1. Home
2. Training Library
3. Big Data
4. Courses
5. Getting Started With Deep Learning: Recurrent Neural Networks

# Exercise 1: Solution

Developed with

1
Introduction
PREVIEW1m 5s
2
Time Series
PREVIEW5m 27s
9
10
12

## The course is part of these learning paths

AWS Machine Learning – Specialty Certification Preparation
39
14
15
Start course
Overview
Difficulty
Beginner
Duration
45m
Students
368
Ratings
4.4/5
Description

From the internals of a neural net to solving problems with neural networks to understanding how they work internally, this course expertly covers the essentials needed to succeed in machine learning.

This course moves on from cloud computing power and covers Recurrent Neural Networks. Learn how to use recurrent neural networks to train more complex models.

Understand how models are built to allow us to treat data that comes in sequences. Examples of this could include unstructured text, music, and even movies.

This course is comprised of 9 lectures with 2 accompanying exercises.

Learning Objective

• Understand how recurrent neural network models are built
• Learn the various applications of recurrent neural networks

Prerequisites

Transcript

Hello and welcome to the solution of exercise one in section eight. Exercise one was asking us to reshape the data as a sequence of 12 values, each being a single number. So this is pretty easy. We use the reshape function. We use the reshape method on a NumPy array. And as you can see now, our X_train tensor has a shape of 228 by 12 timestamps by one, the size of the vector. So we can pass this to our previous keras.models, to our previous LSTM model. The only thing we need to change is the shape of the input. So we do that and our model now has less parameters because the input size is one and not 12.

So we run that model, again with early stopping, and we'll see how well it does. Okay, our model stopped after nine epochs and we can already see that the loss is a bit higher than the previous losses. So if we plot what it got, we see that it didn't really learn the pattern very well. If we remove the early stopping condition and increase the batch size and run the training for much, much longer, this model does learn to reproduce our data really well and even on the test data it performs very well. So this is an important lesson. LSTMs perform amazingly well on sequences, but they do take a lot of time to train. So whenever you are using an LSTM, be prepared to let the training run for a long time. So thank you for watching and see you in the next video.