Exercise 3: Solution
Start course
Difficulty
Beginner
Duration
1h 45m
Students
1085
Ratings
4.7/5
starstarstarstarstar-half
Description

Learn about the importance of gradient descent and backpropagation, under the umbrella of Data and Machine Learning, from Cloud Academy.

From the internals of a neural net to solving problems with neural networks to understanding how they work internally, this course expertly covers the essentials needed to succeed in machine learning.

Learning Objective

  • Understand the importance of gradient descent and backpropagation
  • Be able to build your own neural network by the end of the course

Prerequisites

 

Transcript

Hey guys, welcome to solution of exercise three in section five. In this exercise we're going to explore the Keras functional API, which is way more powerful when you are dealing with more complex networks. But it's a bit more complicated and so that's why we are starting to use it only now. We are asked to essentially rewrite the model using the functional API, so in order to use that we are given the documentation here. I hope you had a look at it. We need to import two additional classes, the Input class and the Model class. So we do that and then here is how we define a model. First of all we define an input layer. 

Remember in the sequential API we did not define the input layer it was just given the input shape to the first layer. Then, the first layer, this dense is exactly the same definition but we say this dense layer is a function that is applied to the input layer. So x here is the output of the first layer. So the first layer is a function, that is applied to the inputs and returns x. Same thing for the second layer, it's a function. It's applied to x which was the output of the first layer. And we return the same variable name but it gets overrated. In the sequential API , we were used to adding layers. To sequential class, here we are mapping the inputs through he layers and then treating the output of the layer as the input of the next. So this API, this way of writing the model makes it much more explicit what a neural network really is. 

Which is compositions of functions. We call the output of the second to last layer. We give it a different name we call it second_to_last just so that we can retrieve it later. And finally we have the outputs which is the output of the layers with three nodes and the softmax activation. Notice that, this function, this layer we are applying it to the second_to_last. So we built our model, now what we need to do is define a model as a model with inputs to be the input layer and with outputs to be the outputs of the last layer. Then we compile it, usual compilation and we train it for 20 epochs. Notice that I've chosen slightly bigger batch size this time and as you can see it behaves exactly the same. The model gets trained, in terms of what's happening in the back hand it's exactly the same thing. But we've written it in a different way. 

We've specified the model in different way that is much richer in terms of flexibility and so it allows us to specify more complex models. We redefine the feature_function and as you can see now that we know the names of the layers we can call them by names. So it's a feature_function between the inputs and the output of the second_to_last layer. Now we can apply that and we plot the features. And you see this time it learned the different representation but still the two features in the second_to_last layer separate the three classes really really well. So the functional API, is more versatile and more powerful. And I hope you'll like it and use it in your models. Thank you for watching and see you in the next video.

About the Author
Students
8529
Courses
8
Learning Paths
8

I am a Data Science consultant and trainer. With Catalit I help companies acquire skills and knowledge in data science and harness machine learning and deep learning to reach their goals. With Data Weekends I train people in machine learning, deep learning and big data analytics. I served as lead instructor in Data Science at General Assembly and The Data Incubator and I was Chief Data Officer and co-­founder at Spire, a Y-Combinator-­backed startup that invented the first consumer wearable device capable of continuously tracking respiration and activity. I earned a joint PhD in biophysics at University of Padua and Université de Paris VI and graduated from Singularity University summer program of 2011.