The course is part of this learning path
Learn about the importance of gradient descent and backpropagation, under the umbrella of Data and Machine Learning, from Cloud Academy.
From the internals of a neural net to solving problems with neural networks to understanding how they work internally, this course expertly covers the essentials needed to succeed in machine learning.
- Understand the importance of gradient descent and backpropagation
- Be able to build your own neural network by the end of the course
- It is recommended to complete the Introduction to Data and Machine Learning course before starting.
About the Author
I am a Data Science consultant and trainer. With Catalit I help companies acquire skills and knowledge in data science and harness machine learning and deep learning to reach their goals. With Data Weekends I train people in machine learning, deep learning and big data analytics. I served as lead instructor in Data Science at General Assembly and The Data Incubator and I was Chief Data Officer and co-founder at Spire, a Y-Combinator-backed startup that invented the first consumer wearable device capable of continuously tracking respiration and activity. I earned a joint PhD in biophysics at University of Padua and Université de Paris VI and graduated from Singularity University summer program of 2011.
Hello, and welcome to section five on gradient descent. In the last section, we met the perceptron, and the activation functions, and the weights, and all the ingredients that form the simplest neural network. In this section, we will learn about gradient descent and backpropagation. This may sound a bit technical, but I'll make it as intuitive as possible. And it's important that you learn about these things for two reasons. First of all, knowing about the internals of a neural net will allow you to demystify it a little bit. It will cease to be magic, and you will know exactly how it works. You will see that it's actually not that complicated.
This will make you able to know which problems can be solved with a neural network and which cannot. And so it's important to know about this. The second and important thing is that by knowing how they work internally, you are better able to choose which parameters and which optimizers to tune when you're trying to improve the performance of your model. So at the end of the section, we will build our second neural network model and use it to solve a slightly more complex problem. So let's get started with section five on gradient descent.