The course is part of these learning paths
Continue the journey to data and machine learning, with this course from Cloud Academy.
In previous courses, the core principles and foundations of Data and Machine Learning have been covered and best practices explained.
This course gives an informative introduction to deep learning and introducing neural networks.
This course is made up of 12 expertly instructed lectures along with 4 exercises and their respective solutions.
Please note: the Pima Indians Diabetes dataset can be found at this GitHub repository or at Kaggle page mentioned throughout the course.
- Understand the core principles of deep learning
- Be able to execute all factors of the framework of neural nets
- It would be advisable to complete the Intro to Data and Machine Learning course before starting.
Hey guys, welcome back! In this video, we'll look at deep learning for the first time and we're gonna build some shallow and deeper neural networks. So, let's get started. Load the usual libraries, and then let's build the first neural networks. So I'll create some data with the function make_moons, that's the helper function from second learned data sets. This is fake data, it's not representing anything but the interesting thing is that the two data sets are not really separable with a line so with a straight boundary, we need our model need to be smart enough to accommodate for a boundary that, you know, goes around this shape. And so, this rules out pretty much logistic progression as we shall see. So, our input data we have 1,000 points and each of them has two features, one and two. Okay so the two go right here. Split our data into training and tests, we choose the test sides of 0.3, 30% and here, I've set the random state so everybody gets the same results. Okay so, random states could choose anything sets the seed of the random number generator so that we all get the same random split. Great, and then, we import the sequential API model, the dense layer and a couple of optimizers from keras.
Okay so, the shallow model is the logistic progression. So we're just going to be testing that, I will not go through it's the same thing as we've done in the previous chapters. So, just build the logistic progression model, train it for a few hundred epochs, and when it's done, we evaluate the result on a test set. Okay, so it's done, we can evaluate the result the result on a test set, just run model evaluate and check the accuracy score. It's the first of the results, sorry it's the second number of the results, so if we print just results we see it's two numbers. First is gonna be the 'binary_crossentropy' and the second is going to be the 'accuracy' so we take the second result, we get 84%. Okay, it's better than the 50% that you know, it's our benchmark if we just count how many points and let you do it as an exercise but if you check the number of values of Y that are one or zero, you'll see that it's half, half 500 points are one and 500 points are zero.
Okay, so we're doing better than the benchmark but if we plot the decision boundary like we've done other times, we'll see that essentially our model is drawing a straight line. And so, it's confusing all this data to be red crosses, and all this data to be blue dots. So, it's doing okay but it's not really learning what we wanted which was a curved kind of boundary here. Okay, so we probably have to step up and build a deeper model. So, now that you know a bit more about deep learning, you know that we build deep models by essentially stacking layers of logistic progressions with many nodes. Or, layers with different activation functions, but the idea is the same, we do a weighted average of the inputs of the previous layer and then we apply a non-linear activation function.
And we go in detail of what the activation functions, what the options are later on in this section. So, for now, take it as a fact that we can apply different activation functions and later we'll see which they are. So, we build our model, we need to make sure that the inputs is two features and then we have four nodes in the first inner layer, and then two nodes in the second inner layer, and one node as output. This is the 'sigmoid' output. 'binary_crossentropy' is a loss, we also take the 'accuracy' so we compile that model and again we fit it on training and test set with the 100 epochs. Call evaluate and wow, now it's behaving much better than you know, the previous one, we got 100% accurate on the test set, which is great. Okay you can check the confusion matrix and the accuracy score, okay, accuracy on the training set is 99.9%, accuracy on the test set is 100% and if we plot the decision boundary, we see that our model has learned this neat boundary, this curved boundary.
There's this point which, you know, it's expected that the model would get wrong because basically there are two points overlapping so there's now way they're going to, the model's going to separate them, but that's just how my data cycle is generated, yours may be different. But, the nice thing is, our deep model essentially learned an arbitrary boundary between the two classes. Awesome, so we've built our second deep neural net, one we had built in the very first chapter although we didn't know what it was, and I hope you know this starts to show you that building neural networks is actually quite easy. So, thank you for watching and see you in the next video.
I am a Data Science consultant and trainer. With Catalit I help companies acquire skills and knowledge in data science and harness machine learning and deep learning to reach their goals. With Data Weekends I train people in machine learning, deep learning and big data analytics. I served as lead instructor in Data Science at General Assembly and The Data Incubator and I was Chief Data Officer and co-founder at Spire, a Y-Combinator-backed startup that invented the first consumer wearable device capable of continuously tracking respiration and activity. I earned a joint PhD in biophysics at University of Padua and Université de Paris VI and graduated from Singularity University summer program of 2011.