Learn about the importance of gradient descent and backpropagation, under the umbrella of Data and Machine Learning, from Cloud Academy.
From the internals of a neural net to solving problems with neural networks to understanding how they work internally, this course expertly covers the essentials needed to succeed in machine learning.
- Understand the importance of gradient descent and backpropagation
- Be able to build your own neural network by the end of the course
- It is recommended to complete the Introduction to Data and Machine Learning course before starting.
Hey guys, in this video I'm going to show you how to use TensorBoard which is an amazing piece of software from the guys at Google that developed TensorFlow. So the first thing we are gonna do is to start a terminal so if you go to the main Jupyter notebook. The home you have a new button and you can start a terminal here. So I'll just start a terminal. And this is a bash terminal that you can use to type any command in your system so we're gonna pass the command tensorboard dash dash logdir equals to and here you have to put the folder you put for your TensorBoard callback.
So if you execute this command this will launch a little server open on your local address on board six zero zero six so we can go and have a look at that already open tab. You can see we have this very nice interface with a lot of interesting things. So the first thing we're gonna look at it shows us the history of our metrics. So we have the accuracy and the loss for the training we also have the accuracy and the loss for the validation which is awesome we don't have to plot it ourselves. And if we had run several trainings we would have different colors displayed in the chart. So the scalars summary is already very interesting.
The other interesting chart is the graphs chart where we can explore the model we've built with a graph. So here we have our dense layers dense one, and they're all clickable so we can expand the layers see what operations it does so you can see there is the weight and the bias the weights are multiplied by the input and then added to the bias and then we apply a softmax so you can really visually explore all of your model as well as to all the additional operations of for example the gradiance. So we have a lot of other nodes here that I'm not going to go into but as you become more proficient with the TensorFlow this is an interesting application to explore.
TensorBoard offers other tabs which are currently not used by Keras but once you will be using directly TensorFlow you will be also able to export images and audio and look at the distributions of the weights. So for now we use it to monitor the progress of our training but in the future we may use it for other things. So thank you for watching, and see you in the next video.
I am a Data Science consultant and trainer. With Catalit I help companies acquire skills and knowledge in data science and harness machine learning and deep learning to reach their goals. With Data Weekends I train people in machine learning, deep learning and big data analytics. I served as lead instructor in Data Science at General Assembly and The Data Incubator and I was Chief Data Officer and co-founder at Spire, a Y-Combinator-backed startup that invented the first consumer wearable device capable of continuously tracking respiration and activity. I earned a joint PhD in biophysics at University of Padua and Université de Paris VI and graduated from Singularity University summer program of 2011.