Convolutional Neural Networks
The course is part of these learning paths
In this course, discover convolutions and the convolutional neural networks involved in Data and Machine Learning. Introducing the concept of tensor, which is essential for everything that follows.
Learn to apply the right kind of data such as images. Images store their information in pixels, but you will discover that it is not the value of each pixel that matters.
- Understand how convolutional neural networks are essential to the fundamentals of Data and Machine Learning.
- It is recommended to complete the Introduction to Data and Machine Learning course before starting.
Hey guys, welcome back. In this video we are going to look at the capabilities of NumPy to deal with tensors. So first of all we're going to create a few tensors. This is a order four tensor A where we've chosen random integers between zero and nine and we've given it a shape of two, three, four, five. So there are two objects in the outermost array then three at the second axis, four in the third axis and five in the innermost axis, okay? So we can access a single element so element zero, so it's gonna be in the first block, one, so in the second little rectangle here and zero, first row, and then three, so fourth row, so it should be a number eight. Let's see, and it's eight. So we can access, as I said, a single element by giving the indices along each axis separated by comma and in square brackets. B, it's an order two tensor so it's a matrix.
We can create a random colored image by creating a random integer tensor where this gives me one of a random image of four-by-four pixels with three colors and each number has gotta be between zero and 255 with the data type of unsigned integer of eight bit. So this is our image, let's see what it look like. Let's see what it looks like. We're gonna do a subplot where the first plot is the image itself and then on each of the three other subplots, we have four subplots all together, we are going to plot a channel. So, red, green and blue. And remember, this image doesn't have colors, I mean we defined a random tensor with three channels and we're just calling them red, green and blue because that's the convention. Okay, so here they are. This is our random image. And these are the three channels that combined give this.
So let's make it a little smaller so you all can see. Okay. Voila. So here's the combined colors and you can see that the greens are quite present where green and blue are present but this square is almost all red with the blue and this square is prevalently blue and so the other two, the red and the green, are small. Okay, so this is how images work and this is how we will pass images through our convolutional neural networks. Let's see what operations we can do with the tensors. Well, since they are arrays we can definitely multiply them by a number, like we've seen before. Nothing new here. And we can definitely sum them with elements of, with the same shape. So A + A, it's a element-wise sum of the array of the tensor A. What's interesting is that there's an extension of the dot product, which is called tensor dot, that allows us to do a dot multiplication of tensors by specifying which axis we are contracting on. So, if I do tensor dot of A with B along this axis, the zero, one on the first tensor and the zero, one on the second tensor, we obtain a matrix, so we've contracted on two axes. Let's go back and look at the shapes of A and B. So the shape of A is two, three, four, five. The shape of B is two, three.
So, we are contracting the tensors A and B and by contracting, I mean, we do a dot product on both the axes zero, that you know, since it has the same number of elements, we can do a dot product and on the axis one. So, we multiply and sum along both axes and so we only are left with two axes, three. So one of four and one of five. If, vice versa, we do the contraction, we do the dot product only along axis zero, then you can guess, we will be left with a tensor with how many axes? Let's see, I'll do it. And if you had thought we had, we were gonna have three axes, that's incorrect because we had also one axis left from B. So by contracting on these two, we are going to be left with this three, this four, five, so it's three, four, five and this three at the end. This is because we've done a contractor with B, so three axes of A will come first and then the last axis is gonna be the remaining axis of B. Okay, so we've done a few operations with tensors and shown that NumPy can handle these operations very easy. Thank you for watching and see you in the next video.
I am a Data Science consultant and trainer. With Catalit I help companies acquire skills and knowledge in data science and harness machine learning and deep learning to reach their goals. With Data Weekends I train people in machine learning, deep learning and big data analytics. I served as lead instructor in Data Science at General Assembly and The Data Incubator and I was Chief Data Officer and co-founder at Spire, a Y-Combinator-backed startup that invented the first consumer wearable device capable of continuously tracking respiration and activity. I earned a joint PhD in biophysics at University of Padua and Université de Paris VI and graduated from Singularity University summer program of 2011.