The course is part of this learning path
Learn about the importance of gradient descent and backpropagation, under the umbrella of Data and Machine Learning, from Cloud Academy.
From the internals of a neural net to solving problems with neural networks to understanding how they work internally, this course expertly covers the essentials needed to succeed in machine learning.
- Understand the importance of gradient descent and backpropagation
- Be able to build your own neural network by the end of the course
- It is recommended to complete the Introduction to Data and Machine Learning course before starting.
Hey guys, welcome back. In this video, we're going to look a bit closer at NumPy. NumPy, or NumPy is the library in Python that allows us to deal with arrays, so multi-dimensional arrays. We use it a lot in the next chapters, and so I think it's the right moment to familiarize with it a bit more than just knowing how to create an array. So, I'll go through a few commands in this short video. First of all, I'm going to import the usual packages and then, gonna create an array. So this array has four elements, and if I check the type, it's of type NumPy ndarray. Okay, I'm going to create a few more arrays, a two-dimensional, they're all two-dimensional, they're matrices, okay, and I'm going to check the shape, so the first is two rows, three columns. Second is three by two, and the third is six by two.
I got that through the use of the property shape for each of them. So shape tells us how many rows and how many columns there are in the array, if the array's more than order two, so if it's order three for example, we will have comma and another number. We can access the elements of an array, with the square brackets and the index. So for example, A of zero, what do you think it's gonna give us? So this is A, will it be the first row, or the first column?
Let's see, I do A of zero, it's the first row. And the reason is, the array, it's treated as a nested list, and so we are getting the first element of the outermost list, which is this. I'll just put it like this, so you can see it better, first element, second element, of this list. And so, yeah, we just got the first element. Since the array is like a table, we can also access individual elements, but giving like the row and the column. So if we do C, two, zero... C, two, zero, it should be equal to four, let's see. Yes, it's four, and we can grab all the elements in the first column by saying of every row... Yeah, this is the colon, operator, and this says for every row, give us the first element. And for B, this is zero, two, four.
Okay, great, so NumPy is smart in the way it does operations. So, if we take an array, and multiply it by a number, what do you think is going to happen? What NumPy does, is it multiplies this number for each of the elements of A. So, each of the elements in A has been multiplied by three. So, the number three has been broadcasted to all the elements of A. Same way if we do a sum, it's element-wise sum. So, we are going to sum the first element of the first row with the first element of the first row, in this case it's like multiplying by two, but it's element by element, we are doing the sum. Okay, which is what you would expect for a matrix sum, and if we do the product it's the element-wise product.
So, this is the first element by the first element, second element by second element and so on. Same for the division, we get one everywhere. And for the subtraction we get zero everywhere. So basically, the four operations all work element by element, which means we cannot do a sum with two arrays of different shapes. See, we got this operands could not be broadcast together because they have different shapes, two, three, and three, two. So if you get value error, just check, but this is a common problem, like when you try to do the sum of two arrays that don't have the same shape. Same is true for the product, because they don't have the same shape.
What you can do with the matrices and vectors of different shapes is an operation called dot product. So, since A and B are two and three and three and two, we can do the dot product of A dot B, but also of B dot A, so A dot B should give us a two by two, let's see, if I do that, yes, it's a two by two. I can also do it with np dot, that's the same thing, so I can do A dot B, or I can do np dot of A with B. And I can do B dot A, which gives me a three by three, so multiplying along the minor axis, it's a three by three. Okay, so C dot A, let's check the shapes first. So, C shape is six, two, and A shape is two, three. So it should work, we should get a six by three matrix, which is what we get, six by three.
But A dot C won't work because three and six do not match, and so we cannot do a dot product. So, remember, dot product, the last dimension and the first dimension always need to match. They need to be the same. So if we do A dot C, it complains again, value error, that the shapes are not aligned. And the dimension of three, the size three here is not the same as size six, as they should be. Okay, these were a few operations with arrays in NumPy, I hope it's useful, we use NumPy a lot and I feel that sometimes, things that we take for granted are actually useful to be explained, so I hope you found something new in this video, I thank you for watching and I'll see you in the next video.
About the Author
I am a Data Science consultant and trainer. With Catalit I help companies acquire skills and knowledge in data science and harness machine learning and deep learning to reach their goals. With Data Weekends I train people in machine learning, deep learning and big data analytics. I served as lead instructor in Data Science at General Assembly and The Data Incubator and I was Chief Data Officer and co-founder at Spire, a Y-Combinator-backed startup that invented the first consumer wearable device capable of continuously tracking respiration and activity. I earned a joint PhD in biophysics at University of Padua and Université de Paris VI and graduated from Singularity University summer program of 2011.