9 – Calculating The Gradient 1

Okay. So, now we’ll do the same thing as we did before, painting our weights in the neural network to better classify our points. But we’re going to do it formally, so fasten your seat belts because math is coming. On your left, you have a single perceptron with the input vector, the weights and … Read more

8 – Backpropagation V2

So now we’re finally ready to get our hands into training a neural network. So let’s quickly recall feedforward. We have our perceptron with a point coming in labeled positive. And our equation w1x1 + w2x2 + b, where w1 and w2 are the weights and b is the bias. Now, what the perceptron does … Read more

7 – DL 42 Neural Network Error Function (1)

So, our goal is to train our neural network. In order to do this, we have to define the error function. So, let’s look again at what the error function was for perceptrons. So, here’s our perceptron. In the left, we have our input vector with entries x_1 up to x_n, and one for the … Read more

6 – DL 41 Feedforward FIX V2

So now that we have defined what neural networks are, we need to learn how to train them. Training them really means what parameters should they have on the edges in order to model our data well. So in order to learn how to train them, we need to look carefully at how they process … Read more

5 – Multiclass Classification

We briefly mentioned multi-class classification in the last video but let me be more specific. It seems that neural networks work really well when the problem consist on classifying two classes. For example, if the model predicts a probability of receiving a gift or not then the answer just comes as the output of the … Read more

4 – Layers

Neural networks have a certain special architecture with layers. The first layer is called the input layer, which contains the inputs, in this case, x1 and x2. The next layer is called the hidden layer, which is a set of linear models created with this first input layer. And then the final layer is called … Read more

3 – 29 Neural Network Architecture 2

So in the previous session we learn that we can add to linear models to obtain a third model. As a matter of fact, we did even more. We can take a linear combination of two models. So, the first model times a constant plus the second model times a constant plus a bias and … Read more

2 – Combinando modelos

Now I’m going to show you how to create these nonlinear models. What we’re going to do is a very simple trick. We’re going to combine two linear models into a nonlinear model as follows. Visually it looks like this. The two models over imposed creating the model on the right. It’s almost like we’re … Read more

11 – DL 46 Calculating The Gradient 2 V2 (2)

So, let us go back to our neural network with our weights and our input. And recall that the weights with superscript 1 belong to the first layer, and the weights with superscript 2 belong to the second layer. Also, recall that the bias is not called b anymore. Now, it is called W31, W32 … Read more

10 – Chain Rule

So before we start calculating derivatives, let’s do a refresher on the chain rule which is the main technique we’ll use to calculate them. The chain rule says, if you have a variable x on a function f that you apply to x to get f of x, which we’re gonna call A, and then … Read more

1 – Why Neural Networks

So you may be wondering why are these objects called neural networks. Well, the reason why they’re called neural networks is because perceptions kind of look like neurons in the brain. In the left we have a perception with four inputs. The number is one, zero, four, and minus two. And what the perception does, … Read more

9 – Calculating The Gradient 1

Okay. So, now we’ll do the same thing as we did before, painting our weights in the neural network to better classify our points. But we’re going to do it formally, so fasten your seat belts because math is coming. On your left, you have a single perceptron with the input vector, the weights and … Read more

8 – Backpropagation V2

So now we’re finally ready to get our hands into training a neural network. So let’s quickly recall feedforward. We have our perceptron with a point coming in labeled positive. And our equation w1x1 + w2x2 + b, where w1 and w2 are the weights and b is the bias. Now, what the perceptron does … Read more

7 – DL 42 Neural Network Error Function (1)

So, our goal is to train our neural network. In order to do this, we have to define the error function. So, let’s look again at what the error function was for perceptrons. So, here’s our perceptron. In the left, we have our input vector with entries x_1 up to x_n, and one for the … Read more

6 – DL 41 Feedforward FIX V2

So now that we have defined what neural networks are, we need to learn how to train them. Training them really means what parameters should they have on the edges in order to model our data well. So in order to learn how to train them, we need to look carefully at how they process … Read more

5 – Multiclass Classification

We briefly mentioned multi-class classification in the last video but let me be more specific. It seems that neural networks work really well when the problem consist on classifying two classes. For example, if the model predicts a probability of receiving a gift or not then the answer just comes as the output of the … Read more

4 – Layers

Neural networks have a certain special architecture with layers. The first layer is called the input layer, which contains the inputs, in this case, x1 and x2. The next layer is called the hidden layer, which is a set of linear models created with this first input layer. And then the final layer is called … Read more

3 – 29 Neural Network Architecture 2

So in the previous session we learn that we can add to linear models to obtain a third model. As a matter of fact, we did even more. We can take a linear combination of two models. So, the first model times a constant plus the second model times a constant plus a bias and … Read more

2 – Combinando modelos

Now I’m going to show you how to create these nonlinear models. What we’re going to do is a very simple trick. We’re going to combine two linear models into a nonlinear model as follows. Visually it looks like this. The two models over imposed creating the model on the right. It’s almost like we’re … Read more

11 – DL 46 Calculating The Gradient 2 V2 (2)

So, let us go back to our neural network with our weights and our input. And recall that the weights with superscript 1 belong to the first layer, and the weights with superscript 2 belong to the second layer. Also, recall that the bias is not called b anymore. Now, it is called W31, W32 … Read more