## 9 – Mean Absolute Error

In the last video, we’ll learned how to decrease an error function by walking along the negative of its gradient. Now in this video, we’re going to learn formulas for these error functions. The two most common error functions for linear regression are the mean absolute error, and the mean squared error. First, we’ll learn … Read more

so now that we’ve learned the absolute trick and the square trick, and how they’re used in linear regression, we still want to have some intuition on how these things get figured out. These tricks still seem a little too magical, and we’d like to find their origin so let’s do this in a much … Read more

## 7 – Square Trick

So here’s another trick that will help us move a point closer to a line, and it’s very similar to the absolute trick but it has a little bit of extra gravy. It’s based on this premise. If we have a point that is close to a line, then this distance is small and we … Read more

## 6 – Absolute Trick

So, here’s our first trick that we will align closer to a point that we’re going to use in a linear regression. It’s called the absolute trick and it works like this, we start with a point, and a line, and the idea is that the point wants the line to come closer to it. … Read more

## 5 – Moving A Line

So, let’s have a little refresher on how we move lines by changing the parameters. So, if we have a line of equation y equals w_1x plus w_2, where w_1 and w_2 are constants, it looks like this, w_1 is the slope and w_2 is the y-intercept which is where the line intersects the y-axis. … Read more

## 4 – Fitting A Line

So here’s a trick that will help us fit a line through a set of points. Let’s say that these are points and we start by drawing some random line. We’re going to ask every point what it wants for the model to be better and then listen to them. So let’s go one by … Read more

## 19 – Conclusion

Well that was it. In this lesson, you have learned linear regression and some of its generalizations. You have also gone hands-on and implemented the gradient descent algorithm for linear regression. You are now ready for the upcoming project in which you’ll be implementing a regression neural network to analyze real data. Great job.

## 18 – Regularization

The following concept is one that works both for regression and classification. So in this video, we’ll explain it using a classification problem. But as you will see, all the arguments here work with regression algorithms as well. The concept is called regularization, it’s a very useful technique to improve our models and make sure … Read more

## 17 – Polynomial Regression

So what happens if we have data that looks like this where a line won’t really do a good job fitting in? Maybe would like to have a curve or some polynomial. Maybe something along the lines of 2x cubed minus 8x squared, et cetera. This can be solved using a very similar algorithm than … Read more

## 16 – Closed Form Solution

So here’s an interesting observation; in order to minimize the mean squared error, we do not actually need to use gradient descent or the tricks. We can actually do this in a closed mathematical form. Let me show you. Here’s our data x_1, y_1 all the way to x_m, y_m; and in this case, m … Read more

## 15 – Higher Dimensions

So in the previous example, we had a one column input and one column output. The input was the size of the house and the output was the price. So we had a two-dimensional problem. Our prediction for the price would be a line and the equation would just be a constant times size plus … Read more

## 14 – DLND REG 13 Absolute Vs Squared Error 3 V1 (1)

So this time there was a solution and the solution is B and here’s the reason which is more subtle. Our mean squared error is actually a quadratic function and quadratic functions have a minimum at the point in the middle. Over here, we can see that the error for line A would be around … Read more

## 13 – DLND REG 12 Absolute Vs Squared Error 2 V1 (1)

Well that was a tricky quiz. If you said the same, then you were correct because as you can see, moving this line up and down actually keeps the mean absolute error the same. You can convince yourself by looking at the picture and checking that as you move a line up and down, you’re … Read more

## 12 – Absolute Vs Squared Error

So here’s a question, what is better the mean absolute error or the mean squared error? Well, there’s no real answer to this. Both are used for a lot of different purposes, but here’s one property that actually tells them apart. So here’s a set of data and we’re going to try to fit it … Read more

## 11 – Minimizing Error Functions

So far, we’ve learned two algorithms that will fit a line through a set of points. One is using any of the tricks namely the absolute and the square trick, and the other one is minimizing any of the error functions namely the mean absolute error and the mean squared error. The interesting thing is … Read more

## 10 – Mean Squared Error

Now in this video we’ll learn the Mean Squared Error. The Mean Squared Error is very similar to the Mean Absolute Error. Again, here we have our point and our prediction, but now instead of taking the distance we’re actually going to draw a square with this segment as its side. So the areas precisely … Read more

## 1 – Welcome To Linear Regression1

Hi I’m Louis. Welcome to the linear regression section of this Nanodegree. The two main families of algorithms and predictive machine learning are classification and regression. Classification answers questions of the form yes-no. For example, is this email spam or not, or is the patient sick or not. Regression answers questions of the form how … Read more

## 1 – Welcome To Linear Regression

Hi I’m Louis. Welcome to the linear regression section of this Nanodegree. The two main families of algorithms and predictive machine learning are classification and regression. Classification answers questions of the form yes-no. For example, is this email spam or not, or is the patient sick or not. Regression answers questions of the form how … Read more