15 – Higher Dimensions

So in the previous example, we had a one column input and one column output. The input was the size of the house and the output was the price. So we had a two-dimensional problem. Our prediction for the price would be a line and the equation would just be a constant times size plus another constant. What if we had more columns in the input, for example size and school quality? Well, now we have a three dimensional graph because we have two dimensions for the input and one for the output. So now our points don’t live in the plane, but they look like points flying in 3-dimensional space. What we do here is we’ll feed a plane through them instead of fitting a line, and our equation won’t be a constant times one variable plus another constant. It’s going to be a constant times school quality plus another constant times size plus a third constant. That’s what happens when we’re in three dimensions. So what happens if we’re in n dimensions? So in this case we have n minus one columns in the input and one in the output. So, for example the inputs are size, school quality, number of rooms, et cetera. Well, now we have the same thing except our data lives in n-dimensional space. So for our input, we have n minus one variables namely; x_1, x_2 up to x_n minus one and for the output of the prediction, we only have one variable y hat. Our prediction would be an n minus one dimensional hyperplane living in n dimensions. Since it’s hard to picture n-dimensions just think of a linear equation in n variables, such as y hat equals w1x1 plus w2x2 plus all the way to w_n minus one x_n minus one plus w_n and that’s how we do predictions for higher dimensions. In order to find the weights w_1 up to w_n the algorithm is exactly the same thing for two variables. We can either do the absolute or square root tricks, or we can calculate the mean absolute or square errors, and minimize using gradient descent.

Dr. Serendipity에서 더 알아보기

지금 구독하여 계속 읽고 전체 아카이브에 액세스하세요.

Continue reading