4 – PyTorch V2 Part 1 Solution 3 V1

All right. So, here’s my solution for this exercise. So, here, I had you calculate the output of this multi-layer network using the weights and features that we’ve defined up here. So, it was really similar to what we did before with our single layer simple neural network. So, it’s basically just taking the features and our weight matrix, our first weight matrix, and calculating a matrix multiplication. So, here’s the torch.mm plus B1, and then that gives us values for our hidden layer H. Now, we can use the values H as the input for the next layer of our network. So, we just do, again, a matrix multiplication of these hidden values H, with our second weight matrix W2, and adding on our bias terms, and then we get the output. So, my favorite features of PyTorches is being able to convert between Numpy arrays and Torch tensors, in a very nice and easy manner. So, this is really useful because a lot of the times, you’ll be preparing your data and to do some preprocessing using Numpy, and then you want to move it into your network, and so, you have to bridge these Numpy arrays, what you’re using for your data, and then the Torch tensors that you’re using for your network. So, actually, to do this, we can actually get a tensor from a Numpy array using torch.fromnumpy. So, here I’ve just created a random array, a four-by-three array, and then we can create a Torch tensor from this array just by doing.from Numpy, and passing an array. So, this creates a nice tensor for us. So, this is a tensor in PyTorch, we can use with all of our Torch methods and eventually, use it in a neural network. Then, we can go backwards, so we can take a tensor such as B here. This is our Torch tensor and we can go back to a Numpy array doing b.numpy. So, this gives us back our Numpy array. So, one thing to remember when you’re doing this, is that the memory is actually shared between the Numpy array and this Torch tensor. So, what this means, is that if you do any operations in place on either the Numpy array or the tensor, then you’re going to change the values for the other one. So, for example, if we do this in-place operation of multiplying by two, which means that we’re actually changing the values in memory, and not creating a new tensor, then we will actually change the values in the Numpy array. So, you see here, we have our Numpy array. So initially, it’s like this, convert it to a Torch tensor, and here, I’m doing this in-place multiplication, and we’ve changed our values for this tensor. Then, if you look back at the Numpy array, the values have changed. So, that’s just something to keep in mind as you’re doing this, so you’re not caught off guard when you’re seeing your arrays, your Numpy arrays, being changed because of operations you’re doing on the tensor. See you in the next video, cheers.

%d 블로거가 이것을 좋아합니다: