Hi. Here’s my solution for your building and training this network using dropout now. Just like I showed you before, we can define our dropout module as self.dropout and then nn.dropout to give it some drop probability. So, in this case, 20 percent, and then just adding it to our forward method now on each of our hidden layers. Now, our validation code looks basically the same as before except now we’re using model.eval. So, again, this turns our model into evaluation or inference mode which turns off dropout. Then, like the same way before, we just go through our data and the test say, calculate the losses and accuracy and after all that, we do model.train to set the model back into train mode, turn dropout back on, and then continue on in train smart. So, now, we’re using dropout and if you look at again the training loss and the validation loss over these epochs that we’re training, you actually see that the validation loss sticks a lot closer to the train loss as we train. So, here, with dropout, we’ve managed to at least reduce overfitting. So, the validation losses isn’t as low as we got without dropout being is still, you can see that it’s still dropping. So, if we kept training for longer, we would most likely manage to get our validation loss lower than without dropout.