9 – 07 CharRNN Solution V1

We wanted to define a character RNN with a two layer LSTM. Here in my solution, I am running this code on GPU and here’s my code for defining our character level RNN. First, I defined an LSTM layer, self.lstm. This takes in an input size, which is going to be the length of a … Read more

8 – 06 Defining Model V2

All right. So, we have our mini batches of data and now it’s time to define our model. This is a little diagram of what the model will look like. We’ll have our character’s put into our input layer and then a stack of LSTM cells. These LSTM cells make up our hidden recurrent layer … Read more

7 – 05 Batching Data V1

So, this is my complete get_batches code that generates mini-batches of data. So, the first thing I wanted to do here is get the total number of complete batches that we can make in batches. To do that, I first calculated how many characters were in a complete mini-batch. So, in one mini-batch, there’s going … Read more

6 – 04 Implementing CharRNN V2

This is a notebook where you’ll be building a characterwise RNN. You’re going to train this on the text of Anna Karenina, which is a really great but also quite sad a book. The general idea behind this, is that we’re going to be passing one character at a time into a recurrent neural network. … Read more

5 – Sequence-Batching

One of the most difficult parts of building networks for me is getting the batches right. It’s more of a programming challenge than anything deep learning specific. So here I’m going to walk you through how batching works for RNN. With RNNs we’re training on sequences of data like text, stock values, audio etc. By … Read more

4 – Character-Wise RNN

Coming up in this lesson you’ll implement a character-wise RNN. That is, the network will learn about some text one character at a time and then generate new text one character at a time. Let’s say, we want to generate new Shakespeare plays. As an example, to be or not to be. We’d pass the … Read more

3 – 03 Training Memory V1

Last time, we defined a model, and next, I want to actually instantiate it and train it using our training data. First, I’ll specify my model hyperparameters. The input and output will just be one, it’s just one sequence at a time that we’re processing and outputting, then I’ll specify a hidden dimension which is … Read more

2 – 02 Time Series Prediction V2

To introduce you to RNNs in PyTorch, I’ve created a notebook that will show you how to do simple time series prediction with an RNN. Specifically, we’ll look at some data and see if we can create an RNN to accurately predict the next data point given a current data point, and this is really … Read more

10 – 08 Making Predictions V3

Now, the goal of this model is to train it so that it can take in one character and produce a next character and that’s what this next step, Making Predictions is all about. We basically want to create functions that can take in a character and have our network predict the next character. Then, … Read more

1 – M4L31 HSA Implementing RNNs V2 RENDERv1 V2

Hi again! So, in the last couple lessons, Ortel and Louise introduce you to recurrent neural networks and LSTMs. In this lesson, Matt and I will be going over some implementations of these networks. Because RNNs have a kind of built-in memory, they’re really useful for tasks that are time or sequence dependent. For example, … Read more