4 – Architecture in More Depth

Now that we have an unrolled example of input and output of the network, let’s go another level deeper, and look at some of the parameters of the model. So at this point of the course you know that you can’t just feed words directly to the network. We need to turn the words into … Read more

3 – Architecture encoder decoder

Let’s look more closely at how sequence to sequence models work. We’ll start with a high level look and then go deeper and deeper. Here are our two recurrent nets. The one on the left is called the encoder. It reads the input sequence, then hands over what it has understood to the RNN and … Read more

2 – Applications seq2seq

I do want to say a couple of words on applications before delving deeper into the concept. That’s because the term sequence-to-sequence RNN is a little bit abstract and doesn’t relay how many amazing things we can do with this type of model. So let’s think of it like this. We have a model that … Read more

1 – Jay’s Introduction

Hello, my name is J. I’m a content developer at Udacity, and today we’ll be talking about a powerful RNN technique called sequence to sequence. In a previous lesson, Andrew Trask showed us how to do sentiment analysis using normal feedforward neural networks. The network was able to learn how positive or negative each word … Read more