6 – Sentiment RNN 2

Welcome back. So now I’m going to go through my solutions to the exercises I had you do before. So you might have come here because you were having difficulties actually implementing that stuff or maybe you just want to see how I did it. You probably ended up doing it differently than me, which … Read more

5 – Training The Network

(speaker) So here I am calculating the output. So like I said before, we’re only really concerned with the final output, which is what we’re going to use to predict our sentiment. So we just need to grab the last one and we can do that using outputs. So this is saying, give us all … Read more

4 – Building The RNN 1

(speaker) Array. So now we’re going to build the graph. So first thing that we need to do is define our hyperparameters. The first one is the LSTM size. So this is the number of units in the hidden layers in the LSTM cells. So LSTM cells actually have four different network layers in them. … Read more

3 – Creating Testing Sets

The next thing to do is create our training, validation, and test sets. This is something you’re probably going to have to do for every network you build, so it’s good practice to do this for every data set you’re using. And I found that a lot of, basically like every different data set has … Read more

2 – Data Preprocessing

So now that we have all the words, what we need to do is we need to encode all of our views as integers. So we’re going to pass in the reviews where every word is an integer and that’s going to go into our embedding layer. So the first step, which I’m going to … Read more

1 – Sentiment Prediction

(instructor) Hello again, everyone. So this week we are going to be talking about Sentiment Analysis with Recurrent Neural Network. So hopefully this will give you some more insight, and some more understanding about how recurrent neural networks work. So you’ve seen a lot of this before. We’re going to be using the same data … Read more

9 – Understanding Inefficiencies in our Network

All right. So in the last section, we optimized our neural network to better find the correlation of our data set by removing some distracting noise. And the neural network attended the signal so much better. It trained 8

8 – Understanding Neural Noise

Okay, so in this section we’re going to talk about noise versus signal. Now job is to look for correlation and neural nets can do very, very good at that. However, once again, this talks about framing the problem so that the neural nets have the most advantage and can train and understand the most … Read more

7 – Mini Project 3 Solution

All right, so in project three we’re going to build our neural network. That is going to predict whether or not a neural review has positive or negative sediment by using the counts of words that are inside of our review. Now the changes I made first were, to create a pre process data function, … Read more

6 – Building a Neural Network

So in this section we’re going to take everything we’ve learned and we’re going to build our first neural network to train over the datasets that we just created. Now what I’d like for you to do for this project is to start with your neural net form the last chapter. I guess the last … Read more

5 – Mini Project 2 Solution

Right, so in this project we’re going to create our input and output data. So, for input data, we’re going to count all the words that happen in a review, and then we’re going to put them into a fixed length vector. Where each place in the vector is for one of our words of … Read more

4 – Transforming Text into Numbers

Now that we have validated that our theory, that individual words instead of review are predictive of that review’s positive or negative label. Now it’s time to transform our datasets into numbers in a way that respects this theory and this belief, so that our neural network can search for correlation in this particular way. … Read more

3 – Mini Project 1 Solution

All right, so. Presumably you kind of took a stab at validating our theory that words are predictive of labels. So now I’m going to show you how I would attack this problem, and then we can kind of compare notes. This learning style, I really, really like. Because I think that it’s most beneficial … Read more

2 – Framing the Problem

Let’s start by curating a dataset. Neural networks by themselves can’t really do anything. All a neural network really does is search for direct or indirect correlation between two datasets. So in order for neural network to train anything, we have to present it with two meaningful datasets. The first dataset must represent what we … Read more

13 – Analysis_ What’s Going on in the Weights

Welcome back. So, in this section we’re going to be talking a little more about what’s going on, a little more theory, a little less project based. And it’s really just about trying to understand what are these weights doing? What can I attach my mind to when I’m thinking about a neural net training … Read more

12 – Mini Project 6 Solution

All right, welcome back. So we’re in project six where we’re going to be reducing noise by strategically reducing the vocabulary. So what we’ve done is we’ve taken these metrics that we’ve kind of used earlier as a juristic to see whether an idea was a good idea. And we’re going to use it to … Read more

11 – Further Noise Reduction

So in the last section we significantly increased the speed by which our neural network. Even in the last section, seeing something in the realm of 1,500 reviews per second in our test data. I mean, just absolutely screaming. Now, in this section we’re going to go back one chapter and continue to try to … Read more

10 – Mini Project 5 Solution

All right, so in this section, we’ve made our network more efficient. We’ve done this by getting rid of the multiplication by 1 because we don’t need to, because it doesn’t change anything. And we’ve gotten rid of the processing of any of the words that have 0 in them altogether. And this really should … Read more