9 – Understanding Neural Noise

Okay, so in this section we’re going to talk about noise versus signal. Now job is to look for correlation and neural nets can do very, very good at that. However, once again, this talks about framing the problem so that the neural nets have the most advantage and can train and understand the most … Read more

8 – Mini Project 3 Solution

All right, so in project three we’re going to build our neural network. That is going to predict whether or not a neural review has positive or negative sediment by using the counts of words that are inside of our review. Now the changes I made first were, to create a pre process data function, … Read more

7 – Building a Neural Network

So in this section we’re going to take everything we’ve learned and we’re going to build our first neural network to train over the datasets that we just created. Now what I’d like for you to do for this project is to start with your neural net form the last chapter. I guess the last … Read more

6 – Mini Project 2 Solution

Right, so in this project we’re going to create our input and output data. So, for input data, we’re going to count all the words that happen in a review, and then we’re going to put them into a fixed length vector. Where each place in the vector is for one of our words of … Read more

5 – Transforming Text into Numbers

Now that we have validated that our theory, that individual words instead of review are predictive of that review’s positive or negative label. Now it’s time to transform our datasets into numbers in a way that respects this theory and this belief, so that our neural network can search for correlation in this particular way. … Read more

4 – Mini Project 1 Solution

All right, so. Presumably you kind of took a stab at validating our theory that words are predictive of labels. So now I’m going to show you how I would attack this problem, and then we can kind of compare notes. This learning style, I really, really like. Because I think that it’s most beneficial … Read more

3 – Framing the Problem

Let’s start by curating a dataset. Neural networks by themselves can’t really do anything. All a neural network really does is search for direct or indirect correlation between two datasets. So in order for neural network to train anything, we have to present it with two meaningful datasets. The first dataset must represent what we … Read more

14 – Analysis_ What’s Going on in the Weights

Welcome back. So, in this section we’re going to be talking a little more about what’s going on, a little more theory, a little less project based. And it’s really just about trying to understand what are these weights doing? What can I attach my mind to when I’m thinking about a neural net training … Read more

13 – Mini Project 6 Solution

All right, welcome back. So we’re in project six where we’re going to be reducing noise by strategically reducing the vocabulary. So what we’ve done is we’ve taken these metrics that we’ve kind of used earlier as a juristic to see whether an idea was a good idea. And we’re going to use it to … Read more

12 – Further Noise Reduction

So in the last section we significantly increased the speed by which our neural network. Even in the last section, seeing something in the realm of 1,500 reviews per second in our test data. I mean, just absolutely screaming. Now, in this section we’re going to go back one chapter and continue to try to … Read more

11 – Mini Project 5 Solution

All right, so in this section, we’ve made our network more efficient. We’ve done this by getting rid of the multiplication by 1 because we don’t need to, because it doesn’t change anything. And we’ve gotten rid of the processing of any of the words that have 0 in them altogether. And this really should … Read more

10 – Understanding Inefficiencies in our Network

All right. So in the last section, we optimized our neural network to better find the correlation of our data set by removing some distracting noise. And the neural network attended the signal so much better. It trained 83% accuracy in the training data, and the testing accuracy was up to 85%. And this was … Read more

1 – Introducing Andrew Trask

Hi there! Welcome to week two of the program. This week we have a guest instructor Andrew Trask. Andrew is a Ph.D. student at the University of Oxford where he studies deep learning for natural language processing. He’s also the author of grokking deep learning, which teaches deep learning to anyone with Python skills and … Read more