4-4-1-14. Quiz: TensorFlow ReLUs

TensorFlow ReLUs TensorFlow provides the ReLU function as tf.nn.relu(), as shown below. The above code applies the tf.nn.relu() function to the hidden_layer, effectively turning off any negative weights and acting like an on/off switch. Adding additional layers, like the output layer, after an activation function turns the model into a nonlinear function. This nonlinearity allows the network to solve more complex … Read more

4-4-1-13. Two-layer Neural Network

Multilayer Neural Networks In the previous lessons and the lab, you learned how to build a neural network of one layer. Now, you’ll learn how to build multilayer neural networks with TensorFlow. Adding a hidden layer to a network allows it to model more complex functions. Also, using a non-linear activation function on the hidden … Read more