4-4-1-14. Quiz: TensorFlow ReLUs
TensorFlow ReLUs TensorFlow provides the ReLU function as tf.nn.relu(), as shown below. The above code applies the tf.nn.relu() function to the hidden_layer, effectively turning off any negative weights and acting like an on/off switch. Adding additional layers, like the output layer, after an activation function turns the model into a nonlinear function. This nonlinearity allows the network to solve more complex … Read more
댓글을 달려면 로그인해야 합니다.