4-4-1-7. Quiz: TensorFlow Softmax

TensorFlow Softmax

The softmax function squashes it’s inputs, typically called logits or logit scores, to be between 0 and 1 and also normalizes the outputs such that they all sum to 1. This means the output of the softmax function is equivalent to a categorical probability distribution. It’s the perfect function to use as the output activation for a network predicting multiple classes.

TensorFlow Softmax

We’re using TensorFlow to build neural networks and, appropriately, there’s a function for calculating softmax.

x = tf.nn.softmax([2.0, 1.0, 0.2])

Easy as that! tf.nn.softmax() implements the softmax function for you. It takes in logits and returns softmax activations.

Quiz

Use the softmax function in the quiz below to return the softmax of the logits.

quiz.py

# Solution is available in the other "solution.py" tab
import tensorflow as tf


def run():
    output = None
    logit_data = [2.0, 1.0, 0.1]
    logits = tf.placeholder(tf.float32)
    
    # TODO: Calculate the softmax of the logits
    # softmax =     
    
    with tf.Session() as sess:
        # TODO: Feed in the logit data
        # output = sess.run(softmax,    )

    return output

solution.py

# Quiz Solution
# Note: You can't run code in this tab
import tensorflow as tf


def run():
    output = None
    logit_data = [2.0, 1.0, 0.1]
    logits = tf.placeholder(tf.float32)

    softmax = tf.nn.softmax(logits)

    with tf.Session() as sess:
        output = sess.run(softmax, feed_dict={logits: logit_data})

    return output

Dr. Serendipity에서 더 알아보기

지금 구독하여 계속 읽고 전체 아카이브에 액세스하세요.

Continue reading