4-4-1-8. Quiz: TensorFlow Cross Entropy

Cross Entropy in TensorFlow

As with the softmax function, TensorFlow has a function to do the cross entropy calculations for us.

tf.reduce_sum()
  • tf.log()
  • Reduce Sum

    x = tf.reduce_sum([1, 2, 3, 4, 5])  # 15
    

    The tf.reduce_sum() function takes an array of numbers and sums them together.

    Natural Log

    x = tf.log(100.0)  # 4.60517
    

    This function does exactly what you would expect it to do. tf.log() takes the natural log of a number.

    Quiz

    Print the cross entropy using softmax_data and one_hot_encod_label.

    (Alternative link for users in China.)

    quiz.py

    # Solution is available in the other "solution.py" tab
    import tensorflow as tf
    
    softmax_data = [0.7, 0.2, 0.1]
    one_hot_data = [1.0, 0.0, 0.0]
    
    softmax = tf.placeholder(tf.float32)
    one_hot = tf.placeholder(tf.float32)
    
    # TODO: Print cross entropy from session
    

    solution.py

    # Quiz Solution
    # Note: You can't run code in this tab
    import tensorflow as tf
    
    softmax_data = [0.7, 0.2, 0.1]
    one_hot_data = [1.0, 0.0, 0.0]
    
    softmax = tf.placeholder(tf.float32)
    one_hot = tf.placeholder(tf.float32)
    
    # ToDo: Print cross entropy from session
    cross_entropy = -tf.reduce_sum(tf.multiply(one_hot, tf.log(softmax)))
    
    with tf.Session() as sess:
        print(sess.run(cross_entropy, feed_dict={softmax: softmax_data, one_hot: one_hot_data}))
    %d 블로거가 이것을 좋아합니다: