Implementation of kategorical_crossentropy keras

I am trying to apply the concept of distillation, mainly to train a new smaller network to do the same as the original one, but with less computation.

I have softmax outputs for each sample instead of logits.

My question is, how is the absolute cross-entropy loss function implemented? As for the maximum value of the original labels, and multiply it with the corresponding predicted value in the same index, or it sums all the logits (one hot coding) as the formula says: enter image description here

Thank!!

+3


source to share


1 answer


I see that you were using the tensorflow tag, so I assume this is the backend you are using?

def categorical_crossentropy(output, target, from_logits=False):
"""Categorical crossentropy between an output tensor and a target tensor.
# Arguments
    output: A tensor resulting from a softmax
        (unless `from_logits` is True, in which
        case `output` is expected to be the logits).
    target: A tensor of the same shape as `output`.
    from_logits: Boolean, whether `output` is the
        result of a softmax, or is a tensor of logits.
# Returns
    Output tensor.

      

This code comes from the keras source code . By looking directly at the code, answer all your questions :) If you need more information, just ask!

EDIT:



Here is the code you are interested in:

 # Note: tf.nn.softmax_cross_entropy_with_logits
# expects logits, Keras expects probabilities.
if not from_logits:
    # scale preds so that the class probas of each sample sum to 1
    output /= tf.reduce_sum(output,
                            reduction_indices=len(output.get_shape()) - 1,
                            keep_dims=True)
    # manual computation of crossentropy
    epsilon = _to_tensor(_EPSILON, output.dtype.base_dtype)
    output = tf.clip_by_value(output, epsilon, 1. - epsilon)
    return - tf.reduce_sum(target * tf.log(output),
                          reduction_indices=len(output.get_shape()) - 1)

      

If you look at the refund, they summarize it ... :)

+3


source







All Articles