The problem of implementing the data loss function in keras

I am doing a custom loss function in keras. The output of the model is a 10-dimensional softmax layer. To calculate the loss: I first need to find the burnout index of 1, and then subtract that value with the true value. I am doing the following:

from keras import backend as K

def diff_loss(y_true,y_pred):

    # find the indices of neuron firing 1
    true_ind=K.tf.argmax(y_true,axis=0)
    pred_ind=K.tf.argmax(y_pred,axis=0)

    # cast it to float32
    x=K.tf.cast(true_ind,K.tf.float32)
    y=K.tf.cast(pred_ind,K.tf.float32)

    return K.abs(x-y)

      

but it gives the error "raise ValueError (" None values โ€‹โ€‹not supported. ") ValueError: values โ€‹โ€‹not supported." What's the problem?

-1


source to share


1 answer


This is because your function is not differentiable. It consists of constants.

There is simply no solution for this if you want argmax

as a result.


Test approach

Since you are using "softmax" this means that only one class is correct (you do not have two classes at the same time).

And since you need index differences, perhaps you could work with one continuous result (continuous values โ€‹โ€‹are differentiable).

Work with only one output in the -0.5 to 9.5 range and take the classes, rounding off the result.



This way you can have the last layer with only one module:

lastLayer = Dense(1,activation = 'sigmoid', ....) #or another kind if it not dense    

      

And change the range with a lambda layer:

lambdaLayer = Lambda(lambda x: 10*x - 0.5)

      

Now your losses can be simple 'mae'

(mean absolute error).

The disadvantage of this attempt is that the activation of the "sigmoid" is unevenly distributed among the classes. Some classes will be more likely than others. But since it's important to have a limit, it seems like a better idea first.

This will only work if the classes follow a logical ascending sequence. (I'm guessing they do it, otherwise you wouldn't be trying such a loss, right?)

0


source







All Articles