What are these initial gradients in TF auto gradient method?

I went through how TF calculates and updates the gradients of the wrt loss function to its loss function in **** tf.train.Optimizer.minimize (loss, global_step = None, var_list = None, gate_gradients = 1, aggregation_method = None, colocate_gradients_with_ops = False, name = None, grad_loss = None **) **

so I looked at the source code in this link. I found a basic gradient function enter link description here In this code I don't understand why they initialize gradients with one at line 408.

+3


source to share





All Articles