Forward Propaganda v. Forward pass

I've seen some papers say they go forward and other papers say they go forward.

Do these two words mean the same thing?

+3


source to share


2 answers


Yes, they usually mean the same thing, with some minor language caveats.



Specifically, in a typical backpropagation-based neural network training, there are two main stages: the front pass and the back pass. During a forward pass, variables are passed forward through the network, and in a reverse pass, the error propagates back through the network.

+2


source


Neural networks are trained using gradient methods. To do this, you need to calculate the partial derivative of the loss function for each parameter. To do this, you first need to compute the value of the loss function given the current values โ€‹โ€‹of the parameters, and then, using the inductive nature of the circuit rule, you compute the derivatives, this time starting from the loss layer. Both terms are used for the same operation of calculating values, so they have the same meaning in the context of neural networks. The forward pass is also used for hidden Markov models, sometimes for their inference algorithm. But they are difficult to mix, this should be understood from the context that the algorithm had in mind.



-1


source







All Articles