Activationsfunction - neural network

I work with a neural network in my spare time. I have developed an already light XOR operation with a neural network. But I don't know when I should use the correct activation.

Is there a trick or is it just mathematical logic?

+3


source to share


2 answers


There are many options for activation functions such as identification, logistics, tanh, Relu, etc. The selection of the activation function can be based on the calculation of the gradient (backpropagation). For example. the logistic function is always differentiable, but it saturates a little when the input is of great importance and therefore slows down the optimization speed. In this case, Relu is preferred over logistics. Above is just one simple example of choosing an activation function. It really depends on the actual situation. Also, I don't think the activation functions used in the XOR neural network are representative of a more complex application.



+1


source


You can solve your problem with sigmoid neurons, in this case the activation function:

https://chart.googleapis.com/chart?cht=tx&chl=%5Csigma%20%5Cleft%20(%20z%20%5Cright%20)%20%3D%20%5Cfrac%7B1%7D%7B1%2Be % 5E% 7B-z% 7D% 7D

Where:



https://chart.googleapis.com/chart?cht=tx&chl=z%20%3D%20%5Csum_%7Bj%7D%20(w_%7Bj%7Dx_%7Bj%7D%2Bb)

In this formula, w is the weights for each input, b is the bias, and x is the input, finally, you can use backpropagation to calculate the cost function.

0


source







All Articles