How to model a for loop in a neural network

I am currently studying neural networks and can understand basic examples like AND, OR, addition, multiplication, etc.

Right now I'm trying to build a neural network that takes two inputs x and n and calculates pow (x, n). And that would require the neural network to have some form of loop, and I'm not sure how I can simulate a network with a loop

Is it possible to simulate this kind of computation on a neural network? I'm guessing it's possible .. based on a recently released article (Neural Turing Machine) but not sure how. Any pointers to this would be very helpful.

Thank!

+3


source to share


2 answers


Fundamental neural networks are not Turing-complete and, in particular, cannot model arbitrary order loops. However, if you fix the maximum n

that you want to process, then you can set up an architecture that can simulate loops to within n

repetitions. For example, you can easily imagine that each layer can act as one iteration in a loop, so you might need layers n

.



For a more general architecture that can be done in Turing-complete, you can use Recurrent Neural Networks (RNN) . One of the popular examples in this class is the so-called Long short-term memory (LSTM) networks from Hochreiter and Schmidhuber. However, training such RNNs is very different from training classical feedforward networks.

+2


source


As you pointed out, Neural Turing Machines seems to work well to learn basic algorithms. For example, the duplicate problem that was implemented in the document might tell us that the NTM can learn the algorithm itself. For now, NTMs are only used for simple tasks, so understanding its scope with pow (x, n) will be interesting given that duplicate works well. I suggest reading Strengthening Neural Turing Machines Learning - Revised for a deeper understanding.



In addition, recent developments in the field Memory Networks

enable us to take on more complex tasks. Hence, to make the neural network understand pow (x, n) is possible. So go ahead and give it a shot!

+2


source







All Articles