Torch7: how to connect neurons of the same layer?
Is it possible to implement, using a torch, an architecture that connects neurons of the same layer?
What you are describing is called a recurrent neural network. Note that it requires a completely different type of structure, input data, and learning algorithms to work.
There is a rnn library for working torch with repeating neural networks.
Yes it is possible. The torch has everything that other languages have: logical operations, read / write operations, array operations. This is all that is needed to implement any neural network. When you consider that torch uses CUDA, you can even implement a neural network that can be faster than some C # or Java implementations. The performance improvement may depend on the number of if / else during one iteration