MLP with Inputs for Pattern Recognition and Prediction: A Bad Idea?

This refers to the 3-layer MLP (Input, Hidden, Output) in Ward Systems NeuroShell 2

I would prefer to split these input layer classes (PR and F) into 2 separate networks with their own hidden layers, which then feed one output layer - this will be a three layer network. There may be a 4 layer version using a new hidden layer to merge the two meshes:

1) Inputs (split into F and PR classes) 2) Hiddens (split into F and PR classes) 3) Hiddens (fully connected "blend" layer) 4) Output

These structures will train at the same time, not train two networks, get a result / prediction, and then average these 2 numbers.

I've found that while averaging the outputs works, "letting the network do it" works even better. But it requires partitioning, which my platform (NeuralShell 2) cannot. And I've never read a newspaper where someone tries to do better averaging.

FYI The ratio of PR to F input signals is 10: 1.

Most of the discussion of networks is forecasting using typically about 10 inputs. Pattern recognition has more orders, from 100 to 1000 and even more.

In fact, it seems that the two types of problems are almost mutually exclusive when looking for research.

So my conclusion is that both types of structure on the same network are probably a very bad idea.

Agreed?

+3


source to share


1 answer


Not bad idea! In fact, this approach is very common, which is used very often, you just missed some secret jargon.

Basically what you are trying to do here is ensemble prediction. The best way to approach this is to prepare two completely separate networks for both halves of your problem. Then use these outputs as input to a new neural network.



The field is known as ensemble training and the results are often good.

As for your question about blending recognition and prediction, it is actually impossible to make a call without knowing more details around the data you are working with, but just because people haven't tried it, then you shouldn't either.

0


source







All Articles