How to prepare a model in keras with multiple IO datasets with different batch sizes

I have a supervised learning problem that I am solving with Keras functional API.

Since this model predicts the state of the physical system, I know that the supervised model must follow additional constraints.

I would like to add this as an additional loss term that penalizes the model for making predictions that do not meet these constraints. Unfortunately the number of learning examples for supervised learning problem -> number of examples of constraints.

Basically, what I'm trying to do is:

Model summary

Minimization of both controlled learning error and constraint error as an auxiliary loss.

I do not believe that alternating training batches in each dataset will be successful because the gradient will only capture the error of one problem at a time when I really want the physical constraint to act as a regularization over the supervised training task. (If I am wrong in my interpretation please let me know).

I know this can be done in pure Tensorflow or Theano, but I hesitate to leave the Keras ecosystem that makes everything else so convenient. If anyone knows how to train a model with batch sizes that vary across different inputs, I'd really appreciate some help.

+3


source to share





All Articles