Keras Training warm_start

Is it possible to continue training the Keras estimate with all hyperparameters (including reduced learning rate) and weights preserved from previous epochs, as is done in scikit-learn with a parameter warm_start

? Something like that:

estimator = KerasRegressor(build_fn=create_model, epochs=20, batch_size=40, warm_start=True)

      

Specifically, a warm start should do this:

warm_start: bool, optional, default False If set to True, reuse the solution of the previous call to set as initialization, otherwise, just remove the previous solution.

Is there anything like this in Keras?

+3


source to share


1 answer


Yes, it is possible. But rather cumbersome. You need to use a function train_on_batch

that saves all model parameters (also optimization ones).



This is cumbersome because you have to split your dataset into batches yourself and you also lose the ability to apply Callbacks

and use automatic progbar

. I hope that in the new version Keras

this option will be added to the method fit

.

+3


source







All Articles