What does nb_epoch mean in a neural network?

I am currently starting to open the Keras library for training deap, it seems like an integer epoch was selected in the training stage, but I don't know what assumption this choice is based on.

In the Mnist dataset, the number of selected epochs is 4:

model.fit(X_train, Y_train,
          batch_size=128, nb_epoch=4,
          show_accuracy=True, verbose=1,
          validation_data=(X_test, Y_test))

      

Can anyone tell me why and how we choose the correct number of epochs?

+5


source to share


3 answers


you may be using an older version of keras, nb_epoch refers to the number of epochs that has been replaced by epoch

if you look here you will see that it is out of date.

One epoch means you trained the entire dataset (all records) once, if you have 384 records, one epoch means you trained your model for all on all 384 records. The batch size means that the data you are modeling is used in one iteration, in this case the size of 128 batches means that your model takes 128 at once and does several single forward and backward passes (backward movement) [This is called one iteration]. This is To break it down into this example, one iteration, your model takes 128 records [1st batch] out of all of your 384's to train and perform forward and backward passes (backpropagation). the second batch requires 129 to 256 entries and another iteration is performed. then 3rd batch is from 256 to 384 and does 3rd iteration. In this case, we say that he completed one era.the number of epochs tells the model that the number that it should repeat will then stop all the above processes.



There is no right way to choose the number of epochs, this is what is done by experiment, usually when the model stops learning (loss is no longer decreasing), you usually reduce the learning rate if it does not decrease after that. and the results seem more or less expected, then you choose in the era when the model stopped learning

I hope this helps

+5


source


Since Keras 2.0, the argument has nb_epoch

been renamed everywhere to epochs

.

Neural networks are trained iteratively, making multiple passes over the entire dataset. Each pass through the entire dataset is called an epoch.

There are two possible ways to choose the optimal number of epochs:



1) Set epochs

to a large number and stop learning when validation accuracy or loss stops improving: so-called early stop

from keras.callbacks import EarlyStopping
early_stopping = EarlyStopping(monitor='val_loss', patience=4, mode='auto')

model.fit(X_train, Y_train,
      batch_size=128, epochs=500,
      show_accuracy=True, verbose=1,
      validation_data=(X_test, Y_test),callbacks = [early_stopping])

      

2) Consider the number of epochs as a hyperparameter and choose the best value based on a set of tests (runs) on a grid of values epochs

+5


source


In neural networks, an epoch is equivalent to training the network using each data once.

The number of epochs, nb_epoch

therefore, how many times you reuse your data during training.

+1


source







All Articles