Keras automatically generates batch sizes

Is there a way to automatically generate batch size based on GPU memory in keras? As far as I know, batch size is mainly used to tune how large the model uses the vram. I am currently just using trial and error to set a good batch size, but I don't know if there is an easier way. This works fine, but it's a little tedious. If I am trying to run several different models where I can leave my computer for a few days, I would like it to be most efficient with each model.

Edit: Some pseudocode

# What I currently do
batch = 32
model.fit(x_train, y_train, batch_size=batch)
# here I have to manually change batch to maximize my gpu

# What I would prefer to do
model.fit(x_train, y_train, batch_size='auto')
# where the auto takes into account the size of my gpu 
# and adjusts the batch_size automatically

      

+3
keras


source to share


No one has answered this question yet

Check out similar questions:

ten
Keras + Tensorflow: multiple gpus prediction
8
How to enable Keras with Theano to use multiple GPUs
7
Why does the Keras LSTM lot size used for forecasting have to be the same as the lot size?
3
Keras model.fit UnboundLocalError
3
Difference between Keras model.fit using only batch_size and using only steps_per_epoch
2
Understanding Keras LSTMs: The Role of Lot Size and State
1
Keras fit_generator uses large memory even with small batch sizes
0
Keras Predicts Not Working For Multiple GPUs
0
Why is Keras not using the full GPU memory?



All Articles
Loading...
X
Show
Funny
Dev
Pics