Keras automatically generates batch sizes
Is there a way to automatically generate batch size based on GPU memory in keras? As far as I know, batch size is mainly used to tune how large the model uses the vram. I am currently just using trial and error to set a good batch size, but I don't know if there is an easier way. This works fine, but it's a little tedious. If I am trying to run several different models where I can leave my computer for a few days, I would like it to be most efficient with each model.
Edit: Some pseudocode
# What I currently do
batch = 32
model.fit(x_train, y_train, batch_size=batch)
# here I have to manually change batch to maximize my gpu
# What I would prefer to do
model.fit(x_train, y_train, batch_size='auto')
# where the auto takes into account the size of my gpu
# and adjusts the batch_size automatically
+3
source to share
No one has answered this question yet
Check out similar questions: