Memory error when using Keras fit_generator and datagen.flow

I am trying to use datagen.flow with ImageGenerator class in Keras. I am getting the following memory error:

Traceback (most recent call last):

File "scratch_6.py", line 284, in <module>

history = model.fit_generator(datagen.flow(train_X, train_y, 
batch_size=batch_size, save_to_dir='test_RA', save_format='png'),

File "/usr/local/lib/python3.5/dist-
packages/keras/preprocessing/image.py", line 455, in flow 
save_format=save_format)

File "/usr/local/lib/python3.5/dist-
packages/keras/preprocessing/image.py", line 764, in __init__
self.x = np.asarray(x, dtype=K.floatx())

File "/usr/local/lib/python3.5/dist-packages/numpy/core/numeric.py", line 531, in asarray
return array(a, dtype, copy=False, order=order)

MemoryError

      

I have 128 GB of RAM. I tried to reduce the batch size, but no change. Any help is appreciated. Thank.

+3


source to share


1 answer


This is a common problem for the entire deep learning algorithm, where the dataset is quite large in size. Thus, for this type of problem, all the data that we cannot load into RAM, because for the computation, as well as for saving the RAM memory of the model, a space capture is required. Also, when we convert the input from int to float, it will take 4 times the space of the input image. so the solution to this problem is to preprocess the image, and also finish the data expansion and store all data in hdf5 database and store in your hard disk, and also during the loading of the data load package by batch processing and training the model, it can take a lot time, but it will not fully use memory.



Thanks Kunal

0


source







All Articles