MATLAB runs out of memory on Linux despite regular "clear all"

I am processing batch files (~ 200) in MATLAB, essentially

for i = 1:n, process(i); end

      

where process(i)

opens a file, reads it, and writes the output to another file. (I am not posting details about process

here because it is hundreds of lines, and I will readily admit that I do not quite understand the code after getting it from someone else.)

It ends after every dozen files or so. Of course, the function is memory

not available on Linux , so we have to figure it out "manually". Well, I thought there was some memory leak, so let's release clear all

after every run i.e.

for i = 1:n, process(i); clear all; end

      

Bad luck, this is still out of memory. At the point where this happens, who

says there are only two small arrays in memory (<100 elements). Note that exiting MATLAB and restarting solves the problem, so there is of course enough memory on the computer to process one item.

Any ideas to help me determine where the error is coming from would be appreciated.

+3


source to share


1 answer


This is probably not the solution you are hoping for, but as a workaround, you can have a shell script that iterates over multiple calls to Matlab.



0


source







All Articles