Why is matlab slower and slower when running a program that takes a long time to complete?

There is a program that my matlab runs, since there are two giant nested for loops, we expect this program to run for over 10 hours. We ask Matlab to print out the loop number every time it loops.

Initially (first 1 hour), we see that on our screen there is a very rapid increase in the number of cycles; over time it goes slower and slower ..... now (more than 20 consecutive hours of execution of the same ".m" file and it's not finished yet), it is almost 20 times slower than it was originally.

The initial plunger usage was about 30%, right now after 20 hours of running, as shown below: enter image description here


My computer spec is below. enter image description here

What can I do to let Matlab keep its original speed?

+3


source to share


3 answers


I can only guess, but my bet is that you have array variables that have not been preallocated , and hence their size increases with each iteration of the loop for

. As a result of this, Matlab has to reallocate memory at each iteration . Redistribution slows down the work, and the more, the larger these variables , since Matlab needs to find an ever larger piece of contiguous memory. This explains why, over time, the program runs slower.



If this is indeed the reason, the solution would be to preset these variables. If their size is not known in advance, you can make an assumption and preset an approximate size to avoid at least part of the reallocation. Or, if your program is not memory constrained, perhaps you can use an upper bound on variable size when preallocating; then, after the loop, trim the arrays by removing the unused entries.

+4


source


Some general hints, if they don't help, I suggest adding some code to the question.



  • Don't print to the console, this output slows down execution and the output is stored in memory. If you need it, write a log file. For simple state usewaitbar

  • Make sure you reallocate all variables
  • Check which function calls depend on the loop index, such as increasing the size of variables.
  • If any mex functions are used, double check them for memory leaks. The standard procedure for this is: call a function with random sample data, don't store the output. If memory usage increases, a memory leak occurs in the function.
  • Use a profiler. Profile your code for the first n iterations (where n equals about 10 minutes), generate an HTML report. Then let it run for about 2 hours and generate a report again for n iterations. Now compare both reports and see where time is lost.
+4


source


I want to point everyone to the next page in the MATLAB documentation: Memory Efficient Strategies . This page contains a set of techniques and best practices that MATLAB users should know.

The OP reminded me of this by saying that memory usage tends to increase over time. There is actually an issue with long running MATLAB instances on win32 systems where memory leak exists and escalates over time (this is also described in the link).

I would like to add to Luis' an answer to the following advice that a friend of mine once received during his correspondence with Yair Altman:

Allocating the largest vars first helps by assigning the largest free contiguous blocks to the highest vars, but it is easy to show that in some cases this can actually be harmful, so this is not a general solution.

The only sure way to solve the memory fragmentation problem is to restart Matlab and better yet restart Windows.

More details on memory allocation performance can be found in the following undocumented Matlab posts:

0


source







All Articles