Is this a memory leak?

My code is leaking memory. After a couple of hours, it fills all the memory and crashes. I have simplified my code here, can anyone know if this looks like a leak? Thank you.

var request = require('request').forever(), // as per [1]
    async = require('async'),
    kue = require('kue'),
    jobs = kue.createQueue(),
    pool = { maxSockets: 1 };



function main (job, done) {
    async.series(
        [function (callback) {
            var req = request({url: job.data.URL1, pool: pool}, function (err, resp, body) {
                //stuff...
                callback(err);
            });
        },
        function (callback) {
            var req = request({url: job.data.URL2}, function (err, resp, body) {
                //stuff...
                callback(err);
            });
        }
        ],
        function (err) {
            //stuff...
            done();
        }
    );

}

jobs.process('job_name', function (job, done) {  //many jobs with 'job_name' in the queue
    main (job, done);
});

      

[1] https://groups.google.com/d/msg/nodejs/ZI6WnDgwwV0/sFm4QKK7ODEJ

+3


source to share


2 answers


I don't think your code is to blame. I had the same problem using kue to make sure I wasn't doing anything wrong. I made a super simple worker like this:

var Redis       = require('redis'),
    kue         = require('kue'),
    config      = require("../../config/local.js"),
    redisClient = Redis.createClient(config.redis),
    jobs        = kue.createQueue({ redis : config.redis });

jobs.process('testjobs', function processJob(job, done, error) {
    console.log(job.data, error);
    done();
});

      

Running this code made me realize that this is a leak. The workaround is to use pm2 , this guy starts your program and reloads it if the memory goes to the roof, I use the JSON App Declaration to adjust the max memory allowed before restarting the process.



{
  "apps" : [
    {
      "name": "test_worker",
      "script": "test.js",
      "instances": 1,
      "max_restarts": 10,
      "max_memory_restart" : "10M",
      "ignore_watch": [
        "[\\/\\\\]\\./",
        "node_modules"
      ],
      "merge_logs": true,
      "exec_interpreter": "node",
      "exec_mode": "fork_mode"
    }
  ]
}

      

Hope it helps.

+1


source


If, as it sounds, jobs are added to the queue faster than they are removed, you will see your memory usage grow. It's not really a memory leak. This is part of how Kue makes level events work .

By default, Kue hangs on a job in memory until the job is completed or completed. He does this so that he could initiate events operation level (eg start

, progress

, complete

, failed

) in the process that created the job.

This means that all queued jobs in the queue are also stored in the memory of the process that created them (unless the application is restarted). Until the queue is copied, you won't see the memory grow. However, if the queue backup grows, it is sometimes alarming.

What to do? If you disable job level events, Kue will not hang on the job after it has been queued. You can do it globally using a flag jobEvents

:

 kue.createQueue({jobEvents: false});

      



Or you can enable or disable work level events per job using the job method events

:

var job = queue.create('test').events(false).save();

      

They work if you don't have to answer tasks at all. However, if you need to handle events for a job, you can use queue-level events . Since the job is not stored in memory, you need to get the job from redis to do something with it:

queue.on('job complete', function(id, result){
  kue.Job.get(id, function(err, job){
    // do something with the job
  });
});

      

0


source







All Articles