Performance considerations with `Promise.all` and a lot of asynchronous operations

When used Promise.all

with asynchronous code (in the case of synchronous code, there is nothing to worry about), you can suffer from serious (if not other) problems when you want to send entire groups (be it tens, hundreds, thousands or even millions) of requests, given the final the result of your asynchronous operations (e.g. local file system, HTTP server, database, etc.) does not gracefully handle many concurrent requests.

In this case, it would be ideal if we could say Promise.all

before how many promises we want in flight at the same time. However, since A + is supposed to be thin, adding these kinds of fancy features certainly doesn't make sense.

So what would be the best way to achieve this?

+3


source to share


2 answers


Well, first of all, it is not possible to give a concurrency argument for Promise.all

, since promises represent operations already running, so you cannot enqueue them or wait before executing.

What you want to accomplish with limited concurrency is the return promise function. Luckily for you - bluebird ships with this feature (since version 2.x) using Promise.map

:

 Promise.map(largeArray, promiseReturningFunction, {concurrency: 16});

      

The concurrency parameter determines how many operations can happen at once - note that this is not a global value, but only for this chain. For example:



Promise.map([1,2,3,4,5,6,7,8,9,10], function(i){
    console.log("Shooting operation", i);
    return Promise.delay(1000);
}, {concurrency: 2});

      

Fiddle

Please note that the order of execution is not guaranteed.

+5


source


Since I was unable to find a pre-existing library to take care of the batching promise, I wrote a simple primitive myself. This is a class that wraps an array function

for execution and splits it into batches of a given size. It will wait for each batch to complete before starting the next. This is a rather naive implementation. A full blown throttling mechanism would probably be desirable in some network scenarios.

Fiddle .



code:

/**
 * Executes any number of functions, one batch at a time.
 *
 * @see http://jsfiddle.net/93z8L6sw/2
 */
var Promise_allBatched = (function() {
    var Promise_allBatched = function(arr, batchSize) {
        if (arr.length == 0) return Promise.resolve([]);
        batchSize = batchSize || 10;

        var results = [];
        return _runBatch(arr, batchSize, results, 0)
        .return(results);        // return all results
    };

    function _runBatch(arr, batchSize, results, iFrom) {
        // run next batch
        var requests = [];
        var iTo = Math.min(arr.length, iFrom + batchSize);

        for (var i = iFrom; i < iTo; ++i) {
            var fn = arr[i];
            var request;
            if (fn instanceof Function) {
                request = fn();
            }
            else {
                request = fn;
            }
            requests.push(request);            // start promise
        }

        return Promise.all(requests)        // run batch
        .then(function(batchResults) {
            results.push.apply(results, batchResults);    // store all results in one array

            console.log('Finished batch: ' + results.length + '/' + arr.length);
        })
        .then(function() {
            if (iTo < arr.length) {
                // keep recursing
                return _runBatch(arr, batchSize, results, iTo);
            }
        });
    }

    return Promise_allBatched;
})();

      

+1


source







All Articles