Can you create thousands of DOM elements asynchronously?

I am building an emoji collector and the hardest task is to create ~ 1500 DOM elements for each emoji that blocks / renders the page unresponsive for about 500-700ms.

I debugged this and it looks like the creation of DOM elements is what blocks the rest of the JS execution:

function load_emojis(arr){
  var $emojis = [];
  # arr.length = 1500
  _.each(arr, function(){
    $emojis.push($('<span/>').append($('<img/>')));
  });

  $('.emojis').html($emojis);
}

      

Is there a way to do this all asynchronously / on a different thread so that it doesn't block JS after it?

I tried to put it in setTimeout, but it still seems like it is executed on one thread, thus blocking JS execution.

+3


source to share


5 answers


JavaScript is not streaming; it, like most user interface libraries, does everything in one thread with an event loop; asynchronous behavior can run on a background thread (invisible to a JS programmer, they don't explicitly manage or even see the threads), but the results are always delivered to a single foreground thread for processing.

Rendering of the elements doesn't actually take place until your JS code finishes and control returns to the event loop; if you are doing too many things, the lag occurs when the browser needs to draw it and you cannot help to help. The most you can possibly do is reduce the parsing overhead by explicitly creating elements rather than passing a bunch of text to parse, but your options are limited.



Things that may help, depending on browser, moon phase, etc., include:

  • Create and populate one parent element that is not part of the DOM, then add that single parent to the DOM when it finishes populating (may avoid the work of maintaining DOM structure).
  • Create a template for the structure you will re-add, then usecloneNode(True)

    to copy this template and fill in the small differences after; this avoids the processing of parsing tree node trees when it is indeed the same tree node repeated multiple times with multiple attribute settings.
  • If the images are larger than fit in the viewport, only insert / render the initially visible elements, adding others in the background as the user scrolls (perhaps also adds them proactively, but in small enough numbers and with enough setTimeout

    / in setInterval

    between that any given it doesn't take very long to insert and the UI remains responsive).
  • The big part for an "array of many images" is to stop using more than 1500 images and use one monolithic image instead. Then you either:

    and. Use a monolithic image repeatedly with a given fixed offset and size using CSS (the image is decompressed and rendered once, and views of that image are rendered at different offsets repeatedly or ...

    b. Use / tagsmap

    area

    to insert the image only once, but force the clicks to behave differently on each part of the image (reduces the DOM layout works on a single image, all other DOM elements must exist in the tree but don't need to be displayed)

+2


source


Here is a function that divides work into chunks:

function load_emojis(arr){
    var chunk = 20;
    $('.emojis').html(''); // clear before start, just to be sure
    (function loop(i) {
        if (i >= arr.length) return; // all done
        var $emojis = [];
        $.each(arr.slice(i, i+chunk), function(){
            $emojis.push($('<span/>').append($('<img/>')));
        });
        $('.emojis').append($emojis);
        setTimeout(loop.bind(null, i+chunk));
    })(0);
}

      

This will do setTimeout

for every 20 next elements in your array.

Obviously, the total completion time will be longer, but other JS and user events can happen during many small pauses.



I left out the second argument setTimeout

, as the default (0) is sufficient for other tasks in the event queue.

Also, I found your usage a html()

little odd, as the documentation allows the argument to be a string or a function, but you provided it with an array of jQuery elements ... In this case append()

, the function will use.

Play with the size chunk

to find the perfect size. It will probably be more than just 20.

+1


source


My first solution doesn't work, it only postpones the problem. Your best shot is to optimize these features to get it done quickly.

function load_emojis(arr) {
  var emojis = new Array(arr.length);
  var emojisDiv = document.querySelector('.emojis');
  for (var i = 0; i < arr.length; i++) {
    emojis[i] = '<span><img src="" alt=""/></span>';
  }
  emojisDiv.innerHTML = emojis.join('');
}

      

DOES NOT WORK: There is one nice solution, but it is not cross browser and only works in chrome firefox and opera: using window.requestIdleCallback.

function load_emojis(arr) {
  window.requestIdleCallback(function() {
     var $emojis = [];
     # arr.length = 1500
     _.each(arr, function(){
       $emojis.push($('<span/>').append($('<img/>')));
     });

     $('.emojis').html($emojis);
  });
}

      

see https://developer.mozilla.org/en-US/docs/Web/API/Window/requestIdleCallback

Here is the polyfill: https://github.com/PixelsCommander/requestIdleCallback-polyfill

0


source


Javascript is single threaded, so the short answer is no. This way, it setTimeout

will execute the function on the same thread, but allow your code to keep running until the timeout expires (after which it will block). For more on the event loop, MDN has a great explanation .

In doing this, browsers implement Web Worker APIs that allow you to deploy separate processes where you can run JS code at the same time as your main process. This way you can achieve some degree of multithreading in the browser. Unfortunately, workers do not have the same API access as your main process, and workflows cannot directly render the DOM as described in this post .

0


source


There is only one thread where you can do DOM operations. You can execute non-DOM stuff on a separate thread using WebWorkers, but that obviously won't work as you are adding to the DOM.

You perform 1501 DOM operations at once, each DOM operation is very expensive. You can easily reduce it to 1 DOM operation:

function load_emojis(arr){
  var $emojis = [];
  # arr.length = 1500
  _.each(arr, function(){
    $emojis.push('<span><img src="..."/></span>');
  });

  $('.emojis').html($emojis.join(''));
}

      

This should be an order of magnitude slower than a single simple DOM operation, but it is much better than 1500 times slower. You have to put in src before it is added to the DOM, as you can't easily access individual images anymore.

0


source







All Articles