How to keep $ .ajax () async inside $ .each () loop, but react to the result on each iteration

I'm having problems with some dumb architecture because I'm dumb. I am trying to loop through YouTube videos posted on Reddit, extract urls and process them in a .m3u playlist.

The complete Subreddit code is on the YouTube Source tab . Play YouTube music from subreddits in Foobar with foo_youtube

At some point, I get the idea that I can check each url to see if the videos are dead or not, and suggest an alternative if they are removed.

So, I am making AJAX requests for the YouTube API and if there is an error, I have to react to it and change this url.

But the problem is that it only works if the AJAX is not asynchronous - it takes many seconds during which the page is jammed.

I would like AJAX to be asynchronous, but I don't know how I should structure my code.

Here's the PSEUDOCODE how it's done now:

var listing = // data from reddit API
$.each(listing, function(key, value) {
    var url = // post URL from reddit posts listing 
    // ( "http://youtu.be/XmQR4_hDtpY&hd=1" )
    var aRegex = // to parse YouTube URLs 
    // ( (?:youtube(?:-nocookie)?.com/....bla...bla )
    var videoID = // YouTube video ID, extracted with regex 
    // ( "XmQR4_hDtpY" )
    var itemArtist = // parsed from reddit posts listing 
    // ( "Awesome Artist" )
    var itemTitle = // parsed from reddit posts listing 
    // ( "Cool Song (Original Mix)" )
    var itemURL = // url, further processed 
    // ( "3dydfy://www.youtube.com/watch?v=XmQR4_hDtpY&hd=1" )

    $.ajax({
            type: "HEAD",
            url: "https://gdata.youtube.com/feeds/api/videos/" + videoID,
            error: function() { 
                // If it no longer available 
                // (removed, deleted account, made private)
                deadVid++; // chalk up another dead one, for showing progress
                itemURL = // dead videos should get a different URL 
           // ( "3dydfy-search://q=Awesome%20Artist%20Cool%20Song....." )
            }
        });

    // further process itemURL!
    // at this point I want itemURL changed by the .ajax() error callback
    // but I'm trying to keep the requests async 
    // to not jam the page while a hundred HTTP requests happen!
    if (condition){
        itemURL += // append various strings
    }
    // Write the item to the .m3u8 playlist
    playlist += itemURL + '\n';
}// end .each()

      

+3


source to share


1 answer


Basically you want to know

  • What were the mistakes and
  • When each ajax request is complete

If you put errors in the list, the results will be ready at the end (order is not guaranteed).

For the second part, if you keep the ajax promises array returned from each $ .ajax you can use $.when

and wait for them all to complete with always()

.

As a basic example (other details removed):

var listing = {}; // data from reddit API
var promises = [];
var results = [];
$.each(listing, function(key, value) {
    // [snip]
    promises.push($.ajax({
            type: "HEAD",
            url: "https://gdata.youtube.com/feeds/api/videos/" + videoID,
            error: function() { 
                //[snip]
                results.push({
                   itemArtist: itemArtist,
                   videoID: videoID,
                   url: itemURL});
            }
        });
    );
}
// Wait for all promises to complete (pass or fail) 
$.when.apply($, promises).always(function(){
    // process list of failed URLs
});

      

Sorry for any typos. This was coded right back into the answer, but you get the idea.

I note that you mentioned 100 requests, but the browser will only allow a few minutes, so there is no need for additional processing.



If always

doesn't work, you can add your own deferred objects, which are allowed on success

or fail

:

var listing = {}; // data from reddit API
var promises = [];
var results = [];
$.each(listing, function(key, value) {
    var deferred = $.Deferred();
    promises.push(deferred.promise());
    // [snip]
    $.ajax({
            type: "HEAD",
            url: "https://gdata.youtube.com/feeds/api/videos/" + videoID,
            complete: function(){
                // resolve on success or fail
                deferred.resolve();
            },
            error: function() { 
                //[snip]
                results.push({
                   itemArtist: itemArtist,
                   videoID: videoID,
                   url: itemURL});
            }
        });
    );
}
// Wait for all promises to complete (pass or fail) 
$.when.apply($, promises).always(function(){
    // process list of failed URLs
});

      

Update to 2015

Here's another cool way to concatenate parallel promises together without using an array (as long as you don't need the data values ​​passed to the callbacks):

Simplified code:

   var promise;   // Undefined is also a resolved promise when passed to $.when()
   $.each(listing, function(key, value) {
       // Each new Ajax promise is chained, in parallel, with the previous promise
       promise = $.when(promise, $.ajax({...}));
   });
   // When they are all finished, fire a final callback
   $.when(promise).always(function(){
       // All done!
   });

      

This has generated some criticism, mostly from people who consider themselves "unclean", but there is minimal trade-off to simplify promises to parallel code.

I realized that this was possible when I saw what someone is using promise = promise.then(newpromise)

for a sequential sequence of events. After some experimentation, I found that I can do the same thing in parallel usingpromise = $.when(promise, newpromise)

+4


source







All Articles