Cloud functions for Firebase: killing long processes without touching maximum timeout

I need to transcode videos from webm to mp4 when they are uploaded to firebase storage. I have a demo of the code that works , but if the uploaded video is too large, the firebase functions will disconnect me until the conversion is complete.I know it is possible to increase the timeout limit for the function, but it seems like a mess as I cannot confirm. that the process will take less time than the timeout limit.

Is there a way to stop firebase from timed out without increasing the maximum timeout limit?

If not, is there a way to complete time-consuming processes (like converting video), but still triggering each process using firebase function triggers?

Even if doing time consuming processes using firebase functions is not something that really exists, is there a way to speed up the fluent-ffmpeg conversion without touching its quality? (I understand this part is a lot to ask. I plan on lowering the quality if absolutely necessary, as the reason webpages are converted to mp4 is related to IOS devices)

For reference, here's the bulk of the demo I mentioned. As I said, the complete code can be here , but this section of code, copied over, is the part that creates the Promise, making sure the transcoding is complete. The complete code is only 70 lines, so it should be relatively easy to go through if needed.

const functions = require('firebase-functions');
const mkdirp = require('mkdirp-promise');
const gcs = require('@google-cloud/storage')();
const Promise = require('bluebird');
const ffmpeg = require('fluent-ffmpeg');
const ffmpeg_static = require('ffmpeg-static');

      

(There is a bunch of text parsing code here and then the next piece of code inside the onChange event)

function promisifyCommand (command) {
    return new Promise( (cb) => {
        command
        .on( 'end',   ()      => { cb(null)  } )
        .on( 'error', (error) => { cb(error) } )
        .run();
    })
}
return mkdirp(tempLocalDir).then(() => {
    console.log('Directory Created')
    //Download item from bucket
    const bucket = gcs.bucket(object.bucket);
    return bucket.file(filePath).download({destination: tempLocalFile}).then(() => {
      console.log('file downloaded to convert. Location:', tempLocalFile)
      cmd = ffmpeg({source:tempLocalFile})
               .setFfmpegPath(ffmpeg_static.path)
               .inputFormat(fileExtension)
               .output(tempLocalMP4File)
      cmd = promisifyCommand(cmd)
      return cmd.then(() => {
        //Getting here takes forever, because video transcoding takes forever!
        console.log('mp4 created at ', tempLocalMP4File)
        return bucket.upload(tempLocalMP4File, {
            destination: MP4FilePath
        }).then(() => {
          console.log('mp4 uploaded at', filePath);
        });
      })
    });
  });

      

+4


source to share


2 answers


Cloud features for Firebase are poorly suited (and not supported) for long-running tasks that may exceed the maximum timeout limit. The only real chance to use only the cloud functions to do very heavy computations is to find a way to split the work into multiple function calls and then combine the results of everything that works with the final product. For something like video transcoding, this is a very difficult task.



Instead, consider using a function to run a long running task in App Engine or Calculate Engine .

+1


source


How to spy on a random anonymous person trying to figure out how to go through video transcoding or some other lengthy processes, here is a version of the same code I gave, instead sending an HTTP request to the google engine process that transcodes the file. There is no documentation for it at the moment, but looking at the Firebase / functions / index.js code and the app.js code can help you solve the problem.

https://github.com/Scew5145/GCSConvertDemo



Good luck.

0


source







All Articles