Can OpenMAX for Android NDK be used to stream live video / audio to a server? Example?

OpenMax has a "Data Sink" concept.

The spec says about them:

Its data locator which identifies where the data resides. Possible locators include:
โ€ข URIs (such as a filename)
โ€ข Memory addresses
โ€ข I/O devices
โ€ข Output Mixes
โ€ข Cameras

      

Here's some sample code from the spec:

/* Setup the data sink structure */
uri.locatorType = XA_DATALOCATOR_URI;
uri.URI = (XAchar *) "file:///recordsample.wav";
audioSink.pLocator = (void*) &uri;

      

In some other places he says:

URIs pointing to audio files in the local file system

      

Does anyone know if OpenMax can use these URI / data sinks to implement a streaming application that captures the current data from the microphone / camera and streams it to the server? Example?

+3


source to share


1 answer


The short answer is NO. The NDK document explicitly states that the android implementation of OpneMAX AL does not reveal any features other than Java MediaPlayer to date .



+3


source







All Articles