How do I stream video as it is recorded?

Ok, so I am my application I have ViewController

that handles the video recording from the camera, which it then saves to the document directory of my application folders. Now what I want to do is when recording a video while simultaneously uploading parts of the current file written to the server as well (I'm new to this, but guessing the http server). The reason I am doing this is because I want to add support, so when you shoot a video, I can transfer chrome. This is possible because the EZCast application already performs a similar function.

I have already worked out how to upload video to http server, send video to HTTP server to chrome chromite, and actually record video using these sources:

Chrome Cast: https://developers.google.com/cast/

Chrome Cast: https://github.com/googlecast/CastVideos-ios

Http Server: https://github.com/robbiehanson/CocoaHTTPServer

Camera Recording Idevice: https://github.com/BradLarson/GPUImage

To make a video, I obviously connected, but before allowing me to go to the recording view, I must already be connected, so my code for purely playing .mp4 video looks like this:

-(void)startCasting
{
[self establishServer];
self.mediaControlChannel = [[GCKMediaControlChannel alloc] init];
        self.mediaControlChannel.delegate = self;
        [self.deviceManager addChannel:gblvb.mediaControlChannel];
        [self.mediaControlChannel requestStatus];

        NSString *path = [NSString stringWithFormat:@"http://%@%@%hu%@%@", [self getIPAddress], @":" ,[httpServer listeningPort], @"/", @"Movie.mp4"];

        NSString *image;
        NSString *type;
        self.metadata = [[GCKMediaMetadata alloc] init];


        image = @"";//Image HERE
        [gblvb.metadata setString:@"
                           forKey:kGCKMetadataKeySubtitle];//Description Here
        type = @"video/mp4";//Video Type            

        [self.metadata setString:[NSString stringWithFormat:@"%@%@", @"Casting " , @"Movie.mp4"]forKey:kGCKMetadataKeyTitle];//Title HERE




        //define Media information



        GCKMediaInformation *mediaInformation =
        [[GCKMediaInformation alloc] initWithContentID:path
                                            streamType:GCKMediaStreamTypeNone
                                           contentType:type
                                              metadata:gblvb.metadata
                                        streamDuration:0
                                            customData:nil];

        //cast video            
        [self.mediaControlChannel loadMedia:mediaInformation autoplay:TRUE playPosition:0];
}

- (NSString *)getIPAddress {

    NSString *address = @"error";
    struct ifaddrs *interfaces = NULL;
    struct ifaddrs *temp_addr = NULL;
    int success = 0;
    // retrieve the current interfaces - returns 0 on success
    success = getifaddrs(&interfaces);
    if (success == 0) {
        // Loop through linked list of interfaces
        temp_addr = interfaces;
        while(temp_addr != NULL) {
            if(temp_addr->ifa_addr->sa_family == AF_INET) {
                // Check if interface is en0 which is the wifi connection on the iPhone
                if([[NSString stringWithUTF8String:temp_addr->ifa_name] isEqualToString:@"en0"]) {
                    // Get NSString from C String
                    address = [NSString stringWithUTF8String:inet_ntoa(((struct sockaddr_in *)temp_addr->ifa_addr)->sin_addr)];

                }

            }

            temp_addr = temp_addr->ifa_next;
        }
    }
    // Free memory
    freeifaddrs(interfaces);
    return address;

}

      

Now, before casting, I need to install my http server. This is simple and requires a little implementation after adding CocoaHTTPServer to your project. My code to start the server looks like this:

static const int ddLogLevel = LOG_LEVEL_VERBOSE;
-(void)establishServer
{

[httpServer stop];

    // Do any additional setup after loading the view from its nib.
    // Configure our logging framework.
    // To keep things simple and fast, we're just going to log to the Xcode console.
    [DDLog addLogger:[DDTTYLogger sharedInstance]];

    // Create server using our custom MyHTTPServer class
    httpServer = [[HTTPServer alloc] init];


    // Tell the server to broadcast its presence via Bonjour.
    // This allows browsers such as Safari to automatically discover our service.
    [httpServer setType:@"_http._tcp."];

    // Normally there no need to run our server on any specific port.
    // Technologies like Bonjour allow clients to dynamically discover the server port at runtime.
    // However, for easy testing you may want force a certain port so you can just hit the refresh button.
    // [httpServer setPort:12345];

    // Serve files from our embedded Web folder

    NSString *webPath = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/"];
    DDLogInfo(@"Setting document root: %@", webPath);

    [httpServer setDocumentRoot:webPath];
    [self startServer];
}


- (void)startServer
{
    // Start the server (and check for problems)

    NSError *error;
    if([httpServer start:&error])
    {
        DDLogInfo(@"Started HTTP Server on port %hu", [httpServer listeningPort]);



    }
    else
    {
        DDLogError(@"Error starting HTTP Server: %@", error);
    }
}

      

Finally, I use this code to start displaying and recording from iPhones camera:

    - (void)viewDidLoad
    {
        [super viewDidLoad];
        videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
    //    videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront];
    //    videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionBack];
    //    videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1920x1080 cameraPosition:AVCaptureDevicePositionBack];

        videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
        videoCamera.horizontallyMirrorFrontFacingCamera = NO;
        videoCamera.horizontallyMirrorRearFacingCamera = NO;

    //    filter = [[GPUImageSepiaFilter alloc] init];

    //    filter = [[GPUImageTiltShiftFilter alloc] init];
    //    [(GPUImageTiltShiftFilter *)filter setTopFocusLevel:0.65];
    //    [(GPUImageTiltShiftFilter *)filter setBottomFocusLevel:0.85];
    //    [(GPUImageTiltShiftFilter *)filter setBlurSize:1.5];
    //    [(GPUImageTiltShiftFilter *)filter setFocusFallOffRate:0.2];

    //    filter = [[GPUImageSketchFilter alloc] init];
        filter = [[GPUImageFilter alloc] init];
    //    filter = [[GPUImageSmoothToonFilter alloc] init];
    //    GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRightFlipVertical];

        [videoCamera addTarget:filter];
        GPUImageView *filterView = (GPUImageView *)self.view;
    //    filterView.fillMode = kGPUImageFillModeStretch;
    //    filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill;

        // Record a movie for 10 s and store it in /Documents, visible via iTunes file sharing

        NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.mp4"];
        unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
        NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
        movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
        movieWriter.encodingLiveVideo = YES;
    //    movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(640.0, 480.0)];
    //    movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(720.0, 1280.0)];
    //    movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(1080.0, 1920.0)];
        [filter addTarget:movieWriter];
        [filter addTarget:filterView];

        [videoCamera startCameraCapture];


    }

bool recording;
- (IBAction)Record:(id)sender
{
    if (recording == YES)
    {
        Record.titleLabel.text = @"Record";
        recording = NO;
        double delayInSeconds = 0.1;
        dispatch_time_t stopTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC);
        dispatch_after(stopTime, dispatch_get_main_queue(), ^(void){

            [filter removeTarget:movieWriter];
            videoCamera.audioEncodingTarget = nil;
            [movieWriter finishRecording];
            NSLog(@"Movie completed");

            //            [videoCamera.inputCamera lockForConfiguration:nil];
            //            [videoCamera.inputCamera setTorchMode:AVCaptureTorchModeOff];
            //            [videoCamera.inputCamera unlockForConfiguration];
        });
        UIAlertView *message = [[UIAlertView alloc] initWithTitle:@"Do You Wish To Store This Footage?"
                                                          message:@"Recording has fineshed. Do you wish to store this video into your camera roll?"
                                                         delegate:self
                                                cancelButtonTitle:nil
                                                otherButtonTitles:@"Yes", @"No",nil];

        [message show];

        [self dismissViewControllerAnimated:YES completion:nil];

    }
    else
    {
    double delayToStartRecording = 0.5;
    dispatch_time_t startTime = dispatch_time(DISPATCH_TIME_NOW, delayToStartRecording * NSEC_PER_SEC);
    dispatch_after(startTime, dispatch_get_main_queue(), ^(void){
        NSLog(@"Start recording");

        videoCamera.audioEncodingTarget = movieWriter;
        [movieWriter startRecording];

        //        NSError *error = nil;
        //        if (![videoCamera.inputCamera lockForConfiguration:&error])
        //        {
        //            NSLog(@"Error locking for configuration: %@", error);
        //        }
        //        [videoCamera.inputCamera setTorchMode:AVCaptureTorchModeOn];
        //        [videoCamera.inputCamera unlockForConfiguration];
        recording = YES;
        Record.titleLabel.text = @"Stop";
    });
        [self startCasting];
    }

}

      

Now that you can probably see that I am trying to start a video recorded immediately after recording and pointing the server to the position. This doesn't work because I believe the file is not in place along that path until the stop button is clicked, but how can I fix this? can anyone help?

ChromeCast Supported Media Types: https://developers.google.com/cast/docs/media

+3


source to share





All Articles