UIImageJPEG Presence accepted memory warning

When using UIImageJPEGRepresentation I get a memory warning, is there a way to avoid this? This is not an application crash, but I would like to avoid it if possible. It interrupts periodically[[UIApplication sharedApplication] openURL:url];

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];
    NSData *imageToUpload = UIImageJPEGRepresentation(image, 1.0);

    // code that sends the image to a web service (omitted)
    // on success from the service
    // this sometime does not get run, I assume it has to do with the memory warning?
    [[UIApplication sharedApplication] openURL:url];
}

      

+3


source to share


3 answers


Usage UIImageJPEGRepresentation

(in which you do a circular shutdown of an asset with UIImage

) can be problematic because using compressionQuality

1.0 the resulting NSData

file can actually be significantly larger than the original file. (Also, you keep the second copy of the image in UIImage

.)

For example, I just picked a random image from my iPhone photo library and the original asset is 1.5MB , but NSData

created UIImageJPEGRepresentation

with compressionQuality

1.0, the required 6.2mb. And saving the image to UIImage

itself can take even more memory (because if it's uncompressed, it might take, for example, four bytes per pixel).

Instead, you can get the original asset using the method getBytes

:

static NSInteger kBufferSize = 1024 * 10;

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    NSURL *url = info[UIImagePickerControllerReferenceURL];

    [self.library assetForURL:url resultBlock:^(ALAsset *asset) {
        ALAssetRepresentation *representation = [asset defaultRepresentation];
        long long remaining = representation.size;
        NSString *filename  = representation.filename;

        long long representationOffset = 0ll;
        NSError *error;
        NSMutableData *data = [NSMutableData data];

        uint8_t buffer[kBufferSize];

        while (remaining > 0ll) {
            NSInteger bytesRetrieved = [representation getBytes:buffer fromOffset:representationOffset length:sizeof(buffer) error:&error];
            if (bytesRetrieved <= 0) {
                NSLog(@"failed getBytes: %@", error);
                return;
            } else {
                remaining -= bytesRetrieved;
                representationOffset += bytesRetrieved;
                [data appendBytes:buffer length:bytesRetrieved];
            }
        }

        // you can now use the `NSData`

    } failureBlock:^(NSError *error) {
        NSLog(@"assetForURL error = %@", error);
    }];
}

      

This avoids setting the image in UIImage

, and the resulting NSData

can be (for photographs, in any case) much less. Note, this also has the advantage that it also preserves the metadata associated with the image.



By the way, while the above represents a significant memory improvement, perhaps you will see a more dramatic memory reduction opportunity: in particular, instead of loading the entire asset into at the same time NSData

, you can now stream (subclass NSInputStream

use this procedure getBytes

to get bytes over as needed, rather than loading the entire object into memory in one go). There are some annoyances with this process (see BJ Homer's article on the topic ), but if you're looking for drastic memory cuts that's the way to go. There are several approaches here (BJ, using some intermediate file and streaming from it, etc.), but the key is that streaming can reduce the amount of memory significantly.

But by avoiding UIImage

in UIImageJPEGRepresentation

(which avoids the memory occupied by the image, as well as the more NSData

that it gives UIImageJPEGRepresentation

), you can get ahead significantly. Also, you might want to make sure you don't have redundant copies of this image data in memory in one go (for example, don't load the image data in NSData

and then create a second one NSData

for HTTPBody

... see if you can do it in one fell swoop). And if the worst gets worse, you can use streaming approaches.

+5


source


Provided as an answer for formatting and images.

Use tools to check for leaks and memory loss due to retained but not leaked memory. The latter is unused memory that is still pointed to. Use Mark Generation (Heapshot) in Tool Allocations on Instruments.

For using Heapshot to find memory, see the bbum blog

Basically the method is to run the tools, highlight the tool, take a snapshot, start iterating your code, and repeat another bunch, repeating 3 or 4 times. This will point to memory that is allocated and not released during iterations.



To find out which results are expanded to see the individual distributions.

If you need to look at where are saved, released, and auto-implemented for object use tools:

Launch in Tools, Subscription is set to Record Counts (for Xcode 5 and below, you must stop recording to set the parameter). The reason the app starts, stops recording, unfolds and you can see where everything is saved, releases and auto-implementers.

+1


source


in ARC: just put your code inside a small @autoreleasepool block

 @autoreleasepool {
     NSData *data = UIImageJPEGRepresentation(img, 0.5);
     // something with data
}

      

+1


source







All Articles