Unable to send image from iPhone camera?

On the iPhone 3GS, the image captured by the camera is 2048x1536 pixels. If my math is correct, opening this image on a CGLayer will consume 12.5 MB.

Springboard will terminate any application that reaches 12mb (at least that's what happens to me).

Manipulating this image with a function such as CGContextDrawLayer will consume another 12 MB, at least.

That's 24 MB.

How can you manipulate such images on iPhone without stopping the program?

Is there a way to reduce the size of the image captured by the camera without reducing its size?

any clues? thank.

+2


source to share


2 answers


You should consider using NSInputStream to process your image in chunks of any size. For example, you can read 1MB of data, process it, write the results to NSOutputStream , and then repeat 11 more times until EOF.



Most likely, your image processing algorithm will determine the optimal block size.

+2


source


Your screen is only 320 x 480 pixels, so putting anything larger on a layer seems like a waste of memory.

So you can translate the origin and scale of the original image from 2048 x 1526 pixels to 320 x 480 pixels before applying it to the layer.

If you are using a UIScrollView

layer to represent, for example, you must write code to have pinch and stretch compute a new 320 x 480 pixel representation based on the current zoom level, as determined from the frame and view boundaries. In your code, drag and drop will transform the origin and recalculate the missing bits.



This effect can be seen with Safari when scaling a document. It goes from blurry to harsh as a new view is rendered. Likewise, when you drag a view, its new missing parts are calculated and added to the view.

Regardless of the touch event, you probably only want to place 320 x 480 pixels on that pixel.

+1


source







All Articles