How to make an image with an effect faster with UIKit

I am building an iOS app that has a process to toggle a large number of images with multiple UIImageViews (a loop to set a UIImageView image property with a bunch of images). And sometimes some of the images need some kind of graphic effect, say multiplication.

The easiest way is to use a CIFilter to accomplish this task, but the problem is that CALayer on iOS does not support the "filters" property, so you need to apply the effect to images before setting the "image" property. But it is very slow when you refresh the screen frequently.

So, I tried to use Core Graphics directly to do the multiplication with the UIGraphics context and kCGBlendModeMultiply. This is much faster than using a CIFilter, but since you must apply the multiplication before rendering the image, you may still feel like the program is running slower when trying to render images with a multiplication effect than rendering regular images.

I guess the main problem with these two approaches is that you need to process the image effect with the GPU and then get the result with the CPU, and then finally display the result with the GPU, which means the data is transferred between the CPU and the GPU cpu wasted a lot of time, so I tried to change the superclass from UIImageView to UIView and implement the CGGraphics context code for the drawRect method, and then when I set the "image" property, I call the setNeedsDisplay method on the didSet. But it doesn't work that well ... in fact every time it calls setNeedsDisplay the program gets much slower, which is even slower than using a CIFilter, perhaps because multiple views are being displayed.

I guess I can probably fix this problem with OpenGL, but I'm wondering if I can solve this problem with UIKit alone?

+3


source to share


1 answer


As far as I understand, you have to make the same changes for different images. Therefore, the initial initialization time is not critical for you, but each image should be processed as soon as possible. First of all, it is critical to generate new images in the background queue / thread. There are two good ways to quickly process / generate images:

  • Use CIFilter from CoreImage

  • Use GPUImage library

If you've used CoreImage, check if you're using CIFilter and CIContext correctly. It takes quite a long time to create a CIContext , but it can be SHARED between different CIFilters and images - so you only have to create a CIContext once! The CIFilter can also be SHARED between different images, but since it is not thread safe, you must have a separate CIFilter for each thread.



In my code, I have the following:

+ (UIImage*)roundShadowImageForImage:(UIImage*)image {
    static CIFilter *_filter;

    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^
    {
        NSLog(@"CIContext and CIFilter generating...");
        _context = [CIContext contextWithOptions:@{ kCIContextUseSoftwareRenderer: @NO,
                                                    kCIContextWorkingColorSpace : [NSNull null] }];

        CIImage *roundShadowImage = [CIImage imageWithCGImage:[[self class] roundShadowImage].CGImage];
        CIImage *maskImage = [CIImage imageWithCGImage:[[self class] roundWhiteImage].CGImage];

        _filter = [CIFilter filterWithName:@"CIBlendWithAlphaMask" 
                             keysAndValues:
                   kCIInputBackgroundImageKey, roundShadowImage,
                   kCIInputMaskImageKey, maskImage, nil];
        NSLog(@"CIContext and CIFilter are generated");
    });

    if (image == nil) {
        return nil;
    }
    NSAssert(_filter, @"Error: CIFilter for cover images is not generated");

    CGSize imageSize = CGSizeMake(image.size.width * image.scale, image.size.height * image.scale);

    // CIContext and CIImage objects are immutable, which means each can be shared safely among threads
    CIFilter *filterForThread = [_filter copy]; // CIFilter could not be shared between different threads.

    CGAffineTransform imageTransform = CGAffineTransformIdentity;
    if (!CGSizeEqualToSize(imageSize, coverSize)) {
        NSLog(@"Cover image. Resizing image %@ to required size %@", NSStringFromCGSize(imageSize), NSStringFromCGSize(coverSize));
        CGFloat scaleFactor = MAX(coverSide / imageSize.width, coverSide / imageSize.height);
        imageTransform = CGAffineTransformMakeScale(scaleFactor, scaleFactor);
    }
    imageTransform = CGAffineTransformTranslate(imageTransform, extraBorder, extraBorder);

    CIImage *ciImage = [CIImage imageWithCGImage:image.CGImage];
    ciImage = [ciImage imageByApplyingTransform:imageTransform];

    if (image.hasAlpha) {
        CIImage *ciWhiteImage = [CIImage imageWithCGImage:[self whiteImage].CGImage];
        CIFilter *filter = [CIFilter filterWithName:@"CISourceOverCompositing"
                                      keysAndValues:
                            kCIInputBackgroundImageKey, ciWhiteImage,
                            kCIInputImageKey, ciImage, nil];
        [filterForThread setValue:filter.outputImage forKey:kCIInputImageKey];
    }
    else
    {
        [filterForThread setValue:ciImage forKey:kCIInputImageKey];
    }

    CIImage *outputCIImage = [filterForThread outputImage];
    CGImageRef cgimg = [_context createCGImage:outputCIImage fromRect:[outputCIImage extent]];
    UIImage *newImage = [UIImage imageWithCGImage:cgimg];
    CGImageRelease(cgimg);
    return newImage;
}

      

If you are still not satisfied with the speed, try GPUImage This is a very good library, it is also very fast because it uses OpenGL to generate images.

0


source







All Articles