Back to blog
Performance and growth

Realtime Blurring on iOS

Create a culture that means business™

Email address

Thank you! A team member will reach out shortly.

From bokeh to that fancy iOS Control Center overlay, blurry backgrounds just look cool. Unfortunately, programmatically blurring things is an expensive operation. It’s a convoluted process by definition! This means that realtime blurring of dynamically displayed images in an interactive user interface can pose a performance challenge. And, great performance is something our team takes very seriously.

Challenge

To further illustrate where the challenge comes in, let’s consider an example. Smooth scrolling of a UITableView (or any animation for that matter) requires displaying frames at a desirable frame rate. 60 fps is a good target because at that rate things tend to look very smooth. Using this target, we have approximately 16.7ms (1 sec / 60) per frame to process and render everything. If we require blurring of a region of each UITableViewCell before displaying it, this is going to make achieving our target frame rate more difficult. Not only that, but we don’t want to blur each region again and again. We may have images appearing many times in the table view and/or the user may scroll up and down bringing the same images into view multiple times. In addition, we must perform the blurring in a memory efficient manner. Repeatedly loading large images while doing heavy processing is a sure-fire way to bloat resource utilization and risk crashes. Finally, being the considerate developers that we are, we want to avoid the unnecessary drain of our users’ batteries too.

Solution

Newsfeed mock-up

It just so happens that our Product team designed us a layout that requires blurred overlays of downloaded images for events in our app’s UITableView representation of a newsfeed.

The solution we chose leverages the fact that we must download the images we’re displaying in each cell. Downloading the images on the main thread would be an obvious no-no since that would block the thread and cause dreadful scrolling performance. Therefore, since we’re already downloading a cell’s image on a background thread, we can do the blurring on the same thread. So, basically, the cell’s contents that are immediately available are displayed immediately while the image and blurred overlay are displayed asynchronously.

When a cell’s image download is complete, we scale the image appropriately to the required dimensions. Since our blurred region is smaller than the scaled image, we can tell our blurring function to blur just the dimensions of that region to conserve memory and improve processing time. The relevant Objective-C looks something like this:

self.text = self.event.text;
__weak id wself = self;
NSString *imageUrl = self.event.imageUrl;

self.image = nil;
self.overlay.image = nil;

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
    UIImage *image = [[wself downloadImage:imageUrl] scaleWithWidth:cellWidth];
    UIImage *blurredImage = [image blurRect:overlayRect];

    dispatch_async(dispatch_get_main_queue(), ^{
        // skip if the cell’s event has changed
        if (![imageUrl isEqualToString:wself.event.imageUrl]) {
            return;
        }

        wself.overlay.image = blurredImage;
        wself.image = image;
    });
});

Optimization

To optimize and reduce redundancies, the downloaded image and output of the blurring function is cached in two caches. One is reserved for the original images and the other for storing of the blurred images. Both original and blurred images are stored under the key of the image’s URL. Therefore, when an image is fetched for display by its URL, we can check both caches for a hit before blurring. If there’s a hit, we can pull the image from the cache and avoid doing expensive operations repeatedly. Modifying the above code snippet, we end up with the below:

self.text = self.event.text;
__weak id wself = self;
NSString *imageUrl = self.event.imageUrl;
self.image = nil;
self.overlay.image = nil;

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
    UIImage *image = [ImageCache sharedCache] get:imageUrl];

    if (!image) {
        image = [[wself downloadImage:imageUrl] scaleWithWidth:cellWidth];
        [ImageCache sharedCache] put:image withKey:imageUrl];
    }

    UIImage *blurredImage = [ImageCache blurredCache] get:imageUrl];

    if (!blurredImage) {
        blurredImage = [image blurRect:overlayRect];
        [ImageCache blurredCache] put:blurredImage withKey:imageUrl];
    }

    dispatch_async(dispatch_get_main_queue(), ^{
        // skip if the cell’s event has changed
        if (![imageUrl isEqualToString:wself.event.imageUrl]) {
            return;
        }

        wself.overlay.image = blurredImage;
        wself.image = image;
    });
});

Next Steps

Newsfeed Scroll

All this keeps the UITableView feeling zippy when scrolling. However, it’s not without disadvantages. Namely, we have to wait until the blurring is complete so we can show the image and the overlay together. Otherwise, if we display the image and delay the overlay until the blurring is finished, things look a bit odd. Also, we’d rather not use additional storage to cache the blurred images. While the process is serving us well, we designed this solution when we were supporting iOS7. So, we may be able to take advantage of new APIs, such as UIVisualEffectView. This is an area we will be looking to improve during a future round of profiling. So, definitely stay tuned, as there is more to come!

Profile image of author: Neil Lokanth

Written by

Interested in learning more about Achievers?

cookie

We use cookies to help us understand how you use our site so we can show you personalized content and enhance your browsing experience.

Learn more by viewing our Privacy Policy