Issue 25

iOS 7 blur

Mihai Fischer

In the last couple of years, the whole mobile world, and we"re talking big names: Android, iOS, Windows, adopted the concept of flat design. The purpose is the same: offer a better experience to the user through the use of clean understandable user interfaces.

With iOS 7, Apple introduced flat design to its operating system along with some new user experience concepts like the idea of "always know where you are" for a better access to different types of data. One of the best examples is the Notification Center, which overlays the Home Screen to show you missed calls and the other info. By doing this, the user didn"t have to leave the Home Screen (having it blurred in the background) and also focuses better on the newly displayed data.

How does Apple do it

To achieve the Notification Center blur, Apple works with the iPhone"s GPU, this showing up in the responsiveness of the animations and data displaying. The fact that they can have video or moving components in the blurred background, called "live blur", is also thanked to this. Unfortunately, Apple doesn"t provide any API for this in iOS7 SDK, probably for some security reasons. So, if you want to do it yourself, it will take a lot of time to write code to handle the GPU, and to be honest you probably only want to show an alert.


The normal thing to ask is, how can we create this in our own apps then?

There are several ways to do it, but the one I"ll present in this article is very easy to implement, using a class provided by Apple at WWDC 2013, called "UIImage+ImageEffects.h". This implies to capture the current screen, blur the image and set it as background image to the new screen, alert or whatever you are displaying.

We"ll start off by creating a custom view controller based on UIViewController, in which we"ll import the earlier mentioned class.

#import "UIImage+ImageEffects.h"

In one of the life cycle methods of the controller we add the following and we already have the easiest way to obtain a blur effect:

-(void)viewWillAppear:(BOOL)animated {
    UIImage *snapshot = [self takeScreenSnapshot];
    UIColor *tintColor = 
      [UIColor colorWithWhite:0.2 alpha:0.15];
self.view.backgroundImage = [snapshot 
	applyBlurWithRadius:8 tintColor:
	tintColor saturationDeltaFactor:
        1.8 maskImage:nil];

- (UIImage *)takeScreenSnapshot {
    if([self respondsToSelector:@selector(
        [self drawViewHierarchyInRect:
         self.bounds afterScreenUpdates:NO];
        [self.layer renderInContext:
    UIImage *image = 
    NSData *imageData = 
           UIImageJPEGRepresentation(image, 0.75);
    image = [UIImage imageWithData:imageData];
    return image;
  1. applyBlurWithRadius is the number of pixels taken into account for calculating the blur. The bigger the number, the bigger the blur. 
  2. tintColor is the tint set to the blurred image. In my example, i used a darker tint similar to the one in Notification Center. 
  3. saturationDeltaFactor is the image saturation level.
  4. maskImage though is nil, it is used to mask portions of the images, so you have only partially blurred images like below:  

Going deeper

For those of you who want better performance or just another type of blur we can have a look inside "UIImage+ImageEffects.h" and have a few things changed.

typedef enum {
} BlurType;

@import UIKit;

@interface UIImage (ImageEffects)

- (UIImage *) applyLightEffect;
- (UIImage *) applyExtraLightEffect;
- (UIImage *) applyDarkEffect;
- (UIImage *) applyDarkEffectWithTent: (CGFloat) radius;
- (UIImage *) applyTintEffectWithColor:(UIColor *)tintColor;

- (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius blurType: (BlurType) blurType tintColor:(UIColor *)tintColor saturationDeltaFactor:
  (CGFloat)saturationDeltaFactor maskImage:(UIImage *)


As you can see through the newly added properties and methods I added applyDarkEffectWithTent which switches from the default box filter to a faster tent filter algorithm. This one is faster and more powerful than the box filter, that"s why we only need one passing through it, not 3 like in the box algorithm case, obtaining the blurred image even faster.

- (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius blurType: (BlurType) blurType tintColor:(UIColor *)tintColor saturationDeltaFactor:(CGFloat)
saturationDeltaFactor maskImage:(UIImage *)maskImage
            if (blurType == BOXFILTER)
vImageBoxConvolve_ARGB8888(&effectInBuffer, &effectOutBuffer, NULL, 0, 0, radius, radius, 0,

&effectInBuffer, NULL, 0, 0, radius, radius, 0, 

 &effectOutBuffer, NULL, 0, 0, radius, radius, 0,
&effectOutBuffer, NULL, 0, 0, radius, radius, 0, kvImageEdgeExtend);

By doing this we reduce the runtime from 184 ms to a low 16 ms. After all, we want to give our users a seamless experience while using our app, just like in Notification Center.

I"ll finish by suggesting that we use this blur technique only when we think blur is needed to enhance the user"s experience and not on every corner of the app, making it an overloaded frozen blur.




  • Accenture
  • Bosch
  • ntt data
  • Betfair
  • FlowTraders
  • MHP
  • Connatix
  • Cognizant Softvision
  • BoatyardX
  • Colors in projects

Mihai Fischer wrote also