Autofocus – Phase Detection versus Contrast Detection

One of the imaging features of the iPhone 6(+) is the inclusion of phase detection alongside the existing contrast detection. The iPhone isn’t the first smartphone to do this, with the previously introduced Galaxy S5 featuring the same technology. Indeed, the first device to introduce this technology was a compact camera from Fuji back in 2010.

Which makes sense given that Apple doesn’t design optics or imaging sensors, but instead buy them from companies like Sony.

Nonetheless, the whole in the iPhone 6(+) is, in all probability, the best camera on any currently available smartphone, taking the best of what you might find individually on other devices: The optical image stabilization (in the 6+), phase-detection, and large sensor pixels. This is an aspect of smartphones that is so critical, and it really is baffling why $600+ devices go with the $2 cheaper imaging sensor when it compromises the whole product so substantially.

But what is phase detection and how does it differ from contrast detection? Anyone with an SLR has probably already learned this when wondering why some modes are so much better than others, however for everyone else I considered making a quick focus simulation app that demonstrated both (I make such exercises a fun challenge), then discovering that a pretty good one already exists: Phase Detection versus Contrast Detection. Those detail how an SLR focuses, so of course there are differences with a CCD on-sensor phase detection, but the principal is the same, and a full understanding is gained when you read that dpreview article about the Fuji camera.

Contrast detection measures contrast of the desired focus area of the image. It can have difficulties in limited contrast subjects: One of my sons is literally unfocusable for contrast detection cameras, his porcelain skin and light hair presenting little for it to differentiate. I generally have to find something with a high contrast at the same distance to focus before taking the picture.

Phase detection measures how out of phase the incoming light is from sides of the lens. SLRs have a fixed number of these sensors at specific locations in the frame (the more expensive the SLR, the more sensors), and on-CCD phase detectors are no different, if not even more limited given that they interfere with light capture.

As with the Fuji described at the outset, it is likely that the iPhone features phase detection only for subjects in the center of the frame, and given that this compromises some pixels then some software adjustments to make up for the light lost through phase detection.

It is an interesting and novel technology, though it comes just as contrast detection has gotten much better than it once was: Using Live View on a Canon camera from even a few years ago was a pretty miserable experience, the sensor taking so long to compute the scene and zone contrast, and to discern deltas, that focusing was a brutish affair. Modern image sensors have largely erased this issue of speed in most situations, albeit still struggling in low light and with low contrast subjects.