iPhone 12's Computational Imaging Chops Edge Out Earlier Generations by Far

The last time I wrote anything about the iPhone 12 series when it comes to photography, commenter Blake B wrote, “… computational imaging is way more important than you make it out to be.” I’m starting to think he might be right.

In an actual test from YouTuber Tyler Stalman, he compares the photographic prowess of the new iPhone 12 and 12 Pro models to the previous generation iPhones, specifically the iPhone 11 series and iPhone X. While on paper, the phones didn’t seem that impressive, it appears that whatever Apple is doing with its A14 chip versus the A13 in the iPhone 11s is actually making a difference. Stalman’s photos show a clear improvement in what the iPhone 12 models are able to capture versus previous generations. There’s even a noticeable improvement not only in low-light ability, but also clarity and sharpness as well. Computational imaging actually makes a huge difference here. It was, to an extent, visible in trick modes on other phones, like Google’s Night Sight on its Pixel series, but it seems baked in to every photo to great effect on the iPhone 12.

And it’s crazy that this is even the case, because let’s be real, how many Canon or Nikon shooters could actually see the difference one generation of DIGIC or EXPEED processor to the next? It’s a testament to engineers that realized that they couldn’t physically cram more in the way of sensor size or optics into a phone, and so, they went a different route.

Camera manufacturers always seemed focused on speed. Indeed, if I’m being truthful, the biggest difference I could see jumping from my Canon EOS D30, to EOS 5D Mark II, to the EOS 5D Mark IV was speed, and that’s great and all, but by the time photographers hit the 5D Mark III or so, operational speed was already pretty awesome. With the 5D Mark IV not pushing that many more megapixels, I’m wondering where my computational imaging smarts are. Imagine a camera with a huge sensor, interchangeable lenses, and the processing power of an iPhone or Pixel-series phone. That would be a reason to buy “real” cameras again. As it stands, these kinds of advancements in phones are making 1-inch sensor cameras irrelevant and awfully pricey for what they offer.

Stalman’s examples push the limits of what iPhones are capable in low light, and even point out how some of the video features of the phone aren’t viewable on most monitors (specifically Apple’s Dolby Vision HDR support). Maybe if Apple did a better job of explaining/marketing all of this stuff, there might have been some more excitement around the iPhone 12 among photographers in the first place.

While I’ve had my tiffs with computational imaging in the past, but it seems like it’s time to learn to “stop worrying and love the bomb.” As much as I love my Pixel, it seems to be following the trend of my previous Android phones and getting just a touch flaky after a year, and so, some next-gen Apple computational imaging might be in my future. Is it in yours? Leave your thoughts in the comments below.

Wasim Ahmad's picture

Wasim Ahmad is an assistant teaching professor teaching journalism at Quinnipiac University. He's worked at newspapers in Minnesota, Florida and upstate New York, and has previously taught multimedia journalism at Stony Brook University and Syracuse University. He's also worked as a technical specialist at Canon USA for Still/Cinema EOS cameras.

Log in or register to post comments