Can an iPhone Really Match Full Frame in Low Light?

Can an iPhone Really Match Full Frame in Low Light?

The announcement of Night Mode promises “low-light shots never before possible on iPhone.” Between the new mode and the addition of a truly wide lens, the iPhone is more competitive than ever. But do the shots actually hold up in the field? I tested against my Nikon Z 7, with surprising results.

Before the comment section sharpens their pitchforks, let me clarify that this isn’t a scientific test. While I tried to keep things comparable and rigorous, this test was undertaken entirely to build my understanding of the capabilities of my new phone and hopefully provide some insight into the capabilities of the new iPhones for everyone.

Two Roads Diverged

What makes this an interesting comparison, at least to me, is the two very different paths taken to create these shots. Apple is using a tiny sensor, which, even paired with a 26mm f/1.8 lens, shouldn’t be able to compete with the comparatively huge full frame sensor in the Z 7. To make up for the hardware deficits, Apple relies very heavily on software processing. Apple leverages the significant processing power of the A13 along with techniques like stacking to produce a result. Meanwhile, Nikon provides the raw file from a large sensor and fast lens, with users expected to change shooting and post-processing techniques to create their desired output.

Computational photography-focused hardware has been explored in a number of ways over the last few years, including Lytro and Light’s L16. Computational post-processing techniques have gained wider use in implementations like focus stacking and noise-reduction stacking, but hardware implementations in major cameras are still lacking. As a result, I think this test can be treated as a tiny preview of the two directions cameras are evolving towards. 

The Test

The major thing I wanted to test was the iPhone’s ability to be used as a replacement for a walkaround camera, particularly in lower light. To keep this use case in mind, I kept the setup on the Z 7 minimal, with no tripod, but using a higher ISO and VR. On the Z7 was a 24mm f/1.4 and FTZ adapter. The iPhone was entirely stock, including the use of the default camera app. Both shots have minimal post-processing, which does lend an advantage to the iPhone, but again, goes along with the emphasis on ease of use.

With all the disclaimers out of the way, let’s take a look at a shot. Can you guess which comes from the iPhone and which comes from a full frame mirrorless camera with 24mm f/1.4 lens?

The shot on the bottom comes from the Z7. Both at full resolution and resized to match the lower resolution iPhone shot, I prefer the iPhone’s output (top shot). While the fine detail isn’t there, the straight-out-of-camera look is definitely cleaner. Partly, this is because of the iPhone’s much more aggressive noise reduction, and partly because the iPhone struck a better white balance. The iPhone shot also has much higher contrast, with no easy way of tweaking that.

iPhone

Z 7

In this second set of shots, you can see how the very things that made the iPhone shot look better are now cutting against it. With a mixed lighting source, the iPhone’s higher contrast and saturation make this shot look overprocessed, particularly in comparison with the Z 7 shot. Again, the Z 7 isn’t significantly processed, so the color and contrast can definitely be refined beyond this more flat look. When tweaking the Z 7 shot in Lightroom, the lights have much more headroom for recovery, while the shadow areas have retained much more clarity.

iPhone

Z 7

This third set of shots is the perfect summary of the two different looks these cameras create. The iPhone shot has the characteristic one-dimensional rendering of a cellphone shot, with generous depth of field, higher contrast, and a preference to expose for the shadows. The Z 7 shot is much darker, as the meter was trying to save the highlights in the exposed light fixtures, but with a quick bump in exposure, opens up easily.

Z 7 with minimal post-processing

For this last one, I wanted to include a comparison of a tweaked Z 7 shot and the iPhone shot. The Z 7 shot was quickly brought up in LR to roughly match the iPhone’s contrast, vibrance, and overall exposure.

My Thoughts

Even now, a few days after getting the iPhone, I’m impressed by the capabilities. Night Mode is a massive leap beyond past iPhone's low-light ability and clearly worthy of praise. In comparison to the full frame camera, particularly at smaller sizes, it can trade blows with SOOC shots.

Where it falls short isn’t going to surprise most photographers: the lack of control inherent to a phone camera means shots can be hit or miss. The Z 7’s files have far greater latitude for processing, with the iPhone’s shots being relatively “brittle.” Additionally, the Z 7 provides far greater flexibility in choice of shutter speed and control of DoF. 

The Night Mode option isn’t great for subjects with more movement, as you’ll need to be still for at least 3-10 seconds to allow the iPhone enough time to build its image. An additional issue I noticed was the large amount of lens flare, despite the lens being factory clean. Given enough time and scratches, this could become a problem.

These aren’t deal-breakers, however, as the easy portability means the iPhone is a great choice for a pocket camera. The inclusion of a truly wide angle lens leaves most point and shoots in the dust, in my opinion.

To answer the title question: yes, the iPhone can match a full frame camera at night, but with some big asterisks on that answer. Handicapping the Z 7 in terms of post-processing is a huge benefit for the heavily processed iPhone shots, while resizing further tilts the field in favor of the iPhone. While I won’t be replacing my camera with my phone anytime soon, I’m very happy to have another tool in my bag, particularly to fill the niche of an unobtrusive, pocketable camera.

More broadly, these results should leave photographers excited about the possibilities of computational photography. While makers have just dipped their toes in the water, with support for sensor shifting and focus stacking, I believe the field has great potential. We’re already seeing new lens designs that make greater use of digital corrections for analog compromises, like distortion and vignetting. The processing power and software know-how that power the iPhone could do amazing things with a higher caliber sensor and lens.

Alex Coleman's picture

Alex Coleman is a travel and landscape photographer. He teaches workshops in the American Southwest, with an emphasis on blending the artistic and technical sides of photography.

Log in or register to post comments
60 Comments
Previous comments

Hey downtown Gilbert! A great place to take photos. :D I just got the new iphone as well, I'm excited to see what it can really do.

Always good to test out a camera where there's plenty of good restaurants!

:D Indeed!

I'm a little confused. Did you compare the z7 raw files to the iphone jpgs? If so, this makes no sense to me; apples and oranges, as they say. I understand you wanted to evaluate the "walking around" capability of the platforms but why, then, didn't you program the z7 to shoot direct to jpg and evaluate those files against the iphone's? My understanding: Raw files are meant to capture the most possible data the chip can produce from the scene. To accomplish this, they sacrifice what we deem as more pleasing aesthetics to ensure the most data per pixel is captured, with the understanding that you'll "grade" the photos in a processing software afterward (same as "Log" files in a professional model video camera). A jpg is an already graded raw file, it's just that instead of a human pushing the sliders, an algorithm is doing it for you. So comparing the two is apples and oranges to me (forgive the pun); both fruit (images), but vastly different tastes (aesthetic looks).

There is no doubt that computational photography is the wave of the future. It has already changed the aesthetic expectation and preferences of the public to where professionals will be forced to use it to meet those expectations in order to sell images. Therefore, it will eventually leak it's way up into pro-body cameras. But there will be a switch, so we can turn it off when we want (on another subject for another article/discussion at another time, I predict the public taste will eventually tire of the hyper-hdr, over color-saturated, extreme-contrasty look that's in vogue right now and eventually swing back toward a reality based aesthetic). Those of us who like to print big, will certainly enjoy the benefits of having computational photography applied to a file from a full-frame or medium format sized sensor.

Now that would be a compelling exercise, and article... find the rogue apple imaging engineer who's willing to apply Apple's computational photography algorithm to a full-frame or medium-format sensor'd camera and evaluate the results.

The raw files were processed with a Lightroom profile matched to the Z7's picture control settings. While that's not pixel identical to a JPEG from the camera, it'll look the same for these purposes.

In camera computational photography is going to be quite a ways off for Nikon/Sony/Canon/etc, at least at the same level that phone manufacturers are implementing it. Both for software and processing power reasons - which personally is fine. As long as the camera supports an easy way to shoot a focus stack, I'm happy to edit it on a computer, where I'll have much more control.

Great test.. I've always wondered about this and how iPhones have touched/changed our business.

It's really getting to the place where I could see shooting and using the results for some purposes. Between a wide angle lens and better performance in marginal light, I feel a lot less limited to the old phone paradigm of bright light and a 28mm style view.

You should've also tested the Pixel 4 in night sight alongside. Pixel can save RAW files shot in night sight. Without it, it basically sounds like an ad for Apple, who no doubt have made a big jump from their poor showing last year, but remember that Pixels had this level of low light photos (including RAW) for about a year or so already.

Anything besides Apple or Sony isn't allowed here and will get them banned.

It's not a big conspiracy. I own an iPhone, I own a Z7. I shot this test with an iPhone and Z7. If Google wants to send over a Pixel to test, I'd be just as happy to compare them.