The announcement of Night Mode promises “low-light shots never before possible on iPhone.” Between the new mode and the addition of a truly wide lens, the iPhone is more competitive than ever. But do the shots actually hold up in the field? I tested against my Nikon Z 7, with surprising results.
Before the comment section sharpens their pitchforks, let me clarify that this isn’t a scientific test. While I tried to keep things comparable and rigorous, this test was undertaken entirely to build my understanding of the capabilities of my new phone and hopefully provide some insight into the capabilities of the new iPhones for everyone.
Two Roads Diverged
What makes this an interesting comparison, at least to me, is the two very different paths taken to create these shots. Apple is using a tiny sensor, which, even paired with a 26mm f/1.8 lens, shouldn’t be able to compete with the comparatively huge full frame sensor in the Z 7. To make up for the hardware deficits, Apple relies very heavily on software processing. Apple leverages the significant processing power of the A13 along with techniques like stacking to produce a result. Meanwhile, Nikon provides the raw file from a large sensor and fast lens, with users expected to change shooting and post-processing techniques to create their desired output.
Computational photography-focused hardware has been explored in a number of ways over the last few years, including Lytro and Light’s L16. Computational post-processing techniques have gained wider use in implementations like focus stacking and noise-reduction stacking, but hardware implementations in major cameras are still lacking. As a result, I think this test can be treated as a tiny preview of the two directions cameras are evolving towards.
The Test
The major thing I wanted to test was the iPhone’s ability to be used as a replacement for a walkaround camera, particularly in lower light. To keep this use case in mind, I kept the setup on the Z 7 minimal, with no tripod, but using a higher ISO and VR. On the Z7 was a 24mm f/1.4 and FTZ adapter. The iPhone was entirely stock, including the use of the default camera app. Both shots have minimal post-processing, which does lend an advantage to the iPhone, but again, goes along with the emphasis on ease of use.
With all the disclaimers out of the way, let’s take a look at a shot. Can you guess which comes from the iPhone and which comes from a full frame mirrorless camera with 24mm f/1.4 lens?
The shot on the bottom comes from the Z7. Both at full resolution and resized to match the lower resolution iPhone shot, I prefer the iPhone’s output (top shot). While the fine detail isn’t there, the straight-out-of-camera look is definitely cleaner. Partly, this is because of the iPhone’s much more aggressive noise reduction, and partly because the iPhone struck a better white balance. The iPhone shot also has much higher contrast, with no easy way of tweaking that.
In this second set of shots, you can see how the very things that made the iPhone shot look better are now cutting against it. With a mixed lighting source, the iPhone’s higher contrast and saturation make this shot look overprocessed, particularly in comparison with the Z 7 shot. Again, the Z 7 isn’t significantly processed, so the color and contrast can definitely be refined beyond this more flat look. When tweaking the Z 7 shot in Lightroom, the lights have much more headroom for recovery, while the shadow areas have retained much more clarity.
This third set of shots is the perfect summary of the two different looks these cameras create. The iPhone shot has the characteristic one-dimensional rendering of a cellphone shot, with generous depth of field, higher contrast, and a preference to expose for the shadows. The Z 7 shot is much darker, as the meter was trying to save the highlights in the exposed light fixtures, but with a quick bump in exposure, opens up easily.
For this last one, I wanted to include a comparison of a tweaked Z 7 shot and the iPhone shot. The Z 7 shot was quickly brought up in LR to roughly match the iPhone’s contrast, vibrance, and overall exposure.
My Thoughts
Even now, a few days after getting the iPhone, I’m impressed by the capabilities. Night Mode is a massive leap beyond past iPhone's low-light ability and clearly worthy of praise. In comparison to the full frame camera, particularly at smaller sizes, it can trade blows with SOOC shots.
Where it falls short isn’t going to surprise most photographers: the lack of control inherent to a phone camera means shots can be hit or miss. The Z 7’s files have far greater latitude for processing, with the iPhone’s shots being relatively “brittle.” Additionally, the Z 7 provides far greater flexibility in choice of shutter speed and control of DoF.
The Night Mode option isn’t great for subjects with more movement, as you’ll need to be still for at least 3-10 seconds to allow the iPhone enough time to build its image. An additional issue I noticed was the large amount of lens flare, despite the lens being factory clean. Given enough time and scratches, this could become a problem.
These aren’t deal-breakers, however, as the easy portability means the iPhone is a great choice for a pocket camera. The inclusion of a truly wide angle lens leaves most point and shoots in the dust, in my opinion.
To answer the title question: yes, the iPhone can match a full frame camera at night, but with some big asterisks on that answer. Handicapping the Z 7 in terms of post-processing is a huge benefit for the heavily processed iPhone shots, while resizing further tilts the field in favor of the iPhone. While I won’t be replacing my camera with my phone anytime soon, I’m very happy to have another tool in my bag, particularly to fill the niche of an unobtrusive, pocketable camera.
More broadly, these results should leave photographers excited about the possibilities of computational photography. While makers have just dipped their toes in the water, with support for sensor shifting and focus stacking, I believe the field has great potential. We’re already seeing new lens designs that make greater use of digital corrections for analog compromises, like distortion and vignetting. The processing power and software know-how that power the iPhone could do amazing things with a higher caliber sensor and lens.
Nice test. I do think that if you take away the ability to process RAW afterwards the phone is giving better SOOC results. I’ve been blown away by what it gives me the few chances I’ve had to use it. It’s not “professional” quality but certainly a means to capture scenes on the go that didn’t used to be possible. Scratches shouldn’t be an issue since the lenses are sapphire (unless you’re also carrying diamonds in your pocket).
That's one thing I really wanted to get across in the test - these are just SOOC tests. There's a lot you can do to tilt things in favor of the actual camera, including throwing it on a tripod, more post processing, etc, but if you're just looking for a walk-around shot, the phone has really started to compete even in low light.
According to Jerry Rig (youtube) Apple Saphire isn't really any better than good quality glass.
It's no more scratch resistant than the industry standard 'Gorilla Glass' from Corning.
These tests are just reinforcing how smaller and smaller the gap between mobile photos and DSLRs is. All encouraged by the ability to share beautiful moments instantly on Social networks from the back of your pants pocket. Why is it that I hate looking at the screen of my DSLR to take a shot, but doing so with my phone isn't the same?
There's fewer situations where the DSLR/Mirrorless is a clear winner, for sure.
I get it, I really do and it's great that the new iPhone is doing so well, but enlargement has really not been taken into account. Just as it was with film, sensor size is important to enlargement. The larger the sensor, the less enlargement required for print. This is the reason why a crop-sensor medium format GFX100 is $10k and the full frame PhaseOne is $55,000! There is an even larger and more dramatic difference between an iPhone sensor and a full frame camera, but if you're just posting to instagram I guess it does not matter lol. I'm not trying to be a bummer and I totally get the point of the article and test, but you kind have to throw out things like enlargement for it to matter.
Nobody is saying this is going to replace MF, FF, or the choice of lens and control available with a dedicated camera.
What's interesting, however, is the increasing number of situations where a phone can work just fine in lieu of a compact camera, MFT, or APS-C camera, particularly when the final output isn't going to be anything more than IG or web use. That's why this test has those constraints, the output is viewed at web-size, etc.
Yeah, like I said I get it completely and it's nice to see where the new iPhone is. I'm a photojournalist and most of what I produce is used for web, but no way I could do my job as well with an iPhone. I use my 400mm f2.8 VR almost everyday and there's just no way an iPhone is going to produce anything remotely close to what that lens can. For walking around though it's nice to see whats possible and my hats off to Apple. I'm just not someone who does casual walking around photography and I'm the guy who takes his D4s and 70-200mm f2.8 to his daughter's dance recital lol. If I'm taking a picture of something, it's important to me or to someone so the end result is important and I go all in. I'm not a big phone person and still use an iPhone 7 and have no interests in paying $1200 for a new iPhone. Most of the images on my iPhone were transferred there from my D500. One reason I love the D500 is it takes SD and XQD cards and it works well with Snap Bridge now so I can transfer images taken with any of my three camera's, D4s, D500 and D7200. I rarely even bring a laptop anymore, I just do quick edits and transfer images from my iPhone if I need to transfer images to an editor.
Nice test. I love the wide angle lens and the night mode on my iPhone. But I wish the night mode was also available on the wide angle...
That would be nice to see.
Another test with no actual raw files available for download. No interest.
Night mode produces JPEGs, while the Z7 raws were minimally processed, as the test is meant to evaluate the "walking around" capabilities of both cameras.
Why minimal? The iphone isn't minimally processing the photo... Process the raw and export to jpeg. Would be nice if one of these test would show something a little more in depth. How about a human subject in low light shot from a distance of 10-15 feet and cropped in to the face or eyes even. Nothing against this test. But they always seem like softballs
Minimal- the iPhone isn't minimally processing the photo, but requires no user input to perform that processing. Spending 10 minutes per shot working noise reduction, HSL sliders, and more on the Z7 shot would indicate it's a subject worth that effort, meaning I wouldn't bother with the iPhone to begin with.
This test is for my particular use case - can the phone work as a walkaround camera beyond the envelope of past phone/compact cameras. I'm not a people photographer, particularly in this scenario, hence they aren't in my tests. Assuming the person is posed, I wouldn't expect a huge difference from these results, although the higher saturation is less aesthetically pleasing for people IMO.
If night mode can only provide jpeg, I can right away imagine the result at full printable resolution. Your files are not full resolution even in jpeg, so to me again, if I can't see, the test has no value. Not saying there are no improvements, but I have totally stopped finding interest in reviews that don't provide high res or RAW. It's just a rule, too many partial reviews of too many products and too many people who want my attention. I have to trim to what process I want to see. In fact, I am slowly moving to reading comments first before deciding to read stories.
I don't need raw files, but why there is no exposure information? There is no Exif data either :/
Ah, and THERE's the pitchfork. 😂
Best review EVER!!! Fixed it!
Imagine the images you can get with a $200 phone, your existing camera gear and a bunch of money left over from not buying the iPhone
Ha. With ray tracing, it's just a matter of time until it's indistinguishable.
Do you all realize the future of "photography" will be a shot done with zero photosensor nor optics, just a push onto a button in the smartphone, the device will send geolocalisation into the cloud, the cloud will elect one backdrop based on that geloloc and facing direction in Internet photodatabase, match the daytime lighting with the online weather state and then past the user face/body using the user digital identity. If friends or familly there too, the system will past them too, and the face will be mitigated using the voice mood captured by always on mic.
Frankly, this a really beautiful future for the mass photography, don't you think so ?
It is to exciting to stop having to held a device and having to frame the shot, there is already plenty enough instagram shots of the same location, no need for such pain. Really ashamed no manufacturer has not delivered such a revolutionary device. Or not...
FF prob not but m43 another fellow tiny sensor maybe.
The difference in sensor size between MFT and FF is a lot smaller than the difference between MFT and the iPhone's 1/2.55"
It is a smaller difference, but if I'm reading their comment correctly, they're referring to the title. Essentially saying an iPhone vs MFT sensor is an even closer matchup than iPhone vs full frame.
It'd be a comparison I'd be interested to see, but couldn't perform myself.
I have an a7Sii which can go to ISO 409K. When I put an old Nikkor 55mm f1.2 on it, it is a NO light camera. It will bring out details in a dark alley perfect for night time street photography.
I remember my first shots with my D3s, which was a low light king of its time - literally seeing something from nothing was impressive. Canon has demoed a camera that does 4.5 million ISO, which is absurd.
That feature won't escape the cripple hammer until long after I'm dead.
Remember that max ISO doesn't mean max usable ISO. Just that the camera can go that high. Usually the last 2 or so stops are horrendously speckled. Most of the time, the last stop or two are just a lightened version of one stop lower, negating any gain from going so high, and actually hurting quality.
If i wanted to test those two prduct i would have bin fair and used the full availability of the Z7 and photoshop , then there would be no doubt about what produced the best images. Lets not fool around here, a test is a test.
It is a test of what the cameras can do with minimal user-originated processing when shooting casual images in low light. Adding a tripod, post processing, external lights, or anything else changes it into a different test.
I looked at high ISO tests for every camera for a decade before buying a pro body. Now I realize how silly that was considering you need good light to make good photographs. There are exceptions to the rule but show me a good photo where you turned off all the lights because you heard your camera has llow noise at a high ISO.
Google "A7S".
What iPhone?
.
11 Pro
Night mode was introduced on iPhone 11. Mention of the "truly wide" lens also means that it must be an 11 Pro that has three cameras. (Note, night mode does not work on the extreme wide angle lens.)
Good detective work. And yes, unfortunately night doesn't work on the wide angle, which is disappointing. Guess they have to save something for the 12 Pro...
I love computational photography and as an iPhone user since forever, still prefer the Pixel's slam-it-out Night Sight to Apple's more conservative Night mode. But it sure is terrific to have these capabilities on our phones.
It's really impressive how phones have advanced at the marginal situations. Wide angle lenses, telephoto, low light...
Good test. It's actually pretty common to see smartphones winning this kind of test, generally because the tester doesn't really know how to use a pro camera. And the phone diesn't stand a chance if the photographer's going to shoot some extra shots and employ their own stacking techniques in Photoshop.
I wasn't surprised about white balance. Well, for one because you're shooting raw, you have no actual white balance SOOC. You set one in post. But also, I was recently reading a bit about Google's Night Sight, and specifically, their color processing. They actually get sone pretty awful, noisy, bad color results directly off the sensor. In the process of their Night Sight image fusion algorithm, they have a deep learning AI model that is used to set whitepoint and tweak color. I imagine Apple has a very similar AI. And yeah, fooled in mixed light, and responsible for the increasingly common photos that look fine as stand alones, but with color and shadow rendering only barely suggested by the origonal scene.
Idk about photos, but the low light performance with video is still pretty bad with the iphone. I shot some B roll with the new iphone for a book opening, and even with a very well lit interior space the video footage was very noisy.
As usual, video needs much more consideration when it comes to lighting than photography. If not to lower noise, then to direct the viewer's eye.
Apple did mention at least one video mode that's taking two-shots for every frame. But when you're counting on a 9-shot computational composite for anything beyond bright light for your photos (both Night Mode and Deep Fusion use 9-shot stacks), it should be obvious you're not able to do much for video.
The newer sensor chips, those ridiculous 40, 48, 64, and 108 megapixel Quad Bayer chips, usually have some kind of single-shot HDR mode, where 2 or 4 pixels in the 4 pixel group each get their own exposure. That could certainly be used for video, though I do wonder if four 800nm pixels at different sensitivities is really better than one 1600nm pixel.
I think the justification for those crazy high count sensors is that even if the binning results in slightly lower efficiency in lower light, they'll never perform well in that range anyway, while the higher resolution can come into play in good conditions.
Doesn't the Z 7 have that low-light feature where it takes several images for a one image jpg (I shoot Sony so I know nothing of Nikon). Since the premise of the test was walk-around minimalism, wouldn't that have been a more apples to apples comparison had that feature been used on Z 7? This way, you'd be comparing in-camera low-light processed image vs in-camera low-light processed image.
It does not. You can shoot a focus stack, with the camera shifting focus automatically, but not a noise-reducing stack.
Interesting this hasn't caught on or been improved in more cameras.
My old Canon 6D had a 4-shot mode that would either add or average the shots, your choice. The adding part is the basic function in Google's Night Sight, and for HDR, they don't shoot brackets at all. Of course, like most of these functions on a camera, you did need a tripod, and the 6D would only produce a JPEG, though maybe some of the more expensive Canons delivered a merged raw
I think there's a lot to be said about what's missing from the firmware of these cameras. It'd be great to see expanded support for shooting things like this, or bracketed shots that are automatically grouped in LR, for instance.
While shooting an image for stacking is easy, for higher resolution cameras, you'd probably have to do the actual composite on your computer anyway, since the processor in an iPhone is miles ahead of the processor in a camera.
These are just static objects. Try to shoot some action in low light! At least it compares the brilliant software in an iphone with the raw file in camera. At least they should be processed the same or just show us the unprocessed raw files with exif.
Bingo. That's why I ask for files but never seem to get them. Yes phones are getting as good looking as DSLRs or mirror less, but that is if you remove all conditions that are not in favor of phones. It goes first by showing at lower res which can render a slightly blurred image almost perfectly sharp, case of a car for example. I personally like to see the structure of the pixels organization and right away you can see the difference between the two. And then you have this following video where top of the line phones do really weird things. Check 3:45, I don't think two cameras would have that much variations, and then it becomes more evident that there is more work behind in processing than capturing the actual images when you stop at 3:58 and 4:14. Both are 12mp. Even my old 3mp Canon D30 didn't do that bad. - https://fstoppers.com/gear/apple-iphone-11-pro-max-vs-google-pixel-4-xl-...