Apple’s ProRAW Does Some Seriously Weird Things to Photos

Apple’s ProRAW Does Some Seriously Weird Things to Photos

After many years of non-pro iPhones, Apple’s new 48-megapixel iPhone 14 Pro was enough to convince me to pull the trigger. I was ready to fall into the embrace of Apple’s AI-assisted ProRAW format. While it has its uses, in its current state it’s still kind of a mixed bag.

When Apple first launched ProRAW with the iPhone 12 Pro series, I wondered how “raw” the ProRAW files were, what with some AI-assisted processing baked in. After a couple of years and a few hardware revisions, it’s clear the answer is: Very.

In good light, the slight boost from AI is hardly noticeable and the quality of the main iPhone 14 camera shot in ProRaw shines:

In these photos of Quinnipiac University’s campus, I pitted a Canon EOS R5 with the RF 24-105mm f/4L IS USM lens against the best lens/sensor the iPhone 14 Pro has to offer, its 1x (24mm equivalent) lens that has 48 megapixels. While it’s hard to tell what AI is doing in any photo, as Apple’s not very transparent about it, both photos were pretty much equal when it came to quality. The corners were slightly sharper on the iPhone (an AI assist maybe?) but the trees were slightly less detailed. Both photos were processed from the raw files of each camera in Photoshop to match as closely as possible.

But it’s when I pushed the iPhone in low light against the R5 with a geriatric, almost 20-year-old Canon EF 17-40mm f/4L USM lens that even the older lens was able to pull away from the iPhone, easily.

Here’s a look at Newport, Rhode Island’s Castle Hill Lighthouse with both the R5 and the iPhone at the same focal length.

The differences are obvious even at small sizes. It’s clear, to my eyes at least, what the AI is doing. The shadow details in the rock are basically gone, and so the iPhone filled in the shadow details with its best guess, resulting in the “waxy, painterly patterns” that I described two years ago. All that time and several hardware revisions have passed, and it's still that bad. Even simpler patterns, such as the brickwork on the lighthouse itself showed that “Vaseline smeared over the lens” look I described back then.

Much of this also comes down to the ability to change settings. If I want to use ProRAW, I’m stuck using Apple’s default camera app, which chose a shutter speed of 1/19 at ISO 1000 on with the main lens’ f/1.8 aperture. By comparison, I did a .6 second, ISO 100 at f/8 exposure for the Canon.

The strange part, though, is that my initial reaction to seeing these photos on the small phone screen was awe about how good they looked. In blind (unscientific) tests of both of these sets of photos amongst my social media following of mostly photographers and media professionals, even seasoned veterans guessed wrong about both photos, with many preferring the iPhone’s take on things.

And I think that’s what Apple’s betting on here is that AI will make the photo look better to most folks rather than produce what’s accurate.

Where this should have most photographers worried is that Apple calls this mode “ProRAW,” when, with this type of heavy-handed editing of what’s supposed to be a digital negative, it’s anything but.

All that said, I'll probably take some flack for the light levels being a bit lower on the iPhone post of the lighthouse above. Indeed the photos were shot 9 minutes apart at sunset, but for comparison, here's a shot with the iPhone at the same time as the R5, and while it's better, the same issues remain to an extent:

The iPhone 14 Pro photo of the Castle Hill Lighthouse from the same time as the R5, above.


While the issues of what's happening in the ProRAW files are still there in the same spots (the bricks, the shadows of the rocks), this photo does track much closer to the R5 photo.

All of this leads me to the question: How did camera manufacturers let it get this close? When a phone can go almost toe-to-toe with a professional camera and lens, or at least good enough for most people, what's the point of it all?

What are your thoughts on Apple’s approach to raw files? Leave your thoughts in the comments below. 

Wasim Ahmad's picture

Wasim Ahmad is an assistant teaching professor teaching journalism at Quinnipiac University. He's worked at newspapers in Minnesota, Florida and upstate New York, and has previously taught multimedia journalism at Stony Brook University and Syracuse University. He's also worked as a technical specialist at Canon USA for Still/Cinema EOS cameras.

Log in or register to post comments
8 Comments

Finally! Thank you very much to share an honest observation about iphone 14, untill today I felt like only I have that feeling of Vaseline on the lens. I wish apple to replace ProRAW with JustRAW :) or give access to its 48mp sensor to 3rd party apps. Generally sensor looks great for the phone, but that details vanishing in low light... it kills all good things I can say. I'm shooting a lot in a twilights and many times photos got corrupted. Nevertheless they look great on phone screen, but when I zoom them in on monitor - sometimes I cry.

So the future is, Apple takes the photos; not you anymore

Phone photos are between quality camera photos and Polaroids.

When you edit a pic ic Lightroom you can change the amount of "Pro" in the "ProRAW" Profile. My assumption is that if you dial it down to zero you are left with an actual RAW image.

When I import ProRAW files into Lightroom, if you change the camera profile from 'ProRAW' to 'Adobe Color" it seems to force the file to act more like a true RAW file.

Definitely going to have to play with this one!

Just a note - 3rd party apps have access to both ProRAW and the 48-megapixel sensor mode. Here's a super-detailed article from the folks that make Halide about exactly that: https://lux.camera/iphone-14-pro-camera-review-a-small-step-a-huge-leap/