Did Apple actually develop ProRAW for professionals?

If Apple insists on calling it a 'Pro' iPhone, the camera should be more professional than this

I’ve recently switched back to iPhone, and after spending a few months on Android, I’m struggling to deal with Apple’s proprietary ProRAW photo file format.

Since the iPhone 12 series, all iPhones with the ‘Pro’ moniker can capture RAW image files. These are larger than a JPEG, but retain uncompressed data giving you more room to tweak highlights, shadows, colours and other aspects in a photo editing app.

However, in true Apple fashion, the company needed to (overcomplicate) simplify things, which is where ‘ProRAW’ comes in.

What’s the difference between ProRAW and RAW?

A RAW image is a professional file that gives photographers more data to edit in post. RAW photos capture precisely what the camera sees without applying stylized colours, subtle edits or HDR effects.

It’s also worth noting that RAW is a generalized term for highly editable photos. Most consumer devices (phones, action cameras, drones) take RAW images and save them as a .DNG file. Camera companies like Fuji and Canon also developed their own RAW formats, named .RAF and .CR3, respectively.

This slideshow requires JavaScript.

These usually won’t look great until they’re edited. But all these files work towards the same goal of giving you a basic uncompressed image you can edit. The fundamental key here is that it’s designed for you to edit, not an algorithm.

When you shoot a ProRAW image, you end up with a pre-edited RAW file enhanced with Apple’s photo algorithms. You can tweak this photo heavily like a normal RAW. Still, the process of capturing and editing the picture is a bit more convoluted than typical RAW files because the phone aims to offset its shortcomings with computational photography.

This slideshow requires JavaScript.

This means ProRAW gives you a pleasing photo that looks like a regular iPhone shot, but you can edit it much more than the standard file you get when using the camera normally.

This might not sound drastically different from a standard RAW, but because Apple expects the picture to be 80 percent processed once you hit the shutter button, taking and editing a ProRAW image requires a different mindset.

What’s the problem?

You can see one of the main issues prominently when shooting RAW images with the Halide camera app. The live view and finished product look drastically different.

It’s extremely difficult to line up a perfect RAW capture with any iPhone camera app. They often show a processed photo in the viewfinder instead of what the camera hardware captures. While many pro photographers can work around this as they did with film and older DLSRs, it shouldn’t be necessary on iPhone.

This means that even though you’re shooting RAW, you still need to treat your iPhone like a point-and-shoot camera because that gets you the cleanest and most predictable results.

Apple will try to capture what it thinks is an accurate capture of the space, even if you’re trying to get something more stylized. When they hit, the photos are fantastic since it’s a solid mobile camera system, but there’s a steep learning curve toward using the format expertly.

A demo of me dragging the Pro RAW slider to both ends and then turning on Adobe Colour.

For example, in Lightroom, a slider at the top of the toolkit lets you adjust how strong the ProRAW effect is—dragging it to the left leaves you with an image with perfectly exposed highlights but very dark shadows. Dragging the slider to the right ramps up the shadows and the overall brightness of the scene. Leaving it in the middle gets you a flat iPhone picture.

None of these are what I would call a good starting point to edit professionally.

“I wish there were a way to turn off all the major computational photography assists so I could set up my shots how I want.”

In Lightroom, you can also apply standard Adobe colour profiles to your images that remove the ProRAW effect and for most of my work, I like the familiarity of that. However, sometimes there are still issues with highlights or shadows being crunched because I couldn’t see what my photo looked like before I took it, leaving me with a drastically underexposed shot that’s almost impossible to use without the iPhone’s HDR effect.

This slideshow requires JavaScript.

As you can see in the images above, ProRAW did manage to get a cool photo. Still, when you look at the version with the slider set to the ‘far left,’ you can see that the camera chose to prioritize the sky for its HDR algorithm instead of focusing on the car, leaving the vehicle weirdly lit in the ProRAW pictures. In my opinion, the car looks more realistic in the Adobe versions.

Other times, I needed to switch back to the ProRAW colour profile to get the HDR data back because the iPhone did focus on the car, but that meant I needed to edit the photo differently than all the others, which made it difficult to match colours, exposures and other edits.

How do we solve this?

I think it’s time for Apple to rework its camera app to facilitate professional workflows. As the iPhone gets more powerful, the point-and-shoot mantra of the app is holding back the camera’s full potential. The thought process behind RAW is to give photographers a starting point to edit from. ProRAW, on the other hand, gives you a finished product and often makes you work backwards to get the desired results, which doesn’t make it feel very pro.

Needing to tweak every image with the ProRAW slider in Lightroom makes things feel unpredictable. Sometimes the highlights react in strange ways since the photo retained HDR data. Other times, the shadows might be washed out because the iPhone raised those to match the extreme HDR highlights. They’re still flexible RAW files, but the lack of transparency around how ProRAW affects pictures makes it annoying compared to standard RAW files.

Wondering, “did my iPhone over or underexpose this shot?” is a real problem, and not knowing until I take it into Lightroom makes me not want to rely on ProRAW. I’m generally happy with the files, and the 48-megapixel RAW images give me lots of detail to work with. I wish there were a way to turn off all the major computational photography assists so I could set up my shots how I want.

We recently saw Apple add a slightly more manual video camera app to Final Cut on iPad, so perhaps all my issues will be reminded when Apple shows off iOS 17 at WWDC on June 5th.