fbpx
Reviews

iPhone 11 Pro and 11 Pro Max Camera Review: Lighting up the night

With the iPhone, Apple had reached a point where it could no longer claim to have the best mobile camera on the market. The most ubiquitous, perhaps, but not the best. Despite that, the iPhone 11 Pro and 11 Pro Max are the biggest step forward in years.

Why it took this long for one of the most valuable companies in the world to figure out low-light photography is challenging to ascertain. It was a glaring issue for years, and despite extra lenses and new features along the way, night photography received negligible improvement.

These two phones reverse that to some degree. Yes, there is a new ultra-wide lens in the rear, but the onus was on the software to see in the dark better than ever before. There’s extra stuff at work here, including some caveats, leading to interesting results.

For the full review of both the iPhone 11 Pro and 11 Pro Max, you can check out MobileSyrup managing editor Patrick O’Rourke’s take on Apple’s latest flagship devices.

The camera layout

The standard wide 12-megapixel camera is virtually unchanged from last year, whereas the 12-megapixel telephoto lens gets a wider f/2.0 aperture compared to the previously tighter f/2.4. The newest addition, the 12-megapixel ultra-wide with f/2.4 aperture, and the equivalent of a 13mm focal length showing a 120-degree field of view.

It’s the first time Apple has gone with a triple-lens array, though as has often been the case lately, the company caught up to what competitors had already been doing. Ultra-wide lenses go back to the LG G4 in 2015, soon followed up by the likes of Asus, Huawei, OnePlus and Xiaomi.

Samsung and Apple only added it this year, while Google has yet to embrace an ultra-wide shooter. Such a wide view, coupled with a 4x optical zoom on the telephoto lens, means there’s an iPhone that finally covers most angles users will shoot at.

Apple also had to catch up from a software perspective. The last two iPhone flagships proved that a wider aperture wasn’t enough to compete. Portrait mode effects were neat, except there was little to no control over how a photo would ultimately turn out.

IMG_9662
IMG_9614
IMG_9663
IMG_9613
IMG_9602
IMG_9574
IMG_9569
IMG_9562
IMG_9550
IMG_9546

That’s not to say Apple suddenly put the levers of power in users’ hands with these two phones, but it built better trust that the tools will actually work consistently. The Deep Fusion processing Apple hyped as part of the neural engine in the A13 Bionic chip to help with medium to low-light scenes. Basically the middle ground between bright and dark scenes.

What the company neglected to mention at the time was that the ultra-wide lens wouldn’t benefit from any of that. Neither Deep Fusion nor Night mode apply to that particular lens. For the telephoto lens, it might kick in if it thinks the scene is too bright otherwise. Because all of this works in the background, you have no idea whether the camera applied it or not.

Seeing at night

Rather than present a feature that others would have to match, Apple was the underdog in this instance. After being soundly beaten year after year in low-light performance, the iPhone 11 Pro and 11 Pro Max finally brought something worthwhile to the table.

Apple’s mode works much the same way others do. Machine learning helps analyze a scene, snap a series of different exposures, choose the best ones, and flatten them together for the best possible output.

The results were encouraging as I took the phone out in low-light situations. Cityscapes that would have lacked any punch or colour now appeared vibrant. Dimmer restaurants, bars, clubs or lounges were also easier to shoot in, negating the need for a flash.

The two visual factors that make modes like this successful are the dynamic range and reduced noise. Both were horrendous in previous devices, whereas now, the odds of shooting something good as light dims increases tremendously.

That’s why I’m perplexed as to why Apple would still restrict it. Night mode only works on the standard wide lens — a limitation that, frankly, makes no sense. If it’s software-based, why not apply it to all three lenses? Competitors offering a night mode and multiple lenses allow for that, whereas Apple has placed it in a silo for no apparent reason.

My other issue is that Night mode is automatic. You can’t select it as a mode to use when you want to because it’s supposed to kick in on its own. While I can understand Apple’s tendency to keep things simple for everyone, there’s no way to adjust how much the effect kicks in. In fairness, Google and Samsung do the same thing in not allowing for any manual input as to the duration of the effect.

Huawei does, and is much better for it. Long exposure photography is a varied process, and it would be nice to be able to scale it back when one wants to. If Apple can handle a slider to adjust bokeh in Portrait mode, it can manage something similar for Night mode. As is, the mode tries to be accurate every single shot, which is simply not the case.

Measuring quality

The irony here is that some of the features Apple threw in work without you knowing. It probably follows the “it just works” mantra the company’s espoused for a long time, but there are times when the results appear a little mysterious.

The excessive Portrait mode skin smoothening from last year was already adjusted in an iOS update, and is a little better this time around. Apple took a nice stride last year in separating the subject from the background for bokeh effect, and benefits from the ultra-wide lens. Now, there are two choices to capture portraits, and the wider angle makes a noticeable difference.

The iPhone was always good in ideal conditions, like daytime and bright indoor settings. Nothing’s really changed there, though I would like to see an option to adjust colour temperature. Apple skews toward a warmer tone, adding a yellowish tint to many photos. Google swerves the other way, going for a cooler tone, which is why Pixel photos have a blue tint to them.

That colour temperature is pervasive throughout, regardless of what time of day or night I shot the photo in. I suspect most users won’t notice or care, but if you count yourself among those, you may feel differently after seeing what other phones produce.

Where these latest iPhones really shine is in a feature Apple neglects — RAW output. Third-party apps, like Moment, Halide, VSCO, Camera+ 2 and RAW+, will offer that, letting you capture RAW images you can truly tinker with in editing apps afterward. The beauty of RAW is the level of detail it captures, giving the photo a more lively look. Some of the photos presented here were shot and edited that way.

Those apps I mentioned also offer manual controls Apple doesn’t bother with. The only problem is that Night mode and Deep Fusion don’t apply to other apps. If I wanted to adjust shutter speed and ISO, while still taking advantage of Deep Fusion, I can’t. Granted, the latter feature is entirely automated, but that also goes back to my earlier point about the lack of any manual input at all. Still, if you care about photo quality, you should utilize some of those apps to get more out of the camera.

Video prowess

This will hardly be a surprise, but Apple maintains its pole position in video recording. I’ve long preferred the iPhone for most video scenarios, save for the fact there’s not much to adjust in the options. Again, third-party apps can expand on that much further, particularly ones like Filmic Pro and Filmmaker Pro.

I do appreciate there is a 24fps option for 4K resolution, though wish Apple would also include it for 1080p too. Beyond that, both 30fps and 60fps are available for both, which is better than the Pixel 4’s strange omission for 60fps in 4K. It would be nice to have more manual control over how to shoot, but that’s what other apps are for, I suppose.

Framing better shots

The camera array in the iPhone 11 Pro and 11 Pro Max was a bigger leap forward for Apple, though not a leapfrog over the competition. Low-light photos are far better this time around, catching the company up to the best we saw in 2018, not what we’ve seen in 2019.

The Pixel 3 is already on par, and the Pixel 4 is better at reproducing highlights and shadows with some vibrancy, largely because you have greater control over exposure and shadows. The Huawei P30 Pro is a better all-around shooter for the options it provides in one app, though Apple is often better regarding dynamic range.

Deep Fusion could be something transformative down the line, but for now, it’s hard to measure its ability truly. Apple likes to hype its newest features, yet fails to add context to why they work well in certain scenarios. The good news is that the sum of this camera’s parts is a noticeable step forward. It’s still a game of catch-up for engineers in Cupertino, but a first-place finish is within sight if they push to help people capture images they’ve never seen before.

Related Articles

Comments