There are few cameras as ubiquitous as the iPhone. Walk around almost anywhere and you will likely find someone carrying the device or readying to snap something with it. The iPhone XS (and by extension, the XS Max) is supposed to be Apple’s crowning photography jewel, thanks to new, smarter features.
It’s now expected that ‘S’ year iPhones are entirely iterative, and over the years, that’s always included something new on the camera side. Here, there is a marginal difference in hardware, since both XS models borrow so heavily from the iPhone X.
Apple chose to presumably hold off with any major hardware changes until next year, pushing its luck on the software side instead. Not that there isn’t precedent. If Google and others could do it, why not Apple with these two phones?
Playing it safe
The iPhone XS and XS Max feature larger micron pixels at 1.4µm compared to the iPhone X’s 1.22µm. The larger the pixels, the better chance you have when shooting at night or in low-light, which has been the most inconsistent part of iPhone photography. Where some shots turn out admirably well, others come out abysmally bad.
Rather than widen the aperture, Apple has stuck with the f/1.8 on the standard rear lens and f/2.2 on the telephoto lens. Both feature optical image stabilization and can use the True Tone LED flash. I know Apple promoted the flash as a viable assistant when taking photos, but I remain unconvinced. Maybe I’m biased against smartphone flashes, but I have rarely seen them work well on any phone — unless you find something translucent as a diffuser.
The secret sauce may be in the A12 Bionic chipset Apple used for both devices, particularly when paired with the built-in image signal processor. Forgetting the technical jargon that goes with that, the key is that a lot of processing takes place during and after a photo is taken. These largely manifest themselves as Smart HDR and Depth Control, two software-driven features powered by the aforementioned hardware components.
In a nutshell, both are designed to improve composition in discernible and pragmatic ways. Why make adjustments after the fact when you don’t need to? Or why not make them when the selection process is so easy? This kind of choice defines the camera in these two phones to a certain extent.
Make some adjustments
What bothered me most about the iPhone X’s camera was its lack of flexibility. That’s been a pain point when using iPhones for some time because Apple’s camera app is too limiting.
The problem is that the nuances of the original app don’t always apply to others. For example, I used Camera+ 2 a lot for this review and the Depth Mode in that app bears no resemblance to the Portrait Mode in the regular Camera app. Depth Control within Portrait Mode puts the bokeh effect on a sliding scale. From what I’ve seen thus far, no third-party camera app treats these photos exactly the same way.
To do this, Apple gives you a virtual scale ranging from f/1.4 to f/16. The lower you go, the blurrier the background. Other manufacturers have already done this, only Apple’s method is a little better at knowing who is in the foreground. The bokeh looks natural, not forced. It struggles, however, when the subject isn’t at one focal depth.
Flowers, trees or plants are good examples. Despite using a wider focal point, it’s a problem carrying over since the iPhone 7 Plus. The same can happen with strands of hair or angled faces where the furthest point sort of bleeds into the background blur.
Stranger still is the softening effect applied to every shot in that mode. MobileSyrup managing editor Patrick O’Rourke covered this so I won’t rehash, but I echo his findings.
Take the example selfie done with the iPhone X and iPhone XS. The difference is patently obvious, with the XS’ soft filtering producing a smoothing effect that simply isn’t natural, though the background is demonstrably better. I get that some selfie shooters will like the smoothing, but not me. This happens with the rear lenses as well, though Apple has recently acknowledged the issue and plans to release a fix in iOS 12.1.
The thing about Portrait Mode is that it shoots faces well, as its name implies, but a wider frame means it’s harder to mimic a macro shot. I had to move too far back for the effect to kick in, making it harder to capture more detail from a subject. Even if I did, the softening might negate it. That wasn’t always the case in prior models, and if Apple is determined to keep it, then making it elective would be the best way to go.
When I need to shoot in low-light or nighttime, the iPhone is not the device I reach for. Others are simply doing it better these days, but at least there’s some light at the end of the tunnel — literally.
The XS does better at night when there is a distinct bright light source that can anchor the shot without blowing out the highlights. Glowing signs are good examples. Abundant dimmer lighting also does okay. Once help like that goes away, though, the phone struggles. Using other apps, like Camera+ 2, VSCO, Halide, ProCam or Filmic, presents some breathing room to tinker with manual controls or settings to salvage a shot before even taking it.
Lowering the exposure or raising the shutter speed a little could help, except Apple essentially claimed Smart HDR could do most of the work within the company’s own camera app. In some cases, that’s actually true, but certainly not when night falls.
On the other hand, some of the daytime shots I got facing the sun turned out really well. Crisp detail and decent contrast didn’t really sacrifice any part of the image. The sky maintained a blue hue, not a blown out off-white that’s common in shots like that. Even if I were to shoot someone backlit, the results turn out fairly well. The balance I achieved was truly impressive.
What I liked most was the phone’s ability to not go haywire once I focused on a different part of the frame. Past iPhones had a bad habit of completely over or underexposing images that way, but not as much here. It can happen, just less likely when Smart HDR is doing its thing.
I noticed no real difference in autofocus speed. If there is an uptick there, I never saw it. Camera+ 2 does it faster, with greater control over exposure, too. I’d also like to see Apple’s camera app launch a little faster from the lock screen. It’s not bad, but when you have a fast processor under the hood like these phones do, little things like that matter.
Google’s Pixel 2 and Pixel 2 XL are still better to me, shot-for-shot, mainly because they offer more realistic contrast, and the Pixel 3 lineup appears to have already improved on that. The iPhone XS and XS Max produce warmer and softer shots, which bodes well for faces and pets. The subtle brightness adds some flair to them that I think most would find pleasing.
It’s a closer race with Samsung’s 2018 flagships, and Huawei’s P20 Pro is by orders of magnitude better at night through its Night Mode (provided you’re not shooting a moving object). Even a phone like the LG G7 in Manual Mode is a worthy challenger, offering greater flexibility, including native RAW support.
Shooting in RAW is possible on the iPhone XS and XS Max through an app like Halide, with Lightroom ready to edit them, though I would only really do that with shots that I want to work on. It’s pretty time-consuming to go through a big workflow that way.
This image format goes back to iOS 11, so it’s not new to the iPhone XS or XS Max. However, it does play tricks with compatibility sometimes. To be fair, this deviation from JPEG isn’t Apple’s fault. The newer HEIC (High Efficiency Image File Format) was developed by MPEG, the most popular audio and video compression standard. Apple adopted it as a space-saving measure to stave off users’ storage woes.
iOS editing apps, like Enlight, Snapseed, Photofox and others have already adapted to it. Sharing photos on social media has been business as usual because those images are converted to JPEG on the fly. That’s not always the case with desktop apps. Photoshop, Photoshop Elements and Gimp can’t read them natively. I’ve had trouble with iMovie, too. Windows PCs still have issues with the format.
There is the option to go back to saving in JPEG on the iPhone itself by going to Settings>Camera>Formats>Most Compatible. Or you could use a program to do it. One I found success with on the Mac is iMazing. Keeping it at Most Compatible is tough because Apple forces your hand. Shooting in 4K at 60fps and 1080p at 240fps only works in High Efficiency.
The camera in the iPhone XS and XS Max is undoubtedly the best Apple has produced to date, even if it doesn’t feel like a huge leap forward. It’s a good step in the right direction because it’s starting to address shortcomings. For a brand that once held off competitors in mobile photography, Apple has been playing catch-up for years. At least the gap is smaller now than it was before.
I suspect the average user will feel satisfied with the results, but I do wonder about those who have experienced shooting with different devices. I have surprised more than a few iPhone users with results from other phones. In some circumstances, the iPhone XS and XS Max might turn the tables instead.
Update 23/10/2018 6:22pm: The story has been updated with information related to Apple’s plans to fix the iPhone XR’s and iPhone XS’ photography skin-smoothing issue with iOS 12.1.