Google’s rivals took 2019 seriously.
This is why there is improved camera performance across the board from Huawei, Samsung, OnePlus and Apple, leading to plenty of anticipation over what the Pixel 4 and Pixel 4 XL cameras would bring to the fight.
A second lens is the most obvious, but Google’s camera has always used software as the foundational approach. That mantra remains the same here, despite the additional hardware thrown in. The extra lens comes with its own caveats, but it’s the rest of the package that stands out for different reasons.
The merits of the latest Pixel devices, as far as all their features go, require a different perspective. You can find that from MobileSyrup’s own Dean Daley and Jonathan Lamont, who reviewed both devices.
I was a little surprised Google went with the exact same 12.2-megapixel sensor with f/1.8 aperture and lens for its standard wide camera. I thought the company might try to squeeze in a larger sensor or widen the aperture further. The reasons for not doing it may have everything to do with the software. This will be a familiar theme for the Pixel 4 (I’ll be referring to both devices as ‘Pixel 4’ unless otherwise noted).
And so, the new 16=megapixel f/2.2 lens with 2x optical zoom would seem an odd choice. Everyone else is going wider, so why narrow the focus further with what is the equivalent of a 48mm lens? It’s even more peculiar because it contradicts Google’s ability to maintain sharpness through digital zoom. If you continually make zooming in digitally a respectable way to shoot photos, then why double down on a telephoto lens?
Google believes zoom is more practical out in the wild — as subjective a statement as you’ll find. But is it true? Personally, I don’t think so, and in this particular case, I feel it’s because there are already software alternatives within the camera to handle specific scenarios.
For example, the Pixel 4’s Portrait Mode doesn’t necessarily need the telephoto lens when we’ve already seen how good it is with just one lens. I previously mentioned the Super Res Zoom that effectively sharpens images without optical help. Macro photography is possible, except I’m not sure a zoom lens was necessary to get there with the Pixel 4.
Google also didn’t make it immediately obvious regarding how to access the telephoto lens in the first place. When I didn’t see it as a toggle option on the interface, I double-tapped onscreen and noticed it went straight to 2x zoom — doing the same reverts back to the standard lens. It’s more intuitive than I would’ve thought, and faster than selecting a lens otherwise. Mind you, it’s easier to do that when you have two lenses instead of three.
Controlling the image
Of all the software features Google introduced, the best was dual exposure control with a shadow slider. Exposure control is old hat on phone cameras, but shadows are another thing altogether. For the first time with a Pixel phone, users have some semblance of basic control over how much dynamic range they can capture from an image.
Frankly, it’s a feature I hope others emulate because it can have a positive impact in more ways than one. For example, the Pixel 4 does okay with backlit subjects, but now, by focusing on the foreground subject, you can dial up the shadow to get more detail out of the highlights. It’s not perfect, mind you; it’s just the closest you can get to manual control on these devices.
One of the reasons this was a brilliant move was actually to offset Google’s own preference for contrast. Compare a Pixel photo in good lighting to most other phones, and you will notice it is rich in colour and relatively high in contrast. That’s especially clear when comparing them to the iPhone, which overly skews to warmer light temperatures. The slider lets users decide how much shadow they deem necessary each time they tap the shutter.
Live HDR+ is Google’s way of showing you what it will do to the image after processing it. What you see is what you’ll get. No more looking at one thing, and then seeing the imposed effects when viewing it later.
Top Shot and Motion also return, helping assuage the fear of missing the perfect moment when snapping a photo. I didn’t see any real performance boost for these modes, and while I don’t know how much people use them, Google wisely leaves them off out of the box.
Shooting for the stars on any phone is anything but easy. I recall trying to capture the northern lights in Iceland two years ago with an LG V30 using a long 30-second exposure on a tripod. One image turned out decently, but it was tough to get anything coming close to the Fuji mirrorless camera I used in the same shoot.
Google’s Astrophotography mode trims down the time and preparation involved, but not necessarily the parameters. Living in a big city all but renders the mode useless because of light pollution, though a clear night might deliver something worth sharing.
You need a dark, clear sky, plus a tripod and phone mount to pull off a photo in this mode. The slightest nudge of the tripod or phone will blur the picture — basically, the same thing with a long exposure shot on any other camera. In ideal conditions, the mode will pop up on its own so you can shoot.
What’s truly fascinating about this is how software does in seconds what would otherwise require minutes. The mode captures a series of 15-second exposures — four minutes in actual camera time — and does it in a fraction of the time. It’s impressive computing and stitching, and should bring out some great creativity in the right conditions.
To be fair, Google had no hope of getting the same results a big lens or telescope could capture. The phone lens and diaphragm are simply too small to match that output. Still, it’s a feature that can blow people away when done right.
Portraits and selfies
Google didn’t drastically change much on the standard camera mode, though portraits are better at keeping all the subject’s details focused. I noticed that hair strands were less susceptible to washing out with the background.
Repeating myself again, as I always do in these reviews, I don’t indulge in selfies, but in testing out the camera, I found results were good. Unfortunately, taking out the wider-angle front-facing lens makes it harder to squeeze in more behind subjects. This was one of the highlights of the Pixel 3, and while Google left that lens out in the Pixel 3a, I was among those who expected it to come back in the Pixel 4.
I’ve always advocated that it’s better to shoot portrait photos with the rear lens anyway. The volume buttons are hard shutter buttons by default. Angle your thumb just the right way and you can easily take selfies in portrait orientation. Landscape is admittedly tougher. If only Google engineers would use the squeeze feature to also act as a shutter in that scenario. I’ve brought it up for three Pixel iterations now.
I’d also like to see Google experiment with studio lighting, like Apple does. Surely there’s software in the company’s arsenal that can do it better? We’ll have to see.
Overall image quality
When Google showed off ‘Night Sight,’ it was a spectacular showcase of what software can do to capture the darkness without a mess of noise. The results spoke for themselves once I got to shoot with it myself. It’s still one of the best low-light shooting modes with the Pixel 4.
The HDR processing seems to hone in on brightly-lit sources more vigorously than the iPhone’s latest camera does, for instance. That was already the case compared to Huawei and Samsung too. In cityscape scenes, it usually works out well, breathing life into the architecture and people, as it did on a recent trip to Nashville.
My one gripe (that I’m carrying over from last year) is the lack of manual input when shooting. Huawei wisely lets you adjust shutter speed and ISO in advance, or stop the exposure midway when using its night mode. It’s a great way to neutralize any overzealous processing. Night Sight would benefit from simply letting users stop the processing midway with a simple tap on the shutter.
Despite that, the Pixel 4 is a stalwart under almost all conditions. Day, night, indoor, outdoor — it covers the bases well. There is room for improvement, though. Since Google is so reluctant to include a full manual mode, the next best thing is to shoot in RAW. That’s easy to do by selecting the settings within the camera menu and enabling it there.
You do need to use a program like Adobe Lightroom or something similar to process and edit the images, but it’s well worth it. These are the only images that don’t have Google’s own processing, leaving you to work with them how you see fit.
Which then brings me to the zoom lens. Whether it’s 2x optical zoom or going all the way to 8x with tons of digital zoom thrown in, the Pixel 4 may be the best there is at getting closer to a subject. I’m just not sure why they didn’t go further. If you insist on going with a zoom lens, and could conceivably produce pretty good results at, say, 20x digital zoom, why not put that in?
Google played it safe here, opting for output consistency over winning a numbers game. I noted earlier that I didn’t think the company was right in choosing one lens (a telephoto) over another (an ultra-wide). And I made that distinction in spite of the solid visual evidence showing me that the zoom lens was not a gimmick.
Perhaps if it chose to avoid putting in its Soli chip, it could have added an ultra-wide lens and the telephoto one. It’s not that I’m surprised Google did right by its telephoto lens, it’s that it only lays bare what could have been with an ultra-wide.
Users and media, alike, chastised Google for not including 4K video capture at 60fps. That is an odd omission for a flagship device, and I don’t see any hardware or software restriction that would otherwise hinder it. Well, except for maybe one.
Google’s notion is that 4K video at that framerate would take up a lot of storage. Indeed, that is true, and pertinent for devices that also come in 64GB variants (ridiculous unto itself). But the bigger reason, at least to me, probably has to do with battery life. The Pixel 4 and 4 XL are hardly sipping juice throughout a day, as was also noted by others.
I still don’t understand why 24fps isn’t an option, either. That motion picture effect is nice, yet there are barebones options on the video side of things. It’s neat that you can automatically shoot a video by holding down the shutter. It would’ve been nicer if there were a few more key features and settings to tinker with before shooting clips.
While quality is certainly not bad, the Pixel 4 won’t come out on top against the Apple or Samsung’s flagships.
As a photographer, the Pixel 3 was a phone I could trust. The narrower focal length did limit it in some respects, but I could count on its image quality. That trend continues with the Pixel 4, which is still among the best in the business.
However, it’s no longer a clear leader running neck-and-neck with Huawei. Competitors closed the gap this year, and Google didn’t pull further away from the pack. Virtually all the new camera software features and modes were shipped out to Pixel 3, 3 XL and 3a devices via software updates, lessening the incentive to get the Pixel 4.
The one major missing link? Not the telephoto lens, but rather the dual exposure control. Those older models won’t get that, and as great as it is, it may not be enough to push the Pixel 4 into a league of its own.