fbpx
News

Google publishes explainer on Pixel 4’s Astrophotography mode

A mix of AI and long exposures helps the Pixel 4 capture stunning images of the night sky

If you’ve had a chance to play around with the Google Pixel 4 and its Astrophotography mode, you may have found yourself wondering how a smartphone can capture such incredible images of the night sky.

Well, Google published a blog post explaining what’s going on in the background that enables the Pixel 4 to produce images that previously required large cameras, equipment and tricky post-processing. The feature relies on artificial intelligence (AI) to get the job done (big surprise).

First up, to capture photos of the night sky, you need to take long exposures. Google increased the Pixel 4’s maximum exposure time to four minutes, up from just one minute on the Pixel 3. However, taking a single four-minute exposure wouldn’t work, as the stars move and become blurry. Instead, the Pixel 4 actually takes up to 15 exposures, each no longer than 16 seconds. While this helps keep the stars looking like little points of light instead of blurry streaks, it does introduce some other issues.

One issue is the presence of “hot” pixels. These are caused by “unavoidable imperfections in the sensor’s silicon substrate” and show up as tiny dots when taking long exposure photos. In a regular shot, these imperfections are invisible. To fix the issue, the Pixel 4 looks for bright outlying pixels and hides them “by replacing their value with the average of their neighbours.” While this does result in a loss of image information, Google says it doesn’t “noticeably affect the image quality.”

AI helps detect the sky and make it look natural

Another issue Google had to tackle was that the viewfinder can’t show night sky details. Based on how the viewfinder works, it can’t show the long exposure image live as it’s happening. Instead, the Pixel 4 will show the exposures as it captures them. If users don’t like what they see, they can move the phone and it’ll start again. It handles autofocus in the same way, taking a couple one-second exposures to focus. If it can’t find focus, the lens focuses at infinity.

Finally, with all the exposures the phone captures, the Pixel 4 can actually make the night sky too bright, especially if there’s a full moon. AI kicks in here to detect the sky and dim it to make it look more natural. Further, by detecting the sky, Google says the Pixel 4 can perform sky-specific noise reduction and selectively increase contrast to make features like clouds, colour gradients or star formations stand out.

However, for all the software techniques, there are some things it still can’t do. For example, Astrophotography mode can’t capture the moon and stars simultaneously, as the moon’s brightness will overpower the stars. Further, capturing a starry sky with no moon leaves the land as a dark silhouette. But those drawbacks don’t make Astrophotography mode any less impressive.

Source: Google Via: Engadget

Related Articles

Comments