Google’s impressive-sounding Pixel Visual Core image processing unit (IPU) continues to be a confusing piece of silicon.
While the IPU could be the start of Google eventually creating its own processors and other smartphone components — which would be a big step for the Pixel line and Android devices in general — as it stands right now, it looks like the Visual Core doesn’t really do much, at least not when it comes to Google’s own camera app.
— Evan (@emuneee) February 7, 2018
Google activated the Visual Core this week, stating that, as of right now, the proprietary chip would only be used for creating HDR+ photos in third-party apps like Instagram and Snapchat. However, the company also hinted that the Pixel Visual Core could have more purposes in the future.
In an interview published in fonearena back in November of last year, the Pixel 2’s camera team explained that it did a lot of optimization within the Pixel camera app to ensure optimal performance, which means that the Pixel Visual Core isn’t necessary for images captured with the native camera app.
So in a sense, despite the fact that the Pixel Core now manages third-party HDR+ integration, it doesn’t seem entirely necessary to the Pixel 2’s camera experience.
An excerpt from fonearena’s interview reads as follows:
“So we’re really looking forward to see what they do with it. Turns out we do pretty sophisticated processing, optimizing and tuning in the camera app itself to get the maximum performance possible. We do ZSL and fast buffering to get fast HDR capture. So we don’t take advantage of the Pixel Visual Core, we don’t need to take advantage of it. So you won’t see changes in the pictures captured from the default camera app in the coming weeks. What you’ll see is that pictures taken in 3rd party apps will get significantly better as they’ll start taking benefit from some of the HDR processing.”