Intel executive shares vision for photorealistic graphics in recruiting video

The recruiting video also boasts of the benefits of starting from scratch


Intel has posted a new recruiting video for its discrete graphics processing unit (GPU) project on Twitter that teases the company’s vision for the future of GPUs: photorealism.

The video features the company’s senior vice president of core and visual computing and chief architect Raja Koduri along with new CEO Bob Swan talking about their plans for GPUs and Intel’s history.

Koduri kicks things off by saying the team is starting from “zero.” Further, he says the team sees this as an opportunity to release discrete graphics from the “constraints of integrated graphics.”

For the unfamiliar, discrete graphics typically refer to graphics solutions that aren’t integrated into the CPU like integrated graphics are. Intel has long used integrated graphics solutions in its processors, so this could suggest Intel is moving away from that platform.

Koduri also discussed his vision for the future of graphics, specifically noting that he wants “photorealistic, immersive worlds.”

While Koduri looks to the future, Swan acknowledges Intel’s place as a leader in computing technology in its past, and how the company will lead into the future with a greater focus on graphics.

As for how Intel hopes to achieve these lofty goals? Koduri likens it to a Lego project, with Intel’s impressive IP portfolio acting as the blocks.

“Everyday I find something cool and new that I want to pull in,” Koduri said. “As an engineer, how do you build a product that delivers a particular function with all of this IP? It’s kind of like putting [together] a very complex lego structure.”

However, don’t expect photorealistic graphics from the company’s first GPU, expected in 2020. The industry as a whole is still relatively far from that goal, and while Intel may have a fresh perspective as it joins the game now, it likely has a lot of hurdles to jump before it gets to photorealistic graphics.

Source: Twitter Via: Digital Trends