According to a new book, Inside Apple, by Fortune Magazine’s Adam Lashinsky, new versions of the iPhone may have camera sensors that capture the entire light field instead of just a fixed amount of detail.
In June 2011, Steve Jobs met with Lytro inventor and CEO Ren Ng, who created the Lytro “whole light field” camera. The simple point-and-shoot box camera contains a f/2 aperture and an 8x optical zoom lens, but its genius is in the way it captures light: when you press the shutter button you record an entire light field instead of fixed focus pixels.
Though the first model captures fairly low-quality 1MP photos (1080×1080 pixels) each one can be manipulated in a variety of ways. You can change the focus, depth of field and lighting after the photo is taken, lessening the need to take that perfect shot the first time. It’s a pretty fundamental change in the way we take photos, and it could come to the iPhone.
Perhaps the size of the sensor is an issue right now, as is the overall quality of the image, but that may soon change. It will also allow manufacturers to include fixed-focus lenses into their devices, thereby shrinking the width of our smartphones. High-quality autofocus lenses are the main reason devices can’t get much thinner than they are. Despite the fact that Sony just introduced a tiny 13MP autofocus sensor, fixed-focus alternatives will always be thinner.
But let’s not jump to conclusions. While the Lytro camera is certainly cool, Apple has always chosen practicality over revolutionary. It chose to delay the launch of a LTE iPhone for at least a year until the technology matured. It’s unlikely the 2012 iPhone will have such technology included, nor the 2013 model. But it’s good to know that, in addition to televisions Steve Jobs also wanted to revolutionize mobile photography. Considering how popular the iPhone 4 and 4S are on Flickr, the prospect isn’t hard to believe.