ARKit Hands-on: Apple aims to enter the augmented reality space


While it’s been rumoured for a number of months now that Apple is interested in the augmented reality space, at WWDC today we got our first glimpse of the tech giant’s initial step into AR.

To some extent, Apple is playing catch-up with Google, which has already launched its Tango augmented reality platform — though Tango devices are still difficult to come by. During Apple’s WWDC keynote, however, the company’s senior vice president of software engineering, Craig Federighi, revealed ARKit and boasted that it will be available across the company’s iOS ecosystem, creating “the largest AR platform in the world.”


During the hands-on media-only portion of the keynote following the main presentation, I was able to quickly go hands-on with ARKit. Bearing a striking resemblance to Google’s Tango platform, I placed a lamp, vase and various other objects on a surface, and then was able to move the new 10.5-inch iPad Pro’s camera around and look at it from different angles. While I’ve only spent a brief period of time with Tango, I’d say the experience is very comparable to Apple’s ARKit.

I was also shown an AR version of the well-known Star Wars holographic chess game, complete with a surprisingly realistic version of Jabba The Hut.

airkit jabba

On stage Federighi explained how Pokémon Go’s object recognition will improve thanks to ARKit, allowing Pokéballs to bounce across surfaces, rather than just floating. The creative lead also showed off a more complex augmented reality experience created by Peter Jackson’s Wingnut AR studio, which utilizes Unity and SceneKit in conjunction with ARKit and is set to launch “later this year.” Unfortunately, this particular demo was not available in the hands-on area.

Of course, like all of Apple’s development kit utilities, the tech giant’s new ARKit will only be a success if the company manages to convince app developers to get on board with it. Google seems to have managed to attract a small amount of developers to its AR platform and given Apple’s vibrant app ecosystem, it’s likely the Cupertino-based company will do the same.


  • Pingback: ARKit Hands-on: Apple aims to enter the augmented reality space | Daily Update()

  • ciderrules

    The Asus Zenfone AR (and previous Tango devices) used three cameras/sensors (motion sensing, depth perception and “normal” primary camera) to do AR.

    Yet Apple was demonstrating this on an iPad with a single camera and no extra sensors. So they obviously have some advanced processing going on to perform similar functions without needing the extra cameras/sensors.

    I wonder if it works even better/faster on devices like the iPhone 7 Plus with dual cameras? Or whether the iPhone 8 will even include these extra sensors (as has been rumored).

    • Smanny

      It cannot work better than Tango. Especially when Tango has mm accuracy using a depth sensor camera as well as an IR camera. Not to mention a regular camera as well. Apples ARkit has motion tracking, plus plane, lighting and scale estimations. Hence scale estimation is using a single camera on an iPhone or iPad. There is no way on this planet Apple’s current ARkit has any accuracy.

    • It’s Me

      Speaking in absolutes shows a lack of confidence and knowledge.

      It’s entirely better their upcoming phones will have specialized hardware but that doesn’t mean they can’t do well without it, just better with it.

      In the meantime, google is leaving users of almost all current phones out in the cold and they’ll have to upgrade to get any benefit at all. Apple would be crucified for anything close to that (i.e. When they restricted their original Siri to phones with the specialized DSP, the same old haters jumped all over that).

    • Smanny

      It’s me from Apple’s own ARkit scale estimation. Just doesn’t sound very accurate, period. That’s not to say new iPhone’s and iPads will not have new hardware, sure. However existing iPhone’s and iPads do not have extra hardware. So clearly it cannot be accurate with a single camera that is used. Please it’s me you have to acknowledge that.

      Clearly when Patrick said “While I’ve only spent a brief period of time with Tango, I’d say the experience is very comparable to Apple’s ARKit.”. It might be comparable, but there is no way the accuracy could be comparable to Tango.

    • It’s Me

      I’ll acknowledge that 2 cams will be better for things like field depth analysis, or rather, would be easier to do well with 2 cam than it would be to do depth analysis with software. But software with powerful enough hardware can do some pretty amazing things.

      ARKit is currently just an SDK. Tango today includes the sdk and hardware reference. Whether ARKit will eventually include specialized hardware or how that would compare with a still fairly immature Tango is yet to be seen.

    • ciderrules

      Yet the article writer, who has actually used both, says the experience is very similar between the two.

      Have you used Tango and ARKit yet?

  • Pingback: ARKit Hands-on: Apple aims to enter the augmented reality space – High Tech Newz()