fbpx
News

XTouch makes tablet actions out of table taps

Imagine not needing your hands to navigate a tablet, and not being limited to the scope of the front-facing camera in order to do so?

This is becoming reality thanks to the team of researchers and developers at XTouch, a startup being incubated at JOLT, a fund based out of Toronto’s MaRS hub. Currently available in the form of an iPad app technology demo, XTouch acquires data from an iOS device’s microphone, turning the sound into meaningful actions on the device itself.

The beauty of XTouch is that its algorithms only need a single calibration point, on any vibrating surface, to function, so functionality can be extracted regardless of the surface’s size. The demo video (seen below) provides a good example of this: say you are cooking, and want to prevent smudging the screen on your propped-up iPad.

Calibrating a compatible recipe app to scroll up, down, left or right depending where on the surface you tap.
photo 3
“Each surface responds differently depending on its physical structure,” says Amin Heidari, co-founder and CEO. “From the device perspective, these two taps,” knocking the small wooden table in front of me in two spots, “sound completely different, which is how the app knows to respond a certain way.”

Heidari envisions a future where, in addition to WiFi and Bluetooth, smartphones and tablets have the XTouch SDK built into them, allowing for various out-of-box lifestyle applications like expanded keyboards, music controls and more.

“We’d love to see XTouch eventually integrated into furniture like dining tablets, night stands, countertops,” he says. Imagine an alarm app where you could tap your desk once to snooze, or in a pre-set combination to disable it.

photo 1_1

The team, which emerged from U of T’s Mobile Applications Lab, is currently in discussion with OEMs to integrate the one-line SDK, with Heidari hinting at a possible Android version in the coming months. He’s extremely proud of his team’s work, mainly because sound-based input allows for the kind of omni-tracking input that cameras don’t support.

Samsung, for example, attempted to use the front-facing camera on its Galaxy S3 and S4 to detect when a user has stopped watching a video, but the result was finicky and hard to consistently reproduce. While sound-based input, like voice, obviously requires active participation, it shares around the same input lag as a Bluetooth controller — around 30ms — and can therefore be used for gaming, too.

There’s also great potential for XTouch to help those with physical disabilities like hearing and visual impairments, giving users a more expansive canvas on which to work than a typical smartphone or tablet screen. When combined with iOS’ and Android’s extensive accessibility support, the combination has great potential.

Right now, XTouch exists as two technology demos, created in-house, called Voodoo Tap Frogs and Magic Xylophone. Both are built for iPhone and iPad, and detail how easily tap-based input can be used for multiplayer gaming and music respectively. With the company’s native SDK coming soon, there are plenty of reasons to be excited.

Related Articles

Comments