Hand Tracking for the Masses: A Conversation with uSens

Greenlight InsightsAnalysis

Mike Boland is a Contributing Analyst with Greenlight Insights. Boland is currently Chief Analyst and VP of Content for BIA/Kelsey, and Chapter President of the VR/AR Association. He was one of Silicon Valley’s first tech reporters of the internet age, as a staff reporter for Forbes starting in 2000. Boland will be the emcee for the upcoming Virtual Reality Strategy Conference 2016.  

 One critical link in VR’s value chain — besides ample content — is hand tracking. Rendering hand movements joins head and positional tracking as key steps towards realistic presence.

As we’re often reminded, the first thing many people do in first-time VR experiences is look down at their hands. When hands don’t appear, it immediately breaks presence.

This is the section of the VR development ecosystem that uSens occupies. Similar but different to Leap Motion and a few others, it develops skeletal tracking hardware and SDKs.

Fresh off $20 million in Series A funding, uSens’ main product is its Fingo series of sensors, including three models that each feature 26 degrees-of-freedom hand tracking.

But the biggest differentiator is modularity: the energy-efficient sensors can be attached to existing headsets, bringing inside-out 3D hand tracking to even mobile VR.

This makes it a sizable opportunity, given mobile VR’s greater accessibility compared to tethered HMDs. And that just became truer with last week’s Daydream View launch.

CTO & Co-founder Dr. Yue Fei hopes this modularity brings biometric tracking to the masses. It’s a key step in graduating from gamepads to true presence by holding tools in a natural way.

With longstanding pedigree in the field, Dr Fei explains that skeletal tracking continues to gain accessibility with the forward march of Moore’s law, mostly involving mobile CPU capacity.

One of the milestones in the science was in fact the introduction of Microsoft Kinnect. To skeletal tracking scholars, it was a proof of concept for both the science and the market.

Since then, a long series of computing advancements have enabled a standalone inside-out 3D hand tracking unit. The magic is in the CPU payload and power consumption, says Yue

Remaining barriers include rendering, field of view and latency — all keys to improving presence and reducing nausea. And of course the holy grail is haptic feedback.

Yue is also interested in tackling pre-market integrations in addition to the standalone after-market flagship. Going after these partnerships will be one use for that $20M infusion.

“We want to be embedded in HMDs,” he said. “We can do the reference design and the core algorithm for making it work; they can be platform owners.”

Photo Courtesy of uSens