Imagine a virtual reality experience unlike any you’ve had before — one where the sophisticated hardware you’re wearing could sense your stress levels, your confusion, or even when your attention is beginning to wane — well before you’ve realized it.
That was the vision of Tico Ballagas, a senior manager at HP Labs, who led the development of the HP Omnicept Solution: the HP Reverb G2 Omnicept Edition headset and SDK (software development kit) for VR developers. Together with an all-star team of researchers, Ballagas set out to change how humans and machines interact. But like all ambitious projects, it had many iterations before it was ready for prime time. After five years of development that spanned four countries and two reorganizations, HP Omnicept will hit the market this spring. Outfitted with a fleet of biosensors and sophisticated AI, the HP Omnicept VR headset can do what no VR hardware does — it measures how hard your brain is working and soon even will gauge your mood and emotions. It’s the next step in the future of computing that’s set to revolutionize every aspect of work, from how we train surgeons to preparing for public speaking to how we make and consume entertainment. Here’s how the Omnicept came to life.
2016 — A team sport
To make a monumental leap forward in computing, Ballagas looked backward. He was inspired by his longtime hero Doug Engelbart, a pioneering inventor and computer engineer who in 1959 posited that instead of replacing people, computers could work alongside them, augmenting intelligence rather than supplanting it. His ideas became the basis of tools we still use today like the mouse, hypertext, and videoconferencing. “Now we realize it’s not either/or,” says Ballagas. “We can use computers and we can use AI to augment human intelligence as well.” But revolutionizing computing isn’t a one-person job. First, Ballagas had to assemble the right experts, starting with Jishang Wei, a top AI architect.
2017 — Getting inside our heads
Wei was then leading a project called “Emotion AI” to train computers to understand how to interpret people’s emotions from their biological responses in order to craft an experience that adapts to them. Ballagas and Wei found common ground in their visions and teamed up to tackle a central problem: We could understand machines, but they couldn’t understand us terribly well. For all their incredible computing power, machines can’t anticipate our feelings, or recognize when our attention is wandering or when we’re upset or tired or anxious. “We saw an opportunity to change that with VR,” Ballagas says. Wei and Ballagas demonstrated their pilot results to HP’s business unit, which expressed strong interest, and they assembled a crack team, including Mithra Vankipuram, a specialist in data science and user experience; Kevin Smathers, a software engineer; and nearly a dozen other talented scientists.