The making of VR’s most revolutionary headset

How multidisciplinary researchers at HP Labs took theory to practice and created HP Omnicept, the most intelligent virtual reality solution on the market.

By Sara Harrison — March 9, 2021

Imagine a virtual reality experience unlike any you’ve had before — one where the sophisticated hardware you’re wearing could sense your stress levels, your confusion, or even when your attention is beginning to wane — well before you’ve realized it.

That was the vision of Tico Ballagas, a senior manager at HP Labs, who led the development of the HP Omnicept Solution: the HP Reverb G2 Omnicept Edition headset and SDK (software development kit) for VR developers. Together with an all-star team of researchers, Ballagas set out to change how humans and machines interact. But like all ambitious projects, it had many iterations before it was ready for prime time. After five years of development that spanned four countries and two reorganizations, HP Omnicept will hit the market this spring. Outfitted with a fleet of biosensors and sophisticated AI, the HP Omnicept VR headset can do what no VR hardware does — it measures how hard your brain is working and soon even will gauge your mood and emotions. It’s the next step in the future of computing that’s set to revolutionize every aspect of work, from how we train surgeons to preparing for public speaking to how we make and consume entertainment. Here’s how the Omnicept came to life.

2016 — A team sport

To make a monumental leap forward in computing, Ballagas looked backward. He was inspired by his longtime hero Doug Engelbart, a pioneering inventor and computer engineer who in 1959 posited that instead of replacing people, computers could work alongside them, augmenting intelligence rather than supplanting it. His ideas became the basis of tools we still use today like the mouse, hypertext, and videoconferencing. “Now we realize it’s not either/or,” says Ballagas. “We can use computers and we can use AI to augment human intelligence as well.” But revolutionizing computing isn’t a one-person job. First, Ballagas had to assemble the right experts, starting with Jishang Wei, a top AI architect.

2017 — Getting inside our heads

Wei was then leading a project called “Emotion AI” to train computers to understand how to interpret people’s emotions from their biological responses in order to craft an experience that adapts to them. Ballagas and Wei found common ground in their visions and teamed up to tackle a central problem: We could understand machines, but they couldn’t understand us terribly well. For all their incredible computing power, machines can’t anticipate our feelings, or recognize when our attention is wandering or when we’re upset or tired or anxious. “We saw an opportunity to change that with VR,” Ballagas says. Wei and Ballagas demonstrated their pilot results to HP’s business unit, which expressed strong interest, and they assembled a crack team, including Mithra Vankipuram, a specialist in data science and user experience; Kevin Smathers, a software engineer; and nearly a dozen other talented scientists.

HP Omnicept Virtual Reality headset

Courtesy of HP

The HP Reverb G2 Omnicept Edition headset has biometric sensors, cutting-edge optics, inside-out tracking, spatial 3D audio, and improved controllers.

The team brainstormed for hours about how to create a machine that could work with humans. But how could they get inside someone’s head? They needed to measure emotions in real time, and they couldn’t stick people who were at work inside an MRI machine. Instead, they had to rely on proxy measures — biological indications that indicate emotional and psychological states, such as pupil dilation, which can signal mental strain or arousal; or increased heart rate, which can indicate stress. “We had a lot of failure stories,” like skin conductivity sensors that didn’t work and EEG sensors that didn’t get clean data, says Wei, the team’s machine-learning maestro.

2018 — Focusing on cognitive load

After reading the work of Nobel Prize-winning behavioral economist Daniel Kahneman, Ballagas realized that the team didn’t need all that extra information from brain waves and skin conductivity. They could get great data from just four biometric sensors that tracked changes in pupil dilation, heart and respiratory rates, and head movement. And they could use those measurements to make important inferences about one specific, but very telling, mental state: a person’s cognitive load. Just like the RAM in a computer, humans can only hold and process so much information at any given time. As we start to do more and increasingly difficult tasks, remember more details or recall more facts, our brains have to work harder and our cognitive load grows. Our working memory can’t hold any more: We start to forget things, have a hard time focusing, and struggle to keep up with new information coming at us. Everyone from air-traffic controllers to fighter pilots to surgeons has to manage a careful balance called the Goldilocks Zone: They’re engaged enough to stay focused, but not so challenged and overloaded with information that they get overwhelmed or burn out. Ballagas reasoned that knowing someone’s cognitive load could help them train more efficiently. The Omnicept could measure cognitive load and see where people were struggling and needed more time and practice, tailoring that training to their unique needs. But Ballagas also kept an eye on the future, including a camera that captures facial expressions which could eventually be used to create avatars that mimic users’ faces in real time. “Cognitive load is really just our first inference,” says Ballagas. “We aspire to do a lot more.”

“Cognitive load is really just our first inference, we aspire to do a lot more.”

—Tico Ballagas, senior manager, HP Labs

2019 — Unwavering commitment to scientific excellence

In April 2019, HP disbanded the Immersive Experiences Lab, but the core team, including Ballagas, Wei, Smathers, Vankipuram, Sarthak Ghosh, and Hiroshi Horii, stayed within HP Labs. With the sensors perfected, Wei and team started refining the AI features in the Omnicept, combining emerging physiological research with top machine-learning techniques. The result is an algorithm that uses sensory data to make inferences about how hard the brain is working. The team was also careful to create a diverse data set so the algorithm was relevant for all types of users. They launched a global data collection effort, partnering with international labs like the CriaLab in Brazil.

2020 — Opening up the science

In early 2020, Ballagas brought in even more scientific muscle by recruiting Dr. Erika Siegel, a psychophysiologist who studies how physical cues like heart rate reflect our internal state. She was impressed by how committed HP is to creating products supported by cutting-edge science. “This [testing] has been very rigorous from the beginning,” she says. “This is how I would have designed it.” The team also started collaborating with Jeremy Bailenson, founding director of the Virtual Human Interaction Lab at Stanford University, who helped design the studies. Bailenson says he’s usually leery of companies promising they can measure emotions, but this opportunity was different. “The team that Tico put together was what changed my mind,” he says, describing the mix of expertise as “mind-blowingly good.” Even as COVID-19 caused a global shutdown, the team continued to collect data by sending out sterilized headsets to more than 600 subjects in the United States, Mozambique, Brazil, and Taiwan. This data makes the Omnicept’s algorithm one of the most inclusive and most representative AIs ever developed. And though the pandemic kept the core team sequestered in their homes, work on the Omnicept continued. “There’s just a good energy to the team,” says Siegel, who credits Ballagas for keeping everyone on deadline while still fostering creativity.

Hiroshi Horii

The HP Omnicept VR solution will initially be used with workplace training applications, like flight simulators that train new pilots.

2021 — The future of Omnicept

After years of development, the Omnicept will initially be used with workplace training applications, like flight simulators that train new pilots, and programs like Ovation, which helps people practice public-speaking skills. As they collect more user data, Ballagas and team will continue to refine and expand Omnicept’s inferences to include emotions beyond cognitive load. In the future it will be used to test how drivers feel about a new car interior before it’s built or improve the outcome of virtual meetings and collaboration by capturing the nonverbal cues of participants. Already Ballagas is considering new ways to unlock the secrets of the brain. “I think that’s the future of computing, the future of communication, and it will be really powerful.”


RELATED: How VR is being used to train healthcare workers and nurses on the front line.


The HP Reverb G2 Omnicept Edition features

Along with the HP Labs researchers, teams from XR Hardware R&D, XR Software R&D, and Personal Systems Software helped develop the finished product, a headset which has biometric sensors, cutting-edge optics, inside-out tracking, spatial 3D audio, and improved controllers with natural gestures. 

Heart Rate: By monitoring a user’s heart rate, the headset makes it possible to know how they are responding to a VR experience or workplace training. 

Motion tracking: With four cameras and internal sensors for position detection, users can track more of their arm movements in a VR experience. 

Eye motion and gaze: Tracking the user’s focus area offers an understanding of how they are responding to content, and monitoring pupil size provides insight into engagement levels. 

Facial expression: Cameras that follow mouth movements will support more natural collaboration in VR environments. 

Foveated rendering: Integrated eye tracking from Swedish tech company Tobii makes it possible to discern the user’s gaze direction, improving image quality and enhancing realism. 

Improved visuals and sound: Valve’s industry-leading lenses and speakers deliver a lifelike experience.