Why AR/VR Eye Tracking is a Huge Deal: Guest Post by Lucas Rizzotto

October 18, 2018

Last year, Japanese company FOVE released the world’s first VR headset with built-in eye tracking — the technology showed a lot of promise, and in the months that followed, Facebook, Apple & Google all acquired eye-tracking startups to incorporate the technology into their respective XR devices.

So what’s the big deal with AR/VR eye tracking, and how can it affect the advance technology industry?

Better Performance & Natural Focus

Eye tracking allows developers to optimize the performance of VR/AR experiences by focusing system resources specifically where the user is currently looking. This not only lowers VR’s high barrier to entry but also gives creators the ability to create breathtaking visuals by using their processing resources wisely.

 

Designing toward a natural eye gaze in VR/AR can positively impact user experience
 
Realistic natural focus will finally arrive to XR devices.

Another major visual improvement comes from the fact that eye-tracking technology can simulate natural focus realistically — a feature that has remained thoroughly absent from VR headsets so far.

 

A New Way to Design User Interfaces and UX

With the screen-based devices we use today, whenever we want to perform any action we need to tell our device what we want it to do. Usually, we do this by touching a certain area of the screen (touch screen interactions), or by pointing at things with a cursor (using a mouse).

Before doing any of those things, however, we always look at what we’re about to interact with, and this is where eye-tracking comes in.

 

Eye tracking in VR/AR will make for faster and more efficient user-interface actions
 
Your eye will play a major role in the UI’s of the future, effectively replacing standards like cursors, crosshair and touch interactions.

It cuts out the middleman, allowing us to engage with content by simply looking at it. This will give rise to new ways of building User Interfaces that feel natural and are incredibly accurate, completely replacing the need for cursors and most touch based interactions altogether. Eye-tracking interactivity is also discrete by nature, and may allow us to use immersive computers in small public spaces — possibly answering one of the biggest design questions in VR/AR today.

 

An Analytics Oasis

Eye-tracking will allow VR/MR creators to have access to an unprecedented level of usage analytics — not only they’ll know exactly what users have looked at or ignored throughout an experience, they’ll also be able to accurately measure engagement through pupil tracking.

You may have heard that human pupils dilate on physical attraction: but it goes much further than that. Pupil expansion betrays not only physical attraction

but also mental strain and emotional engagement. It can even go as far as to predict the actions of a user seconds before they do it (explored and explained in detail in my article about the future of immersive education).

 

Pupil tracking in VR/AR can increase the accuracy of analytics collected from a gaze.
 
Your eye tells more about how you’re feeling than you could ever imagine

All of this will be immensely powerful for developers and will allow them to combine these bits of data to create immersive software that’s 100% reactive to a user’s emotions and truly understands what’s going through their mind as they go further into the experience.

 

New Gameplay Mechanics and Interactions

Eye-tracking will also give way to a number of new interactions and game-play mechanics that were never possible before — virtual characters will now be aware of when you’re looking at them, even going as far as to cross-examine what you’re looking at and why.

VR/AR eye-tracking will lead to more intense and purpose-driven actions within video games and game interactions
 
Interrogation scenes are about to get a lot more dynamic in VR.

Users will be able to aim with their eyes, make narrative choices by simply gazing at an object, and meaningfully change the world around them with almost subconscious gestures, opening up a number of new opportunities for creative storytelling and interaction design.

 



We’d like to thank Lucas Rizzotto for his contribution to our blog from his collection of work. See more of his articles here!

 

Here at Yulio, we take advantage of our heatmap feature to track our user’s gaze duration, and where their attention truly lies on within a scene. Want to try this feature out? Sign up for a free Yulio account and get full access to our feature set for your first 30 days!

Author


Rachel Chan

Rachel Chan

Rachel is a writer for Yulio, covering all things VR. With a keen interest in creativity and innovation, Rachel enjoys seeing how businesses use VR in their workflow, and how they have been transformed by it.