how to do eye tracking studies in virtual reality – Varjo

How to do eye tracking studies in human-eye resolution VR

Read this post to learn how you can easily run eye tracking research in virtual reality without requiring any development skills.

Writer: Ville Leppälä

Ville is the vertical lead for Varjo’s research and healthcare business, working closely with customers and partners.

VR opens up incredible opportunities for researchers

Are you a researcher who studies human behavior by analyzing eye gaze interactions? These days, researchers studying gaze interactions in a museum don’t actually need to physically be there as the same experience can be presented through a high-fidelity virtual reality headset.

And yet eye tracking in VR is often considered challenging compared to traditional methods, such as remote or mobile eye tracking. This is understandable as working in 2D desktop applications is familiar to most of us while virtual reality as a technology is relatively unknown to the majority of researchers.

Fortunately, the latest advancements in VR technology have closed the gap. Read this post to learn how you can easily run eye tracking studies in virtual reality with Varjo headsets without requiring any development skills.

Starting with the basics

Varjo & Unity: Dynamic Human-Eye Resolution VR Photogrammetry.

Virtual reality offers you a flexible way of creating custom environments – a VR headset can be carried anywhere you go and easily fits in your cabin-sized luggage together with a powerful laptop.

When starting with VR the first step is always content. This is everything the user sees inside the headset, run by a 3D engine. The 3D engine draws the picture to the VR headset, whether it’s a game environment or an architectural model.

If you’re building something truly unique, most users prefer 3D engines like Unity or Unreal as they offer a graphical interface for modeling various environments in high detail.

Building these environments can be time-consuming, but once ready, they can then be easily displayed in the VR headset for the user for engaging immersive research.

Additionally, one interesting method for showing  realistic and life-like  environments in virtual reality is to do a photogrammetry scan from a real-world location and show it as a digital twin in VR.

Innovations in VR research held back by technical barriers

App stores are full of VR applications but very few of them offer eye tracking support by default.

For example, if you’ve been hoping to use VR for studying human behaviour in a museum, until now the only way has been to build the 3D environment yourself, usually with Unity or Unreal Engine.

This means that previously, to do eye tracking research in VR you had to first find a professional who’s competent with gaming engines who could build the environment. Then you had to plug in the eye tracker API and set up the recording pipeline. This requires a lot of effort, so what typically happens at this point is that many decide to return to the traditional remote and mobile eye trackers.

Due to these significant technical barriers, most people have yet to discover the vast potential of eye tracking in VR.

Varjo users can now export screen recordings and eye tracking data from Varjo Base without the need to access the application source code. This improvement enables the use of eye tracking analytics in any Varjo-SteamVR compatible application – even without native eye tracking support. This video capture is from Smithsonian American Art Museum or Museum of Fine Art.

Varjo offers a new, simpler way to utilize eye tracking with any VR app

Unmodified capture from Smithsonian American Art Museum or Museum of Fine Art. With Varjo, you can conduct eye tracking studies in VR without developer skills. Post-process recordings can be done with your preferred analytics tools.

At Varjo, we’ve decided to make eye tracking in VR as easy as possible for users without the need to spend extra time and resources on content development.

Our 2.3 software release allows you to use our integrated 20/20 Eye Tracker in any Varjo-compatible VR application, even if that application isn’t compatible with eye tracking by default.

If you’re building something truly unique, most users prefer 3D engines like Unity or Unreal as they offer a graphical interface for modeling various environments in high detail.This means that all Varjo-compatible VR applications now automatically support eye tracking, including SteamVR and OpenVR content.

You’ll be able to choose from over 4,000 Steam Store VR applications and then run eye tracking experiments on top of it.

Going back to the museum use case, an example scene from a museum can easily be downloaded from the wide selection available – for example the Smithsonian American Art Museum or Museum of Fine Art. The user can see the museum environment in ultra-high resolution (up to 40 pixels per degree / 4K rendering per eye), which adds to the immersion of the experience and precision of the eye tracking data.

Follow these simple steps to get started

1. Run any Varjo-compatible application
2. Turn the 20/20 Eye Tracker on from Varjo Base & user calibration starts automatically
3. Toggle eye tracking data recording on
4. Run the session

Once the session is over, Varjo Base saves a screen recording video file of what the user has seen inside the headset and a CSV file that includes the eye tracking data from the session. The data is in simple X, Y coordinates associated with the video.

This data can then be easily analyzed by creating heatmaps, areas of interest or going deeper into scan paths and fixations. The data is also compatible with the established workflows of eye tracking analytics suites like iMotions.

Go more advanced

The simple way will get you easily started but if you want more advanced scene-specific optimization, you’ll need to edit the application source code. You’ll be able to do pure 3D tracking meaning that gaze interactions are not compared to 2D images but to actual 3D objects which is a clear benefit in dynamic environments. All benefits of 3D tracking can be also easily captured with the help of tools like Cognitive3D.

Check our previous blog post on the benefits of VR-based eye tracking for research to learn more.

Get started with Varjo’s integrated 20/20 Eye Tracker

Varjo’s integrated eye tracking allows you to gain deep, reliable, and immediate insight into what your subjects see and experience.

Using our human-eye resolution VR/XR headsets, you can track what’s behind users’ reactions and behaviors as they interact with photorealistic objects, environments and virtually any stimuli – all with the full control and flexibility of virtual and mixed reality.

All of our headsets VR-2, VR-2 Pro and XR-1 Developer Edition come with built-in eye tracking.

You can read more about our 20/20 Eye Tracker from this blog post.

For more information on the technical specifications of Varjo 20/20 Eye Tracker, please visit our Developer Site.

Subscribe to newsletter