A better interaction method for VR

As a big fan of Virtual Reality, I believe there is huge room to improve its interaction method — Laser Interaction. So I explored a more intuitive VR interaction — Virtual Reality Hands.

Laser interaction is limited

All current VR products use the laser interaction model: the controller emits a laser, leaving a dot on the target. The dot serves as a cursor to navigate, select and click.

Laser as an interaction method has three major problems:


Using a laser to interact with targets requires great efforts to control hands and wrists.


Cursor is inefficient in VR. Because the ratio between the hand movement and the cursor movement is large, making it hard to exert precise control on the cursor, which reduces the efficiency.


The laser and the cursor are annoyingly shaky, because of human’s innate hand tremor.

The solution

VR Hands

Interact with VR contents using hands directly.

How does it work?

Why do so many companies choose lasers?

Lasers are needed to interact with 1-meter-away content.

According to this theory, contents in VR should be placed at least 100 cm away. But most people’s hands can only reach to 60-80 cm in front of their eyes, they need lasers to reach further contents.

Why 1 meter is the most comfortable distance?

Vergence-Accommodation Conflict (V-A Conflict) is a innated problem of current VR display technology. It will cause eye fatigue when the viewing distance is nearer than 1 meter. But it could be ignored when the viewing distance is further than 1 meter.

Reframe the problem

Need a better interaction for faraway contents

This leads to two possible solutions direction:
1. Technology breakthroughs to eliminate V-A conflict.  
2. Find better interaction method for faraway contents.

I choose the second solution because:
1. There is not much I can do about the technical limitations.
2. Interacting with faraway contents is necessary even the technology allows direct interaction with near contents.
Because it improves efficiency. People don’t need to approach closer to faraway contents to interact with them.


Focus on the experience first, don’t worry about the feasibility.

Design should consider feasibility. But at the ideation phrase, design should never be limited by technology. So I decide to first pursue the best design, then consider if the solution is feasible.

Break down VR interactions

There are several types of common interactions, such as scrolling, swiping, and zooming. But the most fundamental one is clicking. Once clicking is solved, other interactions won’t be too hard. So this design focuses primarily on clicking.

Break down clicking into smaller parts and three very specific design challenges reveal: controller, selection and manipulation.

Design details

# 1

Controller ·  Selection  ·  Manipulation

Hands as controller

In real world, people use their hands to finish all kinds of tasks, clicking, touching, probing, etc. Even though they can’t click anything that’s out of hands’ reach, they have a very similar behavior — pointing faraway items with fingers. In terms of “clicking” faraway objects, hands are the best controllers.

It is possible to track hands and fingers motion using products like Leap Motion.

# 2

Controller  ·  Selection  ·  Manipulation

Selection is a tradeoff between intuition and accuracy.

All selections face the same tradeoff between intuition and accuracy. Extreme intuition and extreme accuracy can’t coexist. Accuracy comes with the price of extra attention, which lowers intuition. And if people want to operate more intuitively, they have to sacrifice some accuracy.

For VR, improving intuition is more important than improving accuracy.

Accuracy is the basis of selection, while intuition is the key to make the selection convenient and pleasurable. When designing the VR interaction, we must focus on intuition first. Otherwise, only bad ideas such as laser interaction would be generated.

Abandon the cursor to improve intuition.

The key to improve selection intuition is to abandon cursor. Because Cursor is always accompanied with correction, and correction leads to higher accuracy / lower intuition.

Mouse as a cursor to select extremely small targets.

VR interactions do not require cursor-level accuracy. Most contents and controls in VR are already big enough for direct finger clicks, just like mobile phone UIs.

Ensure minimal selection accuracy (Technical stuff, TLDR)

Hands are not accurate enough for selection.
First of all, I ran a test to ask if people believe they can point at something faraway accurately. And their answers were YES. I tried to run another experiment to try to see if people can objectively click something faraway accurately.

I ran an experiment to track the direction of the fingertip to see if people can actually do remote clicking accurately. But the experiment was not successful. I could not set up a precise environment to record and measure the data.

So I tried to look up some related researches, and found that people can’t point at something faraway accurately(Mayer, Schwind et al. 2018). According to Mayer, when the viewing distance is 2m, the offsets between interact and target using IFRC(index finger ray cast) is about 27.03cm with a standard deviation of 18.95cm. So when the viewing distance is 1m, the offset is about 13.52cm with a standard deviation of 9.48cm, which is not accurate enough for the VR operation.

Use eyes to ensure accuracy.
Eyes are way more accurate than hands. Its accuracy level is the smallest things it can see. So eyes are accurate enough for selection.

In the selection process, people have to stare at (eye-select) the things first before they could click it. Hand responses were about 90 msec slower than eye responses (Brueggemann 2007).

Technical Possibilities
Degree is the measurement of  size in VR. At the distance of 1 meter, the comfortable readable font size is 6.04 cm height/3.45°  (Volodymyr Kurbatov 2017b).

Tobii Pro VR — one of the best eye-tracking products on the market —  has the eye-tracking accuracy of about 0.5 degree. As mentioned above, the minimum readable size height is about 1.33 degree, and the actual clickable button or content in VR are way bigger than 1.33 degree. So Tobii Pro VR is totally eligible for precisely locking down targets in VR.

Time Latency: Another important issue is that we need to make sure there is enough time for the eye tracker to lock down the target before the hand reaching to it. The Tobii VR's eye tracking equipment latency is about 10ms. We’ve known that the hand response is 90ms slower than eye response. So the equipment latency could be ignored, which means we don’t have to worry about the latency.

Proper feedback for selection

There won’t be cursors hovering above the target to indicate the selection, so there should be other feedbacks.

Showing feedback when the index finger extend

The perfect timing for the feedback is when the user’s index finger has already pointed at the target but before final manipulation. In this way the feedback won’t be too aggressive, allowing users to stop the feedback anytime they wants by hiding the index finger.

# 3

Controller  ·  Selection  ·  Manipulation

Use velocity pattern to recognize the manipulation.

It is a little tricky to determine whether the user has performed the manipulation because there is no physical contacts.

It is impossible to determine the manipulation by calculating moving distance of the fingertips, because people click in different ways.

Inspired by iPhone X, I decide to use the velocity pattern of the fingertip to recognize the click. Even different clicks move in different distance, they all have the same sudden velocity change pattern.

How people click
Velocity of the finger tip


Elble, R. J. and W. C. Koller (1990). Tremor, Johns HoPKINS UNIVERSITY PRESS.

Oculus (2018a). VR Best Practices. [website] Available at: https://developer.oculus.com/design/latest/concepts/bp-vision/

Oculus (2018b). VR Best Practices. [website] Available at: https://developer.oculus.com/design/latest/concepts/bp-vision/

Kramida, G. (2016). "Resolving the Vergence-Accommodation Conflict in Head-Mounted Displays."

Oculus (2018c). VR Best Practices. [website] Available at: https://developer.oculus.com/design/latest/concepts/bp-vision/

Foley, J. and R. Held (1972). "Visually directed pointing as a function of target distance, direction, and available cues." Perception Psychophysics 12(3): 263-268.

Lin, L. and S. Jörg (2016). Need a hand?: how appearance affects the virtual hand illusion. Proceedings of the ACM Symposium on Applied Perception. Anaheim, California, ACM: 69-76.

Mayer, S., V. Schwind, R. Schweigert and N. Henze (2018). The Effect of Offset Correction and Cursor on Mid-Air Pointing in Real and Virtual Environments. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI '18. Montreal QC, Canada, ACM: 1-13.

Brueggemann, J. (2007). "The hand is NOT quicker than the eye." Journal of Vision 7(15): 54-54.

Tobii Pro (2018a). Tobii Pro VR Integration. [website] Available at: https://www.tobiipro.com/product-listing/vr-integration/

Tobii Pro (2018b). The Human Eye. [website] Available at:https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/the-human-eye/

Volodymyr Kurbatov (2017a). 10 Rules of Using Fonts in Virtual Reality [website] Available at: https://medium.com/inborn-experience/10-rules-of-using-fonts-in-virtual-reality-da7b229cb5a1

Volodymyr Kurbatov (2017b). 10 Rules of Using Fonts in Virtual Reality [website] Available at: https://medium.com/inborn-experience/10-rules-of-using-fonts-in-virtual-reality-da7b229cb5a1

 Other projects.

100 Day UI