Having a look at the implementation of handtracking for the Oculus Quest. There can be so many benefits to removing controllers from the equation especially when creating experiences with accessibility in mind. Although it’s in its infancy I wanted to check out the current functionalities and uses pre Oculus Quest 2 using the unity engine and C#.
Notes:
Pinch
Pinch is the basic interaction primitive for UI interactions using hands. A successful pinch of the index finger can be considered the same as a normal select or trigger action for a controller, i.e., the action that activates a button or other control on a UI.
To detect whether the finger is currently pinching and to check the pinch’s strength, call the GetFingerIsPinching() and GetFingerPinchStrength() methods from OVRHand.cs. Pass the relevant finger constant defined in the HandFinger enum for the finger that you want to query. The finger constants are: Thumb, Index, Middle, Ring, and Pinky.
The progression of a pinch gesture is indicated by the returned float value. For each finger pinch, the corresponding value ranges from 0 to 1, where 0 indicates no pinch and 1 indicates full pinch with the finger touching the thumb.
In addition to the pinch strength, OVRHand.cs also provides the GetFingerConfidence() method to measure the confidence level of the finger pose. It’s measured in terms of low or high, which indicates the amount of confidence that the tracking system has for the finger pose.
To retrieve the confidence level of a finger pose, call the GetFingerConfidence() method and pass the finger constant for which you want to track the confidence level.
Handtracking | ||
Output/tested | Oculus Quest | |
Engine | Unity 2020.1.0f1 | |
Creative | Unity | |
Git |