New Oculus Open Source Library and Pirates Demo App + Q&A with Developer Luca Mefisto on Hand Tracking Innovation

Oculus Developer Blog
Posted by Oculus VR
March 31, 2021
Hand Tracking

Hand tracking has great potential to create immersive and intuitive VR experiences, particularly enhancing the natural expression of social interactions. There is both great potential and room for improvement, as it can be difficult to implement robust gestures.

Over the past half, we have been working with Luca Mefisto, a passionate and talented VR developer from Madrid, on solving one of the biggest pain points of building hand tracking interactions in VR. Together, we’re excited to release an open source library that will help developers easily create great hand-object interactions. We’re also launching Hand Posing Tool: Pirates Demo on App Lab. The app is set in a pirate ship and allows you to perform all types of interactions: take the wheel of the ship, load up the canon and light the fuze, grab a book, open a bottle and grab the message inside, open the treasure trunk, and more. The app is fully open source and will serve as a great showcase and sample for developers getting started with the library.

We know there’s a need for advanced tooling for hand-based applications. In the case of hand-object interactions, there are two sets of challenges.

The first is “what’s the best way to represent my hand as it interacts with an object in VR?” We found that the most compelling way to do this is by snapping to canned poses when grabbing an object. The sacrifice in fidelity compared to dynamic hand-object interactions (e.g. managed by a physics or collision engine) is largely made up by the greater robustness this method provides. Hands stay stable when holding objects and it feels great. Side note: this is a clear trend we’ve seen with hand tracking—it’s often better to sacrifice a little fidelity for more stability or realism. Another illustration of this is seen in the Hand Tracking loss recovery API, where we substitute real tracking data by canned animation when we lose tracking. This works great to maintain immersion even when hand tracking fails.

The second challenge is how to author the canned hand poses that hands “snap to” when grabbing an object. This is usually done by a hand animation artist who needs to manually create hand poses for each virtual object. This is painstaking work, as multiple poses need to be created per object. With this new library, these poses can be created automatically using hand tracking. We also generate poses automatically, making clever use of symmetry (e.g. no need to record for left and right hand separately anymore, or to record multiple poses along an axis).

We spoke with VR developer Luca Mefisto to learn more about his experience with hand tracking, the decision to open source, and more:

Tell us a little bit about yourself and how you got started developing for Oculus.

I am a software engineer by trade who started working with XR in 2010, and I’m still as excited about these technologies as I was back then!

I worked in a company doing training simulations and marketing applications that moved into VR as soon as the Oculus DK1 came out. When I tried it, I had an epiphany and became fully committed, eventually quitting my job and co-founding the VRManchester community, as well as collaborating with as many projects as I could, always keeping a balance between experimental experiences and more business-oriented projects.

Since then I have released some games, worked with big artists and NGOs, solved problems for big corporations and small indies, helped shape the community, and taught among other things. There are way too many interesting problems in the space and I like being involved in too many of them.

What was it like for you integrating hand tracking for the first time?

One of my current side projects is a neuro-rehabilitation tool for people with acquired brain injuries that I am developing in conjunction with a team of neuropsychologists.

Accessibility is key here and Hand Tracking was one of the obvious choices. I had used Hand Tracking solutions in the past for experimental projects and it was not the easiest ride.

But Oculus Quest, which has it all integrated in the headset, combined with the ability to swap between controllers and back, plus an SDK that works well with the rest of the OVR code, turned this technology into a hassle free solution not just for me but for the therapists.

What was great about it? What pain points did you experience?

For our patients, not having to hold and understand the triggers of the controller was a great leap forward in accessibility. Moreover, the increased feeling of immersion made them focus more on the exercises as they would take the hands “for granted.” It also improved the safeness of the experience, since their hands will be free if they lose balance (we also align a real table with the virtual one to give them somewhere to lean on).

Sadly it was not all incredibly straightforward. Some brain injuries can make patients take the virtual world “too literally” and seeing a hand disappear, grabbing something in an unrealistic way, or shaking a lot when tracking is low can make them question not just the experience but their own proprioception. We had to put a lot of extra work in our design to alleviate these points, and part of that work is the Open Source tool that we shared with the community.

How did you come to build the Pirates Demo for hand poses?

Building the Pirates Demo experience and open-sourcing it came out to be a great decision. First of all, it forced me to test the Hand-Posing tool in a different, and more common, scenario. It helped identify many pain-points and re-design some systems so they would be easier to use and extend by other developers, while also serving as a full-fledged example on how to use the tool.

It also allowed us to test the interactions with a much wider user-base, not just developers using the internal tool, but also players eager to have some fun in the pirate ship. People are more aware about the existence of the tool and have more channels to communicate with us and provide feedback.

What was it like collaborating with the Oculus team to build this demo?

It has been wonderful to see that my vision for the tool is so well aligned with theirs. Since the very beginning they have guided (and supported) me to make the tool better while still leaving me in control of the general direction and reach of the project. Having worked as a “lone-wolf” for so many years, it has also been refreshing to be able to pick their talented brains and discuss design decisions. Couldn’t have hoped for a better companion!

What is featured in the hand posing open source repository?

The main feature of the tool is the ability to author high-quality grabbing poses and the snapping to them. Using handtracking as a dev tool, and not just an interaction one, you can generate the poses for an object in mere seconds. It also supports snapping surfaces, pose mirroring, inversion, and many more quality-of-life features, so you only need to “mimic” the grabbing pose once, and reuse it as much as you want. It doesn’t matter if the player is using a different hand skeleton, controllers or other hand representations.

We built it with extensibility in mind. I did not want to enforce a workflow into other developers, but rather leverage their current one, so they can add the poses and keep using their hand prefabs, animations or grabbers. Of course we also provide a good implementation for these things so they don’t have to start from scratch! The Pirates Demo is a great example of what can be achieved with all the code provided.

Why did you decide to open source this library?

At first, I just wanted to have a snapping tool that felt natural for my patients but also featured a very optimised workflow, since I was the only developer in the team and I am definitely not an animator (they typically take care about these matters).

When I shared my first results on Twitter I discovered that my problem was also the problem of pretty much all VR devs out there, and it viralised quickly even though it was a very technical/0-fancy tweet, making me realise I would assist our community greatly by open-sourcing it.

What are some of your key takeaways from building this demo?

The main takeaway is that even with an input that sounds as definitive as Hand Tracking, there is no one-size-fits-all solution. Once you decide to go for a Hand Tracking implementation, a lot of decision branches open up in front of you and you must understand the requirements of your users in order to choose wisely. Hand Tracking goes hand in hand with design, and can be one of the most powerful tools for immersion in your VR experience if done well.

What are you working on now?

I am currently working on some features to keep hands stable in poor-tracking scenarios. After that, I want to dive into Locomotion techniques using Hand Tracking and gestures, keep adding more samples to the Pirate Ship, and automate the poses when no authored ones are provided… So much to do!

How is the new Hands API/HandPose Showcase buildable for the community?

Developers can get the code here.

In order to bring it into your project, you can use the Unity Package Manager so it stays as a separate package and does not mess with your code. You don’t need to download the Pirate Demo, as the basic Package also comes with less-fancy examples covering all features.

What do you hope developers do with the open source repo + Pirates demo?

My hope is that developers use this even when they want their players to use Controllers, as a system to not just support both things with the same Hands Mesh, but also to increase the responsiveness and inmersion of their experience without having to put in many hours of work. My dream would be to see games in the Oculus Store implemented with this tool and read some reviews from players saying “Great Hand Interactions!” while the devs think: “Ah! We did not even break a sweat.”

I would also love to keep reading their feedback! If the tool is too difficult to use in some regard, or if they are missing a key feature or a fun interaction in the Pirate Demo, I’m all ears.

I want to thank Oculus for their ongoing support, they have been amazing. And I’d also like to thank the whole VR-dev community. We know how interesting our field is, although some problems can be a bit too hard, so let's keep solving them collaboratively as we have been doing for the last few years. I am very proud of our industry.

Share your thoughts on hand tracking, the open source library or the Pirates Demo in the comments or the Developer Forums. We’re looking forward to seeing the creative uses and continued innovation in hand tracking ahead.