Accurately tracked, responsive, and properly-modeled hands make your application more fun, immersive, and easier to use. However, poorly implemented hands are a common issue in many apps, and a significant cause of store rejections.
We’ve put together a collection of assets, samples, and SDKs to make it easy to add high-quality hands to your app. The hands in these samples meet the standards described in our Oculus Touch best practices, and satisfy store requirements for accurate hand tracking.
We provide support for high-quality hands in two ways: the Avatar SDK, and the custom hand samples for Unity and Unreal Engine 4. The most fully-featured of these is the Unity hand sample, which demonstrates hands that can grab and collide with objects.
The Avatar SDK provides quality hands or controllers for your app, with no need to create any assets yourself. Users will see the same hands in your app as they already see in Oculus Home and the Universal Menu.
It has its own extensive documentation and getting started guides. The documentation covers Unity, UE4, and custom engines, across both Touch and Gear VR.
Please note that the Avatar SDK only covers visual representation of the hands. You can, however, have a look at the AvatarGrabSample in the Oculus Sample Framework for Unity to see how to add grabbing and throwing functionality to the hands.
Custom Hands and Controllers in Unity
In the Assets/SampleScenes/Hands directory in the Oculus Sample Framework for Unity, you’ll find several scenes and prefabs that are ready to adapt to your application or used as-is within your app. The two scenes we'll discuss here are CustomControllers and CustomHands.
CUSTOM CONTROLLERS
This simple scene shows how to use the Touch controller prefabs in your app. Note that the controller models have been authored to be placed at the exact same position as the hand anchors, with no offset in position or orientation. Simply parent them onto the hand anchors, and you will have perfectly registered controllers.
The controller sample is simple, and easy to drop into your project. TouchController.cs updates Animator variables according to current input status, and the rest of the sample's logic is inside the Animator.
CUSTOM HANDS
The CustomHands scene demonstrates how to use the custom hand prefabs in your app. While the controller sample is simply a couple of animated models, the sample hands provide the capability to grab objects.
The sample scene demonstrates that different objects may need different approaches to interaction. To allow tower-building, the boxes do not snap to your grip when you grab them: this would send other boxes in the stack flying. However, grabbing a ping pong ball without snapping it into a good grip pose leaves the ball floating, and looks distractingly bad. You can toggle the Snap Position property of these objects to see the results for yourself.
The scene uses a straightforward algorithm to grab and release objects. Collider triggers attached to each hand keep track of all GameObjects with Grabbable components within them. When the user's grip moves from open to closed, the hand grabs the closest object within these volumes. When the user releases the grip trigger, the object is released, and the hand's velocity is imparted to the object. For more details on implementation subtleties, see the comments in OVRGrabber.cs and other source files included in the scene.
Note that the prefabs include cylinder collision authored to match the hands, which you can remove if you don’t require collision. The sample only enables collision (via Hand.cs) when the user is pointing, to allow some physical interaction without making it difficult to grab objects.
This sample scene uses an OVRPlayerController to allow the user to move around the scene, but many apps will want to use their own locomotion system instead. The hands have no dependency on the OVRPlayerController: feel free to pull the OVRCameraRig out of the hierarchy, and delete the OVRPlayerController.
Custom Hands and Controllers in Unreal Engine 4
Our Unreal Engine 4 sample is located in the Samples section of our Unreal Engine 4 branch in GitHub. Build and run the scene in VR Preview. Pressing the Menu button on the Touch controllers toggles between hands and controllers. Play around with them for a minute to see the different supported animations.
The implementation consists of three primary parts, with all of the logic in blueprints (no C++ is used):
UE4's own Motion controller component updates the transforms of the Touch controllers.
The hand and controller Child Actor Components here use relatively simple blueprints (Left/Right Controller and Left/Right Hand), which listen on input events and update their member variables appropriately. A partial example:
The corresponding animation blueprints (Left/Right HandAnimBP and Left/Right ControllerAnimBP) use the variables set in part 2 to blend the anims for the grip,point, and thumbs up poses. Partial example:
For your own implementation, you can use the VRCharacter included in the sample, or you can set up your own player character and just use the provided blueprints.
Tips for Modeling Your Own Props and Weapons
If you’re making your own tracked objects from scratch, you’ll want to make sure they’re just as well-registered as our sample assets.
A simple trick is to import our supplied Touch controller models into the scene you’re working in, in your modeling tool of choice. The Touch controller models are intended to be anchored exactly at 0,0,0. When you model your custom object, if it lines up properly with the Touch controller in your editing tool, then it will also look correct in your app when you export it.
As an example, here’s what it looks like in our UE4 sample when both the hand and controller models are visible. You can see how they line up well:
Avatar SDK
Unity
Unreal
VRC
Explore more
Quarterly Developer Recap: Tracked Keyboard, Hand Gestures, WebXR PWAs and more
Unlock the latest updates for developers building 3D, 2D and web-based experiences for Horizon OS, including the new tracked keyboard, microgesture-driven locomotion, and more.
GDC 2025: Strategies and Considerations for Designing Incredible MR/VR Experiences
Dive into design and rendering tips from GDC 2025. Hear from the teams behind hit games like Batman: Arkham Shadow and Demeo and learn how you can make your mixed reality and VR games more enjoyable and accessible.
Accessiblity, All, Apps, Avatars, Design, GDC, Games, Hand Tracking, Optimization, Quest, Unity
GDC 2025: Emerging Opportunities in MR/VR on Meta Horizon OS
Discover some of the opportunities we shared at GDC 2025 to accelerate your development process, level up your skillset and expand your reach with Meta Horizon OS.