In this post David Farrell, Oculus Medium Software Engineer, will provide updates on the Move Tool, an overview of how these updates function from a technical perspective, and how they help enable users to move and stretch 3D shapes quickly and easily.
If you haven't already, be sure to check out the immersive tool and all it's sculpting on the Oculus Rift today!
Oculus Medium's Move Tool is a recently updated feature that enables users to move, twist and stretch virtual clay. Artists can easily translate, rotate and scale with their hands, allowing them to stay focused on the creation process while avoiding cumbersome modes or keyboard shortcuts.
I recently spoke at GDC about the Move Tool. The presentation deck is now available on Github and contains a great deal of information about the math used for the tool's deformation (be sure to check out the speaker notes for added context). You can also check out recently released sample code from Medium which includes the Move Tool's nonelastic and elastic deformations. It is licensed under a permissive BSD license, and compiles in C++ and GLSL.
Lastly, before diving in, it's worth noting that we've previously written about the Move Tool and the nuts and bolts that make it work in the following posts:
We hope you find this content interesting + helpful!
The New Elastic Move
In Medium 2.0, we added a mode to the Move Tool that we titled Elastic Move. With Elastic Move, the Move Tool uses a physically based simulation of an elastic material, so the artist feels like they are interacting with a real object. This simulation is tuned to use a volume preserving deformation. With Elastic Move, artists can adjust, reproportion, and pose their sculpts.
An effective simulation needs many details to work well together, such as the visual and audio quality of the headset, precise tracking of your hands and comfort. Elastic Move shows how improving simulations can lead to greater depths of immersion, as your brain recognizes that shapes are moving in a lifelike and predictable way. This provides a sense that your are within a realistic environment, not inside a sterile computer simulation.
Improvements to Nonelastic Move
We also updated the behavior of the Move Tool in nonelastic mode so that clay in the falloff region (between the Move Tool's inner and outer sphere) follows more of a swept curve pattern. The curve is controlled by how much a user's hand rotates between the start and end orientation of the move. This curve gives the user the ability to create a wide variety of shapes with the movement of their hand. The following video illustrates this difference in behavior:
We actually arrived at this behavior by accident, but we were more than pleased with the results, so we kept it in.
To explain what's taking place, the Move Tool mesh deformation works by treating each vertex in the triangle mesh as a point, and integrating that point through a time-dependent vector field. The vector field represents the motion from the start to the end of the move gesture. We use fictional time, such that the start of the gesture is at t=0, and the end of the gesture is at t=1. We then use a numerical integrator to find the final position of the point.
Both the nonelastic and elastic move tool use this framework of integrating points through a vector field, using different vector fields for the two modes. We'll now discuss how to construct the vector field for the nonelastic mode.
First, we record the pose of the Touch controller every frame that the user is moving the mesh. Each pose consists of the translation and orientation of the Touch controller, as well as the scale (controlled with the thumbstick) and the time that the pose was recorded. The first and last pose is used to calculate the Move Tool deformation, and the in-between poses are thrown away.
Third, we define the time dependent vector field with this function (nonelastic sample):
INLINE Deformation buildDeformation(Motion motion)
{
Deformation deformation;
deformation.origin = motion.origin;
deformation.linearVelocity = motion.linearVelocity;
deformation.rotationTensor = skewSymmetric(motion.angularVelocity);
deformation.strainTensor = identityMat3x3() * log(pow(motion.scaleFactor, 1/motion.dt));
deformation.displacementGradientTensor = deformation.rotationTensor + deformation.strainTensor;
deformation.time = motion.time;
deformation.dt = motion.dt;
return deformation;
}
INLINE vec3
NonElasticEvaluateODE(float t, vec3 x, Deformation deformer)
{
// advect the center of the move
float originLerp = t - deformer.time;
vec3 originAdvected = deformer.origin + deformer.linearVelocity*originLerp;
vec3 R = x - originAdvected;
vec3 trans = deformer.linearVelocity;
vec3 disp = deformer.displacementGradientTensor * R;
return trans + disp;
}
Finally, we integrate each vertex in the triangle mesh with the vector field using numerical integration. Integrating a point through that vector field yields the same result as transforming it by the 3D relative transformation, but we can easily add a falloff effect and blend multiple deformers together by summing their individual vector fields. This means that points are smoothly blended across planes of symmetry, as when Medium's mirror mode is enabled. For an example of how we efficiently integrate each vertex, please refer to the following Sculpting and Simulations Sample Code.
The above method works by integrating from t=0 to t=falloff, where falloff is a 0-1 factor based on the distance of the point from the inner to outer sphere of the Move Tool. When developing the code, we used a different technique, we integrated from t=0 to t=1 and multiplied the displacement vector returned by NonElasticEvaluateODE() by the falloff factor. However, to exactly match an affine transformation, we should have also modified NonElasticEvaluateODE() so that the origin's offset was multiplied by the falloff, like this:
Dropping the multiply happens to create a vector field that produces the curve demonstrated in the above video. This small change has no effect when the falloff term is 1 (for vertices in the Move Tool's inner sphere), but has a large effect when the falloff term is between 0 and 1 (for vertices in between the Move Tool's outer and inner sphere), generating shapes with a smooth curve.
The nonelastic and elastic modes of the Move Tool are used in complementary ways:
In nonelastic mode, the Move Tool is volume creating and adds new virtual clay to the sculpt.
In elastic mode, the Move Tool is volume preserving and stretches and contracts the virtual clay.
These two modes can be toggled in the Move Tool's UI. We also added a quick shortcut (double tapping the green button) to quickly toggle between these modes.
Medium artists have used these dual volume creating/preserving modes in many ways:
“Inspired by the way car designers drag out their tape drawings or scrape clay, I had a similar feeling using the move tool to pull geometry into more complex surfaces. The move tool with the double tap to switch modes is pretty unique to Medium and creates shapes that are tough to imagine in other programs." — Jay Evans
“I often try to create organic forms with elegant twists and contours, and I use the move tool in a variety of ways throughout my process, utilizing the full spectrum of the settings within the tool on any given sculpt. The move tool is so intuitive yet absolutely magical when considered against the constraints of traditional physical media, and also previous 2D sculpting software that lack motion controls and stereoscopic viewing.” - Will Atwood
"When I first started sculpting the OC5 logo I wasn’t quite sure how to pull off the organic flow of the shape, but once I started experimenting with the Move tool I knew that I was headed in the right direction." - Wyatt Savarese
Elastic Move Under The Hood
Under the hood, the Elastic Move Tool is based on Kelvinlets, which you can read more about within the following well-written research paper. This paper describes how Kelvinlets simulate an elastic material as if it were embedded in a 3D infinite continuum. The Elastic Move Tool uses the Touch controllers to guide that simulation.
The Kelvinlets paper above discusses how to apply the equations of linear elasticity to a sculpting application. The resulting equations work extremely well for real time applications: they are a closed-form solution, they use only ALU (no texture fetches), and operate independently per-vertex - they are usable in a vertex shader. The equations do need to solve an ordinary differential equation (ODE), but that's similar to a physics or particle simulation in a game. The technique can work with any triangle mesh, and needs no mesh precomputation. Similar results can be achieved using the finite element method (FEM), but FEM requires that the mesh is tetrahedralized, and needs an expensive solve.
Although Kelvinlets can be applied to any triangle mesh, the deformed mesh may become poorly tessellated. A common way to address this problem is dynamic remeshing, where the length of each triangle's edge is kept within some constant length. Medium doesn't use dynamic remeshing, and instead, each move operation converts the triangle mesh to an SDF and tessellates it back to a mesh. This is covered in the following post: Medium Under the Hood: Part 2: Move Tool Implementation.
Kelvinlets present a set of tradeoffs to consider. The equations make the assumption that the object being deformed is embedded in an infinite continuum, which is a good approximation for many things, but not exactly how reality works. The tradeoff is that we ignore dealing with boundaries or friction between objects, which limits what can be done with the simulation. On the other hand, the simulation can be computed at high framerates, and Kelvinlets can be computed in a vertex shader at VR framerates. This level of physical simulation is useful in many situations, but be aware of the tradeoffs made with this technique. Ultimately, for digital sculpting applications like Oculus Medium, Kelvinlets work very well.
The Kelvinlet equations are linear in terms of the force vector and force matrix, which is one reason they are so efficient. Another benefit of linearity is that multiple Kelvinlets can be simply summed together. In Medium, this means that we can support sculpting across a symmetry plane by simply summing Kelvinlets of multiple deformers.
Stiffness and Compressibility
There are two parameters that affect the material's response to a force in the Kelvinlet equations: the material stiffness and material compressibility. Stiffness is related to Young's Modulus, and defines the ratio of stress to strain of the elastic material. Compressibility is related to Poisson's Ratio, and defines how a material expands or contracts in directions perpendicular to the direction of stretching.
Elastic Move uses material stiffness for the Move Tool Strength. The Move Tool has a slider in its UI that controls its strength. The slider goes from 0% to 100%, and maps to stiffness values of 5.0 to 1.0. When the strength is 100% (stiffness value of 1.0), then the deformation at the Move Tool's center will exactly follow the movement of the tool. As the Move Tool's strength percentage lowers, we increase the stiffness value, so that the deformation at the Move Tool's center will be less influenced by the movement. The upper stiffness value is unbounded, so Medium's use of 5.0 as the largest stiffness value is arbitrary. However, stiffness values much larger than 5.0 make the virtual clay practically unmovable, so mapping the Move Tool strength percentage to the stiffness range 5.0 to 1.0 feels right.
The stiffness parameter compensates for the weightlessness of the Touch controllers, and makes Medium artists feel more in control of the virtual clay. The result is that the clay feels as if it resists movement, just as a real-world material does.
Elastic Move uses a compressibility setting of 0.5. Most real-world materials have Poisson's Ratio values between 0.0 and 0.5. Medium's use of 0.5 means that the simulation is volume preserving, such that virtual clay is added or subtracted from the scene as it is stretched with Elastic Move. We don't expose this setting to the user, because for sculpting in Medium, we don't want to overload the user with too many tunable parameters. The volume preserving behavior of 0.5 compressibility is the most useful behavior, so the simulation is hardcoded to that.
We can imagine a content creation application or game that lets the user define material parameters based on real-world values. The user could pick a layer or object, set the material type (for example: foam, clay, sand, etc.) or set the stiffness and compressibility directly, and those settings affect all interactions with the material.
Here is a video of various stiffness settings:
Visual Quality During Preview
It's worth looking at how Medium handles shading and level-of-detail with the Move Tool.
The Move Tool is split into two phases: the preview phase, where the sculpt's geometry is immediately affected by the Move Tool, and the application phase, which is a more computationally intensive process that converts the deformed geometry back to a signed distance field.
During the preview phase, the vertex normals should be deformed along with the vertex positions. However, correctly computing the vertex normals requires averaging the triangle face normals of neighboring triangles for each vertex, which is somewhat computationally expensive. Instead, during the preview phase, Medium switches to using flat shading with the normal of the triangle's face. To do that, we compute the normal in the pixel shader by procedurally calculating the face normal. That can be done by taking the cross product of the derivative of world position in the x direction, with the derivative of the world position in the y direction (in GLSL, this is cross(dFdx(worldPosition), dFdy(worldPosition)).
Because Medium's triangle meshes are quite dense, using the triangle face's normal during the preview phase is not objectionable. Once the application phase is complete, the vertex normals are recomputed from the signed distance field, and we return to smooth shading.
Another challenge with the Move Tool is that it is more performance intensive to render in the preview phase than in normal sculpting. Medium dynamically adjusts the LOD of the mesh during the preview phase as necessary to maintain framerate. Medium generates four level-of-detail meshes using an algorithm called Transvoxel, so we bias the sculpt's LOD level during the Move Tool's preview phase. This can result in a visible snap at the start of a move when Medium changes LOD levels, but the application phase computes the move deformation at the highest level of detail, so again the artifacts during the preview phase are not objectionable.
GDC Talk + Sample Code
As noted above, at this year's GDC, I gave a presentation titled Sculpting And Simulations With 6DOF Controllers. The talk covered the math that we use in the Move Tool, going from the position and orientation of a Touch controller, to the equations behind nonelastic and elastic deformation. Finally concluding with a few numerical integration techniques. See the following Github link for this presentation: Sculpting And Simulations With 6DOF Controllers, and be sure to read the speaker notes in those slides— they contain the bulk of the information.
Additionally, we released sample code from Medium of the Move Tool's nonelastic and elastic deformations. This code is very close to what we use in Medium, it is licensed under the permissive BSD license, and compiles in C++ and GLSL: Sculpting and Simulations Sample. Any additional code snippets in this article that are not found in the above mentioned presentation or GitHub repository are subject to Oculus' Examples License.
The Future
We very much look forward to seeing what is created with the application. If you'd like to contact me about these topics, feel free to reach out: @nosferalatu
Thanks for reading!
- David Farrell
GDC
Rift
Explore more
GDC 2025: Insights for Creating, Monetizing and Growing on Meta Horizon Worlds
Get insights from GDC on how Meta Horizon Worlds’ unique development path enables you to build quickly, broaden your reach, earn revenue and measure success.
All, Design, GDC, Games, Marketing, Mobile, Multi-User, Optimization, Quest
The Past, Present, and Future of Developing VR and MR with Meta
Take a journey through the past, present, and future of developing VR and MR as Meta’s Director of Games Chris Pruett shares evolving ecosystem trends and audience insights.
GDC 2025: Strategies and Considerations for Designing Incredible MR/VR Experiences
Dive into design and rendering tips from GDC 2025. Hear from the teams behind hit games like Batman: Arkham Shadow and Demeo and learn how you can make your mixed reality and VR games more enjoyable and accessible.
Accessiblity, All, Apps, Avatars, Design, GDC, Games, Hand Tracking, Optimization, Quest, Unity