Medium Under The Hood: Part 1 - Developing the Move Tool
David Farrell
Oculus Medium is an immersive VR experience designed for Touch that lets you sculpt, model, paint, and create tangible objects in a VR environment. We recently added the Move Tool, which allows users to grab, move, and reproportion parts of their sculpt. The Move Tool was one of the primary features of Medium's "Summer of Move" (version 1.2) update and a long standing request from Medium users.
The Move Tool is a powerful tool in VR because it allows you to translate and rotate the 3D object all at the same time. Previous non-VR applications would require you to make those changes as a series of separate operations with the mouse and keyboard, which was cumbersome and less intuitive, and usually restricted to a 2D plane such as screen space or perpendicular to the surface.
Over the course of the last year, we prototyped three versions of the Move Tool. The first two were abandoned for various reasons we'll discuss later in this post. The third and final version (for now) of the tool ended up using a new, more powerful way of working with the sculpt data. In this post, I’ll cover how we use the Touch controllers to create a displacement field, how we apply that field to a 3D object, and discuss each of the three prototypes that we developed. In future posts, we'll be taking a closer look at how Medium converts a triangle mesh to a signed distance field and how that's been useful for other types of sculpting operations.
6DOF Input
The Touch controllers provide a highly accurate, six-degree-of-freedom (6DOF) input for each hand. 6DOF means you can control both translation and rotation in 3D with your hand. This is an entirely different experience than using a two-degree-of-freedom mouse on a flat screen. You’re free to think directly in 3D instead of performing mental gymnastics to map a mouse’s movement to a 3D space, and there are some adjustments to the sculpt that are only possible with the ability to both translate and rotate.
This is a unique feature that distinguishes Medium’s Move Tool from similar tools in other 2D programs. When using Medium, it’s easy to take for granted that you can simply move your hand to specify a position and orientation. When you go back to a flat screen application with a mouse, though, the mouse’s two degrees of freedom limits how you can express a transform. VR applications like Medium provide a straightforward way to both translate and rotate at the same time.
One way to interpret a 6DOF input is to define a transformation matrix. With your hand controlling a transformation matrix, you can easily manipulate 3D objects. The Move Tool uses input from the Touch controllers to create a pair of transform matrices: a start transform matrix that defines what the move effects and an end transform matrix that defines the final position and orientation of the sculpt. This pair of matrices are input into a deformation function that produces a displacement field.
Here’s a visualization of the start and end transformation matrices:
Sculpting with Displacement Fields
All three versions of the Move Tool used the concept of a displacement field that’s applied to the sculpt. A displacement field defines a vector at each point in the field that can be used to displace (deform) an object embedded in it. It’s a straightforward way to turn hand gestures from the Touch controllers into something that can be applied to a 3D object.
When you first hold the trigger down, we capture the current transform matrix to use as the start of the deformation. As you move your hand around, we capture a second transform matrix to use as the destination of the deformation. When you release the trigger, we apply the deformation the mesh (prior to that, we show the user a quick preview of what the deformation will look like-- more on that later). While the trigger is down, your other hand still has the ability to grab and manipulate the sculpt’s location, but the move tool’s location stays in the same place. This two handed gesture is often physically easier than doing all the movement with one hand. From a more mathematical point of view, when you first hold the trigger down, we set the StartTransformMatrix. When you release the trigger, we set the NewTransformMatrix. We also let the user control the size of the sphere that the tool applies to, and an inner sphere inside that where everything is displaced at 100% (we smoothly interpolate the influence if the point falls in between). Given all of that, we can create a displacement vector using this formula:
displacement(p) = p + falloff * (NewTransformMatrix * inv(StartTransformMatrix) * p - p)
where
p = a 3D point,
falloff = smoothstep(outerradius, innerradius, distance(p - StartTransformMatrix.translation), smoothstep() is the same as HLSL/GLSL’s smoothstep()
This function creates a displacement field that we apply to the sculpt.
Here's what the displacement field looks like. The video demonstrates translation, rotation, swirling, and scaling. Other displacement fields could be used — for example, we're thinking about how to use this for a pinch/bulge tool.
Medium’s Surface Representation
Before I explain the development of the three versions of the Move Tool, I’ll explain some details about how Medium represents the sculpt data.
Medium defines 3D objects (sculpts) using an implicit surface. The surface is stored as a signed distance field (SDF) in a 3D grid of voxels. An SDF has many useful properties that distinguish it from triangle meshes, such as quick and robust CSG operations like add, subtract, and intersect. Also, an SDF is never self-intersecting; it’s always clear whether a point is inside or outside. Triangle meshes, on the other hand, are an explicit method of storing the surface. They work well when you want to translate or displace a point, but they can self-intersect and have other numerical issues. CSG operations are notoriously difficult with triangle meshes.
A 3D array of distance values quickly becomes quite large. To manage this large data set, we store only distance values close to the surface. This region is called the narrow band of the level set. Medium’s narrow band width is size two, meaning it clamps any distance value outside of the range [-2.0 .. +2.0]. Medium stores the distance data sparsely: it divides the 3D array into 8x8x8 blocks of data, and only blocks containing solid parts of the object are stored in a hash table.
On the left is the a sculpt; on the right is the visualization of a slice of the SDF. The narrow band is the light blue area near the surface.
On the left is the SDF without the surface. Blue is outside the surface, and orange is inside. On the right is a closer view with the signed distance numbers. Notice how data is clamped to the narrow band.
Move Tool Prototype 1: Level Set Advection
Our first version of the Move Tool used the level set method to advect the SDF through a displacement field. Level set advection is a technique where a simple partial differential equation describes movement through a velocity field. As the equation is repeatedly applied to the 3D grid of SDF data, the surface is moved.
Our implementation of level set advection is based on an Eulerian simulation. It begins with a signed distance field, steps the simulation by applying the advection partial differential equation to the field, and re-initializes the field to have distance values. To implement the Move Tool on top of this, we used the displacement field to advect the sculpt surface.
This version mostly worked the way we expected, but two issues caused us to be unhappy with the results: there was a loss of detail over time, and a high performance cost. The loss of detail is due to small rounding errors caused by approximating the surface by distance samples on a grid. When high frequency details fall between grid sample points, the details are lost. These errors accumulate with each step of the simulation and are known as numerical dissipation. The second issue with the advection technique is the high performance cost. The step size of the simulation must be limited so that the surface moves no more than one grid point per step, so a single move requires hundreds of simulation steps.
Even though this advection approach didn’t work well for the move tool, it is a useful technique for other tools, as it automatically handles topological changes. We use it in Medium’s Swirl Tool, which also suffers from numerical dissipation, but swirl quickly mixes everything together anyway and the loss of detail is not a huge problem. However, for the Move Tool, detail preservation is important-- when you move an arm and a hand, you expect that the fingers in the hand don't evaporate due to numerical dissipation.
Because of the loss of detail and performance issues, we decided to not release this version of the tool. Here’s a video of what it looked like:
sad move tool version 1.0. you can see the loss of detail with even a simple sculpt.
Move Tool Prototype 2: Inverse Displacement
The second version of the Move Tool was written by Software Engineer Aaron Lieberman and worked by doing the following:
On trigger down, copy the current state of the SDF data to the side
As the user moves the Touch controller around, imagine that the tool is a capsule. At each grid point in the capsule, sample the cached SDF by transforming that point by the inverse of the move deformation (to find its original distance value). In contrast to the first version of the Move Tool, this version kept a lot of the detail as you moved the tool around. However, there were still a couple of issues with this approach.
The first issue is that you can only apply a rigid transformation to a signed distance field. Only translation and rotation preserve the Euclidean distances held by the signed distance field. When the displacement field stretches the distance data (applies a nonuniform scale), the resulting values are no longer distances. This breaks many of the assumptions Medium makes about properties of the signed distance field. We considered running a signed distance field reinitialization (also called redistancing) over the resulting field, but SDF reinitialization tends to smear the detail of the object a bit, which we were trying to avoid. And in some cases, the distances were so incorrect that even reinitialization wouldn’t help. The second issue is that this version was still computationally expensive. As you moved the tool around, you would have to wait a bit before you saw the sculpt in the new position. This lag was because we were re-meshing the surface every time you moved the Touch controller.
We decided to abandon this approach because the distance field was incorrect and because the performance of Move Tool 2.0 wasn’t able to deliver a visceral feeling of grabbing and manipulating a sculpt.
sad move tool version 2.0. we had to dig a bit to find a video that showed this version, and this video actually makes it look decent. however, you can see the challenges there would have been with performance-- this sculpt is the same size as Medium’s initial sphere.
Move Tool Prototype 3: Final Version
Earlier this year, Lydia Choy, Medium Design Lead, and I revisited the problem of the Move Tool, and brainstormed ways to implement something we'd be happy with. We looked at move tools and grab brushes from other triangle-based 3D content creation applications. Those tools worked by displacing the vertices of a triangle mesh. We lamented that it was easy to do those kinds of operations on a triangle mesh, but difficult with a signed distance field representation.
After thinking about it further, we came up with a scheme that sounded unorthodox at the time, but ended up working out very well. Instead of trying to work directly with the SDF, what if we use a triangle mesh, deform that mesh, and then convert back to SDF? Medium already has a triangle mesh sitting around: we continuously extract a triangle mesh from the isosurface using Transvoxel, a variant of the Marching Cubes algorithm. That mesh is what the GPU uses for rendering (we could also raymarch into the SDF on the GPU, but we use triangles for various performance-related reasons). So, we developed a plan to take a copy of that mesh, deform it with the displacement field, and convert the new triangle mesh back to SDF data.
We expected that the conversion from triangle mesh to SDF would take some time for the computer to calculate, and we weren't sure how fast it would be. We decided to implement the new move tool in two phases: the first phase would show a preview of the move by displacing vertices in the GPU vertex shader and the second phase would be a more lengthy conversion process. This followed the design of a few of our other tools, such as lengthy layer operations like “Increase Resolution” where we display the equivalent of the Windows hourglass or OSX spinner while the operation completes.
Lydia developed the first phase by modifying the sculpt vertex shader code. Once we were happy with that, we found that deforming our triangle meshes worked very well. Even at our default level of resolution, you can perform moves without seeing objectionable artifacts due to under-tessellation. The meshes generated by Transvoxel (and most Marching Cubes-style algorithms) have two properties that make it work well with this style of deformation. It is a very dense mesh with an even distribution of points (vertices) across the surface. Additionally, the mesh is always two-manifold (watertight), which we take advantage of when converting the triangle mesh back to SDF. (We'll discuss the details of that conversion process in the next post in this series).
Below, you can see the two phases of the Move Tool at work. In the first phase, the user manipulates the sculpt with the Touch controller to reposition the turtle’s head. In the second phase, we compute the new SDF data. When an operation takes longer than one second to complete, we display the animated Medium logo to indicate to the user that Medium is processing-- but we’re still rendering at VR framerates.
What's Next?
Since the “Summer of Move” update was released, the Move Tool has been a great success with the Medium community. Some positive feedback we’ve received from Medium users:
“I love the Move Tool so so much! Makes it incredibly easy to define and tweak 3D shapes.”
“Love the move tool! Made it possible to add a little snarl to my critter with one gesture.”
"Not only removing the deficit of a move tool, but making the best move tool I’ve ever seen”.
In the upcoming parts of this series, I’ll dive into how we convert the triangle mesh back to an SDF, how this technique has become useful in other tools, and how we do all of this while still maintaining VR framerates.
Design
Explore more
Quarterly Developer Recap: Tracked Keyboard, Hand Gestures, WebXR PWAs and more
Unlock the latest updates for developers building 3D, 2D and web-based experiences for Horizon OS, including the new tracked keyboard, microgesture-driven locomotion, and more.
GDC 2025: Insights for Creating, Monetizing and Growing on Meta Horizon Worlds
Get insights from GDC on how Meta Horizon Worlds’ unique development path enables you to build quickly, broaden your reach, earn revenue and measure success.
All, Design, GDC, Games, Marketing, Mobile, Multi-User, Optimization, Quest
GDC 2025: Strategies and Considerations for Designing Incredible MR/VR Experiences
Dive into design and rendering tips from GDC 2025. Hear from the teams behind hit games like Batman: Arkham Shadow and Demeo and learn how you can make your mixed reality and VR games more enjoyable and accessible.
Accessiblity, All, Apps, Avatars, Design, GDC, Games, Hand Tracking, Optimization, Quest, Unity