We occasionally post insights from the Oculus Developer community so that VR pioneers can share their best practices and help drive the industry forward. The post below is from our friends behind Virtual Reality Toolkit (VRTK), as they are set to release version 4.0 of the solution and the VR development community is obviously excited for this new update.
The long awaited VRTK 4 has just been released into beta as 4.0-beta in GitHub! VRTK v4 is a re-imagining of the spirit of VRTK, a free and open source toolkit to help accelerate development and evolve the understanding of what works and what doesn’t work when developing for spatial computing. VRTK v4 has been re-written from ground up to offer a better, and more versatile way of creating content. It’s still easy to pick up as a beginner, simply drag and drop some things in a scene to build your own virtual world. We're clearly excited and passionate about accessibility, but it's worth noting that the real power behind this new structure of VRTK means that as development skill and knowledge increase, so does the potential for what's possible.
VRTK v4 is a decoupled approach to providing solutions to common problems faced in building for spatial computing. VRTK v4 no longer relies on underlying knowledge of any hardware SDK requirements, for those who have used VRTK v3, you will be glad to hear the SDK Manager is a thing of the past!
VRTK v3 came with upsides of making it especially accessible for people to understand and use, but this came with drawbacks, as behind the scenes magic require that any relatively complex changes needed custom code, extending classes and magical runtime tango with your VRTK components. This has all been addressed in the new structure of VRTK v4, with all the magic removed, it means that everything is available at edit time so you know exactly what is going on before you run the scene.
The decoupled components found within the guts of VRTK v4 also have loose connections and are either injected at edit time so you know exactly how things are communicating, or they use UnityEvents to pass messages between the components. This event message passing makes it easier to decouple functionality and reuse the same logic in many different places and ways without rewriting the same code over and over again. It also means it’s much easier to customize functionality without even needing to change or add any code. Pretty much every component can be updated and manipulated via changes to the UnityEvent listeners which is done via the Unity inspector, or even through visual scripting with the use of 3rd party visual scripting tools.
The way this change is reflected in VRTK v4 is that there are no longer single drag and drop scripts/components magically creating dependencies at edit time or runtime; instead, pre-built objects (known as Unity prefabs) contain a collection of generic components working together to deliver the required common solution. These prefabs are simply dragged and dropped into the scene and provide the required functionality, with the added benefit of being able to tweak and change the functionality by simply adjusting the parameters of the internal components. Without needing to write any code!
A great example is the new interactable object prefab, which is totally decoupled from requiring a VR controller to touch and grab it. Instead, it simply requires a generic interactor component to initiate the interaction. These interactors can be attached to anything, a VR controller, the end of a pointer beam, even a robotic arm in the scene. By default, the standard interactable prefabs offer a few different ways of dealing with interactions between two controllers, such as first hand will grab, but the second hand grabbing will control the direction of the interactable. These can be set up and added to a scene with a few simple drag and drops around the editor, but making this a bit more complex is where it gets really exciting. In the farmyard example scene there is a pump action confetti shooter which can be picked up with one hand, while the direction of the weapon is controlled when grabbed with the other hand. There’s a little twist to how this weapon actually works, as it won’t fire until the stock is pumped. This was all made possible simply by re-wiring the UnityEvents on the internal components, and without any custom code whatsoever!
The way input is dealt with in VRTK v4 has also totally changed. VRTK v3 would monitor the changes of any known SDK controller inputs, to then emit an event when the input state changed. This was much better than having to poll changes to SDK inputs in an Update loop, but it was always limited to what SDK inputs were available to VRTK. In VRTK v4 there is a new generic action system that simply wraps input data from whatever SDK can provide input, it then provides a known action to any of the VRTK components that need notifying of a change in some way. This means any external hardware can be hooked into a VRTK component with relative ease, if you want to support the latest VR treadmill, no problem!
There are many more changes throughout VRTK v4, and to compliment this new experience to learn, we’re also working on a new educational curriculum with the aim to help teach people from all skill levels of how to get the best out of VRTK when building for spatial computing. The VRTK Academy is another open source initiative that aims to bring as many helpful guides and tutorials to people interested in building for spatial computing so they can concentrate on turning their wonderful ideas into reality. Because it is also an open source initiative, we will be encouraging anyone who has a good lesson to teach, or create a tutorial to go along with the existing collection of guides. Our goal is to build an extremely useful resource that can be used by complete beginners, seasoned developers, educational institutes or anyone with a thirst for knowledge.
With the release of VRTK v4.0 beta, we’re just at the beginning of our journey. We now have a platform we’re confident can support the ever-changing landscape of spatial computing and we would love to bring more features and common solutions for people to implement in their own experiences.
While VRTK is currently available on Unity, we’d love to see VRTK brought to even more platforms like Unreal and WebXR in the future, so once you’ve learned the techniques of VRTK then you can apply them to the engine platform that suits your needs best. We want to help this community find out what works and what doesn’t work, which will only lead to greater user experiences in this domain. We hope the mission of VRTK will help shape the way we build and use spatial computing experiences in all aspects of life!
Enjoy VRTK v4!