Oculus Go Development

On 6/23/20 Oculus announced plans to sunset Oculus Go. Information about dates and alternatives can be found in the Oculus Go introduction.

Oculus Quest Development

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.

VR Playtest Facilitation + Results Reporting

Most VR professionals can facilitate a playtest, but it takes a great deal of practice to really specialize in this role and assure you maximize the value of your results. Here are a number of recommendations for how best to facilitate + moderate a VR playtest, followed by best practices to analyze and report on your test results.

How to effectively facilitate a VR playtest

The individual assigned to facilitate your playtest (sometimes called the moderator) will be in charge of guiding the tester through the playtest from start to finish. The environment, whether the test is remote versus in-person and the goals of test will have a great deal of impact on how to best execute your playtest. Communication is key, while there are a number of other tactics that will ensure that your tester provides the feedback you need.

Before you start your playtest

While the information provided in the research planning phase will help a great deal, it’s highly recommended that you test your test. Do a few practice sessions with your script, and if possible, do so with a teammate or peer. This will enable you to deliver the script and manage the conversation with confidence.

As noted in the script writing section, be sure that you have the tester review + approve your NDA, or any other legal consent forms prior to kicking off your test. This step cannot be understated.

Minimize tension with casual, calm communication

In order to acquire valuable feedback, you want your tester to be generally at ease throughout the test. Start the test by noting that there is no such thing as bad feedback, the game is WIP, and they should speak freely throughout.

While you should practice your script, do your best not to read from it verbatim at any time. Be calm, casual, and never critical.

Working with testers who are new to VR and slowly ramping up intensity

Along with your introduction, it may help to provide them an overview of the Oculus hardware. Guide them through each piece of hardware, how they work, and how to hold/wear each of them.

If the test is in-person, we recommend the following steps to help introduce new VR users to the hardware:

  • Help them put on the controllers via the controller straps.
  • Hand them the headset for them to put on.
  • Tell them you will help with the side and top strap, then do so.

If your app has the flexibility, it is recommended to start a newer VR player in a calm environment rather than one that has high intensity in the first 2-3 minutes. This will give them a chance to get a feel for the overall visual design, locomotion style, and how they are embodied in the virtual environment.

There is no “right way” to play a VR app

Even if your app is generally linear, enable the tester to experience your app however they believe it should be experienced. If the user asks how they should be navigating your app, do your best to let them think for themselves. Avoid corralling them into the path you had intended as you may discover important learnings about what they consider fun, their expectations for interactivity, and numerous other insights that can only be discovered with free play.

Encourage the tester to think aloud, small ideas can lead to major product enhancements

While encouraging your tester to play with a certain amount of freedom, you should encourage testers to think aloud as they play your app. The team at MIRAGESOFT made a valuable discovery through an offhand comment when testing their game, Reel VR Fishing:

During the playtest, a user remarked that if the UI system could be turned off that it would add to the immersion. Throughout development our team had never thought of it this way, and was in awe by the different points of view that helped us to further improve the game.

  • Mark Choi, MIRAGESOFT

Listen and follow up: Essential to successful playtesting

Enable the tester to speak freely whenever possible, always avoid interruption and get comfortable with long pauses, collectively these tactics will help to get your tester to speak openly and honestly.

If your tester is simply quiet and does not have the volume of feedback you’re looking for, ask follow up questions to what they’ve already provided you:

  • Why do you feel that way?
  • Why did you decide to take this path?
  • What else would you like to see?
  • Tell me more about that.
  • Anything else?

Finally, internalize the practice of answering questions with a question. For example:

Tester: What am I supposed to do here? Researcher (You): What do you think you are supposed to do there?

This will help your tester more openly think through their experience, their challenges, and assumptions.

More tips for facilitating your test in-person playtests

  • Clean/sanitize your hardware in front of the tester to build rapport.
  • Show your tester the steps to calibrate their screen with the home button.
  • During the playtest, avoid any unnecessary physical contact w/ the tester. If you foresee needing to “nudge” them, be sure to request approval to do so prior to starting the test.
  • Limit the number of people in the room, and let the tester know if more people enter.
  • Limit any conversation that’s not directed to the tester.

Considerations for what to analyze during your playtest

During your playtest, always be looking for visual and audible cues that might help you learn even more about the testers experience. These include, but are not limited to:

  • Body language / non-verbal queues
    • If you notice the tester reacting or commenting non-verbally, feel free to ask them outright about what they experienced.
  • What they focus or notice in a specific area, or completely disregard.
  • Any signs of discomfort:
  • Eye strain
  • Nausea
  • Fatigue
  • Areas in your app where they may be unable to move forward.
  • Expressions of excitement or some other emotion of note.

Best practices for after the test is complete

Following the playtest, you will want to discuss the testers’ high level thoughts on the experience, this is where your survey/script will come in handy. Be sure to remember the recommendations above for efficient communication throughout user research.

While talking through your questionnaire, do your best not to take notes for every response. You should be recording this process, while avoiding detailed note-taking assures the tester that you’re giving them your undivided attention. Once the full interview is complete and they have left the premises, then feel free to take down any other notes that you had in mind.

If you are doing multiple tests, be sure to wipe down the headset and controllers with hypoallergenic wipes. It’s also good to change out the headset padding if you have a spare available.

Data analysis best practices

You’ve completed your tests, you have notes, survey answers, recorded audio/video, and potentially some testing automation data, time to dive in, find, record and share your insights. This phase is not to be taken lightly, there is plenty of work to be done post-test, and while this phase (like the others) will be determined by your goals + available resources, here are a few best practices that will help to streamline your process.

Be prepared to prioritize: You will have bugs, UX and gameplay feedback and a finite amount of resources, be sure that you and your team are prepared to analyze your data and prioritize where to focus your time and effort going forward.

Review your results as a team: If possible, get the full team together to analyze the results and key takeaways. This is a great opportunity to share personal notes, review specific recordings, and ensure that your full team understands why you will be spending the time to make certain adjustments to your app going forward.

A single comment does not have any less impact: While receiving feedback more than once can add to its validation, do not attribute less impact to a single comment. You put in the effort to recruit your target audience, so all comments should be taken into consideration, especially in the early phases of your testing process.

Tips for data analysis + visualization from the Oculus UXR Team

With the example data set visualized below, our XR User Research Team has provided the following points to help guide your playtesting data analysis + visualization:

Tips for when your data set is <25 testers: Don’t use averages if you have a sample size below 20. You may want to use some sort of score to track, but if it’s usability, you need to think more in terms of people succeeding or failing. Use terms like some and most. Remember, one fail out of five is significant. If you recruited correctly, that could mean thousands or millions of people failing.

Tactics to effectively record and analyze open ended questions: Open ended questions should be qualified and coded. A simple way to do this is to review each piece of feedback, and assign a numerical value for each comment in a new document. As an example:

User feedback: He ran really fast, I was able to beat enemies really well.

This would be recorded as two pieces of feedback: #1: Running felt fast. #2: Fast running helps players be successful.

Keep a tally as more playtesters mention these points, and sort them by the attributed digits. If you notice certain feedback trending throughout, be sure to save specific quotations so you can include it in the final report.

Layout should be constructed based on your goals/objectives: The visualization of your results should be laid out based on the different aspects you wanted to test. Go through all of your notes and transfer them into sections into a separate document. From there, you can organize and pull out your key findings and action items.

You may want to visualize your data in a flow from round to round: In the example below, notice how the data is visualized on the horizontal axis. The testers’ comments were used to qualify how they felt about the experience after each 2 rounds of playing, we then assigned the following attributes: Fun, Challenge, and Frustration based on their open ended responses.

Example playtest visualization

Visualization is essential to communicating your results to your cross functional teams, ensuring your research is actionable. If you are new to data visualization, you may want to do some independent research before reporting your findings.

Keep in mind these design best practices for MRC

If you haven’t already, be sure to review the first guide in this series on MRC, which covers app design best practices including locomotion, environmental design and more.

Playtest Planning + Survey Design Guide