The Oculus Remote Monitor client connects to VR applications running on remote devices to capture, store, and analyze data streams.
Oculus Remote Monitor is compatible with any Oculus mobile application built with Unity, Unreal Engine, or Native development tools.
Download the Oculus Remote Monitor client:
To enable the Capture Server on a device:
After completing those steps the Capture Server will run automatically whenever a VR application runs.
Oculus Remote Monitor uses a UDP-broadcast-based auto-discovery mechanism to locate remote hosts, and then a TCP connection to access the capture stream. For this reason, the host and the client must be on the same subnet, and the network must not block or filter UDP broadcasts or TCP connections.
If you are on a large corporate network that may have such restrictions, we recommend setting up a dedicated network or tethering directly to your device. Furthermore, frame buffer capturing is extremely bandwidth intensive. If your signal strength is low or you have a lot of interference or traffic on your network, you may need to disable Capture Frame Buffer before connecting to improve capture performance.
Oculus Remote Monitor uses UDP port 2020 and TCP ports 3030->3040.
Because we use a direct network connection, the following permission is required in your application's AndroidManifest.xml:
<!-- Network access needed for OVRMonitor --> <uses-permission android:name="android.permission.INTERNET" />
If the host and client are on the same subnet and the network is configured correctly (see Network Setup above), Oculus Remote Monitor automatically discovers any compatible applications running on the network.
To begin capturing and viewing data:
Each time you connect to a host, the Oculus Remote Monitor automatically compresses and saves the incoming data stream to disk under a unique filename in the format package-YYYYMMDD-HHMMSS-ID.dat. The default recordings directory for these files are in your Documents folder, under the OVRMonitorRecordings sub-folder. You can change this directory in the Client Settings panel.
To open saved capture files, click the Open Capture File icon.
The Frame Buffer Viewer provides a mechanism for inspecting the frame buffer as the data is received in real-time, which is particularly useful for monitoring play test sessions.
To view the most recent frame buffer:
When enabled, the Capture library will stream a downscaled pre-distortion eye buffer across the network. We use downscaling rather than using a higher-quality compression scheme to reduce overhead on the host device as much as possible. This reduces image quality substantially, but still provides valuable visual context as to what is happening on screen at any given moment.
The current default is 192x192 compressed. The Monitor application recompresses the Frame Buffer further to save memory and disk space when dealing with large capture sets.
The overview provides a high-level performance overview. It plots a graphical summary of the VrAPI messages and error conditions against a timeline.
To view the log:
Move the pointer over the performance overview to reveal details of the collected data. Double-click anywhere in the overview to open the Profiler Data view at that precise point in the timeline.
Screen captures of the pre-distorted frame buffer. Move the pointer over this section to view the screenshots captured at that point in time.
Frames per Second
For more information, see Basic Performance Stats through Logcat.
Head-Pose Prediction Latency (ms)
|The number of milliseconds between the latest sensor sampling for tracking and the anticipated display time of new eye images.|
|The CPU and GPU clock levels and associated clock frequencies, set by the application. Lower clock levels result in less heat and less battery drain.|
|Thermal (°C)||Temperatures in degrees Celsius. Well-optimized applications do not cause the temperature to rise quickly. There is always room for more optimization, which allows lower clock levels to be used, which, in return, reduces the amount of heat that is generated.|
|Available Memory (GB)||The amount of available memory, displayed every second. It is important to keep a reasonable amount of memory available to prevent Android from killing backgrounded applications, like Oculus Home.|
VrAPI reports various messages and error conditions to Android's logcat as well as to the Oculus Remote Monitor, which provides the thread and timestamp of each message. The Logging Viewer provides raw access to this data.
To view the log:
Warnings and errors are color-coded to stand out easier, and unlike logcat, thread IDs are tracked so you know exactly when and where it occurred.
Applications may expose user-adjustable parameters and variables in their code. Nearly any constant in your code may be turned into a knob that can be updated in real-time during a play test.
To view and adjust the available remote variables:
VrApi exposes CPU and GPU Levels to allow developers to quickly identify their required clock rates. Applications are also free to expose their own options that you can select or adjust.
ShowTimeWarpTextureDensity: This experimental feature toggles a visualization mode that colors the screen based on the texel:pixel ratio when running TimeWarp (green indicates 1:1 ratio; dark green < 1:1 ratio; red > 2:1 ratio).
To view and adjust the settings:
Configurable any time. If it does not point to a valid executable when Monitor runs, Monitor will attempt to locate a valid copy of adb by checking the environment variable. If that fails, Monitor will search under ANDROID_HOME.
|Recordings Directory||Specifies the location in which Monitor will automatically store capture files when connected to a remote host. The default is the current user's Documents directory under OVRMonitorRecordings.|
|Frame Buffer Compression Quality||Used for client-side recompression of the frame buffer. This helps offload compression load from the host while allowing for significant savings in memory usage on the client. Lower-quality settings provide greater memory savings, but may result in blocking artifacts in the frame buffer viewer.|
|Device Settings||Allows you to toggle capture support without manually editing your .oculusprefs file.|
The Profiler Data view provides both real-time and offline inspection of the following data streams on a single, contiguous timeline:
VrAPI also has a number of other events embedded to help diagnose VR-specific scheduling issues.
To view the data streams:
The Profiler Data view has a number of controls to help you analyze the timeline.
The Performance Data Viewer screen shows a selected portion of the application timeline:
Provides screen captures of the pre-distorted frame buffer, timestamped the moment immediately before the frame was handed off to the TimeWarp context. The left edge of each screenshot represents the point in time in which it was captured from the GPU.
Displays notches on every driver v-sync event.
GPU Zones inserted into the OpenGL Command Stream via Timer Queries are displayed in a similar manner as CPU events. Each row corresponds to a different OpenGL Context. Typical VR applications will have two contexts: one for TimeWarp, and one for application rendering. Note that on tiler GPUs, these events should be regarded as rough estimates rather than absolute data.
Hierarchical visualization of wall clock time of various functions inside VrAPI along with OpenGL Draw calls inside the host application. Log messages are displayed on their corresponding CPU thread as icons. Mouse over each icon to display the corresponding message (blue circles), warning (yellow squares), or error (red diamonds).
|Sensor||General sensor data visualizer. CPU and GPU clocks are visualized in the screenshot shown above, but other data may be displayed here, such as thermal sensors, IMU data, et cetera.|
Tearing occurs in VR applications any time TimeWarp rendering fails to render ahead of scanout. VrApi attempts to detect this with a GPU Sync Object to determine when the GPU completes rendering distortion for a given eye. If for any reason it does not complete in time, VrApi prints a warning to logcat, which Oculus Remote Monitor picks up.
V-sync %d: Eye %d, CPU latency %f, GPU latency %f, Total latency %f
If you are running on a Samsung GALAXY S6, you may also see tearing events by looking at the GPU Context that is running WarpToScreen.
In this example, because the refresh rate of the display is 60 Hz, the ideal running time of WarpToScreen is 16.66ms, but a scheduling/priority issue in the application caused the second eye to be executed 10 ms late, pushing WarpToScreen to run for 26.81 ms. The actual eye distortion draw calls are barely visible as two distinct notches under each WarpToScreen event on the GPU.
VrApi reports the Frame Index that was submitted inside vrapi_SubmitFrame, as well as the Frame Index that is currently being TimeWarped. This allows you to easily identify missed frames, and to easily track latency between app render and distortion.
Every time a Frame Index is reported from API, Oculus Remote Monitor marks it on the timeline for the associated thread with a vertical grey line. If the Frame Index arrives out of order, the line is changed to red, helping you quickly identify problem areas.
In the above example, the TimeWarp thread has two out-of-order frames visible. This typically happens when the GPU was unable to finish rendering a frame from the application in time for TimeWarp to begin sampling from it. Zooming in, we can investigate the cause by looking at the actual Frame Index value.
Just as we thought, the same Frame Index is sampled twice, which results in the first red line. But the next frame is able to catch up, which means it jumps ahead two frames, resulting in the second red line along with a frame that is never displayed. This was all probably caused by the application failing to complete GPU work on time.
Oculus Remote Monitor is capable of capturing OpenGL calls across the entire process (enabled with the Graphics API option). Application-specific performance issues can therefore be spotted at times. In the example below, an application was mapping the same buffer several times a frame for a particle effect. On a Note 4 running KitKat, the GL driver triggered a sync point on the second glUnmapBuffer call, causing it to take away 2.73 ms - without the sync point, this same call takes around 0.03 ms. After spotting this issue, the developer was able to quickly fix the buffer usage and reclaim that CPU time.
VrApi attempts to lock the CPU and GPU clocks at particular frequencies to ensure some level of execution speed and scheduling guarantees. These are configurable via the CPULevel and GPULevel available in the API.
When in VR Developer Mode, the clocks may occasionally unlock when out of the headset for too long. When this happens, the CPU/GPU Clock Sensors go from extremely flat to extremely noisy, typically causing many performance issues like tearing and missed frames, as seen below:
Note that on a Samsung GALAXY S6, we allow the clocks to boost slightly under certain conditions, but only by a small amount in typical cases, and it should never drop below the requested level. It is also fairly common for some cores to go completely on and offline occasionally.
VrCapture uses a fixed-size FIFO internally to buffer events before they are streamed over the network. If this buffer fills faster than we can stream its contents, we are left in a tricky situation. If your network connection stalls long enough for any reason, it eventually causes the host application to stall as well.
This is easily spotted in Oculus Remote Monitor - look for around two seconds of extremely long events on the OVR::Capture thread followed by other threads stalling as well. We provide a large internal buffer, so a few network hitches shouldn’t affect your application, but if they persist long enough, a random event inside your application will eventually stall until the Capture thread is able to flush the buffer.
In the example below, several seconds of poor network connectivity (see by long AsyncStream_Flush events) eventually caused the MapAndCopy event on the application’s render thread to stall until it was eventually released by the Capture thread:
If you find it difficult to capture reliably because of this issue, we recommend disabling Frame Buffer Capturing before connecting, as this feature consumes the bulk of the bandwidth required.
If your application requests CPU/GPU Levels are too high, the internal SoC and battery temperatures will rise slowly, yet uncontrollably, until it hits the thermal limit. When this happens, GearVR Service will terminate the application and display a thermal warning until the device cools down.
It may take quite a long time to encounter this scenario during testing. Monitoring thermals in Oculus Remote Monitor is a great way to quickly see if your application causes the device temperature to rise perpetually. Mouse over the Sensor graph to give the precise readout at any given time. We recommend keeping an eye on it.
If the temperature exceeds the device’s first thermal trip point, the graph turns bright red, which typically occurs a few minutes before GearVR Service shuts the application down, and should serve as a stern warning that you should probably lower the CPU/GPU Levels.