Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.streampixel.io/llms.txt

Use this file to discover all available pages before exploring further.

The outcome: a user puts on a Quest, opens a URL in the headset’s browser, taps “Enter VR,” and is inside your Unreal project — no APK install, no app store, no native build.

What works, what doesn’t

Streampixel streams pixels, and WebXR puts those pixels onto a stereo display. Combined, they let any WebXR-capable browser become a VR client.
Status
Meta Quest 2 / 3 / Pro (built-in browser)Supported
Apple Vision Pro (Safari)Supported
Pico 4 / Neo 3 (built-in browser)Supported
Desktop VR via Chrome/Edge + WebXR (Index, Vive, Rift)Supported
Native APK / sideloaded appOut of scope — Streampixel is browser-first
iOS Safari (non-Vision Pro)No WebXR; falls back to flat 2D stream
Streampixel does not ship a native VR runtime. WebXR is the bridge — if a device’s browser exposes a WebXR session, the SDK can hand the stream to it.

Step 1 — Prepare the Unreal project

VR streaming is not a runtime toggle on the Streampixel side; it is a build-time choice in Unreal. Your project must:
  • Render in stereo (single-pass instanced is recommended).
  • Target a stable framerate matching the headset (72, 90, or 120 Hz).
  • Use a forward renderer for best perf on standalone headsets.
Walk through Prepare your Unreal Engine project for VR before you continue.

Step 2 — Enable XR input in the SDK

The SDK has an xrInput config flag. Enable it on the page that will request the WebXR session:
import { StreamPixelApplication } from 'streampixelsdk';

const { appStream, pixelStreaming, UIControl } = await StreamPixelApplication({
  appId: PROJECT_ID,
  AutoConnect: true,
  xrInput: true,           // route XR controller / hand input to UE
  forceTurn: true,
});
xrInput: true tells the SDK to listen for WebXR controller, hand, and pose events and forward them to Unreal as input. Without it, the headset will display the stream but controllers will not function inside the UE world.

Step 3 — Trigger the WebXR session

Browsers require a user gesture (a tap or click) to enter immersive mode. Wire up a button:
<!DOCTYPE html>
<html>
  <head>
    <meta charset="utf-8" />
    <title>Streampixel VR</title>
    <style>
      html, body { margin: 0; height: 100%; background: #000; color: #fff;
        font-family: ui-sans-serif, system-ui; }
      #stream { width: 100vw; height: 100vh; }
      #vr-button {
        position: fixed; bottom: 24px; left: 50%; transform: translateX(-50%);
        padding: 14px 28px; font-size: 16px; font-weight: 600;
        background: #4e9cff; color: #000; border: 0; border-radius: 8px;
        cursor: pointer; z-index: 10;
      }
      #vr-button:disabled { opacity: 0.4; cursor: not-allowed; }
    </style>
  </head>
  <body>
    <div id="stream"></div>
    <button id="vr-button" disabled>Enter VR</button>

    <script type="module">
      import { StreamPixelApplication } from 'streampixelsdk';

      const PROJECT_ID = 'YOUR_PROJECT_ID';
      const button = document.getElementById('vr-button');

      const { appStream, pixelStreaming } = await StreamPixelApplication({
        appId: PROJECT_ID,
        AutoConnect: true,
        xrInput: true,
        forceTurn: true,
      });

      appStream.onVideoInitialized = async () => {
        document.getElementById('stream').append(appStream.rootElement);

        // Only enable the button if WebXR immersive-vr is supported.
        if (!('xr' in navigator)) {
          button.textContent = 'WebXR not available';
          return;
        }
        const supported = await navigator.xr.isSessionSupported('immersive-vr');
        if (supported) {
          button.disabled = false;
        } else {
          button.textContent = 'VR not supported on this device';
        }
      };

      button.addEventListener('click', async () => {
        try {
          const session = await navigator.xr.requestSession('immersive-vr', {
            requiredFeatures: ['local-floor'],
            optionalFeatures: ['hand-tracking', 'bounded-floor'],
          });

          // Hand the session to the SDK so it can drive xrInput.
          if (typeof pixelStreaming.startWebXR === 'function') {
            pixelStreaming.startWebXR(session);
          }

          session.addEventListener('end', () => {
            button.disabled = false;
            button.textContent = 'Enter VR';
          });

          button.disabled = true;
          button.textContent = 'In VR';
        } catch (err) {
          console.error('Failed to enter VR:', err);
        }
      });
    </script>
  </body>
</html>
Always feature-detect with navigator.xr.isSessionSupported('immersive-vr') before showing the button. Without that check, desktop visitors will see a button that throws on click.

Step 4 — Pick the right codec

Standalone headsets do most decoding in fixed-function silicon. The decoder’s codec support is what determines whether your stream plays smoothly or stutters.
CodecQuest 2 / 3 / ProVision ProPicoDesktop VR
H264Hardware-decoded, recommendedHardware-decodedHardware-decodedYes
VP8Software fallback, drops framesLimitedLimitedYes
VP9Software fallbackSoftwareSoftwareYes
AV1No (not in browser)NoNoChrome desktop only
Set H264 as your default in the dashboard (Codec settings) for any project that targets headsets. Other codecs may negotiate but will burn battery and miss frame deadlines.

Step 5 — Tune for the device

HeadsetNative refreshRecommended targetPer-eye resolution
Quest 272 / 90 Hz72 Hz1832×1920
Quest 390 / 120 Hz90 Hz2064×2208
Quest Pro90 Hz90 Hz1800×1920
Vision Pro90 Hz (varies)90 HzHigh DPI per eye
Pico 472 / 90 Hz90 Hz2160×2160
Streampixel can deliver up to ~1080p per eye comfortably; pushing higher requires more bitrate and more decoder headroom. Start at 1920×1080 per eye, 25–30 Mbps, H264, and tune from there. See Performance tuning for bitrate guidance.

Latency

VR is unforgiving of lag. A delay between head movement and rendered frame that you would never notice on a 2D stream becomes nausea-inducing in stereo. Things that move latency in the right direction:
  • Closest region. A 30 ms RTT improvement is more valuable than any encoder tweak.
  • Wired or 6 GHz Wi-Fi. Standalone headsets on 2.4 GHz Wi-Fi are the most common cause of “this feels laggy” complaints. Quest 3 and Vision Pro on Wi-Fi 6E typically work well.
  • H264 over VP9. H264 hardware decode shaves several ms compared to software-decoded VP9.
  • Lower minQP in the dashboard’s adaptive settings, only if your bitrate budget allows it. Quality dips at the same bitrate, but frames arrive faster.

Gotchas

localhost over HTTP works during development on Quest browsers, but Safari on Vision Pro will not surface the WebXR API on insecure origins. Use a local HTTPS dev server (mkcert, Caddy, or vite --https) to test there.
navigator.xr.requestSession only resolves if it is called synchronously from a click or tap event. Wrapping it in a Promise chain that awaits something else first (like the SDK’s onVideoInitialized) breaks the gesture link. Always trigger from the button click handler directly.
Pass optionalFeatures: ['hand-tracking'] if your UE project expects hand input. If you require it, use requiredFeatures instead — the session will fail to start on devices without hand tracking, which is the right behavior for hand-only experiences.
Quest and Pico route audio to the headset speakers by default. Some users with paired Bluetooth headphones experience added audio latency that does not match the visual stream. Document this for your testers; there is no fix on the streaming side.
A standalone headset decoding a 30 Mbps H264 stream while running its display at 90 Hz drains the battery in roughly 90–120 minutes. For longer experiences, recommend tethered power or build in checkpoints.

Next steps

Prepare UE for VR

Project settings, plugins, and rendering paths for VR builds.

Codec settings

Lock H264 as the default for headset audiences.

Performance tuning

Bitrate and resolution recipes that ship without dropped frames.

Regions

Pick the closest region — VR latency is dominated by RTT.