Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.streampixel.io/llms.txt

Use this file to discover all available pages before exploring further.

A Streampixel session is a WebRTC peer connection between the user’s browser and a GPU worker. It carries an encoded video track, an optional audio track, and a bidirectional data channel for input and JSON messaging. Almost every knob you can tune — codec, resolution, bitrate, AFK behavior — maps to one of those three streams.

WebRTC in 60 seconds

WebRTC is the browser-native protocol for real-time audio, video, and data. Streampixel uses it because it’s already in every modern browser, has sub-100 ms end-to-end latency on a good network, and supports adaptive bitrate.
ConceptWhat it does
Peer connectionThe end-to-end channel between browser and worker. Carries video, audio, and data.
ICEThe negotiation that picks the best path between peers (direct, NAT-pierced, or relayed).
STUNHelps each peer learn its public IP so direct connections can be attempted.
TURNFallback relay used when firewalls or symmetric NATs block direct peer-to-peer. Streampixel runs TURN per region.
SDPThe offer/answer messages that describe what codecs and tracks each side supports.
Data channelA bidirectional binary/text channel — Streampixel uses it for input, JSON messaging, and SDK metadata.
STUN and TURN are handled automatically. You don’t configure ICE servers — the SDK pulls them at session start from signalling.

Codecs

The worker encodes Unreal’s framebuffer with one of four codecs. The browser advertises which codecs it supports during the SDP exchange and Streampixel falls back automatically when the requested codec isn’t available.
CodecBrowser supportQualityCPU/GPU costNotes
H264UniversalGoodLowSafe default. Hardware-accelerated almost everywhere.
VP8UniversalGoodMediumOlder, software-decoded in many browsers.
VP9Chrome, Firefox, EdgeBetter than H264 at the same bitrateHigherGood for high-detail content over constrained links.
AV1Chrome desktop onlyBest per-bitHighestFalls back automatically when unsupported.
AV1 only decodes on Chrome desktop today. If you select AV1 as your preferred codec, every other browser silently falls back to a supported alternative. Test in your real browser mix before switching defaults.
Pick a preferred codec from project settings. See codec settings for the full configuration.

Resolution and bitrate

The encoder targets a resolution and a bitrate. Both can be fixed (locked to a value you pick) or adaptive (auto-tuned to the user’s network).
The encoder shrinks resolution and drops bitrate when the network gets congested, then ramps back up when conditions improve. Best for the broadest audience and varied network quality.
SettingCommon valuesEffect
Resolution720p, 1080p, 1440p, 4KHigher resolution = more pixels to encode and decode. Bandwidth grows roughly with pixel count.
Bitrate4 Mbps – 50 MbpsHigher bitrate = better quality at a given resolution. Capped by the user’s downlink.
Frame rate30, 60, 90, 120 fpsHigher frame rate = smoother motion, more encoding work, more bandwidth.
Configure these from resolution settings.

Latency

The “input lag” a user feels is the sum of several components. Cutting any one of them helps; cutting the worst offender helps most.
ComponentTypical contributionWhat controls it
Encode5–15 msCodec choice, GPU load, frame rate.
Network RTT20–150 msUser-to-worker distance — pick the right region.
TURN relay (when used)+10–30 msNetwork conditions; can’t be avoided behind restrictive firewalls.
Decode5–20 msCodec, browser, hardware acceleration availability.
Render frame8–16 ms (60 fps)Browser compositor, monitor refresh.
The biggest single win for most users is putting them on the closest region. After that, prefer H264 with hardware acceleration, and avoid forcing very high resolutions on weak GPUs.

Input flow

Every input event the user generates is sent over the WebRTC data channel — not as separate HTTP calls. That keeps input latency on the same path as the video.

Mouse

Position, buttons, wheel. Lock and confine modes for FPS-style controls. See mouse settings.

Keyboard

Key down, key up, character input. Browser shortcuts can be intercepted on demand.

Touch

Multi-touch points are forwarded to UE as touch events for mobile and tablet.

Gamepad

Standard Gamepad API devices map to UE input axes and buttons.

XR

WebXR sessions forward head and controller pose to UE for VR builds.

JSON messages

App-defined messages between page and UE — for menus, state sync, and custom UI.

Audio

Audio is bidirectional. Game audio flows out of UE to the browser; mic input flows from the browser to UE for voice-driven interactions.
DirectionSourceCodec
Worker → BrowserUnreal audio outputOpus (WebRTC default). Negotiated automatically.
Browser → WorkerUser microphoneOpus. Requires microphone in the iframe allow attribute.
For multi-user voice and text chat (separate from the pixel-streaming pipe), Streampixel uses LiveKit rooms. See built-in voice and text chat.

Next steps

Codec settings

Pick the codec that fits your audience and content.

Resolution settings

Lock or adapt resolution and bitrate.

Session lifecycle

See how the peer connection opens, runs, and ends.

JSON message communication

The data channel carries more than input — it’s a full app-to-UE bus.