The analytics dashboard summarises every session your project has served. This page explains what each metric measures, what a healthy value looks like, and how to spot common problems.Documentation Index
Fetch the complete documentation index at: https://docs.streampixel.io/llms.txt
Use this file to discover all available pages before exploring further.
Where to find analytics
The dashboard splits into Sessions (top), Audience (middle), and Quality (bottom).
Sessions
Session count
The number of unique connections in the selected window. One viewer who reloads the page counts twice — Streampixel measures connections, not unique users.| Reading | Interpretation |
|---|---|
| Steady day-over-day | Healthy baseline |
| Spike with no marketing change | Investigate — possibly a viral share or a bot |
| Drop to zero | Build is failing to start; check disconnect codes |
Average duration
Mean session length in minutes. Useful as an engagement signal.- Under 30 seconds — Most viewers are bouncing. Often a loading-time or quality issue.
- 1-3 minutes — Typical for casual demos.
- 5+ minutes — Strong engagement; viewers are exploring.
Concurrent sessions
The number of viewers connected at the same time, plotted over the selected window. Two values matter:- Peak — the highest concurrent count. Use this to size your plan.
- Current — live count right now.
Audience
Region distribution
Breakdown by client region — typicallyUS, EU, APAC, plus a long tail. Use it to choose which Streampixel regions to deploy to:
- Heavy
EUtraffic → enable Europe if you haven’t already. - Heavy
APACtraffic → enable Asia Pacific.
OS / browser distribution
Breakdown of viewer operating systems and browsers. Useful for:- Prioritising browser-specific bug fixes (a Safari-only crash matters more if 40% of your audience is on iOS).
- Validating your Web SDK testing matrix matches the real audience.
Quality
These metrics are reported by the client over WebRTC. They reflect what the viewer actually experienced.FPS
Frames per second the client received and rendered.| FPS | Quality |
|---|---|
>= 60 | Ideal — smooth for fast-paced content |
30-59 | Acceptable — most viewers won’t notice |
15-29 | Visibly choppy |
< 15 | Unwatchable; viewers will bounce |
Bitrate
Client-reported video bitrate in kbps.| Bitrate | Quality |
|---|---|
> 5000 kbps | High quality, suitable for 1080p+ |
2000-5000 kbps | Good quality, suitable for 720p |
500-2000 kbps | Watchable but compressed |
< 500 kbps | Heavy compression; expect blockiness |
Latency
Round-trip time (RTT) in milliseconds between the client and the streaming server.| Latency | Feel |
|---|---|
< 50 ms | Imperceptible — feels native |
50-100 ms | Excellent — most users won’t notice |
100-150 ms | Acceptable for most experiences |
150-250 ms | Noticeable input lag |
> 250 ms | Frustrating for interactive content |
Disconnects by code
Distribution of disconnect reasons. Each code has a meaning — a few common ones:| Code | Meaning |
|---|---|
1000 | Normal close — user navigated away or closed the tab |
1006 | Abnormal closure — usually a network drop or TURN failure |
4001 | AFK / idle timeout |
4003 | Queue timeout |
Spotting issues
Use the patterns below to triage anomalies.Spike in 1006 disconnects
Spike in 1006 disconnects
Indicates abnormal closure — the WebRTC connection died unexpectedly. Common causes:
- TURN server overloaded — viewers behind strict NATs can’t relay traffic. Check region distribution; if one region spikes, the local TURN may need attention.
- Network instability on the client — usually correlates with mobile traffic.
- Build crash mid-session — pair with FPS dropping to zero just before disconnect.
FPS drops with stable bitrate
FPS drops with stable bitrate
Bitrate is fine, but frames are not rendering. The client device cannot decode fast enough.
- Check OS/browser distribution — older mobile devices often hit decode limits.
- Lower the build’s resolution or codec complexity.
Bitrate drops with stable FPS
Bitrate drops with stable FPS
WebRTC’s adaptive logic chose to keep frames smooth at the cost of pixel quality. The client’s network is the bottleneck.
- Check region distribution — a region without a local edge will show this.
- If the audience is mobile-heavy, this is normal during commute hours.
Average duration drops while session count stays flat
Average duration drops while session count stays flat
Same number of viewers, but each one leaves sooner. Typically a regression in load time, first-frame quality, or onboarding.
- Compare to the date of your last build upload. Roll back if the timing matches.
Concurrent peak hits plan limit
Concurrent peak hits plan limit
New viewers are being queued or rejected at peak times.
- Upgrade your plan, or
- Use the queue messaging in branding to set viewer expectations during peaks.
Per-session detail
Click any row in the Recent sessions table to see metrics for that single session — FPS over time, bitrate over time, disconnect reason, region, and OS/browser. Useful for debugging individual viewer reports.Exporting data
The dashboard view covers most needs. For raw session data — to ingest into your own analytics warehouse — contact support. Streampixel can provide CSV or JSON exports on request.Data retention
Recent session data is kept for the duration of your account. Historical depth depends on your plan; if you need a longer retention guarantee for compliance, contact support before relying on the dashboard for long-term records.Deleting a project also deletes its analytics. Export anything you need before removing a project.
Next steps
Disconnect codes
Full reference for every disconnect code shown in analytics.
Stream statistics
Read live FPS, bitrate, and latency from the Web SDK.
Webhooks
Mirror events into your own monitoring stack.
Billing & subscriptions
Compare peak concurrent usage against your plan limit.