Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.streampixel.io/llms.txt

Use this file to discover all available pages before exploring further.

The outcome: every push to main packages your Unreal project, uploads the .zip to S3, registers the build with Streampixel, waits for the build.approved webhook, then distributes it to streaming servers. No manual dashboard steps.

Architecture

The CI job is fire-and-forget. Approval and distribution are handled out-of-band by a tiny webhook listener — that decoupling is what lets the build pipeline take 5 minutes or 50 minutes without holding a runner open.

Prerequisites

RequirementNotes
Streampixel projectCreated in the dashboard. Note the projectId.
API key + user IDFrom API authentication and Finding your IDs.
S3 bucket (or equivalent)Must serve direct-download HTTPS URLs. See file URL requirements.
AWS credentialsAn IAM user with s3:PutObject on the bucket.
GitHub repoWith your Unreal project.
Webhook hostA public HTTPS endpoint to receive build.approved. ngrok works for development.
This recipe uses GitHub Actions and AWS S3, but the same pattern works on any CI (GitLab CI, CircleCI, Buildkite) and any storage that returns a direct-download URL (GCS, R2, DigitalOcean Spaces).

Step 1 — Store secrets

In your GitHub repo, add these as Settings → Secrets and variables → Actions:
SecretValue
STREAMPIXEL_API_KEYYour Streampixel API key
STREAMPIXEL_USER_IDYour Streampixel user ID
STREAMPIXEL_PROJECT_IDTarget project ID
AWS_ACCESS_KEY_IDIAM access key
AWS_SECRET_ACCESS_KEYIAM secret
AWS_REGIONe.g. us-east-1
S3_BUCKETBucket name, no s3:// prefix
Never commit these to the repo. Never echo them in CI logs. The Actions runner masks known secrets, but custom log lines that concatenate strings can leak them — keep set -x off in scripts that touch secrets.

Step 2 — GitHub Actions workflow

Create .github/workflows/streampixel-deploy.yml:
name: Build and ship to Streampixel

on:
  push:
    branches: [main]
  workflow_dispatch:

concurrency:
  group: streampixel-deploy
  cancel-in-progress: false

jobs:
  build-and-upload:
    runs-on: [self-hosted, windows, ue5]
    timeout-minutes: 120

    env:
      UE_ROOT: 'C:\Program Files\Epic Games\UE_5.4'
      PROJECT_NAME: MyProject
      BUILD_DIR: ${{ github.workspace }}\Build
      ARCHIVE_NAME: ${{ github.sha }}.zip

    steps:
      - name: Checkout
        uses: actions/checkout@v4
        with:
          lfs: true

      - name: Package Unreal build
        shell: cmd
        run: |
          "%UE_ROOT%\Engine\Build\BatchFiles\RunUAT.bat" ^
            BuildCookRun ^
            -project="%GITHUB_WORKSPACE%\%PROJECT_NAME%.uproject" ^
            -platform=Win64 ^
            -clientconfig=Shipping ^
            -cook -build -stage -package -archive ^
            -archivedirectory="%BUILD_DIR%" ^
            -pak -nodebuginfo -utf8output

      - name: Zip the staged build
        shell: pwsh
        run: |
          Compress-Archive `
            -Path "$env:BUILD_DIR\WindowsNoEditor\*" `
            -DestinationPath "$env:GITHUB_WORKSPACE\$env:ARCHIVE_NAME" `
            -CompressionLevel Optimal

      - name: Configure AWS credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ${{ secrets.AWS_REGION }}

      - name: Upload to S3
        shell: bash
        run: |
          aws s3 cp \
            "$GITHUB_WORKSPACE/$ARCHIVE_NAME" \
            "s3://${{ secrets.S3_BUCKET }}/builds/$ARCHIVE_NAME" \
            --no-progress

      - name: Generate signed download URL
        id: presign
        shell: bash
        run: |
          URL=$(aws s3 presign \
            "s3://${{ secrets.S3_BUCKET }}/builds/$ARCHIVE_NAME" \
            --expires-in 3600)
          echo "::add-mask::$URL"
          echo "url=$URL" >> "$GITHUB_OUTPUT"

      - name: Submit build to Streampixel
        shell: bash
        env:
          FILE_URL: ${{ steps.presign.outputs.url }}
        run: |
          RESPONSE=$(curl -sS -X POST \
            https://api.streampixel.io/pixelStripeApi/projects/upload-file \
            -H "Content-Type: application/json" \
            -d @- <<EOF
          {
            "apiKey": "${{ secrets.STREAMPIXEL_API_KEY }}",
            "userId": "${{ secrets.STREAMPIXEL_USER_ID }}",
            "projectId": "${{ secrets.STREAMPIXEL_PROJECT_ID }}",
            "fileUrl": "$FILE_URL",
            "autoRelease": false
          }
          EOF
          )

          UPLOAD_ID=$(echo "$RESPONSE" | jq -r '.uploadId')
          if [ "$UPLOAD_ID" = "null" ] || [ -z "$UPLOAD_ID" ]; then
            echo "Upload failed: $RESPONSE"
            exit 1
          fi

          echo "Upload accepted, uploadId=$UPLOAD_ID"
          echo "$UPLOAD_ID" > upload-id.txt

      - name: Persist uploadId as artifact
        uses: actions/upload-artifact@v4
        with:
          name: streampixel-upload-id
          path: upload-id.txt
          retention-days: 7
A few things worth pointing out:
  • autoRelease: false — we don’t want Streampixel to deploy automatically. The webhook listener (Step 3) decides when to call distribute.
  • concurrency.group — only one shipping job at a time. Distribution is rate-limited to 1 call per 2 minutes per user; queueing builds at the CI level keeps you under that ceiling without retries.
  • Presigned URL with 1-hour expiry — long enough for Streampixel to download a 30 GB ZIP, short enough that the URL is useless if it leaks. The ::add-mask:: annotation prevents the URL from showing in subsequent logs.
  • Self-hosted Windows runner — packaging Unreal on GitHub-hosted runners is impractical for non-trivial projects. Use a runner with the engine pre-installed.

Step 3 — Webhook listener

The CI job ends after upload. Streampixel processes the build asynchronously and emits webhook events: build.uploaded, downloading, extracting, saving, then approved or rejected. When build.approved fires, you call distribute. Here is a minimal Express listener:
// webhook-listener.js
import express from 'express';
import axios from 'axios';

const app = express();
app.use(express.json({ limit: '1mb' }));

const STREAMPIXEL_API_KEY = process.env.STREAMPIXEL_API_KEY;
const STREAMPIXEL_USER_ID = process.env.STREAMPIXEL_USER_ID;
const STREAMPIXEL_PROJECT_ID = process.env.STREAMPIXEL_PROJECT_ID;
const WEBHOOK_PATH = process.env.WEBHOOK_PATH; // e.g. "/webhooks/streampixel/9f3a...c2"

// Idempotency: don't distribute the same uploadId twice.
const distributed = new Set();

app.post(WEBHOOK_PATH, async (req, res) => {
  // Respond fast — Streampixel times out after 10s and does not retry.
  res.status(200).end();

  const { event, data } = req.body || {};
  if (!event || !data) return;

  // Defense-in-depth: confirm this event is for our project.
  if (data.projectId && data.projectId !== STREAMPIXEL_PROJECT_ID) {
    console.warn('Ignoring event for unknown project:', data.projectId);
    return;
  }

  console.log(`[${event}]`, data.uploadId || '');

  if (event !== 'build.approved') return;
  if (distributed.has(data.uploadId)) return;
  distributed.add(data.uploadId);

  try {
    const resp = await axios.post(
      'https://api.streampixel.io/pixelStripeApi/projects/distribute-file',
      {
        apiKey: STREAMPIXEL_API_KEY,
        userId: STREAMPIXEL_USER_ID,
        projectId: STREAMPIXEL_PROJECT_ID,
        uploadId: data.uploadId,
      },
    );
    console.log('Distribute OK:', resp.data);
  } catch (err) {
    // Common cause: rate limit (1 call / 2 min / user). Back off and retry.
    distributed.delete(data.uploadId);
    console.error('Distribute failed:', err.response?.data || err.message);
  }
});

app.get('/healthz', (_req, res) => res.status(200).send('ok'));

app.listen(8080, () => console.log('listening on :8080'));
Run it:
STREAMPIXEL_API_KEY=... \
STREAMPIXEL_USER_ID=... \
STREAMPIXEL_PROJECT_ID=... \
WEBHOOK_PATH=/webhooks/streampixel/9f3a4c1ee21b4f2c \
node webhook-listener.js
Then register https://your-host.example.com/webhooks/streampixel/9f3a4c1ee21b4f2c as your project’s webhook URL in the dashboard.

Handling rate limits

The distribute endpoint is rate-limited to one call per two minutes per user. The listener above handles this naively — if you ever ship two builds within two minutes (rare in production, common during testing), the second one will fail and stay un-distributed. A robust queue:
const distributeQueue = [];
let lastDistributeAt = 0;
const COOLDOWN_MS = 2 * 60 * 1000 + 5_000; // 2 min + 5s safety

setInterval(async () => {
  if (distributeQueue.length === 0) return;
  if (Date.now() - lastDistributeAt < COOLDOWN_MS) return;

  const uploadId = distributeQueue.shift();
  lastDistributeAt = Date.now();

  try {
    await axios.post(
      'https://api.streampixel.io/pixelStripeApi/projects/distribute-file',
      {
        apiKey: STREAMPIXEL_API_KEY,
        userId: STREAMPIXEL_USER_ID,
        projectId: STREAMPIXEL_PROJECT_ID,
        uploadId,
      },
    );
  } catch (err) {
    // Put it back at the front and try again next tick.
    distributeQueue.unshift(uploadId);
    lastDistributeAt = 0;
  }
}, 5_000);
Push uploadId onto distributeQueue from the webhook handler instead of calling distribute inline.

Tips

Tag your S3 objects with the commit SHA. If a build misbehaves in production, you can aws s3api get-object-tagging against the upload to identify the exact source revision without scanning logs.
Use a single Slack channel as your “build feed.” Wire each webhook event into a Slack incoming webhook. The signal — “build approved 12 minutes after push” — is the fastest way to spot regressions in package size or upload throughput.
Keep autoRelease: false even if you don’t gate distribution on anything. It costs nothing, lets you intercept bad builds, and means manual dashboard distribution still works as an emergency rollback path.

Gotchas

Google Drive share links, Dropbox preview pages, and any URL that returns HTML before the file will fail upload. The endpoint streams the response body as a ZIP and gives up if the first bytes are not a valid archive header. S3 presigned URLs and public bucket URLs work; HTML wrappers do not. See file URL requirements.
During development, use ngrok or Cloudflare Tunnel:
ngrok http 8080
Register the https://...ngrok.app/webhooks/streampixel/<token> URL in the dashboard. Keep the tunnel up while you test — Streampixel does not retry failed deliveries.
Streampixel does not currently sign webhook payloads. Treat your webhook URL as a shared secret: include a long random token in the path, and reject requests at any other path. See Security hardening for more.
build.approved should fire once per upload, but make your handler idempotent anyway. The distributed Set above is fine for a single-process listener; use a database or Redis if you run multiple replicas.
Streampixel will download builds up to ~30 GB. A 1-hour presigned URL is usually plenty, but if you ship very large builds during peak hours when ingestion is queued, bump it to 6 hours.

Next steps

Upload File API

Full request/response reference for the upload endpoint.

Webhooks

All seven webhook events and their payloads.

Distribute File API

Push an approved build to streaming servers.

Security hardening

Lock down API keys, webhooks, and embeds.