Skip to content

Quickstart

This workflow proves the local ReplayLab MVP without network access, real provider packages, or API keys. The app in examples/dogfood_mvp imports local fake OpenAI-style and requests-style modules, then ReplayLab captures and replays those calls through the same provider adapter path used by real applications.

1. Capture

uv run replaylab run \
  --project-name dogfood-mvp \
  --auto-patch-integrations openai,requests \
  --capture-payload-policy full \
  -- python examples/dogfood_mvp/app.py

This writes a wrapper capsule and a separate child provider capsule under .replaylab/capsules/. The child capsule contains the OpenAI and HTTP boundaries.

2. Find The Provider Capsule

uv run replaylab capsule list --local-store-root .replaylab

Pick the capsule whose integrations include openai, requests, and auto_patch. The wrapper capsule usually has no provider boundaries and is not the right replay target.

3. Inspect

uv run replaylab capsule inspect <child_capsule_id> --local-store-root .replaylab

Inspection prints run status, boundary counts, providers, integrations, and payload counts without printing payload file contents.

4. Replay

uv run replaylab replay <child_capsule_id> \
  --local-store-root .replaylab \
  --auto-patch-integrations openai,requests \
  --report-id replay_dogfood_mvp \
  -- python examples/dogfood_mvp/app.py

Replay serves recorded provider responses from the capsule. The fake provider modules are still imported, but live provider calls are not made when replay matches the capsule.

5. Inspect And Compare The Report

uv run replaylab report inspect .replaylab/replays/replay_dogfood_mvp/report.json

uv run replaylab report compare \
  <child_capsule_id> \
  .replaylab/replays/replay_dogfood_mvp/report.json \
  --local-store-root .replaylab

report compare exits 0 only when every expected boundary was replayed and there were no blocked, mismatched, extra, missing, or payload-unavailable results.

6. Generate A Regression Test

uv run replaylab generate-test <child_capsule_id> \
  --output tests/regression/test_dogfood_replay.py \
  --fixture-root tests/fixtures/replaylab/capsules \
  --app-root . \
  --auto-patch-integrations openai,requests \
  -- python examples/dogfood_mvp/app.py

The generator copies the capsule fixture and writes a deterministic pytest test that calls replaylab replay.

7. Run The Generated Test

uv run pytest tests/regression/test_dogfood_replay.py

The generated test should pass without network access. Do not check generated local replay output under .replaylab/ into source control.