BallScope is an AI-assisted multi-camera football tracking project. It captures live video from two cameras, runs YOLO-based detection, selects the best camera view, and serves preview + recording controls through a web interface.
- School project (Grade 9, upper secondary / Sek II)
- Region: Wasseramt Ost
- Authors: Rafael Reverberi, Benjamin Flury
- Run two live camera feeds in parallel.
- Track the ball (or any configured class) with YOLO.
- Auto-switch to the camera with the better detection.
- Apply smoothed crop/zoom on the selected camera.
- Stream live MJPEG preview in the browser.
- Record processed output and raw camera streams.
- Analyze uploaded videos in the web UI.
- Run post-analysis on separate left/right camera videos in the web UI.
- Preview and tune dual-camera stitching in the Analysis workspace, then save the stitch values for the current browser session.
- Tune both cameras in a dedicated Camera Settings workspace, keep those settings for the current session, and save named camera presets for later reuse.
- Apple Silicon macOS (
arm64, M1/M2/M3/M4) - NVIDIA Jetson (
Linux aarch64, e.g. Orin Nano)
Not supported:
- Intel macOS (
x86_64)
git clone https://github.com/rafaelreverberi/ballscope
cd ballscope./setup.shJetson note:
sudo ./setup.sh(setup.sh normalizes repository ownership back to your user at the end when run via sudo.)
source .venv/bin/activate
python main.pyOr from project root:
./start.shJetson note (if device permissions require it):
sudo ./start.shOpen:
- Local:
http://localhost:8000 - Jetson on LAN:
http://<jetson-ip>:8000
git clone https://github.com/rafaelreverberi/ballscope && cd ballscope
./setup.sh
./start.shDependencies are split by role:
requirements.txt: shared Python dependenciesrequirements-mac-apple-silicon.txt: shared + Apple Silicon PyTorch (MPS)requirements-jetson.txt: shared dependencies for Jetson (installed after PyTorch is ready)
setup.sh does all of this:
- Detects platform (Apple Silicon Mac vs Jetson)
- Creates/uses
.venv - Installs platform-specific dependencies
- macOS: Homebrew packages including
ffmpeg,gstreamer, common GStreamer plugins,node, anduvcc(used for recording/audio support and BRIO camera controls)
- macOS: Homebrew packages including
- Downloads model files (
models/*.pt, includingmodels/ballscope-ai.pt) from Hugging Face:https://huggingface.co/RafaelReverberi/ballscope-assets/tree/main/models
- On Jetson: ensures PyTorch CUDA is installed before other Python packages
- Installs both YOLO and RF-DETR analysis dependencies into the local
.venv - At the end: optional autostart setup (
systemdon Jetson,launchdon macOS) - If run with
sudo: resets repository ownership back to the invoking user - Verifies key imports and runtime device resolution
- Writes install logs to
logs/setup_*.log
On Jetson, setup.sh asks how to provide PyTorch first:
- Manual mode: setup stops after
.venvso you can install CUDA PyTorch manually. - Hugging Face wheel mode: downloads
.whlfiles fromhttps://huggingface.co/RafaelReverberi/ballscope-jetson-wheels/tree/main/wheelsinto localwheels/and installs them. - Preinstalled mode: verifies existing PyTorch in
.venvhas CUDA enabled and continues.
At the end of setup.sh, you can enable autostart:
- Jetson: installs/overwrites
ballscope.service(systemd) and enables it (also installs asudoersrule for BallScope power-control API endpoints: reboot/shutdown) - macOS: installs/overwrites
com.ballscope.start.plist(launchd)
Default is BALLSCOPE_AI_DEVICE=auto.
When auto is used, BallScope resolves devices as follows:
- Jetson: prefer
cuda:0 - Apple Silicon Mac: prefer
mps - fallback:
cpu
You can override manually:
export BALLSCOPE_AI_DEVICE=cpu
python main.py- Jetson default sources:
/dev/video0,/dev/video2 - Mac default sources:
0,1
In the Camera Settings workspace, you can change source values and save BRIO camera controls for the current app session. You can also store the full left/right camera setup as a named preset and reload it later with one click. Session settings and saved presets are reused by recording and live previews.
main.py: app entrypointballscope/web/app.py: FastAPI app + frontend UIballscope/camera/: camera workers / capture loopballscope/ai/: detection + camera switching logicballscope/recording/: recording pipelinesballscope/runtime_device.py: platform/device resolution helpers
docs/setup.md: full setup instructionsdocs/jetson_notes.md: Jetson-specific notesdocs/architecture.md: architecture and data flowdocs/architecture_video_analysis_2026-04-17.md: detailed redesign plan for the dual-camera offline analysis pipeline
- Open
Camera Settingsfrom the home screen to tune left and right cameras side by side. - The page shows a live result preview at the top and both raw camera previews below it.
- BRIO-relevant controls are exposed on both macOS and Jetson, including HDR/backlight, auto exposure, manual exposure, white balance, focus, zoom, pan, and tilt.
- On macOS, advanced controls use
uvcc. - On Jetson, advanced controls use
v4l2-ctl. - Named camera presets save the full current left/right camera configuration, including source, quality preset, and manual control values.
- Saved camera presets are stored in
camera_presets.jsonby default. You can override the path withBALLSCOPE_CAMERA_PRESET_FILE. - In the Camera Settings page, you can enable automatic loading of the last used preset on app startup.
- If setup fails, inspect the latest installer log in
logs/. - If a camera cannot be opened on Mac, use numeric sources (
0,1) instead of/dev/videoX. - If Jetson reports no CUDA, verify your Torch wheel matches your JetPack/L4T version.
- If Jetson shows NumPy ABI warnings at startup, rerun
./setup.shso Jetson dependencies are re-resolved withnumpy<2. - The analysis page now always uses the high-quality dual-camera offline pipeline by default.
- The
Analysispage accepts separateLeft CameraandRight Camerauploads. - The
Set Up Stitchingaction opens a stitched preview player for the selected left/right uploads so overlap, blend width, and vertical crop alignment can be tuned before starting analysis. - The default stitch uses a wider feathered seam so players crossing the midfield overlap do not hit a hard 1-camera cut.
- Saved stitching values stay active for the current browser session and are sent into the same dual-camera analysis pipeline on both Apple Silicon macOS and Jetson.
- Dual-source analysis advances each upload by playback time, not by matching decoded frame numbers, so variable-frame-rate left/right files do not drift apart during longer analysis runs.
- The Stitching modal also exposes a manual
Right Delay (sec)control. Positive values mean the right clip starts later and should be shifted forward to match the left clip. - Final analysis export trims the chosen audio source to the same effective start as the rendered video instead of remuxing unaligned audio from the raw upload.
- Analysis can optionally be limited to the first N minutes of the uploaded files for fast debugging on long recordings.
- BallScope detects whether a model is a YOLO checkpoint or an RF-DETR checkpoint and uses the matching runtime automatically.
models/ballscope-ai.ptis detected as RF-DETR and is the default analysis model.- Dual-camera analysis is per-camera first: left/right detections stay separate, are fused in master-canvas space, and the final broadcast crop is rendered from the master canvas.
- Full-frame global reacquire scans keep original frame resolution so small-ball detectability is not destroyed by downscaling.
- Short ball losses are handled with bounded prediction and phased camera behavior (
TRACKED,HOLD_SHORT,LOST_SHORT,LOST_LONG,UNKNOWN) so the view widens gradually instead of snapping away.
MIT License (LICENSE)