+--------------------------+
| CAREER QUEST MODE |
| PRESS START |
+--------------------------+
This guide walks you through getting BaoBuildBuddy running locally for the first time. Think of it as the tutorial level -- follow each checkpoint in order.
| I want to... | Go here |
|---|---|
| Understand the app in simple terms | ELI5 System Walkthrough |
| Set up local AI with Ollama | Local AI Setup Guide |
| Run the full proof pass after setup | Verification Runbook |
| Do the full first-time local setup | Keep reading below |
| Read the full technical reference | README.md |
BaoBuildBuddy is a monorepo with two runtime services:
- API server in
packages/server(Bun + Elysia, port 3000) - Frontend app in
packages/client(Nuxt SSR, port 3001)
One command starts both:
bun run devThis runs scripts/dev-stack.ts to orchestrate startup. It's the recommended path.
| Tool | macOS (Homebrew) | Ubuntu / Debian | Windows (winget) |
|---|---|---|---|
Bun (bun@1.3.11) |
brew install oven-sh/bun/bun |
curl -fsSL https://bun.sh/install | bash |
winget install --id Oven-sh.Bun -e |
| Git | brew install git |
sudo apt-get update && sudo apt-get install -y git |
winget install --id Git.Git -e |
| Rust (for desktop builds) | brew install rustup-init && rustup-init |
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh |
winget install --id Rustlang.Rustup -e |
Playwright bundles its own Chromium -- no separate Chrome install needed.
curlandjqfor diagnostics- At least one AI provider API key (HuggingFace, OpenAI, Gemini, or Claude)
- Ollama for local AI: Download | Quickstart
HuggingFace free tier now requires an API token. Create one at https://huggingface.co/settings/tokens.
bun --version
git --version
rustc --version # only needed for desktop buildsCheck the Bun version matches the workspace manifest:
bun pm pkg get packageManager
# -> "bun@1.3.11"git clone https://github.com/d4551/baobuildbuddy.git
cd baobuildbuddyAlready have it? Just pull the latest:
git pull --ff-onlymacOS / Linux:
bash scripts/setup.shWindows (PowerShell):
powershell -ExecutionPolicy Bypass -File scripts\setup.ps1The setup script handles:
- Checking required tools
- Validating Bun version against the workspace manifest
- Installing workspace dependencies
- Installing Playwright Chromium (unless skipped)
- Creating
.envfrom.env.example - Generating and pushing DB schema
- Running
typecheck,lint, andtest(unless skipped)
Setup flags:
| Flag | Bash | PowerShell | Effect |
|---|---|---|---|
| Skip checks | --skip-checks |
-SkipChecks |
Skip validation after setup |
| Skip browser install | --skip-browser-install |
-SkipBrowserInstall |
Skip Playwright Chromium |
| Include build | --include-build |
-IncludeBuild |
Run bun run build after setup |
| Include desktop build | --include-desktop-build |
-IncludeDesktopBuild |
Run Tauri desktop build |
| Help | --help |
-Help |
Print usage |
After a successful run, you should see:
✅ Bun workspace install complete
✅ Playwright Chromium installed for automation scripts
✅ SQLite schema generated and ready for local runs
✅ Initial validation checks pass
bun install
bun run automation:browsers:install
cp .env.example .env # Windows: copy .env.example .env
bun run db:generate
bun run db:pushEdit .env with these minimum values:
PORT=3000
DB_PATH=~/.bao/bao.db
NUXT_PUBLIC_API_BASE=/
NUXT_PUBLIC_WS_BASE=/
NUXT_PUBLIC_I18N_DEFAULT_LOCALE=en-US
NUXT_PUBLIC_I18N_FALLBACK_LOCALE=en-US
NUXT_PUBLIC_I18N_LOCALE_COOKIE_KEY=bao-locale
Important: Do NOT set
NUXT_PUBLIC_I18N_SUPPORTED_LOCALESin.env. Nuxt replaces the parsed array with a raw string, breaking the i18n plugin. The default is handled innuxt.config.ts.
When you're ready, add:
BAO_DISABLE_AUTH=trueto skip auth gating in local dev- or
BAO_AUTH_SETUP_TOKEN=your-operator-tokenif you want auth enabled during first-run setup LOCAL_MODEL_ENDPOINT=http://localhost:11434/v1for local AI (leaveLOCAL_MODEL_NAMEblank for auto-detect)OPENAI_API_KEY,GEMINI_API_KEY,CLAUDE_API_KEY,HUGGINGFACE_TOKENas neededAUTOMATION_STDIO_BUFFER_LIMIT=2000for large scraper outputs
AI provider keys can also be set in the UI via Settings > AI Providers.
bun run devTerminal 1:
bun run dev:serverTerminal 2:
bun run dev:clientFrom a terminal:
curl -fsS http://localhost:3000/api/health
curl -fsS http://localhost:3000/api/auth/status
curl -fsS http://localhost:3000/api/jobs?limit=1Then open http://localhost:3001 in your browser and confirm:
- Home page loads without errors.
- Settings page is reachable.
- An API-backed feature returns data (jobs or resumes).
- Dashboard quick actions reflect your pipeline status.
- Browser dev tools show no hard errors.
- Open Settings.
- Configure AI: local model endpoint (see Local AI Setup) or a provider API key.
- Save settings.
- Open Resume and create your first resume.
- Open Jobs and run a search to confirm the ingestion pipeline.
- Open AI Chat and send a test message.
- Open Automation > Job Apply and test a non-sensitive sample flow.
bun run typecheck
bun run lint
bun run test
bun run buildFor the full validation sequence, see README.md > Validation & Quality Gates.
For the complete post-setup proof flow, including page screenshots, export checks, runtime verification, and desktop artifact verification, use VERIFICATION_RUNBOOK.md.
If port 3001 is already in use, run page verification against an alternate port:
PORT=4105 bun run --cwd packages/client preview
VERIFY_HOST=127.0.0.1 VERIFY_PORT=4105 bun run verify:pagesIf you want a desktop window instead of a browser tab, see README.md > Desktop Packaging.
Quick start:
bun run dev:desktopBuild an installer:
bun run build:desktopRequires Rust toolchain (rustc + cargo).
| Problem | Fix |
|---|---|
| Server starts but UI can't connect | Check NUXT_PUBLIC_API_BASE / NUXT_PUBLIC_WS_BASE |
| Port conflict | Change PORT in .env |
| Playwright browser missing | Run bun run automation:browsers:install |
curl health checks fail |
Confirm server shows Listening on ... and no startup errors |
| RPA automation unavailable | Install Playwright Chromium and retry |
| Locales missing or duplicated | Verify .env locale keys match packages/client/locales |
For more, see README.md > Troubleshooting.
| Topic | Guide |
|---|---|
| Understand the system architecture | ELI5 System Walkthrough |
| Set up local AI with Ollama | Local AI Setup Guide |
| Learn the automation flows | Automation Guide |
| Deep architecture reference | README.md |
- Core pages use tokenized layout primitives (
PageScaffold,PageHeaderBlock,SectionGrid) frompackages/client/constants/ui-layout.ts. - Modal flows use
AppModalFramewitharia-modal+aria-labelledby. - Cross-page CTAs come from
packages/client/constants/flow-engine.tsviauseFlowEngine.ts. - Automation pages follow the same tokenized layout and are enforced by
validate:ui-layout-tokens. - AI provider display is locale-driven (
aiProviderCatalog.*) withAIProviderIcon. - Interview role recommendations are derived from profile, readiness rankings, pathway scores, and live job signals.
- Skills readiness copy is locale-driven from typed server IDs (
feedbackId,improvementSuggestions,nextSteps).