Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -63,4 +63,6 @@ coverage.xml

# Internal planning docs
specs/
.env

# Scraped data
scraping/*.csv
273 changes: 101 additions & 172 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,268 +1,197 @@
# SpareRoom Chat Assistant
# London Flat Finder

A chat interface for finding rooms to rent in London. Uses OpenAI for conversational AI, Redis for vector search (RAG), and Supabase for authentication and data persistence.
A chat-based interface for finding rooms to rent in London. Uses OpenAI for conversational AI, Supabase Edge Functions for backend logic, and pgvector for semantic search.

## Architecture

- **Frontend**: React + TypeScript + Material UI
- **Backend**: FastAPI + Python (stateless API)
- **Backend**: Supabase Edge Functions (Deno)
- **Auth**: Supabase Auth (magic links)
- **Database**: Supabase PostgreSQL with RLS
- **Vector Search**: Redis with OpenAI embeddings
- **Database**: Supabase PostgreSQL with RLS + pgvector
- **AI**: OpenAI (chat, embeddings, vision-based scoring)

## Prerequisites

- Python 3.12+
- Node.js 18+
- [uv](https://github.com/astral-sh/uv) (Python package manager)
- [Supabase CLI](https://supabase.com/docs/guides/cli)
- Docker (for local Supabase)
- OpenAI API key
- Redis credentials (for vector search)

## Quick Start

### 1. Start Supabase (Local Development)

```bash
# Start local Supabase (requires Docker)
supabase start

# This will output your local credentials:
# Outputs local credentials:
# - API URL: http://127.0.0.1:54321
# - anon key: eyJ...
# - Inbucket (email testing): http://127.0.0.1:54324
```

### 2. Backend Setup
### 2. Set Up Edge Functions

```bash
cd backend

# Create .env file
cp .env.example .env
# Edit .env with your credentials:
# - OPENAI_API_KEY=sk-...
# - REDIS_HOST, REDIS_PORT, REDIS_PASSWORD

# Install dependencies and run
uv sync
uv run uvicorn main:app --reload --port 8000
supabase secrets set OPENAI_API_KEY=sk-...
supabase functions serve
```

Backend runs at http://localhost:8000

### 3. Frontend Setup

```bash
cd frontend

# Create .env.local for local development
cat > .env.local << 'EOF'
VITE_SUPABASE_URL=http://127.0.0.1:54321
VITE_SUPABASE_ANON_KEY=<your-anon-key-from-supabase-start>
VITE_BACKEND_URL=http://localhost:8000
VITE_SUPABASE_ANON_KEY=<your-anon-key>
VITE_APP_URL=http://localhost:5173
EOF

# Install dependencies and run
npm install
npm run dev
```

Frontend runs at http://localhost:5173
### 4. Magic Link Emails

### 4. View Magic Link Emails

For local development, emails are captured by Inbucket:
- Open http://127.0.0.1:54324
- Find your magic link email and click to authenticate
For local development, emails are captured by Inbucket at http://127.0.0.1:54324

## Database Schema

The schema is defined in `supabase/schema.sql`. Tables:
Simplified schema with 4 core tables (defined in `supabase/schema.sql`):

| Table | Purpose |
|-------|---------|
| `profiles` | User profiles (auto-created on signup) |
| `conversations` | Chat conversations per user |
| `messages` | Messages within conversations |
| `user_rules` | Extracted search filters per conversation |
| `saved_listings` | Shortlisted/blacklisted listings |

All tables have Row Level Security (RLS) - users can only access their own data.
| `profiles` | User profiles (auto-created on signup via trigger) |
| `user_listings` | User's property ratings - likes/dislikes with optional cached listing data |
| `user_journey_state` | Onboarding progress tracking |
| `email_alerts` | Email notification configuration |

## Database Commands
The `listings` table (created via migration) stores property data with pgvector embeddings for semantic search.

All database commands are run from the root directory:

| Command | Description |
|---------|-------------|
| `npm run db:start` | Start local Supabase |
| `npm run db:stop` | Stop local Supabase |
| `npm run db:status` | Show Supabase status and credentials |
| `npm run db:reset` | Reset database and apply migrations |
| `npm run db:diff -- <name>` | Generate migration from schema changes |
| `npm run db:types` | Regenerate TypeScript types from database |
| `npm run db:studio` | Open Supabase Studio in browser |

## Database Migrations

Schema changes follow this workflow:

```bash
# 1. Edit the source of truth
vim supabase/schema.sql
### Design Principles

# 2. Reset local DB and apply schema
npm run db:reset
- **No conversation storage**: Chat messages are ephemeral (kept in React state only). This reduces complexity and storage costs.
- **Unified feedback**: Single `user_listings` table replaces separate shortlist/blacklist and like/dislike tables.
- **RLS everywhere**: All tables have Row Level Security - users can only access their own data.

# 3. Generate migration from diff
npm run db:diff -- my_migration_name

# 4. Review the generated migration
cat supabase/migrations/*_my_migration_name.sql

# 5. Regenerate TypeScript types
npm run db:types
```

## Updating TypeScript Types

When you change the database schema, regenerate the TypeScript types:
## Database Commands

```bash
npm run db:types
npm run db:start # Start local Supabase
npm run db:stop # Stop local Supabase
npm run db:status # Show status and credentials
npm run db:reset # Reset and apply migrations
npm run db:diff -- <name> # Generate migration from schema changes
npm run db:types # Regenerate TypeScript types
npm run db:push # Push migrations to remote
```

This updates `frontend/src/types/supabase.ts` with types matching your database tables. The app types in `frontend/src/types/index.ts` extend these generated types.

## API Endpoints

| Endpoint | Method | Description |
|----------|--------|-------------|
| `/api/chat` | POST | Send message, get AI response + extracted rules |
| `/api/find-matches` | POST | RAG pipeline: search + filter + rerank listings |
| `/health` | GET | Health check |

## Project Structure

```
├── backend/
│ ├── main.py # FastAPI app (typed)
│ └── clients/
│ ├── openai_client.py # OpenAI embeddings, chat, vision
│ └── redis_client.py # Vector search
├── frontend/
│ ├── src/
│ │ ├── App.tsx # Main app component
│ │ ├── auth/
│ │ │ └── AuthContext.tsx # Supabase auth provider
│ │ ├── api/
│ │ │ ├── functions.ts # Edge Function calls (chat, match-stream)
│ │ │ ├── browse.ts # Browse mode streaming
│ │ │ └── journey.ts # User journey state API
│ │ ├── components/
│ │ │ ├── ChatPanel.tsx
│ │ │ ├── ListingsPanel.tsx
│ │ │ └── RulesPanel.tsx
│ │ │ ├── ChatPanel.tsx # Chat interface
│ │ │ ├── ResultsModal.tsx # Search results
│ │ │ ├── BrowseMode.tsx # Infinite scroll browser
│ │ │ ├── AlertsModal.tsx # Email alerts config
│ │ │ └── LoginModal.tsx # Auth UI
│ │ ├── hooks/
│ │ │ ├── useConversation.ts
│ │ │ ├── useRules.ts
│ │ │ └── useSavedListings.ts
│ │ ├── lib/
│ │ │ └── supabase.ts # Typed Supabase client
│ │ ├── pages/
│ │ │ └── Login.tsx
│ │ └── types/
│ │ ├── index.ts # App types
│ │ └── supabase.ts # Generated DB types
│ │ │ ├── useChatController.ts # Chat orchestration (stateless)
│ │ │ ├── useUserListings.ts # Unified listing feedback
│ │ │ ├── useBrowse.ts # Browse mode state
│ │ │ ├── useEmailAlerts.ts # Alert configuration
│ │ │ └── useJourneyState.ts # Onboarding tracking
│ │ ├── types/
│ │ │ ├── index.ts # App types
│ │ │ └── supabase.ts # Generated DB types
│ │ └── utils/
│ │ ├── sse.ts # SSE stream parsing
│ │ └── rent.ts # Rent normalization
│ └── package.json
├── supabase/
│ ├── config.toml # Supabase local config
│ ├── schema.sql # Source of truth for DB schema
│ └── migrations/ # Generated migrations
└── package.json # Root scripts for DB management
```

## Environment Variables

### Backend (.env)

```
OPENAI_API_KEY=sk-...
REDIS_HOST=...
REDIS_PORT=...
REDIS_PASSWORD=...
│ ├── functions/
│ │ ├── chat/ # Chat endpoint
│ │ ├── match-stream/ # SSE streaming matches
│ │ └── send-email-alerts/ # Scheduled notifications
│ ├── schema.sql # Source of truth
│ └── migrations/
└── scraping/ # Data ingestion scripts
```

### Frontend (.env.local)
## Edge Functions

```
VITE_SUPABASE_URL=http://127.0.0.1:54321
VITE_SUPABASE_ANON_KEY=eyJ...
VITE_BACKEND_URL=http://localhost:8000
VITE_APP_URL=http://localhost:5173
```
| Function | Description |
|----------|-------------|
| `chat` | Process messages, extract preferences, update journey state |
| `match-stream` | SSE streaming of scored property matches |
| `send-email-alerts` | Scheduled email notifications |

## Usage

1. Start Supabase, backend, and frontend
1. Start Supabase and frontend
2. Open http://localhost:5173
3. Enter your email to receive a magic link
4. Check Inbucket (http://127.0.0.1:54324) and click the link
5. Chat with the assistant about your room preferences:
3. Enter email for magic link (check Inbucket at http://127.0.0.1:54324)
4. Chat about your preferences:
- "Looking for a room under £700"
- "I work at Bank, max 30 min commute"
- "Need a pet-friendly place"

The assistant extracts filters, searches the vector database, and ranks listings using GPT-4 vision.
5. Browse and rate properties
6. Set up email alerts for new matches

## Production Deployment

### 1. Backend (Google Cloud Run)
### Frontend (Vercel)

The backend is deployed to Cloud Run using the `deploy.sh` script. It uses Google Secret Manager for sensitive configuration.
Environment variables:

**One-time Setup:**
```bash
cd backend
./setup-secrets.sh
# Follow prompts to enter API keys (OpenAI, Supabase, Redis, etc.)
```
| Variable | Description |
|----------|-------------|
| `VITE_SUPABASE_URL` | Production Supabase URL |
| `VITE_SUPABASE_ANON_KEY` | Production Anon Key |
| `VITE_APP_URL` | Vercel production URL |

**Updating Secrets:**
If you need to update a secret (e.g., Supabase Anon Key), use the safe update script to avoid newline issues:
```bash
cd backend
./update_secret.sh
cd frontend && vercel --prod
```

**Deploy:**
### Edge Functions

```bash
cd backend
./deploy.sh
supabase functions deploy chat
supabase functions deploy match-stream
supabase functions deploy send-email-alerts
```
This will deploy the container and map the secrets automatically.

### 2. Frontend (Vercel)
### Auth Configuration

The frontend is deployed to Vercel.
Add your Vercel URL to Supabase Dashboard > Authentication > URL Configuration > Redirect URLs.

**Environment Variables:**
You must set the following in **Vercel Dashboard > Settings > Environment Variables**:
### Email Alerts Cron Job

| Variable | Description |
|----------|-------------|
| `VITE_SUPABASE_URL` | Your production Supabase Project URL |
| `VITE_SUPABASE_ANON_KEY` | Your production Supabase Anon Key |
| `VITE_BACKEND_URL` | The Cloud Run URL (from step 1) |
| `VITE_APP_URL` | Your Vercel production URL (e.g., `https://your-app.vercel.app`) |
Email alerts run hourly via pg_cron. After deploying, set up the job in SQL Editor:

**Deploy:**
```bash
cd frontend
vercel --prod
```sql
SELECT cron.schedule(
'send-email-alerts-hourly',
'0 * * * *',
$$
SELECT net.http_post(
url := 'https://YOUR_PROJECT_REF.supabase.co/functions/v1/send-email-alerts',
headers := '{"Authorization": "Bearer YOUR_SERVICE_ROLE_KEY", "Content-Type": "application/json"}'::jsonb,
body := '{}'::jsonb
);
$$
);
```

### 3. Supabase Auth Configuration

For Magic Links to work in production, you must whitelist your Vercel URL:
1. Go to **Supabase Dashboard > Authentication > URL Configuration**.
2. Add your Vercel URL to **Redirect URLs** (e.g., `https://your-app.vercel.app/**`).
Monitor job runs:
```sql
SELECT * FROM cron.job_run_details ORDER BY start_time DESC LIMIT 10;
```
17 changes: 0 additions & 17 deletions backend/.dockerignore

This file was deleted.

Loading