Fred is a minimal AI assistant built with FastAPI, PostgreSQL, and local Ollama. It can explore codebases, store preferences, and remember high-level concepts.
- Python 3.11+
- PostgreSQL
- Ollama (local LLM inference)
pip install -e .
# or with uv:
uv syncCopy .env.example to .env and set:
| Variable | Description | Default |
|---|---|---|
DATABASE_URL |
PostgreSQL connection string | postgresql://localhost:5432/fred |
OLLAMA_URL |
Ollama API base URL | http://localhost:11434 |
- Ensure Ollama is running:
ollama serve - Pull a model with tool support (e.g. Llama 3.1):
ollama pull llama3.1 - Start Fred:
uvicorn fred.main:app --reloadGET /— Chat web UIGET /health— Health checkPOST /chat— Send message, get response (sync)POST /chat/stream— Stream response as NDJSON (recommended for code exploration)GET/POST/DELETE /memories— Long-term memory CRUDGET/PUT /preferences— User preferences (e.g.project_path,model)
Code exploration requires project_path to be set. Use PUT /preferences/project_path with {"value": "/path/to/your/project"} or ask Fred to set it in chat.
# Install dev dependencies
uv sync --extra dev
# Run tests
pytest tests/ -v
# Lint and format
ruff check src/ tests/
ruff format src/ tests/
# Type check
mypy src/See AGENTS.md for architecture, tools, and conventions (targeted at AI agents).