MindVault is a privacy-first, offline-capable local knowledge management app with semantic search and RAG capabilities. It allows you to take notes, index them locally, and ask questions using a local LLM.
graph TD
User((User)) --> Frontend[React Frontend]
Frontend --> TipTap[TipTap Editor]
Frontend --> API[FastAPI Backend]
API --> SQLite[(SQLite Metadata)]
API --> Chroma[(ChromaDB Vector Store)]
API --> Ollama[Ollama LLM]
API --> Filesystem[(Markdown Files)]
- Privacy-First: Everything runs locally. No cloud calls.
- Semantic Search: Find notes based on meaning, not just keywords.
- RAG Pipeline: Ask questions about your notes and get AI-generated answers using Ollama.
- Markdown Editor: TipTap-based editor with live preview and markdown support.
- Related Notes: Automatically see notes related to what you're currently writing.
- Import: Easily import your existing Obsidian vaults or markdown folders.
- Frontend: React, TypeScript, TailwindCSS, TipTap, Axios, Lucide
- Backend: FastAPI, SQLAlchemy, SQLite, ChromaDB, Ollama.
- Embeddings:
sentence-transformers(all-MiniLM-L6-v2) running on the backend. - LLM: Ollama (llama3).
- Docker & Docker Compose
- Python 3.9+
- Node.js 18+
- Ollama installed and running locally.
docker-compose up -dThis starts ChromaDB. Make sure Ollama is running on your host machine at http://localhost:11434.
cd backend
pip install -r requirements.txt
cp ../env.example .env
uvicorn main:app --reloadcd frontend
npm install
npm run devollama pull llama3- Create a Note: Click the "+" button in the sidebar.
- Write Content: Use the TipTap editor to write in Markdown.
- Semantic Search: Switch to the "AI Chat" view to ask questions about your notes.
- Get Answers: The AI retrieves relevant context from your notes and answers via Ollama.