Containerized MCP demo stack by Vahid Tavakkoli (2026).
This project demonstrates how to orchestrate a local AI workflow with a Node.js backend, a static frontend, Python tool servers, SearxNG web search, and Ollama model inference using Docker Compose.
mcp-test is a practical reference setup for experimenting with tool-enabled LLM chat in a local environment. The backend receives user prompts, calls Ollama, executes requested tools (matrix inversion, Tower of Hanoi, internet search, time, weather), and returns a final response to the browser UI.
- Node.js backend API (
/api/chat) with tool-call orchestration. - Static frontend (Nginx) for interactive chat.
- Python MCP-style tool services:
- Matrix inversion service
- Hanoi solver service
- SearxNG integration for search results.
- Ollama integration for local model inference.
- Docker Compose orchestration for one-command startup.
Browser UI (frontend:6180)
-> Backend API (backend:6100, POST /api/chat)
-> Ollama API (host.docker.internal:11434)
-> MCP Matrix service (mcp-matrix:6101, POST /tool/matrix)
-> MCP Hanoi service (mcp-hanoi:6102, POST /tool/hanoi)
-> SearxNG (searxng:8080)
mcp-test/
├── backend/ # Node.js API and tool execution logic
├── frontend/ # Static HTML/JS UI served by Nginx
├── mcp-hanoi/ # Python FastAPI Hanoi tool server
├── mcp-matrix/ # Python FastAPI matrix tool server
├── searxng/config/ # SearxNG runtime config
├── docker-compose.yml # Multi-service orchestration
└── README.md
- Docker + Docker Compose
- Ollama running on the host machine
- A pulled Ollama model (default in compose:
ministral-3:3b)
- The backend expects Ollama at
http://host.docker.internal:11434. - On Linux, if
host.docker.internalis not available by default, configure host-gateway support for Docker. - Make sure your selected model is pulled before chat requests.
- SearxNG is included as a containerized dependency in this stack.
- Backend searches route through
SEARXNG_URL=http://searxng:8080. - Public internet access from your Docker environment is required for meaningful search results.
-
Clone:
git clone https://github.com/vtavakkoli/mcp-test.git cd mcp-test -
(Optional) create local env file:
cp .env.example .env
-
Start services:
docker compose up --build
-
Open the frontend:
http://localhost:6180
| Service | URL / Endpoint | Host Port |
|---|---|---|
| Frontend | http://localhost:6180 |
6180 |
| Backend | http://localhost:6100/api/chat |
6100 |
| MCP Matrix | http://localhost:6101/tool/matrix |
6101 |
| MCP Hanoi | http://localhost:6102/tool/hanoi |
6102 |
| SearxNG | http://localhost:6103 |
6103 |
curl -X POST http://localhost:6100/api/chat \
-H "Content-Type: application/json" \
-d '{"message":"Invert matrix [[4,7],[2,6]]"}'curl -X POST http://localhost:6101/tool/matrix \
-H "Content-Type: application/json" \
-d '{"matrix":[[4,7],[2,6]]}'curl -X POST http://localhost:6102/tool/hanoi \
-H "Content-Type: application/json" \
-d '{"n":3}'- Frontend loads but replies fail
- Confirm backend is running on
6100and CORS is enabled.
- Confirm backend is running on
- Backend returns Ollama error
- Verify Ollama is running on host port
11434and the configured model exists.
- Verify Ollama is running on host port
- Search tool returns empty/failed results
- Check SearxNG container status and outbound network availability.
- Matrix tool returns 400
- Ensure input is square and invertible.
- Hanoi tool returns 400
- Ensure
n >= 1.
- Ensure
Licensed under the MIT License. See LICENSE.
Written by Vahid Tavakkoli, 2026.
