This repository allows you to run an Ollama server with Docker Compose. It exposes the Ollama API endpoint so that other applications can integrate with it.
The official Ollama Docker image (ollama/ollama) is available on Docker Hub.
Ollama official GitHub page
- Docker and Docker Compose installed
- Docker Desktop app (for macOS and Windows users)
# CPU only
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollamadocker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollamaRun using Docker Compose:
docker-compose pull
docker-compose up -dPull models from within the container:
docker exec -it ollama ollama pull phi3:mini-4kRun models interactively:
docker exec -it ollama ollama run llama3Access the container shell:
docker exec -it ollama bashThe Ollama API is available at http://localhost:11434. You can configure other applications to use this endpoint:
NEXT_PUBLIC_OLLAMA_ENDPOINT_URL=http://localhost:11434
OLLAMA_MODEL=phi3:mini-4k
You can add a client application to your docker-compose.yml file to interact with Ollama. Here's an example client that demonstrates the API:
# Example client application that uses Ollama
example-client:
image: curlimages/curl:latest
container_name: ollama-example-client
depends_on:
ollama:
condition: service_healthy
restart: "no"
command: >
sh -c "
# Wait to ensure Ollama is fully ready
sleep 5
# List available models
echo '--- Available Models ---'
curl -s ollama:11434/api/tags | grep name
# Run a basic prompt with the default model
echo '\\n--- Example Query ---'
curl -s ollama:11434/api/generate -d '{
\"model\": \"phi3:mini-4k\",
\"prompt\": \"Explain how to use Docker in 3 steps\",
\"stream\": false
}' | grep content
# Keep container running for demonstration (remove in production)
echo '\\n--- Client finished ---'
"
networks:
- ollama-networkThis example demonstrates how to:
- Wait for Ollama to be healthy before making requests
- List available models
- Send prompts to the Ollama API
- Access Ollama by service name within the Docker network
To build your own application that uses Ollama:
services:
# Your Ollama setup...
your-app:
image: your-image
environment:
- OLLAMA_API_URL=http://ollama:11434
networks:
- ollama-network
depends_on:
ollama:
condition: service_healthyThis ensures your application only starts when Ollama is ready and can access it via the internal Docker network using the service name ollama.
Stop the container:
docker-compose downRemove all data (models and configuration):
docker-compose down
rm -rf ./ollama