This guide explains how to deploy the deAPI MCP Server on a remote host and connect to it from Claude Desktop or other MCP clients.
The deAPI MCP Server (src/server_remote.py) uses HTTP/SSE transport and can run in two deployment scenarios:
- Local Deployment - Server runs on your local machine (localhost:8000)
- Remote Deployment - Server deployed to a remote host accessible over the network
This guide focuses on remote deployment for production environments. For local usage, see the main README.md.
- Clone the repository and install dependencies:
git clone https://github.com/deapi-ai/mcp-server-deapi.git
cd mcp-server-deapi
uv pip install -e .
# or
pip install -e .- Set environment variables (optional):
export MCP_HOST=0.0.0.0 # Listen on all interfaces
export MCP_PORT=8000 # Port to listen on- Run the remote server:
python -m src.server_remoteThe server will be available at http://your-server-ip:8000/mcp
- For production, use a process manager like systemd:
Create /etc/systemd/system/mcp-server-deapi.service:
[Unit]
Description=deAPI MCP Server
After=network.target
[Service]
Type=simple
User=your-user
WorkingDirectory=/path/to/mcp-server-deapi
Environment="MCP_HOST=0.0.0.0"
Environment="MCP_PORT=8000"
ExecStart=/usr/bin/python3 -m src.server_remote
Restart=always
[Install]
WantedBy=multi-user.targetEnable and start:
sudo systemctl enable mcp-server-deapi
sudo systemctl start mcp-server-deapi
sudo systemctl status mcp-server-deapi- Build the Docker image:
docker build -t mcp-server-deapi .- Run the container:
docker run -d \
-p 8000:8000 \
--name mcp-server-deapi \
--restart unless-stopped \
mcp-server-deapi- Check logs:
docker logs -f mcp-server-deapiCreate docker-compose.yml:
version: '3.8'
services:
mcp-server-deapi:
build: .
ports:
- "8000:8000"
environment:
- MCP_HOST=0.0.0.0
- MCP_PORT=8000
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 10s
retries: 3Run with:
docker-compose up -d- Push code to GitHub
- Connect Railway to your repository
- Set port to 8000
- Deploy automatically
fly launch
fly deployheroku create mcp-server-deapi
git push heroku mainEdit your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS or %APPDATA%\Claude\claude_desktop_config.json on Windows):
{
"mcpServers": {
"deapi-remote": {
"url": "http://your-server-ip:8000/mcp"
}
}
}Or if using HTTPS with a domain:
{
"mcpServers": {
"deapi-remote": {
"url": "https://mcp-server-deapi.yourdomain.com/mcp"
}
}
}Use the MCP Python SDK to connect:
from mcp import ClientSession
from mcp.client.sse import sse_client
async def connect_to_remote_server():
async with sse_client("http://your-server-ip:8000/mcp") as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
# List available tools
tools = await session.list_tools()
# Call a tool (authentication is handled at the connection level)
result = await session.call_tool(
"get_available_models",
arguments={}
)Place the MCP server behind a reverse proxy (nginx, Caddy, etc.) with SSL:
Nginx example:
server {
listen 443 ssl http2;
server_name mcp-server-deapi.yourdomain.com;
ssl_certificate /path/to/cert.pem;
ssl_certificate_key /path/to/key.pem;
location / {
proxy_pass http://localhost:8000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# SSE specific
proxy_buffering off;
proxy_cache off;
proxy_read_timeout 86400;
}
}Caddy example (simpler):
mcp-server-deapi.yourdomain.com {
reverse_proxy localhost:8000
}
For production, you may want to add authentication to the MCP server itself. FastMCP supports this:
from fastmcp.server.auth.providers import APIKeyProvider
# Add to server_remote.py
auth = APIKeyProvider(api_keys=["your-secret-key"])
mcp = FastMCP(
name="deAPI AI API",
auth=auth,
# ... rest of config
)Then clients need to include the API key:
{
"mcpServers": {
"deapi-remote": {
"url": "https://mcp-server-deapi.yourdomain.com/mcp",
"headers": {
"X-API-Key": "your-secret-key"
}
}
}
}Only expose port 8000 (or your chosen port) and limit access:
# UFW example (Ubuntu)
sudo ufw allow 8000/tcp
sudo ufw enable
# Or restrict to specific IPs
sudo ufw allow from YOUR_CLIENT_IP to any port 8000Consider adding rate limiting in your reverse proxy or using a service like Cloudflare.
The server includes a built-in health check endpoint at /health:
curl http://localhost:8000/health
# {"status": "healthy", "service": "deapi-mcp"}This is used by the Docker HEALTHCHECK and can be integrated with your monitoring system.
The server logs to stdout. Capture with your logging system:
# For systemd
journalctl -u mcp-server-deapi -f
# For Docker
docker logs -f mcp-server-deapi
# Or redirect to file
python -m src.server_remote 2>&1 | tee /var/log/mcp-server-deapi.log- Check firewall rules
- Verify server is running:
netstat -tlnp | grep 8000 - Check server logs
- Increase proxy timeouts (nginx:
proxy_read_timeout) - Check network stability
- Enable reconnection logic in client
For lower-traffic scenarios:
- Use serverless platforms (Railway, Fly.io free tier)
- Scale down during off-hours
- Use spot instances on AWS/GCP
Test the remote server is working:
# Test SSE endpoint
curl -N http://your-server-ip:8000/mcp
# Should return SSE streamFrom Python:
import httpx
response = httpx.get("http://your-server-ip:8000/mcp", timeout=None)
print(response.status_code) # Should be 200