This guide outlines the steps to check your system configuration environment to ensure the MCP CLI operates correctly.
The MCP CLI requires Python 3.11 or newer.
How to check: Open your terminal and run:
python3 --versionEnsure the output shows Python 3.11.x or higher. If not, you may need to install or update your Python version.
It is highly recommended to use a virtual environment to manage project dependencies. This project uses uv for dependency management.
How to check/set up:
- Install
uv(if not already installed):Or, if you havecurl -LsSf https://astral.sh/uv/install.sh | shpipx:pipx install uv
- Create and activate a virtual environment:
Navigate to the project root directory (
/Users/fanzhang/Documents/github/mcp-cli) and run:uv venv source .venv/bin/activate - Install dependencies:
This will install all necessary packages listed in
uv sync
pyproject.tomlanduv.lock.
The MCP CLI relies on several configuration files.
-
server_config.json: Located in the project root, this file configures the MCP server connection. How to check: Verify its presence and content. A basicserver_config.jsonmight look like:{ "mcp_server_url": "http://localhost:8000" }Adjust
mcp_server_urlif your server is hosted elsewhere. -
~/.chuk_llm/providers.yaml: This file configures your LLM providers (e.g., OpenAI, Anthropic). How to check:cat ~/.chuk_llm/providers.yamlEnsure your desired providers are configured correctly with their respective API keys (if applicable). Example for OpenAI:
openai: api_key: "sk-..." model: "gpt-4o"
-
~/.chuk_llm/.env: This file can store environment variables, including sensitive API keys. How to check:cat ~/.chuk_llm/.envEnsure any API keys or other sensitive environment variables are correctly set here.
Some configurations, especially API keys, might be set via environment variables.
How to check:
You can check specific environment variables using echo:
echo $OPENAI_API_KEY
echo $ANTHROPIC_API_KEY
# etc.Ensure these variables are set if your providers.yaml or other configurations rely on them.
The MCP CLI communicates with an MCP server and potentially external LLM provider APIs.
How to check:
-
MCP Server Connectivity: If your
mcp_server_urlishttp://localhost:8000, ensure your MCP server is running. You can try topingit (thoughpingmight not work for HTTP services, it's a quick check for host reachability):ping localhost
A more direct check would be to use
curlon the server's health endpoint if available, or simply try running an MCP CLI command that connects to the server. -
LLM Provider API Connectivity: Ensure you have internet access and that there are no firewall rules blocking access to LLM provider endpoints (e.g.,
api.openai.com,api.anthropic.com).
The project includes diagnostic scripts that can help verify the setup.
How to run:
Navigate to the diagnostics/ directory and run relevant scripts. For example:
cd diagnostics/
python3 provider_list_diagnostic.py
python3 mcp_server_diagnostic.pyThese scripts can provide insights into your LLM provider and MCP server configurations.