Skip to content

Commit 229f6a5

Browse files
Add docs/client-quickstart.md tutorial with snippet sync
Convert the "Building MCP clients" tutorial from modelcontextprotocol.io (Mintlify MDX) into a Diataxis-style tutorial for the Python SDK docs. Code blocks are synced from `examples/clients/quickstart-client/client.py` via `<!-- snippet-source -->` markers and `# region` tags, so the tutorial stays in sync with the actual example code that gets type-checked. Changes to the tutorial content (vs the original Mintlify source): - Strip all Mintlify JSX (`<Tabs>`, `<CodeGroup>`, `<Warning>`, etc.) and convert to MkDocs Material syntax (`===` tabs, `!!! admonitions`) - Remove non-tutorial sections per Diataxis: "Key Components Explained", "Common Customization Points", "Best Practices" - Merge "System Requirements" and API key setup into a single "Prerequisites" section; show API key inline in run commands - Use sentence case for headings throughout - Tailor troubleshooting errors to real Python exceptions Changes to the example at `examples/clients/quickstart-client/`: - Add `# region` / `# endregion` markers for 5 snippet regions - Remove `python-dotenv` dependency, `load_dotenv()`, and `.env.example` (API key is now passed inline via environment variable) - Update `README.md` link to point to the new tutorial
1 parent 53461ea commit 229f6a5

File tree

7 files changed

+375
-8
lines changed

7 files changed

+375
-8
lines changed

docs/client-quickstart.md

Lines changed: 359 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,359 @@
1+
# Quickstart: Build an LLM-powered chatbot
2+
3+
In this tutorial, we'll build an LLM-powered chatbot that connects to an MCP server, discovers its tools, and uses Claude to call them.
4+
5+
Before you begin, it helps to have gone through the [server quickstart](https://modelcontextprotocol.io/quickstart/server) so you understand how clients and servers communicate.
6+
7+
[You can find the complete code for this tutorial here.](https://github.com/modelcontextprotocol/python-sdk/tree/main/examples/clients/quickstart-client/)
8+
9+
## Prerequisites
10+
11+
This quickstart assumes you have familiarity with:
12+
13+
- Python
14+
- LLMs like Claude
15+
16+
Before starting, ensure your system meets these requirements:
17+
18+
- Python 3.10 or later installed
19+
- Latest version of `uv` installed
20+
- An Anthropic API key from the [Anthropic Console](https://console.anthropic.com/settings/keys)
21+
22+
## Set up your environment
23+
24+
First, create a new Python project with `uv`:
25+
26+
=== "macOS/Linux"
27+
28+
```bash
29+
# Create project directory
30+
uv init mcp-client
31+
cd mcp-client
32+
33+
# Install required packages
34+
uv add mcp anthropic
35+
36+
# Remove boilerplate files
37+
rm main.py
38+
39+
# Create our main file
40+
touch client.py
41+
```
42+
43+
=== "Windows"
44+
45+
```powershell
46+
# Create project directory
47+
uv init mcp-client
48+
cd mcp-client
49+
50+
# Install required packages
51+
uv add mcp anthropic
52+
53+
# Remove boilerplate files
54+
del main.py
55+
56+
# Create our main file
57+
new-item client.py
58+
```
59+
60+
## Creating the client
61+
62+
### Basic client structure
63+
64+
First, let's set up our imports and create the basic client class in `client.py`:
65+
66+
<!-- snippet-source examples/clients/quickstart-client/client.py#MCPClient_init -->
67+
```python
68+
import asyncio
69+
import os
70+
import sys
71+
from contextlib import AsyncExitStack
72+
from pathlib import Path
73+
74+
from anthropic import Anthropic
75+
from anthropic.types import MessageParam, TextBlock, TextBlockParam, ToolParam, ToolResultBlockParam, ToolUseBlock
76+
from mcp import ClientSession, StdioServerParameters
77+
from mcp.client.stdio import stdio_client
78+
from mcp.types import TextContent
79+
80+
# Claude model constant
81+
ANTHROPIC_MODEL = "claude-sonnet-4-5"
82+
83+
84+
class MCPClient:
85+
def __init__(self) -> None:
86+
# Initialize session and client objects
87+
self.session: ClientSession | None = None
88+
self.exit_stack = AsyncExitStack()
89+
self._anthropic: Anthropic | None = None
90+
91+
@property
92+
def anthropic(self) -> Anthropic:
93+
"""Lazy-initialize Anthropic client when needed"""
94+
if self._anthropic is None:
95+
self._anthropic = Anthropic(api_key=os.getenv("ANTHROPIC_API_KEY"))
96+
return self._anthropic
97+
```
98+
<!-- /snippet-source -->
99+
100+
### Server connection management
101+
102+
Next, we'll implement the method to connect to an MCP server:
103+
104+
<!-- snippet-source examples/clients/quickstart-client/client.py#MCPClient_connect_to_server -->
105+
```python
106+
async def connect_to_server(self, server_script_path: str) -> None:
107+
"""Connect to an MCP server
108+
109+
Args:
110+
server_script_path: Path to the server script (.py or .js)
111+
"""
112+
is_python = server_script_path.endswith(".py")
113+
is_js = server_script_path.endswith(".js")
114+
if not (is_python or is_js):
115+
raise ValueError("Server script must be a .py or .js file")
116+
117+
if is_python:
118+
path = Path(server_script_path).resolve()
119+
server_params = StdioServerParameters(
120+
command="uv",
121+
args=["--directory", str(path.parent), "run", path.name],
122+
env=None,
123+
)
124+
else:
125+
server_params = StdioServerParameters(command="node", args=[server_script_path], env=None)
126+
127+
stdio_transport = await self.exit_stack.enter_async_context(stdio_client(server_params))
128+
self.stdio, self.write = stdio_transport
129+
self.session = await self.exit_stack.enter_async_context(ClientSession(self.stdio, self.write))
130+
131+
await self.session.initialize()
132+
133+
# List available tools
134+
response = await self.session.list_tools()
135+
tools = response.tools
136+
print("\nConnected to server with tools:", [tool.name for tool in tools])
137+
```
138+
<!-- /snippet-source -->
139+
140+
### Query processing logic
141+
142+
Now let's add the core functionality for processing queries and handling tool calls:
143+
144+
<!-- snippet-source examples/clients/quickstart-client/client.py#MCPClient_process_query -->
145+
```python
146+
async def process_query(self, query: str) -> str:
147+
"""Process a query using Claude and available tools"""
148+
assert self.session is not None
149+
messages: list[MessageParam] = [{"role": "user", "content": query}]
150+
151+
response = await self.session.list_tools()
152+
available_tools: list[ToolParam] = [
153+
{"name": tool.name, "description": tool.description or "", "input_schema": tool.input_schema or {}}
154+
for tool in response.tools
155+
]
156+
157+
# Initial Claude API call
158+
response = self.anthropic.messages.create(
159+
model=ANTHROPIC_MODEL, max_tokens=1000, messages=messages, tools=available_tools
160+
)
161+
162+
# Process response and handle tool calls
163+
final_text: list[str] = []
164+
165+
for content in response.content:
166+
if isinstance(content, TextBlock):
167+
final_text.append(content.text)
168+
elif isinstance(content, ToolUseBlock):
169+
tool_name = content.name
170+
tool_args = content.input
171+
172+
# Execute tool call
173+
assert self.session is not None
174+
result = await self.session.call_tool(tool_name, tool_args)
175+
final_text.append(f"[Calling tool {tool_name} with args {tool_args}]")
176+
177+
# Continue conversation with tool results
178+
messages.append({"role": "assistant", "content": response.content})
179+
tool_result_content: list[TextBlockParam] = [
180+
{"type": "text", "text": block.text} for block in result.content if isinstance(block, TextContent)
181+
]
182+
tool_result: ToolResultBlockParam = {
183+
"type": "tool_result",
184+
"tool_use_id": content.id,
185+
"content": tool_result_content,
186+
}
187+
messages.append({"role": "user", "content": [tool_result]})
188+
189+
# Get next response from Claude
190+
response = self.anthropic.messages.create(
191+
model=ANTHROPIC_MODEL,
192+
max_tokens=1000,
193+
messages=messages,
194+
)
195+
196+
response_text = response.content[0]
197+
if isinstance(response_text, TextBlock):
198+
final_text.append(response_text.text)
199+
200+
return "\n".join(final_text)
201+
```
202+
<!-- /snippet-source -->
203+
204+
### Interactive chat interface
205+
206+
Now we'll add the chat loop and cleanup functionality:
207+
208+
<!-- snippet-source examples/clients/quickstart-client/client.py#MCPClient_chat_loop -->
209+
```python
210+
async def chat_loop(self) -> None:
211+
"""Run an interactive chat loop"""
212+
print("\nMCP Client Started!")
213+
print("Type your queries or 'quit' to exit.")
214+
215+
while True:
216+
try:
217+
query = input("\nQuery: ").strip()
218+
219+
if query.lower() == "quit":
220+
break
221+
222+
response = await self.process_query(query)
223+
print("\n" + response)
224+
225+
except Exception as e:
226+
print(f"\nError: {str(e)}")
227+
228+
async def cleanup(self) -> None:
229+
"""Clean up resources"""
230+
await self.exit_stack.aclose()
231+
```
232+
<!-- /snippet-source -->
233+
234+
### Main entry point
235+
236+
Finally, we'll add the main execution logic:
237+
238+
<!-- snippet-source examples/clients/quickstart-client/client.py#main_entrypoint -->
239+
```python
240+
async def main() -> None:
241+
if len(sys.argv) < 2:
242+
print("Usage: python client.py <path_to_server_script>")
243+
sys.exit(1)
244+
245+
client = MCPClient()
246+
try:
247+
await client.connect_to_server(sys.argv[1])
248+
249+
# Check if we have a valid API key to continue
250+
api_key = os.getenv("ANTHROPIC_API_KEY")
251+
if not api_key:
252+
print("\nNo ANTHROPIC_API_KEY found. To query these tools with Claude, set your API key:")
253+
print(" export ANTHROPIC_API_KEY=your-api-key-here")
254+
return
255+
256+
await client.chat_loop()
257+
finally:
258+
await client.cleanup()
259+
260+
261+
if __name__ == "__main__":
262+
asyncio.run(main())
263+
```
264+
<!-- /snippet-source -->
265+
266+
## Running the client
267+
268+
To run your client with any MCP server:
269+
270+
=== "macOS/Linux"
271+
272+
```bash
273+
ANTHROPIC_API_KEY=your-key-here uv run client.py path/to/server.py
274+
275+
# Example: connect to the weather server from the server quickstart
276+
ANTHROPIC_API_KEY=your-key-here uv run client.py /absolute/path/to/weather/weather.py
277+
```
278+
279+
=== "Windows"
280+
281+
```powershell
282+
$env:ANTHROPIC_API_KEY="your-key-here"; uv run client.py path\to\server.py
283+
```
284+
285+
The client will:
286+
287+
1. Connect to the specified server
288+
2. List available tools
289+
3. Start an interactive chat session where you can:
290+
- Enter queries
291+
- See tool executions
292+
- Get responses from Claude
293+
294+
## What's happening under the hood
295+
296+
When you submit a query:
297+
298+
1. Your query is sent to Claude along with the tool descriptions discovered during connection
299+
2. Claude decides which tools (if any) to use
300+
3. The client executes any requested tool calls through the server
301+
4. Results are sent back to Claude
302+
5. Claude provides a natural language response
303+
6. The response is displayed to you
304+
305+
## Troubleshooting
306+
307+
### Server path issues
308+
309+
- Double-check the path to your server script is correct
310+
- Use the absolute path if the relative path isn't working
311+
- For Windows users, make sure to use forward slashes (`/`) or escaped backslashes (`\\`) in the path
312+
- Verify the server file has the correct extension (`.py` for Python or `.js` for Node.js)
313+
314+
Example of correct path usage:
315+
316+
=== "macOS/Linux"
317+
318+
```bash
319+
# Relative path
320+
uv run client.py ./server/weather.py
321+
322+
# Absolute path
323+
uv run client.py /Users/username/projects/mcp-server/weather.py
324+
```
325+
326+
=== "Windows"
327+
328+
```powershell
329+
# Relative path
330+
uv run client.py .\server\weather.py
331+
332+
# Absolute path (either format works)
333+
uv run client.py C:\projects\mcp-server\weather.py
334+
uv run client.py C:/projects/mcp-server/weather.py
335+
```
336+
337+
### Response timing
338+
339+
- The first response might take up to 30 seconds to return
340+
- This is normal and happens while:
341+
- The server initializes
342+
- Claude processes the query
343+
- Tools are being executed
344+
- Subsequent responses are typically faster
345+
- Don't interrupt the process during this initial waiting period
346+
347+
### Common error messages
348+
349+
If you see:
350+
351+
- `FileNotFoundError`: Check your server script path
352+
- `ModuleNotFoundError: No module named 'mcp'`: Make sure you ran `uv add mcp anthropic` in your project
353+
- `ValueError: Server script must be a .py or .js file`: The client only supports Python and Node.js servers
354+
- `anthropic.AuthenticationError`: Check that your `ANTHROPIC_API_KEY` is valid
355+
356+
## Next steps
357+
358+
- **[Example servers](https://modelcontextprotocol.io/examples)** — Browse official MCP servers and implementations
359+
- **[Example clients](https://modelcontextprotocol.io/clients)** — View clients that support MCP integrations

examples/clients/quickstart-client/.env.example

Lines changed: 0 additions & 1 deletion
This file was deleted.
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
# An LLM-Powered Chatbot MCP Client written in Python
22

3-
See the [Building MCP clients](https://modelcontextprotocol.io/tutorials/building-a-client) tutorial for more information.
3+
See the [client quickstart](../../../docs/client-quickstart.md) tutorial for more information.

0 commit comments

Comments
 (0)