Tools wrap external capabilities as callable functions. Three adapter types supported: CLI, MCP, and HTTP.
When to add tools: When external capabilities lack good Python libraries. Don't wrap what requests, subprocess, or the stdlib already handle well.
Define command-line tools with YAML schema + recipes.
py-code-mode supports a host-side middleware chain around tool execution. This is intended for:
- Audit logging and metrics
- Allow/deny decisions and interactive approvals
- Argument rewriting, retries, caching, etc.
Notes:
- Middleware runs where tools execute (host-side ToolAdapters).
- Enforcement guarantees are strongest with
DenoSandboxExecutorbecause sandboxed Python can only access tools via RPC back to the host.
API surface:
ToolMiddleware:async def __call__(ctx: ToolCallContext, call_next) -> AnyToolCallContext: includestool_name,callable_name,args, and metadata likeexecutor_type,origin,request_id.
To enable for DenoSandboxExecutor, pass tool_middlewares in DenoSandboxConfig.
# tools/curl.yaml
name: curl
description: Make HTTP requests
command: curl
timeout: 60
tags: [http, network]
schema:
options:
silent:
type: boolean
short: s
description: Silent mode
location:
type: boolean
short: L
description: Follow redirects
header:
type: array
short: H
description: HTTP headers
data:
type: string
short: d
description: POST data
positional:
- name: url
type: string
required: true
description: URL to requestSchema fields:
options- Named flags/options with types:boolean,string,array,integerpositional- Positional arguments in ordershort- Single-character flag alias (e.g.,-sforsilent)tags- Keywords for discovery (optional)
Recipes are pre-configured tool invocations for common patterns:
recipes:
get:
description: Simple GET request
preset:
silent: true
location: true
params:
url: {}
post:
description: POST request with data
preset:
silent: true
location: true
params:
url: {}
data: {}
json:
description: GET with JSON Accept header
preset:
silent: true
location: true
header: ["Accept: application/json"]
params:
url: {}Recipe fields:
preset- Default values baked into the recipeparams- Parameters exposed to the agent (can havedefault)description- What this recipe does
Tool calls inside Session.run() are synchronous in the default executors (Subprocess/Container/InProcess).
Notes:
- If you need async tool calls in Python code, use
call_async(...)explicitly. - In
DenoSandboxExecutor, tool calls are async-first and you must useawait tools.*. - In
DenoSandboxExecutor, tool calls execute outside the sandbox:await tools.*is an RPC back to the host Python process, and the tool runs with host permissions (or container permissions if the tool adapter/executor is containerized).
# Recipe invocation (recommended)
tools.curl.get(url="https://api.github.com/repos/owner/repo")
tools.curl.post(url="https://api.example.com/data", data='{"key": "value"}')
# Escape hatch - raw tool invocation (full control)
tools.curl(
url="https://example.com",
silent=True,
location=True,
header=["Accept: application/json", "User-Agent: MyAgent"]
)
# Discovery
tools.list() # All tools
tools.search("http") # Search by name/description/tags
tools.curl.list() # Recipes for a specific tool
# Explicit async (if you need it)
await tools.curl.call_async(url="https://example.com")
await tools.curl.get.call_async(url="https://example.com")Connect to Model Context Protocol servers:
# tools/fetch.yaml
name: fetch
type: mcp
transport: stdio
command: uvx
args: ["mcp-server-fetch"]
description: Fetch web pages with full content extractionConfiguration:
type: mcp- Identifies this as an MCP adaptertransport- One of:stdio,sse,streamable_httpcommand- Command to launch the MCP serverargs- Arguments passed to the command
name: weather
type: mcp
transport: sse
url: http://localhost:8080/ssename: mythic
type: mcp
transport: streamable_http
url: http://localhost:3333/mcp
timeout: 10 # optional, 30 seconds if not specifiedMCP tools are namespaced by their YAML name field:
# Use tools as defined by the MCP server
content = tools.web.fetch(url="https://example.com")Namespace naming: Choose names that describe the capability domain, not the tool name. For example, use web instead of fetch (avoids tools.fetch.fetch()), or datetime instead of time.
Wrap REST APIs (defined in Python):
from py_code_mode.tools.adapters import HTTPAdapter, Endpoint
from py_code_mode.tools import ToolRegistry
# Create adapter
adapter = HTTPAdapter(base_url="https://api.github.com")
# Add endpoints
adapter.add_endpoint(Endpoint(
name="get_repo",
method="GET",
path="/repos/{owner}/{repo}",
description="Get repository metadata"
))
adapter.add_endpoint(Endpoint(
name="list_issues",
method="GET",
path="/repos/{owner}/{repo}/issues",
description="List repository issues"
))
# Create registry and add adapter
registry = ToolRegistry()
registry.add_adapter(adapter)
# Registry is passed to executor (typically via custom integration)# Path parameters are function arguments
repo = tools.github.get_repo(owner="anthropics", repo="anthropic-sdk-python")
# Query parameters passed as dict
issues = tools.github.list_issues(
owner="anthropics",
repo="anthropic-sdk-python",
query_params={"state": "open", "labels": "bug"}
)Agents can discover and search tools:
# List all available tools
all_tools = tools.list()
# Returns: [Tool(name="curl", description="...", callables=[...]), ...]
# Search by keyword
http_tools = tools.search("http")
# Searches tool names, descriptions, and tags
# List recipes for a tool
curl_recipes = tools.curl.list()
# Returns: [{"name": "get", "description": "...", "params": {...}}, ...]Tools are loaded from the tools_path configured on the executor:
from pathlib import Path
from py_code_mode import Session, FileStorage
from py_code_mode.execution import InProcessConfig, InProcessExecutor
# Storage for workflows and artifacts
storage = FileStorage(base_path=Path("./storage"))
# Executor loads tools from tools_path
config = InProcessConfig(tools_path=Path("./tools"))
executor = InProcessExecutor(config=config)
async with Session(storage=storage, executor=executor) as session:
# Tools are available from config.tools_path
result = await session.run('tools.curl.get(url="...")')Place YAML tool definitions in your tools directory:
./tools/
curl.yaml
jq.yaml
nmap.yaml
Each YAML file defines one tool with its schema and recipes.
Do add tools for:
- External commands with no good Python library (nmap, jq)
- MCP servers providing specialized capabilities
- Internal REST APIs your agents need to call
Don't add tools for:
- Operations well-handled by Python stdlib (file I/O, JSON parsing)
- Operations well-handled by popular libraries (HTTP via requests)
- One-off operations that won't be reused
Recipe design:
- Create recipes for common workflows, not every possible flag combination
- Use descriptive names:
get,post,json_postnotrecipe1,recipe2 - Preset sensible defaults (silent mode, follow redirects)
- Expose only the parameters that vary between invocations
See examples/ for complete tool configurations in working agent applications.