Skip to content

Commit f48745d

Browse files
Add mcp details and cookbook (#252)
Adds some missing details to the MCP docs. Adds a simple example of MCP use with any-agent/any-llm. Fixes links to github.io docs. Closes #35
1 parent 41b52d0 commit f48745d

5 files changed

Lines changed: 118 additions & 10 deletions

File tree

README.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -78,7 +78,7 @@ Download the encoderfile CLI tool to build your own model binaries:
7878
curl -fsSL https://raw.githubusercontent.com/mozilla-ai/encoderfile/main/install.sh | sh
7979
```
8080

81-
> **Note for Windows users:** Pre-built binaries are not available for Windows. Please see our guide on [building from source](https://mozilla-ai.github.io/encoderfile/latest/reference/building/) for instructions on building from source.
81+
> **Note for Windows users:** Pre-built binaries are not available for Windows. Please see our guide on [building from source](https://mozilla-ai.github.io/encoderfile/reference/building/) for instructions on building from source.
8282
8383
Move the binary to a location in your PATH:
8484
```bash
@@ -92,7 +92,7 @@ mv encoderfile ~/.local/bin/
9292

9393
### Option 2: Build CLI Tool from Source
9494

95-
See our guide on [building from source](https://mozilla-ai.github.io/encoderfile/latest/reference/building/) for detailed instructions on building the CLI tool from source.
95+
See our guide on [building from source](https://mozilla-ai.github.io/encoderfile/reference/building/) for detailed instructions on building the CLI tool from source.
9696

9797
Quick build:
9898
```bash
@@ -310,16 +310,16 @@ Run as a Model Context Protocol server:
310310

311311
## 📚 Documentation
312312

313-
- **[Getting Started Guide](https://mozilla-ai.github.io/encoderfile/latest/getting-started/)** - Step-by-step tutorial
314-
- **[Building Guide](https://mozilla-ai.github.io/encoderfile/latest/reference/building/)** - Build encoderfiles from ONNX models
315-
- **[CLI Reference](https://mozilla-ai.github.io/encoderfile/latest/reference/cli/)** - Complete command-line documentation
316-
- **[API Reference](https://mozilla-ai.github.io/encoderfile/latest/reference/api-reference/)** - REST, gRPC, and MCP API docs
313+
- **[Getting Started Guide](https://mozilla-ai.github.io/encoderfile/getting-started/)** - Step-by-step tutorial
314+
- **[Building Guide](https://mozilla-ai.github.io/encoderfile/reference/building/)** - Build encoderfiles from ONNX models
315+
- **[CLI Reference](https://mozilla-ai.github.io/encoderfile/reference/cli/)** - Complete command-line documentation
316+
- **[API Reference](https://mozilla-ai.github.io/encoderfile/reference/api-reference/)** - REST, gRPC, and MCP API docs
317317

318318
## 🛠️ Building Custom Encoderfiles
319319

320320
Once you have the `encoderfile` CLI tool installed, you can build binaries from any compatible HuggingFace model.
321321

322-
See our guide on [building from source](https://mozilla-ai.github.io/encoderfile/latest/reference/building/) for detailed instructions including:
322+
See our guide on [building from source](https://mozilla-ai.github.io/encoderfile/reference/building/) for detailed instructions including:
323323

324324
- How to export models to ONNX format
325325
- Configuration file options

docs/cookbooks/mcp-integration.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
# Using `encoderfile` from agents
2+
3+
The [Model Context Protocol](https://www.anthropic.com/news/model-context-protocol) (MCP) introduced by Anthropic has proven to be a popular method for providing an AI agent with access to a variety of tools. [This Huggingface blog post](https://huggingface.co/blog/Kseniase/mcp) has a nice explanation of MCP.
4+
5+
In the following example we will use Mozilla's own [`any-agent`](https://github.com/mozilla-ai/any-agent) and [`any-llm`](https://github.com/mozilla-ai/any-llm) packages to build a small agent that leverages on capabilities provided by a test encoderfile.
6+
7+
8+
## Build the custom encoderfile and start the server
9+
10+
We will use the existing test config to build an encoderfile using one of the test models by Mozilla.ai. It will detect Personally Identifiable Information (PII) and tag it accordingly, using tags like `B-SURNAME` for, well, surnames, and `O` for non-PII tokens. As we will see, even if the output consists of logits and tags, the underlying LLM is usually robust enough to focus only on the tags and act appropriately.
11+
12+
```sh
13+
curl -fsSL https://raw.githubusercontent.com/mozilla-ai/encoderfile/main/install.sh | sh
14+
encoderfile build -f test_config.yml
15+
```
16+
17+
After building it, we only need to set it up in MCP mode so it will listen to requests. By default it will bind to all interfaces, using port 9100.
18+
19+
```sh
20+
my-model-2.encoderfile mcp
21+
```
22+
23+
## Install Dependencies
24+
25+
For this test, we will need the `any-agent` and `any-llm` Python packages:
26+
27+
```sh
28+
pip install any-agent
29+
pip install any-llm-sdk[mistral]
30+
```
31+
32+
## Write the agent
33+
34+
Now we will write an agent with the appropriate prompt. We instruct the agent to use the provided tool, since the current description is fairly generic, and not use metadata that it might consider useful but is not documented anywhere in the tool itself. We will also instruct it to replace only surnames to showcase that the tags can be extracted appropriately:
35+
36+
```python
37+
--8<-- "examples/agent_mcp_integration/agent_test.py"
38+
```
39+
40+
After some struggling with the call conventions, the LLM finally obtains the information from the encoderfile and acts accordingly:
41+
42+
> `My name is Javier [REDACTED]`

docs/index.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ While **Llamafile** focuses on generative models, **Encoderfile** is purpose-bui
2828
* **Protocol Agnostic:** Runs as a REST API, gRPC microservice, CLI tool, or MCP Server out of the box.
2929
* **Compliance-Friendly:** Deterministic and offline-safe, making it ideal for strict security boundaries.
3030

31-
> **Note for Windows users:** Pre-built binaries are not available for Windows. Please see our guide on [building from source](https://mozilla-ai.github.io/encoderfile/latest/reference/building/) for instructions on building from source.
31+
> **Note for Windows users:** Pre-built binaries are not available for Windows. Please see our guide on [building from source](https://mozilla-ai.github.io/encoderfile/reference/building/) for instructions on building from source.
3232
3333
## Use Cases
3434

@@ -48,7 +48,7 @@ Encoderfile supports encoder-only transformers for:
4848
- **Token Classification** - Named Entity Recognition, PII detection
4949
- **Sentence Embeddings** - Semantic search, clustering
5050

51-
See our guide on [building from source](https://mozilla-ai.github.io/encoderfile/latest/reference/building/) for detailed instructions on building the CLI tool from source.
51+
See our guide on [building from source](https://mozilla-ai.github.io/encoderfile/reference/building/) for detailed instructions on building the CLI tool from source.
5252

5353
Generation models (GPT, T5) are not supported. See [CLI Reference](reference/cli.md) for complete model type details.
5454

docs/reference/api-reference.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -652,7 +652,7 @@ Encoderfile supports Model Context Protocol, allowing integration with MCP-compa
652652
### Connection Details
653653

654654
- **Endpoint:** `/mcp`
655-
- **Transport:** HTTP-based MCP protocol
655+
- **Transport:** HTTP-based MCP protocol (Streamable HTTP only)
656656
- **Port:** Same as HTTP server (default: 8080)
657657

658658
### MCP Tools
Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
import os
2+
import shutil
3+
from getpass import getpass
4+
5+
if "MISTRAL_API_KEY" not in os.environ:
6+
print("MISTRAL_API_KEY not found in environment!")
7+
api_key = getpass("Please enter your MISTRAL_API_KEY: ")
8+
os.environ["MISTRAL_API_KEY"] = api_key
9+
print("MISTRAL_API_KEY set for this session!")
10+
else:
11+
print("MISTRAL_API_KEY found in environment.")
12+
13+
# Quick Environment Check (Airbnb tool requires npx/Node.js)\\n",
14+
if not shutil.which("npx"):
15+
print(
16+
"⚠️ Warning: 'npx' was not found in your path. The Airbnb tool requires Node.js/npm to run."
17+
)
18+
19+
20+
from any_agent import AgentConfig, AnyAgent
21+
from any_agent.config import MCPStreamableHttp
22+
23+
24+
async def send_message(message: str) -> str:
25+
"""Display a message to the user and wait for their response.
26+
27+
Args:
28+
message: str
29+
The message to be displayed to the user.
30+
31+
Returns:
32+
str: The response from the user.
33+
34+
"""
35+
if os.environ.get("IN_PYTEST") == "1":
36+
return "2 people, next weekend, low budget. Do not ask for any more information or confirmation."
37+
return input(message + " ")
38+
39+
40+
async def main():
41+
print("Start creating agent")
42+
eftool = MCPStreamableHttp(url="http://localhost:9100/mcp")
43+
try:
44+
agent = await AnyAgent.create_async(
45+
"tinyagent", # See all options in https://mozilla-ai.github.io/any-agent/
46+
AgentConfig(model_id="mistral:mistral-large-latest", tools=[eftool]),
47+
)
48+
except Exception as e:
49+
print(f"❌ Failed to create agent: {e}")
50+
print("Done creating agent")
51+
52+
prompt = """
53+
Use the eftool tool to remove the personal information from this line: "My name is Javier Torres".
54+
Do not use any metadata. The "inputs" param must be a sequence with one string.
55+
Replace each surname, but not given names, with [REDACTED].
56+
"""
57+
58+
agent_trace = await agent.run_async(prompt)
59+
print(agent_trace.final_output)
60+
await agent.cleanup_async()
61+
62+
63+
if __name__ == "__main__":
64+
import asyncio
65+
66+
asyncio.run(main())

0 commit comments

Comments
 (0)