Turn any public documentation link or local Markdown file into a powerful knowledge base for your local AI tool. Fast, fun, and effortless—just point and connect.
- Web Scraping: Ingest content from public documentation websites.
- Local File Ingestion: Directly add content from local Markdown files.
- Vector Embeddings: Automatically generate and store vector embeddings for efficient semantic search.
- Context Management: Organize and search documents within specific contexts (e.g., by API, project, or topic).
- Model Context Protocol (MCP) Server: Expose your knowledge base as an MCP server for seamless integration with AI tools.
- SQLite Backend: Reliable and portable local data storage.
brew tap tesh254/pons
brew install ponscurl -sL https://raw.githubusercontent.com/tesh254/pons/main/install.sh | bash(Requires Go installed)
git clone https://github.com/tesh254/pons.git
cd pons
go build -o pons .Pons provides a command-line interface for managing your knowledge base.
Ingest content from a URL or a local file. This command scrapes web pages or reads local files, generates embeddings, and stores the content in your knowledge base.
# Add content from a URL (web scraping)
pons add https://www.example.com --context my-web-docs
# Add content from a local Markdown file
pons add /path/to/your/document.md --context my-local-notes
# Use --verbose for detailed output
pons add https://wchr.xyz --context wchr-context --verboseArguments:
[url_or_file_path]: The URL of the website to scrape or the absolute path to the local Markdown file.
Flags:
--context (-c): A string to categorize the ingested documents (e.g.,shopify-admin,my-project-docs). Defaults todefault.--verbose (-v): Enable verbose output for detailed progress and information.
Documents are stored with a source_type indicating their origin (web_scrape or file_read).
Search your knowledge base for relevant documents using a natural language query.
# Search across all contexts
pons search "How do I update user profiles?"
# Search within a specific context
pons search "What is the main function?" --context my-project-docs
# Get more results
pons search "Pons features" --num-results 10Arguments:
[query]: The natural language query to search for.
Flags:
--context (-c): (Optional) The context to search within. If omitted, searches across all contexts.--num-results (-n): The maximum number of search results to return. Defaults to5.--verbose (-v): Enable verbose output.
List all documents currently stored in your knowledge base.
pons listThis command will display the URL, source type, checksum, content length, and embeddings length for each document.
List all unique contexts currently stored in your knowledge base.
pons contextsThis command will display a list of all distinct context names that have been used when adding documents.
The Pons MCP server allows your local AI tools to connect and utilize its capabilities as a knowledge base.
To start the server, use the pons start command:
pons startBy default, the server listens on http://localhost:9014. You can specify a different address and port using the --http-address flag:
pons start --http-address "0.0.0.0:8081"To connect your AI tool to the Pons MCP server, configure your tool to use the server's address. For example, if your AI tool supports connecting to an MCP server, you would typically provide the http://localhost:8080 (or your custom address) as the server endpoint.
Refer to your AI tool's documentation for specific instructions on how to configure an MCP server connection.
To connect Gemini to your local Pons MCP server, start Pons with the desired HTTP address:
pons start --http-address localhost:9014Then, create a folder named .gemini in your project's root directory and add a settings.json file inside it with the following content:
{
"mcpServers": {
"pons": {
"httpUrl": "http://localhost:9014"
}
}
}To connect Cursor Editor to your local Pons MCP server, start Pons with the desired HTTP address:
pons start --http-address localhost:9999 # Or any other available portThen, create a .cursor folder in your project's root directory and add an mcp.json file inside it with the following content:
{
"mcpServers": {
"pons": {
"type": "streamable-http",
"url": "http://localhost:9999",
"note": "For Streamable HTTP connections, add this URL directly in your MCP Client"
}
}
}Pons exposes the following MCP tools for AI tool interaction:
🚨 MANDATORY FIRST STEP: This tool MUST be called before any other Pons tools.
context from this tool.
This tool generates a context that is REQUIRED for all subsequent tool calls. After calling this tool, you MUST extract the context from the response and pass it to every other Pons tool call.
🔄 MULTIPLE CONTEXT SUPPORT: You MUST call this tool multiple times in the same conversation when you need to learn about different documentation contexts. THIS IS NOT OPTIONAL. Just pass the existing context to maintain conversation continuity while loading the new context.
For example, a user might ask a question about the admin context, then switch to the functions context, then ask a question about polaris UI components. In this case, you would call learn_api three times with the following arguments:
learn_api(api: "admin") -> context: "admin"learn_api(api: "functions", context: "admin") -> context: "functions"learn_api(api: "polaris", context: "functions") -> context: "polaris"
This is because the context is used to maintain conversation continuity while loading the new context.
🚨 Valid arguments for api are:
- Any string representing a documentation context (e.g., shopify-admin, my-project-docs, general-knowledge). This string will be used as the context for subsequent tool calls.
🔄 WORKFLOW:
- Call
learn_apifirst with the initial API (context) - Extract the
contextfrom the response - Pass that same
contextto ALL other Pons tools - If you need to know more about a different context at any point in the conversation, call
learn_apiagain with the new API (context) and the samecontext
DON'T SEARCH THE WEB WHEN REFERENCING INFORMATION FROM THIS KNOWLEDGE BASE. IT WILL NOT BE ACCURATE.
PREFER THE USE OF THE search_doc_chunks TOOL TO RETRIEVE INFORMATION FROM THE KNOWLEDGE BASE.
Searches the knowledge base for relevant documentation and code examples based on a query string. This tool uses vector embeddings for semantic search.
Adds or updates a document in the knowledge base, automatically generating embeddings. This tool is used internally by the pons add CLI command.
Deletes documents from the knowledge base by URL prefix.
Lists stored documents in the knowledge base with pagination, optionally filtered by context.
Retrieves a specific document from the knowledge base by URL.
Pons uses SQLite (github.com/mattn/go-sqlite3) for local data storage. While efforts were made to integrate libsql for its native vector capabilities, challenges with its Go driver's compatibility led to reverting to the stable SQLite implementation. Future enhancements may explore more robust vector database integrations.