Website · Docs · Discord · GitHub · Quick Start
agentregistry is an open-source platform that gives you one place to find, manage, and run MCP servers, AI agents, and skills.
Right now, the MCP servers and AI tools your team needs are spread across npm, PyPI, Docker Hub, GitHub repos, and random URLs. Nobody knows which ones are trustworthy, which versions work, or how to get them running. Every developer is doing their own manual Docker setup and IDE configuration.
agentregistry puts all of that into a single registry with a CLI and a web UI. You import or publish artifacts once, and then anyone on your team can discover them, deploy them with one command, and have their IDE automatically configured to use them.
- One trusted source for AI building blocks — a curated catalog instead of scattered repos, scripts, and one-off MCP setup
- Faster developer onboarding — discover approved artifacts quickly with less manual configuration
- Consistent path from laptop to cluster — same discovery and delivery workflow across local dev and Kubernetes
- Governance without slowing teams down — centralize curation and publishing without forcing each team to rebuild the process
|
Curate & Deploy Package, collect, and enrich AI artifacts from any source in a single centralized registry.
|
Build & Publish Build, test, publish, and deploy AI artifacts with minimal dependencies.
|
Prerequisites: Docker Desktop with Docker Compose v2+
# 1. Install the CLI
curl -fsSL https://raw.githubusercontent.com/agentregistry-dev/agentregistry/main/scripts/get-arctl | bash
# 2. Start the agentregistry daemon by running any arctl command, such as arctl version.
arctl version
# 3. Open the agentregistry UI in your browser. http://localhost:12121 The UI is automatically exposed on port 12121 on your local machine when you start the agentregistry daemon.That's it. Your IDE now has access to the deployed server through the agentgateway.
Create, scaffold, and publish the building blocks of your agentic infrastructure.
- MCP servers — Register servers from npm (
npx), PyPI (uvx), OCI/Docker images, or remote HTTP/SSE endpoints. Each entry supports versioning, environment variables, package references, and automated quality scores. - Skills — Build structured knowledge packages that extend what an agent knows. A skill is a
SKILL.mdbundled with code examples, docs, PDFs, and reference URLs. Scaffold witharctl skill init, publish witharctl skill publishto Docker Hub, any OCI registry, or a GitHub repository. - Agents — Define agents that bundle an identity with dependencies: which MCP servers it needs, which skills it uses, and how it should be configured. Scaffold with
arctl agent init, then package everything into a versioned blueprint for one-step deployment. - Prompts — Create reusable instruction templates that define how an agent should behave in specific contexts. Version and store them alongside agents, skills, and servers so they're discoverable and shareable across your team.
A browser-based admin interface at localhost:12121. Browse the artifact catalog, add MCP servers, skills, and agents, review enrichment scores and metadata, manage deployments, and configure the registry — all without touching the CLI.
Curate a shared catalog of MCP servers, agents, skills, and prompts your teams can trust and reuse.
- Publish artifacts to a central registry from npm, PyPI, Docker, OCI, or remote endpoints
- Discover approved artifacts through the CLI, REST API, or web UI at
localhost:12121 - Give teams a consistent source of truth across environments
- Search by description ("query Postgres", "send Slack messages") instead of exact names — powered by pgvector
Turn a broad set of available AI artifacts into a collection your organization is willing to support.
- Organize what developers can discover and deploy
- Review enrichment scores, versioning, and environment variable requirements
- Standardize how artifacts are shared across teams
- Keep control of what gets published and promoted
Move from discovery to usage without reinventing the same delivery path for every team.
- Run workflows locally with
arctl - Deploy Agent Registry into Kubernetes with Helm
- Support local environments and shared platform environments from the same registry
- Build and push agents — blueprints bundle an agent with its MCP servers and skills into a single deployable unit
Make approved artifacts easier to consume from the tools developers already use.
- Generate configuration for Claude Desktop, Cursor, and VS Code
- Pair with agentgateway for a consistent access layer to deployed MCP infrastructure
- Reduce manual setup for AI clients and shared environments
- Platform teams curate and publish approved MCP servers, agents, and skills in Agent Registry
- Developers discover those artifacts through the web UI or
arctl - Teams pull and deploy what they need in local environments or Kubernetes
- AI clients and shared gateway infrastructure connect to approved artifacts through a consistent workflow
agentregistry pairs with agentgateway to give you a single, secure entry point to all your deployed MCP servers and agents.
Instead of exposing every MCP server individually, agentgateway acts as an AI-native reverse proxy that sits in front of your entire agentic infrastructure:
- Single endpoint — AI clients (Claude Desktop, Cursor, VS Code) connect to one URL. The gateway routes each tool call to the correct backend MCP server.
- Authentication & authorization — Enforce identity and access policies before requests reach your MCP servers. Control who can call which tools. Supports JWT validation and on-behalf-of auth flows.
- Centralized observability — Log and monitor all agent-to-tool traffic in one place instead of instrumenting each server separately. Supports OTEL endpoints for traces, metrics, and logs.
- Dynamic discovery — Deploy a new MCP server through agentregistry and every connected client picks it up automatically — no reconfiguration needed.
- LLM gateway — agentgateway also acts as a unified gateway for LLM providers, giving you a single endpoint to route, manage, and secure access to multiple language models.
- Transport flexibility — Proxy across stdio, SSE, and streamable HTTP transports seamlessly.
When you run arctl deploy, agentregistry automatically configures the gateway routing so your MCP servers are reachable through the secured proxy. Run arctl configure cursor to point your IDE at the gateway endpoint.
| Project | Role |
|---|---|
| agentgateway | AI-native reverse proxy for MCP traffic |
| kagent | Kubernetes-native AI agent platform |
| kgateway | Cloud-native API gateway (Envoy + Gateway API) |
| MCP Go SDK | Go SDK for building MCP servers |
| Model Context Protocol | The open standard for AI-to-tool communication |
If you're interested in participating with the agentregistry community, come talk to us!
- We are available on Discord
- To report security issues, please follow our vulnerability disclosure best practices
- Find more information on the agentregistry website
We do not yet have community meetings. Establishing these meetings is on our roadmap. Please help us deliver this work by either commenting on the issue, or volunteering to establish the meetings.
See CONTRIBUTING.md for guidelines and DEVELOPMENT.md for architecture and local development setup.
Report a bug · Suggest a feature · Join Discord
Apache 2.0 — see LICENSE.


