Skip to content

Latest commit

 

History

History
52 lines (33 loc) · 2.5 KB

File metadata and controls

52 lines (33 loc) · 2.5 KB

SOAT — Infrastructure for AI Apps

SOAT Banner

License: MIT SafeSkill 92/100

SOAT is open-source infrastructure for building AI applications. It provides the essential backend services — identity and access management, document and file storage with vector search, conversational memory, agent orchestration, and a full MCP server — so you can focus on your product instead of reinventing the plumbing.

Why SOAT?

Building AI applications requires a surprising amount of backend infrastructure: user management, API keys, persistent storage, semantic search, conversation history, agent execution, and tool integration. SOAT provides all of this out of the box.

  • Complete IAM: Users, projects, API keys, and fine-grained permissions.
  • Documents & Files: Ingestion, vector embeddings, and semantic search powered by pgvector.
  • Conversations & Actors: Multi-party dialogue management with persistent message history.
  • Agents & AI Providers: Configure LLM providers, define agents with tools, and run AI-powered chat completions.
  • MCP Native: First-class support for the Model Context Protocol, enabling seamless integration with agent runtimes.
  • REST API: Standard HTTP endpoints for universal application access.

Documentation

Read the Full Documentation — Architecture, API reference, and deployment guides.

Getting Started

The quickest way to get started is using Docker Compose.

  1. Clone the repository

    git clone https://github.com/ttoss/soat.git
    cd soat
  2. Follow the Getting Started Guide to spin up the server and database using Docker Compose.

Contributing

We welcome contributions! Please feel free to submit issues and pull requests.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • ttoss - For the HTTP server and MCP packages
  • pgvector - PostgreSQL vector similarity search
  • Ollama - Local LLM and embedding models