Skip to content

AltairaLabs/Omnia

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

887 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Omnia

CI Quality Gate Status Coverage Go Report Card License: Apache 2.0

The Kubernetes Platform for AI Agent Deployment

Omnia is a Kubernetes operator that makes deploying, scaling, and managing AI agents simple. Deploy intelligent assistants that can safely access private, proprietary information — all within your existing infrastructure.

Features

  • Kubernetes-Native: Deploy AI agents as custom resources with full GitOps support
  • Multiple LLM Providers: Support for Claude, OpenAI, and Gemini with easy provider switching
  • Autoscaling: Scale-to-zero with KEDA or standard HPA based on active connections
  • Tool Integration: HTTP, gRPC, and MCP tool adapters for extending agent capabilities
  • Session Management: Redis or in-memory session stores for conversation persistence
  • Observability: Integrated Prometheus metrics, Grafana dashboards, Loki logs, and Tempo traces
  • Production-Ready: Health checks, graceful shutdown, and comprehensive RBAC

Quick Start

Prerequisites

  • Kubernetes 1.28+
  • Helm 3.x
  • kubectl configured for your cluster

Install the Operator

helm install omnia oci://ghcr.io/altairalabs/charts/omnia \
  --namespace omnia-system \
  --create-namespace

Deploy Your First Agent

  1. Create a PromptPack with compiled prompts:
apiVersion: v1
kind: ConfigMap
metadata:
  name: my-prompts
data:
  # Compiled PromptPack JSON (use `packc` to compile from YAML source)
  pack.json: |
    {
      "$schema": "https://promptpack.org/schema/latest/promptpack.schema.json",
      "id": "my-assistant",
      "name": "My Assistant",
      "version": "1.0.0",
      "template_engine": {"version": "v1", "syntax": "{{variable}}"},
      "prompts": {
        "main": {
          "id": "main",
          "name": "Main Assistant",
          "version": "1.0.0",
          "system_template": "You are a helpful AI assistant. Be concise and accurate.",
          "parameters": {"temperature": 0.7, "max_tokens": 4096}
        }
      }
    }
---
apiVersion: omnia.altairalabs.ai/v1alpha1
kind: PromptPack
metadata:
  name: my-pack
spec:
  version: "1.0.0"
  source:
    type: configmap
    configMapRef:
      name: my-prompts

Tip: Use packc to compile PromptPacks from YAML source files with validation.

  1. Create a Provider for LLM credentials:
apiVersion: v1
kind: Secret
metadata:
  name: llm-credentials
stringData:
  ANTHROPIC_API_KEY: "sk-ant-..."
---
apiVersion: omnia.altairalabs.ai/v1alpha1
kind: Provider
metadata:
  name: claude-provider
spec:
  type: claude
  model: claude-sonnet-4-20250514
  secretRef:
    name: llm-credentials
  1. Deploy an AgentRuntime:
apiVersion: omnia.altairalabs.ai/v1alpha1
kind: AgentRuntime
metadata:
  name: my-agent
spec:
  promptPackRef:
    name: my-pack
  providerRef:
    name: claude-provider
  facade:
    type: websocket
    port: 8080
  1. Connect to your agent:
kubectl port-forward svc/my-agent 8080:8080
websocat ws://localhost:8080/ws

Custom Resources

CRD Description
AgentRuntime Deploys and manages an AI agent with its facade, sessions, and scaling
PromptPack Defines agent prompts and system instructions
ToolRegistry Configures tools available to agents (HTTP, gRPC, MCP)
Provider Reusable LLM provider configuration with credentials

Architecture

┌──────────────────────────────────────────────────────────────┐
│                     Omnia Operator                            │
│                                                               │
│  ┌─────────────┐  ┌─────────────┐  ┌─────────────┐           │
│  │ AgentRuntime│  │ PromptPack  │  │ToolRegistry │           │
│  │ Controller  │  │ Controller  │  │ Controller  │           │
│  └──────┬──────┘  └──────┬──────┘  └──────┬──────┘           │
└─────────┼────────────────┼────────────────┼──────────────────┘
          │                │                │
          ▼                ▼                ▼
┌──────────────────────────────────────────────────────────────┐
│                      Agent Pod                                │
│  ┌────────────┐  ┌────────────┐  ┌────────────┐              │
│  │  Facade    │  │  Runtime   │  │   Tools    │              │
│  │ (WebSocket)│◄─┤  (gRPC)    │──┤  Adapter   │              │
│  └────────────┘  └────────────┘  └────────────┘              │
└──────────────────────────────────────────────────────────────┘

Documentation

Full documentation is available at omnia.altairalabs.ai.

Ecosystem

Omnia is part of the AltairaLabs open-source ecosystem:

Project Description
PromptKit Go SDK for building AI agents with tool use and streaming
PromptPack Specification for portable, testable AI agent definitions
Omnia Kubernetes platform for deploying PromptKit agents at scale

Contributing

Contributions are welcome! Please read our Contributing Guide for details.

License

Apache 2.0 - see LICENSE for details.


Built with care by AltairaLabs

About

Kubernetes operator for deploying and managing AI agents with WebSocket/gRPC facades, session management, and tool integration

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors