Skip to content

Rootly-AI-Labs/Rootly-MCP-server

Repository files navigation

Rootly MCP Server

PyPI version PyPI - Downloads Python Version Install MCP Server

An MCP server for the Rootly API that integrates seamlessly with MCP-compatible editors like Cursor, Windsurf, and Claude. Resolve production incidents in under a minute without leaving your IDE.

Demo GIF

Prerequisites

  • Python 3.12 or higher
  • uv package manager
    curl -LsSf https://astral.sh/uv/install.sh | sh
  • Rootly API token with appropriate permissions (see below)

API Token Permissions

The MCP server requires a Rootly API token. Choose the appropriate token type based on your needs:

  • Global API Key (Recommended): Full access to all entities across your Rootly instance. Required for organization-wide visibility across teams, schedules, and incidents.
  • Team API Key: Team Admin permissions with full read/edit access to entities owned by that team. Suitable for team-specific workflows.
  • Personal API Key: Inherits the permissions of the user who created it. Works for individual use cases but may have limited visibility.

For full functionality of tools like get_oncall_handoff_summary, get_oncall_shift_metrics, and organization-wide incident search, a Global API Key is recommended.

Installation

Configure your MCP-compatible editor (tested with Cursor) with one of the configurations below. The package will be automatically downloaded and installed when you first open your editor.

With uv

{
  "mcpServers": {
    "rootly": {
      "command": "uv",
      "args": [
        "tool",
        "run",
        "--from",
        "rootly-mcp-server",
        "rootly-mcp-server"
      ],      
      "env": {
        "ROOTLY_API_TOKEN": "<YOUR_ROOTLY_API_TOKEN>"
      }
    }
  }
}

With uvx

{
  "mcpServers": {
    "rootly": {
      "command": "uvx",
      "args": [
        "--from",
        "rootly-mcp-server",
        "rootly-mcp-server"
      ],      
      "env": {
        "ROOTLY_API_TOKEN": "<YOUR_ROOTLY_API_TOKEN>"
      }
    }
  }
}

Connect to Hosted MCP Server

Alternatively, connect directly to our hosted MCP server:

{
  "mcpServers": {
    "rootly": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-remote",
        "https://mcp.rootly.com/sse",
        "--header",
        "Authorization:${ROOTLY_AUTH_HEADER}"
      ],
      "env": {
        "ROOTLY_AUTH_HEADER": "Bearer <YOUR_ROOTLY_API_TOKEN>"
      }
    }
  }
}

Features

  • Dynamic Tool Generation: Automatically creates MCP resources from Rootly's OpenAPI (Swagger) specification
  • Smart Pagination: Defaults to 10 items per request for incident endpoints to prevent context window overflow
  • API Filtering: Limits exposed API endpoints for security and performance
  • Intelligent Incident Analysis: Smart tools that analyze historical incident data
    • find_related_incidents: Uses TF-IDF similarity analysis to find historically similar incidents
    • suggest_solutions: Mines past incident resolutions to recommend actionable solutions
  • MCP Resources: Exposes incident and team data as structured resources for easy AI reference
  • Intelligent Pattern Recognition: Automatically identifies services, error types, and resolution patterns
  • On-Call Health Integration: Detects workload health risk in scheduled responders

On-Call Health Integration

Rootly MCP integrates with On-Call Health to detect workload health risk in scheduled responders.

Setup

Set the ONCALLHEALTH_API_KEY environment variable:

{
  "mcpServers": {
    "rootly": {
      "command": "uvx",
      "args": ["rootly-mcp-server"],
      "env": {
        "ROOTLY_API_TOKEN": "your_rootly_token",
        "ONCALLHEALTH_API_KEY": "och_live_your_key"
      }
    }
  }
}

Usage

check_oncall_health_risk(
    start_date="2026-02-09",
    end_date="2026-02-15"
)

Returns at-risk users who are scheduled, recommended safe replacements, and action summaries.

Example Skills

Want to get started quickly? We provide pre-built Claude Code skills that showcase the full power of the Rootly MCP server:

An AI-powered incident response specialist that:

  • Analyzes production incidents with full context
  • Finds similar historical incidents using ML-based similarity matching
  • Suggests solutions based on past successful resolutions
  • Coordinates with on-call teams across timezones
  • Correlates incidents with recent code changes and deployments
  • Creates action items and remediation plans
  • Provides confidence scores and time estimates

Quick Start:

# Copy the skill to your project
mkdir -p .claude/skills
cp examples/skills/rootly-incident-responder.md .claude/skills/

# Then in Claude Code, invoke it:
# @rootly-incident-responder analyze incident #12345

This skill demonstrates a complete incident response workflow using Rootly's intelligent tools combined with GitHub integration for code correlation.

On-Call Shift Metrics

Get on-call shift metrics for any time period, grouped by user, team, or schedule. Includes primary/secondary role tracking, shift counts, hours, and days on-call.

get_oncall_shift_metrics(
    start_date="2025-10-01",
    end_date="2025-10-31",
    group_by="user"
)

On-Call Handoff Summary

Complete handoff: current/next on-call + incidents during shifts.

# All on-call (any timezone)
get_oncall_handoff_summary(
    team_ids="team-1,team-2",
    timezone="America/Los_Angeles"
)

# Regional filter - only show APAC on-call during APAC business hours
get_oncall_handoff_summary(
    timezone="Asia/Tokyo",
    filter_by_region=True
)

Regional filtering shows only people on-call during business hours (9am-5pm) in the specified timezone.

Returns: schedules with current_oncall, next_oncall, and shift_incidents

Shift Incidents

Incidents during a time period, with filtering by severity/status/tags.

get_shift_incidents(
    start_time="2025-10-20T09:00:00Z",
    end_time="2025-10-20T17:00:00Z",
    severity="critical",  # optional
    status="resolved",    # optional
    tags="database,api"   # optional
)

Returns: incidents list + summary (counts, avg resolution time, grouping)

Contributing

See CONTRIBUTING.md for developer setup and guidelines.

Play with it on Postman

Run In Postman

About Rootly AI Labs

This project was developed by Rootly AI Labs, where we're building the future of system reliability and operational excellence. As an open-source incubator, we share ideas, experiment, and rapidly prototype solutions that benefit the entire community. Rootly AI logo

About

Rootly MCP server

Resources

License

Contributing

Stars

Watchers

Forks

Contributors 10

Languages