Skip to content

hemantobora/auto-mock

AutoMock πŸ§ͺ⚑

License: MIT Go Version AWS

AutoMock is a cloud-native CLI tool that generates and deploys production-ready mock API servers from natural language descriptions, API collections, or an interactive builder. Spin up fully managed mock servers on AWS (ECS Fargate) with auto-scaling, monitoring, and built-in load testing support.

Cloud provider support is currently AWS-only. GCP and Azure are planned but not yet available.


🌟 Highlights

  • πŸ€– AI-Generated Mocks β€” Describe your API in natural language, get complete MockServer configurations
  • ☁️ AWS Deployment β€” ECS Fargate + ALB β€” fully managed, auto-scaling
  • πŸ“¦ Multi-Format Import β€” Postman, Bruno, Insomnia, OpenCollection β†’ MockServer expectations and Locust bundles
  • πŸ”§ Interactive Builder β€” 7-step guided builder for precise control
  • ⚑ Auto-Scaling β€” CPU/Memory/Request-based (10–200 tasks by default)
  • πŸ’Ύ S3 Storage β€” Versioned, team-accessible expectations
  • 🎭 Advanced Features β€” Progressive delays, GraphQL, response templates, response limits
  • πŸ§ͺ Load Testing β€” Locust bundle generation and managed cloud deployment (AWS ECS)
  • πŸ” Production-Ready β€” Health checks, monitoring, IAM best practices

πŸ—ΊοΈ Installation

Option A β€” Homebrew (macOS / Linux)

brew tap hemantobora/tap
brew install automock
automock --version

Or directly: brew install hemantobora/tap/automock

Option B β€” Scoop (Windows)

scoop bucket add hemantobora https://github.com/hemantobora/scoop-bucket
scoop install automock
automock --version

Option C β€” Download release binary

  1. Go to the Releases page
  2. Download the archive for your OS/arch (e.g., automock_darwin_arm64.tar.gz)
  3. Extract and place the binary on your PATH:
tar -xzf automock_*.tar.gz
sudo mv automock /usr/local/bin/
automock --version

Option D β€” Build from source

Requires Go 1.22+:

git clone https://github.com/hemantobora/auto-mock.git
cd auto-mock
go build -o automock ./cmd/auto-mock

Upgrading

  • Homebrew: brew upgrade automock
  • Scoop: scoop update automock
  • Binary: download the newer version and replace the existing binary

πŸš€ Quick Start

# Set your AI provider key (choose one)
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."

# Configure AWS credentials (or use existing profile)
aws configure

# Generate and initialise a project
automock init --project user-api --provider anthropic

# Deploy to AWS
automock deploy --project user-api

Your mock API is now live at the endpoint shown after deploy.


πŸ“– Documentation

Document Description
GETTING_STARTED.md Complete setup guide, tutorials, examples
automock help CLI reference

✨ Key Features

πŸ€– AI-Powered Generation

Generate complete MockServer configurations from natural language:

automock init --project my-api --provider anthropic

Prompt: "User management API with registration, login, profile CRUD, password reset, and admin functions"

AI generates:

  • All CRUD endpoints (GET /users, POST /users, PUT /users/{id}, etc.)
  • Authentication flows (login, logout, token refresh)
  • Admin-only endpoints with authorization
  • Error responses (400, 401, 403, 404, 500)
  • Realistic test data with proper types
  • Request validation rules
  • Multiple scenarios per endpoint

Supported AI Providers:

  • Anthropic (Claude Sonnet 4.5)
  • OpenAI (GPT-4)
  • Template (no AI, fallback mode)

πŸ“¦ Collection Import

Import existing API definitions from popular tools β€” for both mock generation and load test bundles:

# Postman
automock init \
  --project api-mock \
  --collection-file api.postman_collection.json \
  --collection-type postman

# Bruno OpenCollection v3 (bundled YAML)
automock init \
  --project api-mock \
  --collection-file open.yml \
  --collection-type opencollection

Supported Formats:

Format Flag Notes
Postman Collection v2.1 --collection-type postman .json export
Bruno JSON export --collection-type bruno .json export
Bruno OpenCollection v3 --collection-type opencollection bundled .yml
Insomnia Workspace --collection-type insomnia .json export

All four formats are supported in both automock init (mock generation) and automock load (Locust bundle generation).

Features:

  • Sequential API execution with variable resolution
  • Interactive matching configuration (guided; no automatic scenario inference)
  • Auto-incremented priorities to avoid collisions
  • Pre/post-script processing β€” Postman (pm.environment.set, pm.response.json()) and Bruno (bru.setEnvVar, res.body) script APIs via embedded JS engine
  • Auth mapping to headers when provided in the collection
  • Template variable detection ({{variable}}) β€” warns and skips exact-match offer for body/query params containing placeholders

Example: Multi-Scenario Configuration

GET /api/users/123:
  Priority 100: Anonymous β†’ 401 Unauthorized
  Priority 200: Authenticated β†’ 200 OK (user data)
  Priority 300: Admin β†’ 200 OK (admin view)
  Priority 400: Rate limited β†’ 429 Too Many Requests

πŸ”§ Interactive Builder

Precision-controlled, step-by-step expectation creation:

automock init --project my-api
# Select: interactive

7-Step Process:

  1. Basic Info β€” Description, priority, tags
  2. Request Matching β€” Method, path, query params, headers
  3. Response Configuration β€” Status code, headers, body templates
  4. Advanced Features β€” Delays, caching, compression
  5. Connection Options β€” Socket config, keep-alive
  6. Response Limits β€” Serve unlimited times or expire after N requests
  7. Review & Confirm β€” Validate before saving

Advanced Request Matching:

  • Path parameters: /users/{id}/orders/{orderId}
  • Regex paths: /api/.*/status
  • Query string matching: ?status=active&limit=10
  • Header validation: Authorization: Bearer *
  • Body matching: exact (STRING), partial (JSON ONLY_MATCHING_FIELDS), JSON Schema, regex, parameters

Response Features:

  • Template variables: $!uuid, $!now_epoch, $!request.headers['X-Request-ID'][0]
  • Progressive delays: 100ms β†’ 150ms β†’ 200ms…
  • Multiple response bodies per expectation

☁️ Cloud Deployment

Deploy production-ready infrastructure with one command:

automock deploy --project my-api

AWS β€” ECS Fargate + ALB

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Application Load Balancer (Public)     β”‚
β”‚  https://automock-{project}-{id}.elb…  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
              β”‚
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β”‚  Target Groups      β”‚
    β”‚  β€’ API (/)          β”‚
    β”‚  β€’ Dashboard        β”‚
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
              β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  ECS Fargate Cluster                     β”‚
β”‚  β€’ MockServer (port 1080)                β”‚
β”‚  β€’ Config Loader (sidecar)               β”‚
β”‚  β€’ Auto-scaling: configurable            β”‚
β”‚    (defaults: 10–200 tasks)              β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
              β”‚
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β”‚  S3 Bucket          β”‚
    β”‚  expectations.json  β”‚
    β”‚  (versioned)        β”‚
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Infrastructure features:

  • Auto-Scaling β€” CPU/Memory/Request-based (10–200 tasks by default)
  • Monitoring β€” CloudWatch metrics, logs, alarms
  • Health Checks β€” ALB target health, /mockserver/status
  • Security β€” IAM roles, security groups, private subnets
  • Custom Domain β€” ACM certificate + Route53 (optional, prompted at deploy time)
  • Private ALB β€” Optional internal ALB for VPC-internal clients (e.g., Locust)
  • BYO Networking β€” Use existing VPC, subnets, IGW, NAT, IAM roles, and security groups

Authenticating:

aws configure                  # or use an existing named profile

πŸ“Š Project Management

Manage expectations throughout their lifecycle:

# View / add / edit / remove / replace / download / delete
automock init --project my-api
# β†’ Select the desired action from the menu
Action Description
view List all expectations
add Add new expectations (any generation mode)
edit Edit specific expectations
remove Remove selected expectations
replace Replace all expectations
download Save to {project}-expectations.json
delete Tear down project and all infrastructure

πŸ§ͺ Load Testing β€” Local

Generate a ready-to-run Locust bundle from any supported collection format:

automock load \
  --collection-file api.json \
  --collection-type postman \   # postman | insomnia | bruno | opencollection
  --dir ./load-tests \
  --distributed

cd load-tests
./run_locust_ui.sh
# Browser opens at http://localhost:8089

Generated files:

  • locustfile.py β€” Test scenarios (run via Locust, not Python directly)
  • locust_endpoints.json β€” Endpoint config β€” edit to tune behaviour
  • user_data.yaml β€” Per-user test data rows
  • requirements.txt β€” Python dependencies
  • run_locust_ui.sh / .ps1 β€” Start with web UI
  • run_locust_headless.sh / .ps1 β€” Run without UI
  • run_locust_master.sh / .ps1 β€” Distributed master
  • run_locust_worker.sh / .ps1 β€” Distributed worker

Running the load test:

cd load-tests

# UI mode β€” opens http://localhost:8089 to set users/rate interactively
export AM_HOST="http://your-target-host"
./run_locust_ui.sh

# Headless mode β€” set params via env vars
export AM_HOST="http://your-target-host"
export AM_USERS=20
export AM_SPAWN_RATE=5
export AM_DURATION=5m
./run_locust_headless.sh

Note: Run via the shell scripts or locust -f locustfile.py. Running python3 locustfile.py directly only loads configuration and prints the data row count β€” no test starts.

Variable substitution:

  • ${env.VAR} β€” Expanded at load-time from environment / .env file
  • ${data.<field>} and ${user.id|index} β€” Expanded at runtime in path, headers, params, and body
  • In auth.mode: shared, only ${env.*} expands; in auth.mode: per_user, ${data.*} and ${user.*} also expand in the login path/headers/body

☁️ Managed Locust β€” AWS

Deploy a Locust cluster to AWS via the same deploy command. AutoMock provisions an ECS Fargate cluster (master + workers), an ALB for the Locust UI, and Cloud Map service discovery.

# 1. Upload your load test bundle
automock load --project my-api --upload --dir ./load-tests

# 2. Deploy infrastructure (prompts for sizing)
automock deploy --project my-api

# 3. Scale workers
automock deploy --project my-api
# β†’ prompts for new worker count when already deployed

# 4. Tear down
automock destroy --project my-api
# β†’ select: mocks, loadtest, or both

What you get:

  • Public ALB with HTTP/HTTPS access to the Locust master UI
  • ECS task definitions for master and workers (configurable CPU/memory)
  • Cloud Map private namespace for master–worker discovery
  • CloudWatch log groups

🎯 Use Cases

Frontend Development

Mock backend APIs before they exist:

automock init --project frontend-mock --provider anthropic
# Describe: "REST API for blog app: posts, comments, users"
automock deploy --project frontend-mock

Integration Testing

Consistent, controlled test environments in CI/CD:

automock deploy --project test-api --skip-confirmation
npm run test:integration -- --api-url https://automock-test-api-123.elb.amazonaws.com
automock destroy --project test-api --force

Third-Party API Simulation

Test against external APIs without rate limits or costs:

automock init \
  --project stripe-mock \
  --collection-file stripe-api.postman_collection.json \
  --collection-type postman

Performance Testing

Generate and deploy load tests against your mock:

automock load \
  --collection-file prod-api.json \
  --collection-type postman \
  --dir ./load-tests
automock load --project prod-api --upload --dir ./load-tests
automock deploy --project prod-api

πŸ’° Cost Estimates

AWS (default: 10 tasks, 24/7)

min_tasks and max_tasks are configurable at deploy time. Using BYO networking (existing VPC, subnets, NAT) eliminates the NAT Gateway cost.

Component Monthly Cost
ECS Fargate (0.25 vCPU, 0.5 GB) ~$35
Application Load Balancer ~$16
NAT Gateway ~$32
Data Transfer ~$9
CloudWatch Logs ~$0.50
S3 Storage ~$0.30
Total ~$93

Rough East US estimates; varies by region and traffic. Validate with the AWS Pricing Calculator.

Hourly rate (10 tasks): ~$0.13/hour

AI Generation Costs

Provider Cost per Generation
Claude Sonnet 4.5 $0.05 – $0.20
GPT-4 $0.10 – $0.30

Cost tips: Destroy when not in use (automock destroy --project <name>), reduce task count for smaller APIs, or use BYO networking to share existing NAT Gateways.


πŸ—οΈ Infrastructure Details

Auto-Scaling

Scale Up (Aggressive):

  • CPU 70–80% β†’ +50% tasks
  • CPU 80–90% β†’ +100% tasks
  • CPU 90%+ β†’ +200% tasks
  • Memory thresholds follow the same pattern
  • Requests/min: 500–1000 β†’ +50%, 1000+ β†’ +100%

Scale Down (Conservative):

  • CPU < 40% for 5 minutes β†’ βˆ’25% tasks
  • Cooldown: 5 minutes between scale events

Limits: minimum 10 tasks, maximum 200 tasks (both configurable)

Monitoring & Alerts

CloudWatch Metrics: ECS CPU/memory/task count, ALB request count/response time/4xx/5xx errors

Alarms: unhealthy host count > 0, 5XX errors > 10/min, CPU > 70% for 10 min, Memory > 80% for 10 min

Security

  • IAM β€” Least-privilege, separate task execution and task roles, no hardcoded credentials; optional permissions boundary and custom role path
  • Networking β€” ECS tasks in private subnets behind ALB
  • Data β€” S3 server-side encryption (AES-256) with versioning enabled

πŸ”§ Advanced Features

Progressive Response Delays

Simulate degrading performance:

{
  "progressive": {
    "base": 100,
    "step": 50,
    "cap": 500
  }
}

Request 1: 100ms β†’ Request 2: 150ms β†’ … β†’ capped at 500ms

Response Templates

Dynamic values in responses:

{
  "id": "$!uuid",
  "timestamp": "$!now_epoch",
  "requestId": "$!request.headers['x-request-id'][0]",
  "userId": "$!request.pathParameters['userId'][0]",
  "randomScore": "$!rand_int_100"
}

Available variables: $!uuid, $!now_epoch, $!rand_int_100, $!rand_bytes_64, $!request.path, $!request.method, $!request.headers['X-Header'][0], $!request.pathParameters['param'][0], $!request.queryStringParameters['query'][0]

GraphQL Support

Basic GraphQL request matching (no schema validation):

{
  "httpRequest": {
    "method": "POST",
    "path": "/graphql",
    "body": {
      "query": {"contains": "query GetUser"},
      "variables": {"userId": "123"}
    }
  },
  "httpResponse": {
    "body": {"data": {"user": {"id": "123", "name": "John"}}}
  }
}

Supported: query string contains, operation name matching, optional variables matching (exact).


πŸ“‚ Project Structure

auto-mock/
β”œβ”€β”€ cmd/auto-mock/           # CLI entrypoint
β”œβ”€β”€ internal/
β”‚   β”œβ”€β”€ cloud/               # Cloud provider abstraction
β”‚   β”‚   β”œβ”€β”€ aws/             # AWS implementation (S3, ECS, IAM)
β”‚   β”‚   β”œβ”€β”€ factory.go       # Provider detection & initialization
β”‚   β”‚   └── manager.go       # Orchestration
β”‚   β”œβ”€β”€ mcp/                 # AI provider integration (Anthropic, OpenAI)
β”‚   β”œβ”€β”€ builders/            # Interactive expectation builders
β”‚   β”œβ”€β”€ collections/         # Collection parsers (Postman, Bruno, Insomnia, OpenCollection)
β”‚   β”œβ”€β”€ expectations/        # Expectation CRUD operations
β”‚   β”œβ”€β”€ repl/                # Interactive CLI flows
β”‚   β”œβ”€β”€ terraform/           # Embedded infrastructure modules
β”‚   β”‚   └── infra/
β”‚   β”‚       β”œβ”€β”€ mock/aws/    # ECS Fargate MockServer
β”‚   β”‚       └── loadtest/aws/   # ECS Fargate Locust
β”‚   └── models/              # Data structures
β”œβ”€β”€ go.mod
β”œβ”€β”€ build.sh
β”œβ”€β”€ README.md
└── LICENSE

πŸ“Š Roadmap

  • AWS support (S3, ECS, ALB)
  • AI-powered mock generation (Claude, GPT-4)
  • Collection import (Postman, Bruno, Insomnia, OpenCollection)
  • OpenCollection support in load test bundle generation
  • Interactive builder
  • Auto-scaling infrastructure
  • CloudWatch monitoring
  • Locust load testing (local + managed AWS)
  • Custom domain support (ACM + Route53)
  • Private ALB for VPC-internal clients
  • BYO networking (VPC, subnets, IAM, security groups)
  • Azure provider support
  • GCP provider support
  • Swagger/OpenAPI import
  • Bruno directory-based (.bru files) format
  • Web UI for expectation management
  • Prometheus metrics export
  • Multiple region deployment
  • Docker Compose local deployment

πŸ› οΈ Development

git clone https://github.com/hemantobora/auto-mock.git
cd auto-mock
go mod download
go build -o automock ./cmd/auto-mock

# Run tests
go test ./...

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Areas we'd love help with: Azure/GCP provider support, Swagger/OpenAPI import, Bruno directory-based (.bru) format, Web UI for expectation management, Prometheus metrics export.


πŸ“„ License

This project is licensed under the MIT License β€” see the LICENSE file for details.


πŸ™ Acknowledgments

  • MockServer β€” Powerful HTTP mocking server
  • Anthropic β€” Claude AI for intelligent mock generation
  • OpenAI β€” GPT-4 for intelligent mock generation
  • AWS β€” Cloud infrastructure platform
  • Terraform β€” Infrastructure as Code
  • Locust β€” Load testing framework

πŸ“ž Support


Built with ❀️ by Hemanto Bora

GitHub β€’ Issues

About

AutoMock is an AI-powered, multi-cloud-ready CLI tool that generates and deploys mock API servers based on simple request/response definitions.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors