AutoMock is a cloud-native CLI tool that generates and deploys production-ready mock API servers from natural language descriptions, API collections, or an interactive builder. Spin up fully managed mock servers on AWS (ECS Fargate) with auto-scaling, monitoring, and built-in load testing support.
Cloud provider support is currently AWS-only. GCP and Azure are planned but not yet available.
- π€ AI-Generated Mocks β Describe your API in natural language, get complete MockServer configurations
- βοΈ AWS Deployment β ECS Fargate + ALB β fully managed, auto-scaling
- π¦ Multi-Format Import β Postman, Bruno, Insomnia, OpenCollection β MockServer expectations and Locust bundles
- π§ Interactive Builder β 7-step guided builder for precise control
- β‘ Auto-Scaling β CPU/Memory/Request-based (10β200 tasks by default)
- πΎ S3 Storage β Versioned, team-accessible expectations
- π Advanced Features β Progressive delays, GraphQL, response templates, response limits
- π§ͺ Load Testing β Locust bundle generation and managed cloud deployment (AWS ECS)
- π Production-Ready β Health checks, monitoring, IAM best practices
brew tap hemantobora/tap
brew install automock
automock --versionOr directly: brew install hemantobora/tap/automock
scoop bucket add hemantobora https://github.com/hemantobora/scoop-bucket
scoop install automock
automock --version- Go to the Releases page
- Download the archive for your OS/arch (e.g.,
automock_darwin_arm64.tar.gz) - Extract and place the binary on your PATH:
tar -xzf automock_*.tar.gz
sudo mv automock /usr/local/bin/
automock --versionRequires Go 1.22+:
git clone https://github.com/hemantobora/auto-mock.git
cd auto-mock
go build -o automock ./cmd/auto-mock- Homebrew:
brew upgrade automock - Scoop:
scoop update automock - Binary: download the newer version and replace the existing binary
# Set your AI provider key (choose one)
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
# Configure AWS credentials (or use existing profile)
aws configure
# Generate and initialise a project
automock init --project user-api --provider anthropic
# Deploy to AWS
automock deploy --project user-apiYour mock API is now live at the endpoint shown after deploy.
| Document | Description |
|---|---|
| GETTING_STARTED.md | Complete setup guide, tutorials, examples |
automock help |
CLI reference |
Generate complete MockServer configurations from natural language:
automock init --project my-api --provider anthropicPrompt: "User management API with registration, login, profile CRUD, password reset, and admin functions"
AI generates:
- All CRUD endpoints (
GET /users,POST /users,PUT /users/{id}, etc.) - Authentication flows (login, logout, token refresh)
- Admin-only endpoints with authorization
- Error responses (400, 401, 403, 404, 500)
- Realistic test data with proper types
- Request validation rules
- Multiple scenarios per endpoint
Supported AI Providers:
- Anthropic (Claude Sonnet 4.5)
- OpenAI (GPT-4)
- Template (no AI, fallback mode)
Import existing API definitions from popular tools β for both mock generation and load test bundles:
# Postman
automock init \
--project api-mock \
--collection-file api.postman_collection.json \
--collection-type postman
# Bruno OpenCollection v3 (bundled YAML)
automock init \
--project api-mock \
--collection-file open.yml \
--collection-type opencollectionSupported Formats:
| Format | Flag | Notes |
|---|---|---|
| Postman Collection v2.1 | --collection-type postman |
.json export |
| Bruno JSON export | --collection-type bruno |
.json export |
| Bruno OpenCollection v3 | --collection-type opencollection |
bundled .yml |
| Insomnia Workspace | --collection-type insomnia |
.json export |
All four formats are supported in both automock init (mock generation) and automock load (Locust bundle generation).
Features:
- Sequential API execution with variable resolution
- Interactive matching configuration (guided; no automatic scenario inference)
- Auto-incremented priorities to avoid collisions
- Pre/post-script processing β Postman (
pm.environment.set,pm.response.json()) and Bruno (bru.setEnvVar,res.body) script APIs via embedded JS engine - Auth mapping to headers when provided in the collection
- Template variable detection (
{{variable}}) β warns and skips exact-match offer for body/query params containing placeholders
Example: Multi-Scenario Configuration
GET /api/users/123:
Priority 100: Anonymous β 401 Unauthorized
Priority 200: Authenticated β 200 OK (user data)
Priority 300: Admin β 200 OK (admin view)
Priority 400: Rate limited β 429 Too Many Requests
Precision-controlled, step-by-step expectation creation:
automock init --project my-api
# Select: interactive7-Step Process:
- Basic Info β Description, priority, tags
- Request Matching β Method, path, query params, headers
- Response Configuration β Status code, headers, body templates
- Advanced Features β Delays, caching, compression
- Connection Options β Socket config, keep-alive
- Response Limits β Serve unlimited times or expire after N requests
- Review & Confirm β Validate before saving
Advanced Request Matching:
- Path parameters:
/users/{id}/orders/{orderId} - Regex paths:
/api/.*/status - Query string matching:
?status=active&limit=10 - Header validation:
Authorization: Bearer * - Body matching: exact (STRING), partial (JSON ONLY_MATCHING_FIELDS), JSON Schema, regex, parameters
Response Features:
- Template variables:
$!uuid,$!now_epoch,$!request.headers['X-Request-ID'][0] - Progressive delays: 100ms β 150ms β 200msβ¦
- Multiple response bodies per expectation
Deploy production-ready infrastructure with one command:
automock deploy --project my-apiβββββββββββββββββββββββββββββββββββββββββββ
β Application Load Balancer (Public) β
β https://automock-{project}-{id}.elbβ¦ β
βββββββββββββββ¬ββββββββββββββββββββββββββββ
β
βββββββββββ΄βββββββββββ
β Target Groups β
β β’ API (/) β
β β’ Dashboard β
βββββββββββ¬ββββββββββββ
β
βββββββββββββββ΄βββββββββββββββββββββββββββββ
β ECS Fargate Cluster β
β β’ MockServer (port 1080) β
β β’ Config Loader (sidecar) β
β β’ Auto-scaling: configurable β
β (defaults: 10β200 tasks) β
ββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββ΄βββββββββββ
β S3 Bucket β
β expectations.json β
β (versioned) β
βββββββββββββββββββββββ
Infrastructure features:
- Auto-Scaling β CPU/Memory/Request-based (10β200 tasks by default)
- Monitoring β CloudWatch metrics, logs, alarms
- Health Checks β ALB target health,
/mockserver/status - Security β IAM roles, security groups, private subnets
- Custom Domain β ACM certificate + Route53 (optional, prompted at deploy time)
- Private ALB β Optional internal ALB for VPC-internal clients (e.g., Locust)
- BYO Networking β Use existing VPC, subnets, IGW, NAT, IAM roles, and security groups
Authenticating:
aws configure # or use an existing named profileManage expectations throughout their lifecycle:
# View / add / edit / remove / replace / download / delete
automock init --project my-api
# β Select the desired action from the menu| Action | Description |
|---|---|
view |
List all expectations |
add |
Add new expectations (any generation mode) |
edit |
Edit specific expectations |
remove |
Remove selected expectations |
replace |
Replace all expectations |
download |
Save to {project}-expectations.json |
delete |
Tear down project and all infrastructure |
Generate a ready-to-run Locust bundle from any supported collection format:
automock load \
--collection-file api.json \
--collection-type postman \ # postman | insomnia | bruno | opencollection
--dir ./load-tests \
--distributed
cd load-tests
./run_locust_ui.sh
# Browser opens at http://localhost:8089Generated files:
locustfile.pyβ Test scenarios (run via Locust, not Python directly)locust_endpoints.jsonβ Endpoint config β edit to tune behaviouruser_data.yamlβ Per-user test data rowsrequirements.txtβ Python dependenciesrun_locust_ui.sh/.ps1β Start with web UIrun_locust_headless.sh/.ps1β Run without UIrun_locust_master.sh/.ps1β Distributed masterrun_locust_worker.sh/.ps1β Distributed worker
Running the load test:
cd load-tests
# UI mode β opens http://localhost:8089 to set users/rate interactively
export AM_HOST="http://your-target-host"
./run_locust_ui.sh
# Headless mode β set params via env vars
export AM_HOST="http://your-target-host"
export AM_USERS=20
export AM_SPAWN_RATE=5
export AM_DURATION=5m
./run_locust_headless.shNote: Run via the shell scripts or
locust -f locustfile.py. Runningpython3 locustfile.pydirectly only loads configuration and prints the data row count β no test starts.
Variable substitution:
${env.VAR}β Expanded at load-time from environment /.envfile${data.<field>}and${user.id|index}β Expanded at runtime in path, headers, params, and body- In
auth.mode: shared, only${env.*}expands; inauth.mode: per_user,${data.*}and${user.*}also expand in the login path/headers/body
Deploy a Locust cluster to AWS via the same deploy command. AutoMock provisions an ECS Fargate cluster (master + workers), an ALB for the Locust UI, and Cloud Map service discovery.
# 1. Upload your load test bundle
automock load --project my-api --upload --dir ./load-tests
# 2. Deploy infrastructure (prompts for sizing)
automock deploy --project my-api
# 3. Scale workers
automock deploy --project my-api
# β prompts for new worker count when already deployed
# 4. Tear down
automock destroy --project my-api
# β select: mocks, loadtest, or bothWhat you get:
- Public ALB with HTTP/HTTPS access to the Locust master UI
- ECS task definitions for master and workers (configurable CPU/memory)
- Cloud Map private namespace for masterβworker discovery
- CloudWatch log groups
Mock backend APIs before they exist:
automock init --project frontend-mock --provider anthropic
# Describe: "REST API for blog app: posts, comments, users"
automock deploy --project frontend-mockConsistent, controlled test environments in CI/CD:
automock deploy --project test-api --skip-confirmation
npm run test:integration -- --api-url https://automock-test-api-123.elb.amazonaws.com
automock destroy --project test-api --forceTest against external APIs without rate limits or costs:
automock init \
--project stripe-mock \
--collection-file stripe-api.postman_collection.json \
--collection-type postmanGenerate and deploy load tests against your mock:
automock load \
--collection-file prod-api.json \
--collection-type postman \
--dir ./load-tests
automock load --project prod-api --upload --dir ./load-tests
automock deploy --project prod-apimin_tasks and max_tasks are configurable at deploy time. Using BYO networking (existing VPC, subnets, NAT) eliminates the NAT Gateway cost.
| Component | Monthly Cost |
|---|---|
| ECS Fargate (0.25 vCPU, 0.5 GB) | ~$35 |
| Application Load Balancer | ~$16 |
| NAT Gateway | ~$32 |
| Data Transfer | ~$9 |
| CloudWatch Logs | ~$0.50 |
| S3 Storage | ~$0.30 |
| Total | ~$93 |
Rough East US estimates; varies by region and traffic. Validate with the AWS Pricing Calculator.
Hourly rate (10 tasks): ~$0.13/hour
| Provider | Cost per Generation |
|---|---|
| Claude Sonnet 4.5 | $0.05 β $0.20 |
| GPT-4 | $0.10 β $0.30 |
Cost tips: Destroy when not in use (automock destroy --project <name>), reduce task count for smaller APIs, or use BYO networking to share existing NAT Gateways.
Scale Up (Aggressive):
- CPU 70β80% β +50% tasks
- CPU 80β90% β +100% tasks
- CPU 90%+ β +200% tasks
- Memory thresholds follow the same pattern
- Requests/min: 500β1000 β +50%, 1000+ β +100%
Scale Down (Conservative):
- CPU < 40% for 5 minutes β β25% tasks
- Cooldown: 5 minutes between scale events
Limits: minimum 10 tasks, maximum 200 tasks (both configurable)
CloudWatch Metrics: ECS CPU/memory/task count, ALB request count/response time/4xx/5xx errors
Alarms: unhealthy host count > 0, 5XX errors > 10/min, CPU > 70% for 10 min, Memory > 80% for 10 min
- IAM β Least-privilege, separate task execution and task roles, no hardcoded credentials; optional permissions boundary and custom role path
- Networking β ECS tasks in private subnets behind ALB
- Data β S3 server-side encryption (AES-256) with versioning enabled
Simulate degrading performance:
{
"progressive": {
"base": 100,
"step": 50,
"cap": 500
}
}Request 1: 100ms β Request 2: 150ms β β¦ β capped at 500ms
Dynamic values in responses:
{
"id": "$!uuid",
"timestamp": "$!now_epoch",
"requestId": "$!request.headers['x-request-id'][0]",
"userId": "$!request.pathParameters['userId'][0]",
"randomScore": "$!rand_int_100"
}Available variables: $!uuid, $!now_epoch, $!rand_int_100, $!rand_bytes_64, $!request.path, $!request.method, $!request.headers['X-Header'][0], $!request.pathParameters['param'][0], $!request.queryStringParameters['query'][0]
Basic GraphQL request matching (no schema validation):
{
"httpRequest": {
"method": "POST",
"path": "/graphql",
"body": {
"query": {"contains": "query GetUser"},
"variables": {"userId": "123"}
}
},
"httpResponse": {
"body": {"data": {"user": {"id": "123", "name": "John"}}}
}
}Supported: query string contains, operation name matching, optional variables matching (exact).
auto-mock/
βββ cmd/auto-mock/ # CLI entrypoint
βββ internal/
β βββ cloud/ # Cloud provider abstraction
β β βββ aws/ # AWS implementation (S3, ECS, IAM)
β β βββ factory.go # Provider detection & initialization
β β βββ manager.go # Orchestration
β βββ mcp/ # AI provider integration (Anthropic, OpenAI)
β βββ builders/ # Interactive expectation builders
β βββ collections/ # Collection parsers (Postman, Bruno, Insomnia, OpenCollection)
β βββ expectations/ # Expectation CRUD operations
β βββ repl/ # Interactive CLI flows
β βββ terraform/ # Embedded infrastructure modules
β β βββ infra/
β β βββ mock/aws/ # ECS Fargate MockServer
β β βββ loadtest/aws/ # ECS Fargate Locust
β βββ models/ # Data structures
βββ go.mod
βββ build.sh
βββ README.md
βββ LICENSE
- AWS support (S3, ECS, ALB)
- AI-powered mock generation (Claude, GPT-4)
- Collection import (Postman, Bruno, Insomnia, OpenCollection)
- OpenCollection support in load test bundle generation
- Interactive builder
- Auto-scaling infrastructure
- CloudWatch monitoring
- Locust load testing (local + managed AWS)
- Custom domain support (ACM + Route53)
- Private ALB for VPC-internal clients
- BYO networking (VPC, subnets, IAM, security groups)
- Azure provider support
- GCP provider support
- Swagger/OpenAPI import
- Bruno directory-based (.bru files) format
- Web UI for expectation management
- Prometheus metrics export
- Multiple region deployment
- Docker Compose local deployment
git clone https://github.com/hemantobora/auto-mock.git
cd auto-mock
go mod download
go build -o automock ./cmd/auto-mock
# Run tests
go test ./...- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Areas we'd love help with: Azure/GCP provider support, Swagger/OpenAPI import, Bruno directory-based (.bru) format, Web UI for expectation management, Prometheus metrics export.
This project is licensed under the MIT License β see the LICENSE file for details.
- MockServer β Powerful HTTP mocking server
- Anthropic β Claude AI for intelligent mock generation
- OpenAI β GPT-4 for intelligent mock generation
- AWS β Cloud infrastructure platform
- Terraform β Infrastructure as Code
- Locust β Load testing framework
- Documentation: GETTING_STARTED.md,
automock help - GitHub Issues: Create an issue
- Email: hemantobora@gmail.com