This document provides standard operating procedures (SOPs) for AI agents working on this codebase.
Before making any code changes, ensure the development environment is properly set up:
-
Install tools: Run
mise install(ormise i) to install all required development tools- This installs Go, mockery, oapi-codegen, sqlc, and other tools defined in
mise.toml - See TOOLS.md for detailed information about tool management
- This installs Go, mockery, oapi-codegen, sqlc, and other tools defined in
-
Verify installation: Ensure all tools are available and at the correct versions
This project uses Go's built-in code generation system. Always use the standard generation workflow:
- Make changes to source files (OpenAPI specs, SQL queries, interface definitions, etc.)
- Run code generation:
go generate . - Verify generated files are updated correctly
- Commit both source changes and generated code
- API code (from
openapi.yaml): Generated using oapi-codegen →api/api.gen.go - Database code (from
db/query.sql): Generated using sqlc →internal/db/*.go - Mocks (from interface definitions): Generated using mockery →
*/mocks/*_mocks.go
- Never manually edit generated files (they have headers indicating they're generated)
- The
generate.gofile contains all//go:generatedirectives - Generated code should be committed to the repository
All tests should follow table-driven test patterns to ensure comprehensive behavior coverage:
func TestFeature_Behavior(t *testing.T) {
t.Parallel()
tests := []struct {
name string
input InputType
mockSetup func(t *testing.T, m *MockType)
want ExpectedType
wantErr error
}{
{
name: "success case",
// ... test case fields
},
{
name: "error case",
// ... test case fields
},
}
for _, tt := range tests {
tt := tt
t.Run(tt.name, func(t *testing.T) {
t.Parallel()
// Test implementation
})
}
}- Use table tests: All test functions should use table-driven patterns for testing multiple behaviors
- Test behaviors, not implementation: Focus on what the code does, not how it does it
- Parallel execution: Mark tests as parallel with
t.Parallel()for faster test runs - Clear test names: Use descriptive names that explain the scenario being tested
- Prefer mockery-generated mocks: Use mockery to generate mocks from interfaces
- Do not write mocks manually unless absolutely necessary
- Mock locations:
- Interface-specific mocks:
internal/<package>/mocks/<interface>_mocks.go - Shared mocks:
mocks/<interface>_mocks.go
- Interface-specific mocks:
- Regenerate mocks: When interfaces change, run
go generate .to update mocks
- Keep tests focused and isolated
- Use
github.com/google/go-cmp/cmpfor deep comparisons - Use
github.com/stretchr/testify/mockfor mock assertions - Test error cases as thoroughly as success cases
- Avoid test interdependencies
This project follows Domain-Driven Design (DDD) principles:
internal/domain: Pure domain entities (no external dependencies)internal/app: Application services orchestrating domain logicinternal/httpapi: HTTP handlers and routinginternal/repository: Data access layerinternal/activity: Temporal activity implementationsinternal/workflow: Temporal workflow definitionscmd/server: HTTP API server entrypointcmd/worker: Temporal worker entrypoint
When adding new features, respect these boundaries and keep dependencies flowing inward (toward domain).
When modifying any Temporal workflow or activity, you must update WORKFLOWS.md to reflect the changes. This includes:
- Adding or removing workflows or activities
- Changing workflow inputs, outputs, or step sequences
- Modifying the call graph (new child workflows, removed steps)
- Changing retry policies or timeout configuration
- Altering batching patterns or concurrency settings
The workflow documentation is the primary reference for understanding how the system's background processing works. Keeping it accurate prevents confusion for developers and operators.
| Change | Docs to update |
|---|---|
| New/modified workflow or activity | WORKFLOWS.md |
| New/modified OpenAPI endpoint | openapi.yaml (source of truth) |
| New/modified SQL query | db/query.sql (source of truth) |
| Tool version or setup changes | TOOLS.md |
This project uses Conventional Commits for automated releases.
<type>[optional scope]: <description>
[optional body]
[optional footer(s)]
- feat: New feature (bumps MINOR version)
- fix: Bug fix (bumps PATCH version)
- perf: Performance improvement (bumps PATCH version)
- docs: Documentation only
- refactor: Code refactoring
- test: Adding or updating tests
- chore: Maintenance tasks
Add ! after type or include BREAKING CHANGE: in footer to bump MAJOR version:
feat!: remove deprecated API endpoint
feat: add artifact search endpoint
fix: correct version comparison logic
docs: update API documentation
test: add table tests for group service
See RELEASES.md for complete release process documentation.
mise install→ Set up development environment- Make changes to source files
go generate .→ Regenerate codego test ./...→ Run tests- Update documentation if workflows, activities, or APIs changed
- Commit changes with conventional commit message (including generated code)
- Push to main → Release-please will create/update release PR automatically