This document outlines the architectural patterns and best practices for building microservices in this CDK-based monorepo.
Services are composed in the CDK application through the ApplicationStage class:
Import and instantiate the service in ApplicationStage inside applications/core/bin/main.ts:
import { YourServiceStack } from '@services/your-service-name';
// Application setup here...
class ApplicationStage extends Stage {
constructor(scope: Construct, id: string, props?: StageProps) {
super(scope, id, props);
Tags.of(this).add('STAGE', id);
// Instantiate service stacks here as required..
new YourServiceStack(scope, 'your-service-name', {
...props,
description: 'Your service description',
});
}
}All services must follow this standardized constructor signature:
export class YourServiceStack extends Stack {
constructor(
scope: Construct,
id: typeof SERVICE_NAME | (string & {}),
props: YourServiceStackProps
) {
super(scope, id, props);
// Implementation
}
}Maintain strict separation between infrastructure and runtime code:
services/[service-name]/
├── src/
│ ├── index.ts # Main stack definition
│ ├── service-name.ts # Service name constant
│ ├── infra/ # Infrastructure-only code
│ │ ├── functions/ # Lambda construct definitions
│ │ ├── buckets/ # S3 bucket constructs
│ │ ├── step-functions/ # Step Function definitions (YAML)
│ │ └── environment/ # Environment parameters
│ └── runtime/ # Runtime-only code
│ ├── handlers/ # Lambda handler implementations
│ └── lib/ # Shared runtime utilities
└── tests/ # Test files
- Infra code: CDK constructs, resource definitions, configuration
- Runtime code: Lambda handlers, business logic, utilities
- No mixing: Runtime code cannot import from infra, and vice versa
- Asset resolution: Use
resolveAssetPath()to reference runtime assets from infra
Use StepFunctionFromFile for YAML-based definitions:
import { StepFunctionFromFile } from '@libs/cdk-utils/infra';
const stepFunction = new StepFunctionFromFile(this, 'ProcessWorkflow', {
filepath: resolveAssetPath('infra/step-functions/process-workflow.asl.yaml'),
lambdaFunctions: [functionA, functionB, functionC], // Automatic ARN resolution
});Use specialized bucket constructs:
import { TemporaryDataBucket, ConfigBucket } from '@libs/cdk-utils/infra';
// For short-lived data with automatic lifecycle policies
const dataBucket = new TemporaryDataBucket(this, 'DataBucket');
// For configuration and long-term storage
const configBucket = new ConfigBucket(this, 'ConfigBucket');Use parameter groups for organized credential management:
import { SsmParameterGroup } from '@libs/cdk-utils/infra';
class MyServiceParameters extends SsmParameterGroup {
public readonly parameters = {
API_KEY: StringParameter.fromStringParameterName(this, 'ApiKey', `/company/my-service/api-key`),
} as const;
}
// Grant permissions to functions
const parameters = new MyServiceParameters(this);
parameters.grantToFunction(myFunction, 'read');Development parameter values for an application should be stored in parameters/.env.csv.
The parameters Nx target can be used to import/export them from AWS SSM Parameter store.
Each service must have a service-name.ts file:
/**
* Service name constant for your-service
*
* This constant is used throughout the service for:
* - Stack identification and naming
* - Resource tagging
* - Logging context
* - Service discovery
*/
export const SERVICE_NAME = 'your-service' as const;Use the standardized resolveAssetPath function:
export function resolveAssetPath(assetPath: `${'runtime/' | 'infra/'}${string}`) {
return path.resolve(import.meta.dirname, assetPath);
}Usage examples:
- Lambda handlers:
resolveAssetPath('runtime/handlers/my-handler.ts') - Step Function definitions:
resolveAssetPath('infra/step-functions/workflow.asl.yaml') - Configuration files:
resolveAssetPath('infra/config/mapping.json')
Wrap lambda functions with shared purpose in a factory function
// services/your-service/src/infra/functions/lambda-functions.ts
function lambdaFunctions(props: { commonEnvironment: Record<string, string> }) {
const functionA = new NodejsFunction(this, 'FunctionA', {
entry: resolveAssetPath('runtime/handlers/function-a.ts'),
environment: props?.commonEnvironment,
});
const functionB = new NodejsFunction(this, 'FunctionB', {
entry: resolveAssetPath('runtime/handlers/function-b.ts'),
environment: props?.commonEnvironment,
});
return { functionA, functionB };
}
// services/your-service/src/index.ts
class Service extends Stack {
constructor(scope: Construct, id: string, props: StackProps) {
super(scope, id, props);
const { functionA, functionB } = lambdaFunctions({
commonEnvironment: {
DATA_BUCKET: 'data-bucket',
},
});
}
}Functions can also be wrapped in a construct, but note that this will cause the construct's id to be included in the lambda names
// services/your-service/src/infra/functions/lambda-functions.ts
export class LambdaFunctions extends Construct {
public readonly functionA: NodejsFunction;
public readonly functionB: NodejsFunction;
constructor(scope: Construct, id: string, props?: NodejsFunctionProps) {
super(scope, id);
this.functionA = new NodejsFunction(this, 'FunctionA', {
entry: resolveAssetPath('runtime/handlers/function-a.ts'),
environment: props?.commonEnvironment,
});
this.functionB = new NodejsFunction(this, 'FunctionB', {
entry: resolveAssetPath('runtime/handlers/function-b.ts'),
environment: props?.commonEnvironment,
});
}
}- 6+ functions: Use
commonEnvironmentpattern - < 6 functions: Assign individually for clarity
// EventBridge rule triggering Step Function
new Rule(this, 'ProcessSchedule', {
schedule: Schedule.cron({ hour: '*/6', minute: '0' }),
targets: [new SfnStateMachine(processWorkflow)],
});// API Gateway + SQS + Step Function pattern
const api = new HttpApi(this, 'Api', {
corsPreflight: {
allowMethods: [CorsHttpMethod.POST],
allowOrigins: ['*'],
},
});
const queue = new Queue(this, 'ProcessingQueue', {
queueName: `${SERVICE_NAME}-processing`,
});
api.addRoutes({
path: '/process',
methods: [HttpMethod.POST],
integration: new HttpLambdaIntegration('ProcessIntegration', processFunction),
});// S3 event to SQS to Lambda pattern
bucket.addEventNotification(EventType.OBJECT_CREATED, new SqsDestination(processQueue));
processFunction.addEventSource(new SqsEventSource(processQueue));- Use CDK's automatic naming instead of explicit names
- Tag resources with service name:
Tags.of(this).add('SERVICE', SERVICE_NAME) - Use consistent construct IDs across services
- Development: Optimized for debugging and fast iteration
- Staging: Production-like with enhanced logging
- Production: Optimized for performance and cost
- Use dead letter queues for SQS processing
- Implement retry logic in Step Functions
- Use SNS topics for error notifications
- Grant minimal required permissions
- Use SSM parameters for sensitive data
- Never commit secrets to the repository
- Use
MicroserviceChecksaspect for CDK validation - Test environment variable utilities
- Use CDK Template assertions for infrastructure tests
// Common schedule patterns
Schedule.cron({ minute: '0' }); // Every hour
Schedule.cron({ hour: '*/6', minute: '0' }); // Every 6 hours
Schedule.cron({ minute: '*/5' }); // Every 5 minutes
Schedule.cron({ hour: '0', minute: '0' }); // Daily at midnightimport { pickFromProcessEnv, type Keys } from '@libs/cdk-utils/runtime';
// SSM Parameter Group construct - important to only import the *type*
import type { MySsmParameters } from './my-ssm-parameters';
// Union of keys
type ServiceKeys = 'STAGE' | 'DATA_BUCKET';
// Environment object
interface ServiceEnvironment {
DATA_BUCKET: string;
API_ENDPOINT: string;
}
// Export the generic function with allowed keys specific to this service
export function getServiceEnv = pickFromProcessEnv<
| ServiceKeys
| keyof ServiceEnvironment
| Keys<MySsmParameters>
>;// CDK stack testing
import { MicroserviceChecks } from '@libs/cdk-utils/infra';
let stack: Stack;
let template: Template;
beforeEach(() => {
const app = new App();
stack = new YourServiceStack(app, 'TestStack', { description: 'Test' });
Aspects.of(stack).add(new MicroserviceChecks());
template = Template.fromStack(stack);
});This architecture ensures consistency, maintainability, and scalability across all microservices in the monorepo while providing clear guidelines for developers.