A demonstration of Express running on AWS Lambda using API Gateway V1 Response Streaming and Server-Sent Events (SSE). This project shows how to mix streaming and regular HTTP requests in a way that makes it clear the technology is working as expected within Lambda.
- Response Streaming: Lambda sends response data incrementally through API Gateway V1, with chunks arriving at the client in real-time
- Server-Sent Events (SSE): Standard SSE format (
data: {...}\n\n) streaming from Lambda - Mixed Endpoints: Both streaming and traditional request/response endpoints working side-by-side
- No Lambda Web Adapter: Direct integration using
awslambda.streamifyResponse()andHttpResponseStream.from()
- Node.js 22.x
- AWS CLI configured with appropriate credentials
- Serverless Framework v4 (
npm install -g serverless) - An AWS account with permissions to create Lambda functions and API Gateway
-
Clone the repository:
git clone git@github.com:garethmcc/serverless-streamable-express-sse.git cd serverless-streamable-express-sse -
Install dependencies:
npm install
Deploy to AWS using Serverless Framework:
serverless deployThis will create:
- Two Lambda functions (
streamandhello) - An API Gateway V1 REST API with response streaming enabled on the
/streamendpoint
After deployment, you'll receive endpoint URLs. Test them with:
curl -N https://<api-id>.execute-api.<region>.amazonaws.com/dev/streamYou should see 10 chunks arriving one per second:
data: {"message":"Chunk #1","timestamp":"2026-01-22T11:52:40.235Z"}
data: {"message":"Chunk #2","timestamp":"2026-01-22T11:52:41.236Z"}
...
event: end
data: Stream finished
curl https://<api-id>.execute-api.<region>.amazonaws.com/dev/helloReturns a standard JSON response:
{"message":"Hello from Express on Lambda!","timestamp":"2026-01-22T11:52:58.462Z","streaming_enabled":true}The streaming endpoint uses response.transferMode: STREAM:
functions:
stream:
handler: handler.streamHandler
timeout: 30
events:
- http:
path: /stream
method: get
response:
transferMode: STREAMUses awslambda.streamifyResponse with HttpResponseStream.from to properly format metadata:
exports.streamHandler = awslambda.streamifyResponse(async (event, responseStream, context) => {
responseStream = awslambda.HttpResponseStream.from(responseStream, {
statusCode: 200,
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive',
}
});
// Write SSE data chunks
responseStream.write(`data: ${JSON.stringify(data)}\n\n`);
responseStream.end();
});- API Gateway Configuration: The
transferMode: STREAMsetting tells API Gateway to use the/response-streaming-invocationsLambda endpoint instead of the standard invocation endpoint - Lambda Handler: The
awslambda.streamifyResponse()decorator provides aresponseStreamwritable stream - Metadata:
HttpResponseStream.from()wraps the stream with HTTP metadata (status code, headers) that API Gateway interprets correctly - Streaming: Data written to
responseStream.write()is sent progressively to the client through API Gateway
- Response streaming with API Gateway V1 has a 5-minute idle timeout for regional endpoints
- First 10MB of payload is unrestricted; beyond that, throughput is limited to 2MB/s
- Endpoint caching, content encoding, and VTL response transformation are not supported with streaming enabled
To remove all deployed resources:
serverless removeISC