Summary
Add support for scheduled pipelines and pipeline-to-pipeline dependencies with this boundary:
pipeline-core defines pipeline-level depends, outputs, and schedule metadata
pipeline-executor validates and runs multi-pipeline DAGs in dependency order
pipeline-server handles cron scheduling, persistence of pipeline outputs, and launching orchestration runs
Default v1 behavior:
- pipeline-level
depends references upstream pipelines plus explicit upstream outputs
- downstream pipelines trigger only after upstream success
- downstream runs use the same versions as the upstream run
- cross-pipeline data flows through explicit pipeline
outputs, not raw execution payloads
Proposed API shape
Extend PipelineDefinition in pipeline-core with orchestration metadata:
depends?: readonly PipelineDependency[]
- pipeline-level dependencies, distinct from route-level
depends
- each dependency names:
- upstream pipeline id
- required upstream
outputs
- optional trigger mode, default
onSuccess
outputs?: Record<string, PipelineOutputDefinition>
- named persisted pipeline results
- schema-validated
- version-scoped by default
- derived from completed pipeline results
schedule?: string | string[]
- metadata only in core; execution handled by server
Semantic split:
- route
depends: intra-pipeline route DAG
- pipeline
depends: inter-pipeline orchestration DAG
- route
out: route output/materialization hints
- pipeline
outputs: persisted orchestration outputs
Responsibilities
pipeline-executor
- validate pipeline-level dependency graphs across a set of pipelines
- detect missing upstream pipelines, missing upstream outputs, and pipeline dependency cycles
- build a pipeline DAG and execute pipelines in topological order
- pass upstream pipeline
outputs into downstream pipelines during the same orchestration run
- expose pipeline-level dependency inputs through a dedicated execution context API
The executor should not own durable persistence. It should operate on pipeline definitions, injected upstream output state, and orchestration-run-local produced outputs.
pipeline-server
- persist successful pipeline
outputs
- store execution trigger metadata and lineage
- scan schedules and create orchestration runs
- hydrate executor input with previously persisted upstream
outputs for runs triggered later
- launch orchestration runs from cron and future upstream-completion triggers
Recommended persisted metadata:
trigger: "manual" | "schedule" | "pipeline"
triggerSourceExecutionId?: string
triggerSourcePipelineId?: string
scheduledAt?: Date
Persisted pipeline outputs should be keyed by:
- workspace
- pipeline id
- execution id
- version
- output id
Execution model
Same-run orchestration
- server resolves the pipeline set
- executor validates pipeline-level
depends
- executor runs the pipeline DAG in dependency order
- upstream pipeline
outputs are made available to downstream pipelines in-memory
- on success, server persists declared pipeline
outputs
Scheduled or delayed orchestration
- server identifies due pipelines
- server loads persisted upstream
outputs needed for the orchestration run
- executor runs the selected pipeline set with hydrated upstream dependency inputs
- successful outputs are persisted again for future runs
Downstream pipelines should not read prior PipelineExecutionResult.data directly.
Acceptance criteria
pipeline-core supports pipeline-level depends, outputs, and schedule
pipeline-executor can validate and run a multi-pipeline DAG in dependency order
- downstream pipelines can consume explicit upstream
outputs
pipeline-server can persist pipeline outputs and launch cron-based orchestration runs
- existing single-pipeline execution continues to work unchanged
- route-level DAGs and route artifacts keep current behavior
Initial defaults
- trigger mode is
onSuccess
- version propagation is “same versions as upstream”
- schedules are server-managed cron schedules
- pipeline outputs are version-scoped by default
- missed-schedule backfill is out of scope for v1
Summary
Add support for scheduled pipelines and pipeline-to-pipeline dependencies with this boundary:
pipeline-coredefines pipeline-leveldepends,outputs, and schedule metadatapipeline-executorvalidates and runs multi-pipeline DAGs in dependency orderpipeline-serverhandles cron scheduling, persistence of pipelineoutputs, and launching orchestration runsDefault v1 behavior:
dependsreferences upstream pipelines plus explicit upstreamoutputsoutputs, not raw execution payloadsProposed API shape
Extend
PipelineDefinitioninpipeline-corewith orchestration metadata:depends?: readonly PipelineDependency[]dependsoutputsonSuccessoutputs?: Record<string, PipelineOutputDefinition>schedule?: string | string[]Semantic split:
depends: intra-pipeline route DAGdepends: inter-pipeline orchestration DAGout: route output/materialization hintsoutputs: persisted orchestration outputsResponsibilities
pipeline-executoroutputsinto downstream pipelines during the same orchestration runThe executor should not own durable persistence. It should operate on pipeline definitions, injected upstream output state, and orchestration-run-local produced outputs.
pipeline-serveroutputsoutputsfor runs triggered laterRecommended persisted metadata:
trigger: "manual" | "schedule" | "pipeline"triggerSourceExecutionId?: stringtriggerSourcePipelineId?: stringscheduledAt?: DatePersisted pipeline outputs should be keyed by:
Execution model
Same-run orchestration
dependsoutputsare made available to downstream pipelines in-memoryoutputsScheduled or delayed orchestration
outputsneeded for the orchestration runDownstream pipelines should not read prior
PipelineExecutionResult.datadirectly.Acceptance criteria
pipeline-coresupports pipeline-leveldepends,outputs, andschedulepipeline-executorcan validate and run a multi-pipeline DAG in dependency orderoutputspipeline-servercan persist pipeline outputs and launch cron-based orchestration runsInitial defaults
onSuccess