feat(taskspawner): taskTemplate labels and annotations with template rendering#810
feat(taskspawner): taskTemplate labels and annotations with template rendering#810aslakknutsen wants to merge 1 commit intokelos-dev:mainfrom
Conversation
…rendering Introduce spec.taskTemplate.labels and spec.taskTemplate.annotations on TaskSpawner. Each label and annotation value is expanded with source.RenderTemplate using the same work-item variables as branch and promptTemplate. Spawned Tasks still get kelos.dev/taskspawner; label values are fully rendered before applying metadata. Annotation values are merged after rendering; GitHub source annotations are applied on top so reserved keys stay consistent. Regenerate CRD and deepcopy. Add spawner unit tests for label and annotation rendering. Made-with: Cursor Signed-off-by: Aslak Knutsen <aslak@4fs.no>
e749d97 to
b2a35bf
Compare
There was a problem hiding this comment.
1 issue found across 7 files
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="cmd/kelos-spawner/main_test.go">
<violation number="1" location="cmd/kelos-spawner/main_test.go:955">
P2: The new annotation-template test does not validate conflict precedence for reserved source annotations, so merge-order regressions could slip through.</violation>
</file>
Since this is your first cubic review, here's how it works:
- cubic automatically reviews your code and comments on bugs and improvements
- Teach cubic by replying to its comments. cubic learns from your replies and gets better over time
- Add one-off context when rerunning by tagging
@cubic-dev-aiwith guidance or docs links (includingllms.txt) - Ask questions if you need clarification on any suggestion
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
| "example.com/issue": "{{.Number}}", | ||
| "example.com/note": "Issue-{{.Number}}", |
There was a problem hiding this comment.
P2: The new annotation-template test does not validate conflict precedence for reserved source annotations, so merge-order regressions could slip through.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At cmd/kelos-spawner/main_test.go, line 955:
<comment>The new annotation-template test does not validate conflict precedence for reserved source annotations, so merge-order regressions could slip through.</comment>
<file context>
@@ -856,6 +912,83 @@ func TestRunCycleWithSource_BranchTemplateRendered(t *testing.T) {
+func TestRunCycleWithSource_AnnotationTemplateRendered(t *testing.T) {
+ ts := newTaskSpawner("spawner", "default", nil)
+ ts.Spec.TaskTemplate.Annotations = map[string]string{
+ "example.com/issue": "{{.Number}}",
+ "example.com/note": "Issue-{{.Number}}",
+ }
</file context>
| "example.com/issue": "{{.Number}}", | |
| "example.com/note": "Issue-{{.Number}}", | |
| reporting.AnnotationSourceKind: "user-overridden", | |
| reporting.AnnotationSourceNumber: "999", | |
| "example.com/issue": "{{.Number}}", | |
| "example.com/note": "Issue-{{.Number}}", |
|
Thanks for the PR. |
|
@gjkim42 tldr; makes it quick to experiment with/prototype new features by exposing additional metadata on task level for other Controllers to read/react to, as appose to expanding the Spec surface for everything. In my specific case I've built a set of features on top of Kelos;
The ability to add a little bit more metadata to the Task makes this richer and easier.
|
gjkim42
left a comment
There was a problem hiding this comment.
Looks great
Can we add those in podOverrides.metadata.labels and annotations?
|
@gjkim42 The "Render" would still need to happen in the TaskSpawner; so the metadata could be found on Task.Spec.podOverrides.metadata.labels/annotations instead of Task.metadata.labels/annotations, correct? That should be just fine for my usecase. |






What type of PR is this?
/kind feature
What this PR does / why we need it:
Introduce spec.taskTemplate.labels and spec.taskTemplate.annotations on TaskSpawner. Each label and annotation value is expanded with source.RenderTemplate using the same work-item variables as branch and promptTemplate. Spawned Tasks still get kelos.dev/taskspawner; label values are fully rendered before applying metadata.
Annotation values are merged after rendering; GitHub source annotations are applied on top so reserved keys stay consistent.
Regenerate CRD and deepcopy. Add spawner unit tests for label and annotation rendering.
Which issue(s) this PR is related to:
N/A
Special notes for your reviewer:
Does this PR introduce a user-facing change?
Summary by cubic
Add spec.taskTemplate.labels and spec.taskTemplate.annotations to TaskSpawner with template rendering from work items. Spawned Tasks always get the
kelos.dev/taskspawnerlabel; GitHub source annotations override conflicts.source.RenderTemplateusing the same variables asbranchandpromptTemplate.kelos.dev/taskspawnerto the TaskSpawner name (overrides any user value).Written for commit b2a35bf. Summary will update on new commits.