Skip to content

Commit 3064c12

Browse files
committed
chore: add canonical .claude/rules and .claude/skills from structured-coding-with-ai
1 parent 856cc36 commit 3064c12

1 file changed

Lines changed: 45 additions & 0 deletions

File tree

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
---
2+
name: git-workflow
3+
description: Git workflow for AI-assisted advocacy development — atomic commits per subtask, ephemeral branches, PR curation into reviewable chunks, AI-Assisted tagging
4+
---
5+
# Git Workflow
6+
7+
## When to Use
8+
- Before committing, branching, or creating a pull request
9+
- When reviewing commit history or deciding on merge strategy
10+
- After an AI agent has generated a batch of changes that need to be broken into logical units
11+
12+
## Process
13+
14+
### Step 1: Create an Ephemeral Branch
15+
Create a short-lived branch for the current task. Trunk-based development remains the goal — this branch is a safety net, not a long-lived workspace. If the agent has not produced mergeable work within one session, delete the branch and reconsider the approach. Name the branch for the task, not for a feature epic.
16+
17+
### Step 2: Implement One Subtask
18+
Break the overall task into the smallest logical subtasks. Each subtask is one commit. If the task decomposes into "extract interface, implement adapter, update callers," those are three separate commits. Never let the agent complete an entire multi-step task before committing.
19+
20+
### Step 3: Test Before Committing
21+
Run the relevant test subset before each commit. Every commit must leave the codebase in a passing state. If tests fail, fix the issue before committing — do not push broken commits to shared branches. For advocacy code handling investigation or evidence data, also verify that no sensitive data has leaked into test output, logs, or error messages.
22+
23+
### Step 4: Write the Commit Message
24+
Write commit messages that explain WHY, not WHAT — the code shows what changed. First line: 50 characters, imperative mood. Reference the issue or ticket. Add appropriate AI attribution trailers so the team knows which code was agent-generated.
25+
26+
### Step 5: Repeat for Each Subtask
27+
Continue the cycle: implement one subtask, test, commit. Each commit should be independently understandable. If you read the commit in isolation six months from now, you should know what it does and why.
28+
29+
### Step 6: Curate the Pull Request
30+
PR curation is the critical human skill in AI-assisted development. AI adoption inflated PR size by 154%. Do not submit the agent's full output as one PR. Split into reviewable chunks:
31+
- Target under 200 lines changed per PR, ideally under 100
32+
- Use stacked PRs for large changes (PR1, PR2, PR3 — each independently reviewable)
33+
- Each PR tells a coherent story with a clear description explaining the reasoning
34+
35+
### Step 7: Tag and Request Review
36+
- Tag every PR containing AI-generated code as **AI-Assisted**
37+
- Require two human approvals for primarily AI-generated PRs
38+
- Call out areas needing close review — especially security boundaries, error handling, and any code touching investigation or coalition data
39+
40+
### Step 8: Track Quality Signals
41+
- **Code Survival Rate** — how much AI-generated code remains unchanged 48 hours after merge. Low survival means the agent is generating code that humans immediately rewrite.
42+
- **Suggestion acceptance rate** — healthy range is 25-35%. Higher may indicate over-reliance on AI suggestions without critical review. In advocacy projects, over-reliance means unreviewed security assumptions.
43+
44+
### Merge Strategy
45+
Squash-merge ephemeral branches to keep trunk history clean. Delete branches immediately after merge. If a branch has lived longer than one working session, evaluate whether the approach needs to change rather than extending the branch's life.

0 commit comments

Comments
 (0)