Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
225 changes: 42 additions & 183 deletions CLAUDE.md
Original file line number Diff line number Diff line change
@@ -1,189 +1,48 @@
# CLAUDE.md
# JadeFlow — AI-Native Workflow Orchestration

This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Tech Stack
- Python 3.12+, Typer (CLI), SQLAlchemy + SQLite, FastAPI
- Package manager: uv
- Test runner: pytest

## Session Context (Jan 22, 2026)

This session created the `jadecli` project folder after:
1. Fixed Claude Code hook configuration errors
2. Created merged AI-optimized dotfiles repo at https://github.com/agent2task/dotfiles
3. Added configs for Gemini CLI, Codex CLI, Ollama, CrewAI
4. Fixed WSL TMPDIR cross-filesystem issues for plugin installs
5. Installed taskmaster plugin (46 commands, 3 agents)

---

## Home Directory Overview (~/)

### Development Projects

| Directory | Purpose | Tech Stack |
|-----------|---------|------------|
| **jadecli/** | New project (this folder) | TBD |
| **claude-assist/** | Central brain for Claude Code workflows | Python 3.12+, PostgreSQL, Pydantic, FastAPI |
| **jarvis-tmux-mcp/** | FastMCP server with autonomous agents | Python 3.13, FastMCP 2.x, Textual, LanceDB |
| **defi-data-collection/** | DeFi data collection & trading platform | Python 3.11+, FastAPI, Databricks, Kafka |
| **toad-repo/** | TUI framework for AI/agent applications | Python, Textual, LangGraph, PostgreSQL |
| **dotfiles/** | Chezmoi-managed dotfiles | Shell, chezmoi |

### Configuration Directories

| Directory | Purpose |
|-----------|---------|
| **~/.claude/** | Claude Code config (settings, plugins, rules, hooks) |
| **~/.config/** | XDG app configs (ghostty, tmux, starship, nvim) |
| **~/.local/** | User binaries (mise, uv, zoxide) and tool data |
| **~/.cache/** | 7.5GB cache (uv 6.2GB, pip, pre-commit) |

### Empty/Placeholder

| Directory | Notes |
|-----------|-------|
| **~/projects/** | Contains alex-jobfinder, alexzh-august (empty) |
| **~/work/** | Same structure as projects (empty) |

---

## Key Tool Configs (~/.claude/)

### Rules (7 policy documents, 856 lines)
- **architecture-sync.md** - Schema/model/CLI/docs coherence
- **ralph-wigum-*.md** - RALPH WIGUM methodology (R-A-L-P-H W-I-G-U-M phases)
- **claude-assist-usage.md** - PostgreSQL memory system guidelines
- **security.md** - OWASP checks, secrets blocking
- **performance.md** - Data structures, N+1 avoidance
- **shell-preferences.md** - Always use zsh

### Commands (6 skills)
`/commit`, `/test`, `/sec`, `/perf`, `/db`, `/docs`

### Plugins Installed
- **ralph-loop** - Autonomous TDD loops
- **taskmaster** - Task management (46 commands, 3 agents)

### MCP Servers
- PostgreSQL at `localhost:5433/claude_memory`
- Sequential-Thinking server

---

## Development Environment

### Package Managers
- **uv** - Python packages (6.2GB cache)
- **mise** - Polyglot tools (1.3GB: go, node, python, rust, etc.)
- **npm** - Node packages

### Installed Tools (via mise)
bat, bun, delta, direnv, fd, fzf, go, jq, just, lazygit, node, pre-commit, python, ripgrep, rust, starship, uv, yq, zoxide

### Shell Environment
- **Shell**: zsh with oh-my-zsh
- **Prompt**: Starship (minimalist monochrome)
- **Terminal**: Ghostty (Dracula theme, 0.88 opacity, vim splits)
- **Multiplexer**: tmux (C-a prefix)

### Editor
- **VS Code** with Ghostty Dark theme, glass effect (0.90 opacity)
- **Neovim** available

### Rust Setup
- Toolchain: 1.92.0-x86_64-unknown-linux-gnu
- Global tools: delta, tokei
- Not primary language for this workspace

---

## Claude-Assist Integration

The user has a sophisticated PostgreSQL-backed memory system:

```bash
# Database
docker compose -f ~/claude-assist/docker/docker-compose.yml up -d

# CLI
claude-assist # TUI
da db status # Check database
da memory search "query" # BM25 full-text search
da session list # View sessions
da decision list # View decisions
```

### Key Tables
sessions, commits, decisions, tasks, ralph_loops, ralph_iterations, knowledge_base

---

## JARVIS-TMUX-MCP

FastMCP server for multi-client AI support:

```bash
cd ~/jarvis-tmux-mcp
just run # Start server
just test # Run tests
```

### Agents Available
Banner, Friday, Piper, Rocket, Rufus, Ultron

---

## Dotfiles (chezmoi)

```bash
chezmoi diff # Preview changes
chezmoi apply # Apply dotfiles
chezmoi add FILE # Track new file
```

### Key Mappings
- `dot_` → becomes `.`
- `executable_` → chmod +x
- `.tmpl` → template with {{ .variable }}

---

## WSL Considerations

TMPDIR fix for cross-filesystem issues (already applied):
```bash
export TMPDIR="$HOME/.cache/tmp"
```

This prevents `EXDEV: cross-device link not permitted` errors when installing plugins.

---

## AI Tools Available

| Tool | Command | Notes |
|------|---------|-------|
| Claude Code | `claude` | Primary AI assistant |
| Gemini CLI | `gem` | Free 1M context, 60 req/min |
| Ollama | `ollama run llama3.2` | Local LLMs |
| CrewAI | `crewai` | Multi-agent orchestration |

### Custom Ollama Models
After running `ollama-setup`:
- `coder` - Coding assistant
- `reviewer` - Code review
- `architect` - System design

---

## Taskmaster Plugin Commands
## Commands
- Build: `uv build`
- Test: `pytest tests/ -v`
- Lint: `ruff check . && mypy jade/`
- Format: `ruff format .`

## Project Structure
```
/taskmaster:init-project # Initialize project
/taskmaster:list-tasks # View tasks
/taskmaster:add-task # Add task
/taskmaster:next-task # Get next task
/taskmaster:project-status # Overall status
jade/ — Main package (workflow orchestration engine)
cli/ — Typer CLI (jade init, run, status, logs, validate, graph)
core/ — DAG engine, executor, scheduler, state machine
db/ — SQLAlchemy models, migrations
config/ — Settings + feature flags
ai/ — Feature-flagged AI capabilities
plugins/ — Plugin base class, registry, builtins
api/ — FastAPI REST API (future)
jadecli/ — Legacy CLI toolkit (Click-based)
autopilot/ — Autonomous TDD module
tests/ — Test suite (mirrors jade/ structure)
examples/ — Example workflows
```

### Agents
- task-orchestrator
- task-executor
- task-checker
## Rules
- MUST follow TDD: write failing test -> implement -> verify pass
- MUST ask clarifying questions before implementing complex features
- MUST NOT modify test files during implementation phase
- MUST NOT touch files outside the current task's module
- MUST use Conventional Commits (feat:, fix:, docs:, chore:)
- SHOULD keep functions under 50 lines
- SHOULD keep files under 300 lines
- For module-specific patterns, see jade/<module>/CLAUDE.md

## Verification (run after every change)
- `pytest tests/ -v` must pass
- `ruff check .` must pass
- `mypy jade/` must pass

## What Claude Gets Wrong (update this regularly)
- Tends to add error handling before asked — keep it simple first
- Over-engineers plugin systems — start with the simplest approach
- Forgets to update __init__.py exports when adding new modules
42 changes: 42 additions & 0 deletions examples/etl_pipeline.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
"""ETL Pipeline — extract, transform, load workflow."""

from jade.core import flow, task


@task(name="extract", retries=2, retry_delay=1.0)
def extract():
"""Simulate data extraction."""
data = {"rows": [1, 2, 3, 4, 5], "source": "database"}
print(f"Extracted {len(data['rows'])} rows")
return data


@task(name="transform", depends_on=["extract"])
def transform(upstream):
"""Transform extracted data."""
rows = upstream["extract"]["rows"]
transformed = [x * 10 for x in rows]
print(f"Transformed {len(transformed)} rows")
return {"rows": transformed}


@task(name="validate", depends_on=["transform"])
def validate(upstream):
"""Validate transformed data."""
rows = upstream["transform"]["rows"]
assert all(x >= 10 for x in rows), "Validation failed: values too small"
print(f"Validated {len(rows)} rows")
return {"valid": True, "count": len(rows)}


@task(name="load", depends_on=["validate"])
def load(upstream):
"""Load validated data to destination."""
count = upstream["validate"]["count"]
print(f"Loaded {count} rows to destination")
return {"loaded": count}


@flow(name="etl_pipeline", description="Extract, transform, validate, and load data")
def etl_pipeline():
return [extract, transform, validate, load]
21 changes: 21 additions & 0 deletions examples/hello_world.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
"""Hello World — simplest JadeFlow workflow."""

from jade.core import flow, task


@task(name="greet")
def greet():
print("Hello from JadeFlow!")
return {"message": "Hello, World!"}


@task(name="celebrate", depends_on=["greet"])
def celebrate(upstream):
msg = upstream["greet"]["message"]
print(f"Received: {msg}")
return {"status": "done"}


@flow(name="hello_world", description="A simple hello world workflow")
def hello_world():
return [greet, celebrate]
57 changes: 57 additions & 0 deletions examples/parallel_tasks.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
"""Parallel Tasks — fan-out / fan-in workflow pattern.

Note: currently runs sequentially but demonstrates the dependency pattern
for future parallel execution support.
"""

from jade.core import flow, task


@task(name="fetch_users")
def fetch_users():
"""Fetch user data."""
print("Fetching users...")
return {"users": ["alice", "bob", "charlie"]}


@task(name="fetch_orders")
def fetch_orders():
"""Fetch order data."""
print("Fetching orders...")
return {"orders": [101, 102, 103]}


@task(name="fetch_products")
def fetch_products():
"""Fetch product data."""
print("Fetching products...")
return {"products": ["widget", "gadget", "gizmo"]}


@task(name="merge_data", depends_on=["fetch_users", "fetch_orders", "fetch_products"])
def merge_data(upstream):
"""Merge all fetched data."""
users = upstream["fetch_users"]["users"]
orders = upstream["fetch_orders"]["orders"]
products = upstream["fetch_products"]["products"]
merged_msg = f"Merging {len(users)} users, {len(orders)} orders, {len(products)} products"
print(merged_msg)
return {
"users": users,
"orders": orders,
"products": products,
"total_records": len(users) + len(orders) + len(products),
}


@task(name="generate_report", depends_on=["merge_data"])
def generate_report(upstream):
"""Generate final report from merged data."""
total = upstream["merge_data"]["total_records"]
print(f"Report: {total} total records processed")
return {"report": f"Processed {total} records"}


@flow(name="parallel_pipeline", description="Fan-out/fan-in data pipeline")
def parallel_pipeline():
return [fetch_users, fetch_orders, fetch_products, merge_data, generate_report]
3 changes: 3 additions & 0 deletions jade/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
"""JadeFlow — AI-Native Workflow Orchestration Engine."""

__version__ = "0.1.0"
13 changes: 13 additions & 0 deletions jade/ai/CLAUDE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# jade/ai — AI Capabilities

## Status: Feature-flagged, not yet implemented

## Planned Features
- Natural language -> workflow generation
- AI-assisted error recovery
- Natural language queries against run history

## Boundaries
- ALL features MUST be behind feature flags
- MUST NOT be imported by jade.core or jade.db
- Only jade.cli may call into this module
1 change: 1 addition & 0 deletions jade/ai/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
"""AI capabilities (feature-flagged)."""
1 change: 1 addition & 0 deletions jade/api/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
"""FastAPI REST API (future)."""
14 changes: 14 additions & 0 deletions jade/cli/CLAUDE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# jade/cli — Typer CLI

## Files
- app.py — Main Typer application and entry point
- commands/ — One file per command group (init, run, status, logs, validate, graph)

## Interfaces
- `app` — The Typer app instance, used as entry point
- Each command file exports a function registered with @app.command()

## Boundaries
- MUST NOT contain business logic — delegate to jade.core
- MUST NOT directly access database — use jade.db repository
- CLI is a thin wrapper: parse args, call core, format output
1 change: 1 addition & 0 deletions jade/cli/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
"""Typer-based CLI for JadeFlow."""
Loading