@@ -10,34 +10,32 @@ The application is structured around four core pillars:
10103 . ** The Handler Layer** (Execution & Routing)
11114 . ** The State Layer** (Configuration, Context & Managers)
1212
13- ```
14- ┌─────────────────────────────────────────────────────────────────┐
15- │ NullApp (app.py) │
16- │ Main Orchestrator & Event Hub │
17- ├─────────────────────────────────────────────────────────────────┤
18- │ ┌─────────────┐ ┌─────────────┐ ┌─────────────────────────┐ │
19- │ │ Widgets │ │ Screens │ │ Handlers │ │
20- │ │ (blocks/) │ │ (modals) │ │ Input│AI│CLI Executor │ │
21- │ └──────┬──────┘ └──────┬──────┘ └───────────┬─────────────┘ │
22- │ │ │ │ │
23- │ └────────────────┴─────────────────────┘ │
24- │ │ │
25- ├──────────────────────────┼──────────────────────────────────────┤
26- │ ┌───────────────────────┴───────────────────────────────────┐ │
27- │ │ Managers Layer │ │
28- │ │ AIManager │ MCPManager │ ProcessManager │ AgentManager │ │
29- │ └───────────────────────────────────────────────────────────┘ │
30- ├─────────────────────────────────────────────────────────────────┤
31- │ ┌───────────────────────────────────────────────────────────┐ │
32- │ │ Provider Layer (ai/) │ │
33- │ │ Ollama │ OpenAI │ Anthropic │ Bedrock │ ... (10+ more) │ │
34- │ └───────────────────────────────────────────────────────────┘ │
35- ├─────────────────────────────────────────────────────────────────┤
36- │ ┌───────────────────────────────────────────────────────────┐ │
37- │ │ Storage Layer │ │
38- │ │ SQLite (null.db) │ Config │ Encryption │ │
39- │ └───────────────────────────────────────────────────────────┘ │
40- └─────────────────────────────────────────────────────────────────┘
13+ ``` mermaid
14+ graph TD
15+ NullApp[NullApp<br>app.py<br>Main Orchestrator & Event Hub]
16+
17+ subgraph UI_Layer [UI Layer]
18+ Widgets[Widgets<br>blocks/]
19+ Screens[Screens<br>modals]
20+ Handlers[Handlers<br>Input | AI | CLI Executor]
21+ end
22+
23+ subgraph Managers_Layer [Managers Layer]
24+ Managers[AIManager | MCPManager | ProcessManager | AgentManager]
25+ end
26+
27+ subgraph Provider_Layer [Provider Layer ai/]
28+ Providers[Ollama | OpenAI | Anthropic | Bedrock | ...]
29+ end
30+
31+ subgraph Storage_Layer [Storage Layer]
32+ Storage[SQLite null.db | Config | Encryption]
33+ end
34+
35+ NullApp --> UI_Layer
36+ UI_Layer --> Managers_Layer
37+ Managers_Layer --> Providers
38+ Providers --> Storage
4139```
4240
4341---
@@ -81,21 +79,35 @@ class NullApp(App):
8179
8280Every interaction creates a ** Block** - a distinct visual unit in the history.
8381
84- ```
85- BlockState (models.py) BaseBlockWidget (base.py)
86- ┌──────────────────────┐ ┌──────────────────────┐
87- │ type: BlockType │───────>│ block: BlockState │
88- │ content_input: str │ │ update_output() │
89- │ content_output: str │ │ set_loading() │
90- │ metadata: dict │ │ update_metadata() │
91- │ tool_calls: list │ └──────────────────────┘
92- │ iterations: list │ │
93- └──────────────────────┘ │
94- ┌──────────┴──────────┐
95- ┌───────┴───────┐ ┌───────┴───────┐
96- │ CommandBlock │ │AIResponseBlock│
97- │ (CLI output) │ │ (AI response) │
98- └───────────────┘ └───────────────┘
82+ ``` mermaid
83+ classDiagram
84+ class BlockState {
85+ +BlockType type
86+ +str content_input
87+ +str content_output
88+ +dict metadata
89+ +list tool_calls
90+ +list iterations
91+ }
92+
93+ class BaseBlockWidget {
94+ +BlockState block
95+ +update_output()
96+ +set_loading()
97+ +update_metadata()
98+ }
99+
100+ class CommandBlock {
101+ +CLI output
102+ }
103+
104+ class AIResponseBlock {
105+ +AI response
106+ }
107+
108+ BlockState "1" --* "1" BaseBlockWidget : binds
109+ BaseBlockWidget <|-- CommandBlock
110+ BaseBlockWidget <|-- AIResponseBlock
99111```
100112
101113** Block Types:**
@@ -126,59 +138,78 @@ Handlers decouple execution logic from the UI.
126138
127139### Input Routing (` handlers/input.py ` )
128140
129- ```
130- User Input
131- │
132- ▼
133- InputHandler.handle_submission()
134- │
135- ├── Starts with "/" ──────────> SlashCommandHandler
136- │
137- ├── AI Mode enabled ──────────> AIExecutor
138- │
139- └── CLI Mode ─────────────────> CLIExecutor
141+ ``` mermaid
142+ graph TD
143+ Input[User Input] --> Submit[InputHandler.handle_submission]
144+ Submit -->|Starts with /| Slash[SlashCommandHandler]
145+ Submit -->|AI Mode| AIExec[AIExecutor]
146+ Submit -->|CLI Mode| CLIExec[CLIExecutor]
140147```
141148
142149### AI Executor (` handlers/ai_executor.py ` )
143150
144151The complexity hotspot. Manages three execution modes:
145152
146153** 1. Standard Chat**
147- ```
148- User Prompt → LLM → Stream Response → Update Block
154+ ``` mermaid
155+ sequenceDiagram
156+ participant User
157+ participant LLM
158+ participant Block
159+
160+ User->>LLM: Prompt
161+ LLM-->>Block: Stream Response
162+ Block-->>Block: Update
149163```
150164
151165** 2. Tool-Augmented Chat** (max 3 iterations)
152- ```
153- User Prompt → LLM → Tool Call?
154- │
155- ├── Yes → Execute Tool → Feed Result → LLM → ...
156- │
157- └── No → Stream Response → Done
166+ ``` mermaid
167+ sequenceDiagram
168+ participant User
169+ participant LLM
170+ participant Tool
171+
172+ User->>LLM: Prompt
173+ loop Iteration
174+ LLM->>Tool: Tool Call?
175+ alt Yes
176+ Tool-->>LLM: Result
177+ else No
178+ LLM-->>User: Stream Response
179+ end
180+ end
158181```
159182
160183** 3. Agent Mode** (max 10 iterations)
161- ```
162- Task → LLM (with tools) → Think → Act → Observe → Repeat until done
163- │ │ │
164- │ │ └── Tool result
165- │ └── Tool call
166- └── Reasoning (ThinkingWidget)
184+ ``` mermaid
185+ graph TD
186+ Task[Task] --> LLM
187+ LLM -->|Think| Reasoning
188+ LLM -->|Act| ToolCall
189+ ToolCall -->|Observe| ToolResult
190+ ToolResult --> LLM
191+ Reasoning --> AgentBlock
192+
193+ subgraph Loop [Max 10 Iterations]
194+ LLM
195+ Reasoning
196+ ToolCall
197+ ToolResult
198+ end
167199```
168200
169201### CLI Executor (` handlers/cli_executor.py ` )
170202
171203Executes shell commands via PTY:
172204
173- ```
174- Command → ProcessManager.spawn() → PTY Process
175- │
176- ┌───────────────────────┴───────────────────────┐
177- │ │
178- Simple Output TUI Application
179- │ │
180- CommandBlock TerminalBlock
181- (Static rendering) (Full PTY emulation)
205+ ``` mermaid
206+ graph TD
207+ Command --> Spawn[ProcessManager.spawn]
208+ Spawn --> PTY[PTY Process]
209+ PTY --> Simple[Simple Output]
210+ PTY --> TUI[TUI Application]
211+ Simple --> CmdBlock[CommandBlock]
212+ TUI --> TermBlock[TerminalBlock]
182213```
183214
184215---
@@ -187,20 +218,23 @@ Command → ProcessManager.spawn() → PTY Process
187218
188219### Provider Architecture (` ai/ ` )
189220
190- ```
191- LLMProvider (base.py)
192- │
193- ├── validate_connection() → bool
194- ├── list_models() → list[str]
195- ├── supports_tools() → bool
196- └── generate() → AsyncGenerator[StreamChunk, None]
197-
198- StreamChunk
199- │
200- ├── text: str # Partial response text
201- ├── tool_calls: list # Tool call requests
202- ├── is_complete: bool # Stream finished
203- └── usage: TokenUsage # Token counts
221+ ``` mermaid
222+ classDiagram
223+ class LLMProvider {
224+ +validate_connection() bool
225+ +list_models() list
226+ +supports_tools() bool
227+ +generate() AsyncGenerator
228+ }
229+
230+ class StreamChunk {
231+ +str text
232+ +list tool_calls
233+ +bool is_complete
234+ +TokenUsage usage
235+ }
236+
237+ LLMProvider ..> StreamChunk : yields
204238```
205239
206240### Supported Providers
@@ -248,20 +282,17 @@ Extracts reasoning from different model formats:
248282
249283# ## MCP Integration (`mcp/`)
250284
251- ```
252- MCPManager
253- │
254- ├── MCPConfig (mcp.json)
255- │ │
256- │ └── Server configs (command, args, env)
257- │
258- ├── MCPClient (per server)
259- │ │
260- │ ├── connect() → Spawn process, handshake
261- │ ├── list_tools() → Available tools
262- │ └── call_tool(name, args) → Result
263- │
264- └── get_all_tools() → Merged tool list for LLM
285+ ```mermaid
286+ graph TD
287+ MCPManager -- > Config[MCPConfig< br> mcp.json]
288+ MCPManager -- > Client[MCPClient< br> per server]
289+
290+ Client -- > Connect[connect< br> spawn process]
291+ Client -- > ListTools[list_tools]
292+ Client -- > CallTool[call_tool]
293+
294+ Config -- > Client
295+ Client -- > Tools[Merged Tool List]
265296```
266297
267298### Configuration (` config/ ` )
@@ -280,12 +311,12 @@ MCPManager
280311
281312Builds the message array for LLM calls:
282313
283- ```python
284- ContextManager.build_messages(history_blocks, max_tokens) -> ContextInfo
285- │
286- ├── Converts blocks to messages
287- ├── Truncates long outputs
288- └── Trims oldest messages if over limit
314+ ``` mermaid
315+ graph TD
316+ Build[ContextManager.build_messages] --> Convert[Convert blocks to messages]
317+ Build --> Truncate[Truncate long outputs]
318+ Build --> Trim[Trim oldest messages]
319+ Convert & Truncate & Trim --> Context[ContextInfo]
289320```
290321
291322---
0 commit comments