feat: add --prompt flag for headless single-shot execution#79
feat: add --prompt flag for headless single-shot execution#79velocitybolt wants to merge 2 commits intomindsdb:mainfrom
Conversation
Enable programmatic usage of Anton via CLI flags:
--prompt/-p Run a single query and exit
--output-format Return text (default) or structured JSON
--stdin Read prompt from stdin for piping
Headless mode reuses the full ChatSession pipeline (memory, tools,
scratchpad, datasources) without any interactive elements — no
prompt_toolkit, no spinner, no escape watcher.
JSON output includes response text, tool calls, and token usage
for easy integration with orchestration platforms and CI pipelines.
|
thank you @velocitybolt , this is a great innitiative, @ZoranPandovski can you please review. |
ZoranPandovski
left a comment
There was a problem hiding this comment.
Thanks @velocitybolt. Please check my comments
anton/chat.py
Outdated
| tool_calls: list[dict] = [] | ||
| usage_data: dict = {} | ||
|
|
||
| async for event in session.turn_stream(prompt): |
There was a problem hiding this comment.
Let's add try/except and make sure we print the error to stderr
anton/chat.py
Outdated
| asyncio.run(_chat_loop(console, settings, resume=resume, first_run=first_run, desktop_first_run=desktop_first_run)) | ||
|
|
||
|
|
||
| def run_headless( |
There was a problem hiding this comment.
Currently we are trying to decouple the chat.py scirpt so it would be better if you can migrate this under a new file in commands/dir
| resume: bool = typer.Option( | ||
| False, "--resume", "-r", help="Resume a previous chat session" | ||
| ), | ||
| prompt: str | None = typer.Option( |
There was a problem hiding this comment.
What will happen if a user pass propmpt and stdin simultaneously? Maybe we catch this and return and error that theyt are mutually exclusive
| prompt: str | None = typer.Option( | ||
| None, "--prompt", "-p", help="Run a single prompt in headless mode and exit" | ||
| ), | ||
| output_format: str = typer.Option( |
There was a problem hiding this comment.
Maybe we add a guard in case this is used outsidde headless mode
|
All contributors have signed the CLA ✍️ ✅ |
I have read the CLA Document and I hereby sign the CLA |
|
recheck |
|
I have read the CLA Document and I hereby sign the CLA |
|
@ZoranPandovski anything else we need here, should this merge go into v2 instead? |
Enable programmatic usage of Anton via CLI flags:
--prompt/-p Run a single query and exit
--output-format Return text (default) or structured JSON
--stdin Read prompt from stdin for piping
Headless mode reuses the full ChatSession pipeline (memory, tools,
scratchpad, datasources) without any interactive elements — no
prompt_toolkit, no spinner, no escape watcher.
JSON output includes response text, tool calls, and token usage
for easy integration with orchestration platforms and CI pipelines.
Running Example:
