Skip to content

Adding select_by_llm from HTML#3

Open
guyernest wants to merge 6 commits intomainfrom
fix/polling-region-and-simplify-tool-secrets
Open

Adding select_by_llm from HTML#3
guyernest wants to merge 6 commits intomainfrom
fix/polling-region-and-simplify-tool-secrets

Conversation

@guyernest
Copy link
Owner

No description provided.

guyernest and others added 6 commits January 31, 2026 07:26
Add sequential script execution to Test screen for testing persistent
browser sessions. Users can now run two scripts back-to-back with the
browser kept alive between runs.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…er.py

The persistent browser session feature passes --server-mode to whichever
Python wrapper is selected. On Windows with browser_engine=computer_agent,
this routes to computer_agent_wrapper.py which was missing the flag.

Delegates server mode to OpenAIPlaywrightExecutor.run_server_mode().

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Two fixes for persistent browser session on Windows:
1. Rust: Compact JSON to single line before writing to stdin. Pretty-printed
   JSON with newlines breaks Python's readline()-based NDJSON protocol.
2. Python: Set UTF-8 encoding on stdin/stdout in server mode on Windows.
   Default cp1252 encoding causes OSError on pipe writes.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…erver mode (v0.5.2)

- Add select_by_llm action: extracts all <select> options via DOM, sends
  to text-only LLM for matching, uses select_option() to fire native
  change event (fixes Angular submit button staying disabled)
- Harden server mode: wrap execute_script in try/except to prevent
  unhandled errors from killing the persistent Python process
- Add browser health check with auto-recovery in server mode loop
- Use json.dumps(default=str) to handle non-serializable result objects
- Update address selection strategies: regex (fast) -> select_by_llm
  (reliable) -> vision LLM (fallback)
- Bump version to 0.5.2

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…e extraction (v0.5.4)

Replace CSS-selector-based extract_dom (72% failure rate) with extract_by_llm
that sends cleaned page HTML to the LLM for field extraction. Strips non-content
elements, truncates to 50K chars, and uses JSON mode for structured output.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…(v0.5.5)

- Add Windows UTF-8 stderr encoding to prevent crashes from Unicode characters
- Fix escalate strategies silently succeeding without doing anything - now fail explicitly with NotImplementedError
- Add validation for empty strategy steps
- Truncate long error messages (Playwright timeouts) to 500 chars to prevent I/O issues
- Replace broken escalate fallback with working wait-and-retry select_by_llm

The escalate feature was never implemented in workflow_executor but strategies
with escalate were marking themselves as successful, causing workflows to
continue with unselected addresses and eventually crash.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant