You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add documentation for using OpenHands LLM provider with the SDK (#186)
* Add documentation for using OpenHands LLM provider with the SDK
- Document how to use OpenHands API key with the SDK
- Show environment variable configuration (LLM_API_KEY, LLM_MODEL)
- Explain that openhands/ prefix auto-configures the base URL
- Include Python code examples
- List available models with openhands/ prefix
- Add network requirements note for firewall restrictions
* Fix gemini-3-pro-preview cached input cost to match LiteLLM DB
Copy file name to clipboardExpand all lines: openhands/usage/llms/openhands-llms.mdx
+44-1Lines changed: 44 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -33,6 +33,49 @@ When running OpenHands, you'll need to set the following in the OpenHands UI thr
33
33
When you use OpenHands as an LLM provider in the CLI, we may collect minimal usage metadata and send it to All Hands AI. For details, see our Privacy Policy: https://openhands.dev/privacy
34
34
</Note>
35
35
36
+
## Using OpenHands LLM Provider with the SDK
37
+
38
+
You can use your OpenHands API key with the [OpenHands SDK](https://docs.openhands.dev/sdk) to build custom agents and automation pipelines.
39
+
40
+
### Configuration
41
+
42
+
The SDK automatically configures the correct API endpoint when you use the `openhands/` model prefix. Simply set two environment variables:
0 commit comments