Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
109 changes: 82 additions & 27 deletions docs/chat_models.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,22 @@
# Chat Models

UiPath provides two chat models `UiPathAzureChatOpenAI` and `UiPathChat`. These are compatible with LangGraph as drop in replacements. You do not need to add tokens from OpenAI or Anthropic, usage of these chat models will consume `Agent Units` on your account.
UiPath provides chat model classes compatible with LangChain and LangGraph as drop-in replacements. You do not need provider API keys — usage consumes Agent Units on your account.

## UiPathAzureChatOpenAI
To see the models available for your account, run:

`UiPathAzureChatOpenAI` can be used as a drop in replacement for `ChatOpenAI` or `AzureChatOpenAI`.
```bash
uipath list-models
```

/// note
`UiPathChat` and `UiPathAzureChatOpenAI` are legacy classes and will be phased out in a future release. `UiPathAzureChatOpenAI` has been renamed to `UiPathChatOpenAI` (same class, new name). When you can, migrate existing code to one of the provider-specific classes below.
///

### Example usage
## UiPathChatOpenAI

Here is a code that is using `ChatOpenAI`
Drop-in replacement for `ChatOpenAI` or `AzureChatOpenAI`. This is the new name for `UiPathAzureChatOpenAI`.

Original code using `ChatOpenAI`:

```python
from langchain_openai import ChatOpenAI
Expand All @@ -26,13 +34,13 @@ llm = ChatOpenAI(
)
```

You can simply change `ChatOpenAi` with `UiPathAzureChatOpenAI`, you don't have to provide an OpenAI token.
Swap `ChatOpenAI` for `UiPathChatOpenAI` — no OpenAI token needed:

```python
from uipath_langchain.chat.models import UiPathAzureChatOpenAI
from uipath_langchain.chat import UiPathChatOpenAI

llm = UiPathAzureChatOpenAI(
model="gpt-4.1-mini-2025-04-14",
llm = UiPathChatOpenAI(
model="<model>",
temperature=0,
max_tokens=4000,
timeout=30,
Expand All @@ -41,17 +49,11 @@ llm = UiPathAzureChatOpenAI(
)
```

Currently, the following models can be used with `UiPathAzureChatOpenAI` (this list can be updated in the future):

- `gpt-4`, `gpt-4-1106-Preview`, `gpt-4-32k`, `gpt-4-turbo-2024-04-09`, `gpt-4-vision-preview`, `gpt-4o-2024-05-13`, `gpt-4o-2024-08-06`, `gpt-4o-mini-2024-07-18`, `gpt-4.1-mini-2025-04-14`, `o3-mini-2025-01-31`

## UiPathChat
## UiPathChatAnthropicBedrock

`UiPathChat` is a more versatile class that can suport models from diferent vendors including OpenAI.
Drop-in replacement for `ChatAnthropic` (or `ChatBedrock` when using Anthropic models).

### Example usage

Given the following code:
Original code using `ChatAnthropic`:

```python
from langchain_anthropic import ChatAnthropic
Expand All @@ -66,13 +68,13 @@ llm = ChatAnthropic(
)
```

You can replace it with `UiPathChat` like so:
Swap for `UiPathChatAnthropicBedrock`:

```python
from uipath_langchain.chat.models import UiPathChat
from uipath_langchain.chat import UiPathChatAnthropicBedrock

llm = UiPathChat(
model="anthropic.claude-3-opus-20240229-v1:0",
llm = UiPathChatAnthropicBedrock(
model_id="<model>",
temperature=0,
max_tokens=1024,
timeout=None,
Expand All @@ -81,13 +83,66 @@ llm = UiPathChat(
)
```

Currently the following models can be used with `UiPathChat` (this list can be updated in the future):
## UiPathChatBedrockConverse

- `anthropic.claude-3-5-sonnet-20240620-v1:0`, `anthropic.claude-3-5-sonnet-20241022-v2:0`, `anthropic.claude-3-7-sonnet-20250219-v1:0`, `anthropic.claude-3-haiku-20240307-v1:0`, `gemini-1.5-pro-001`, `gemini-2.0-flash-001`, `gpt-4o-2024-05-13`, `gpt-4o-2024-08-06`, `gpt-4o-2024-11-20`, `gpt-4o-mini-2024-07-18`, `o3-mini-2025-01-31`
Drop-in replacement for `ChatBedrockConverse` from `langchain_aws`. Supports models from multiple providers via the AWS Bedrock Converse API.

/// warning
Please note that you may get errors related to data residency, as some models are not available on all regions.
Original code using `ChatBedrockConverse`:

Example: `[Enforced Region] No model configuration found for product uipath-python-sdk in EU using model anthropic.claude-3-opus-20240229-v1:0`.
```python
from langchain_aws import ChatBedrockConverse

llm = ChatBedrockConverse(
model="anthropic.claude-3-5-sonnet-20240620-v1:0",
temperature=0,
max_tokens=1024,
# other params...
)
```

Swap for `UiPathChatBedrockConverse`:

```python
from uipath_langchain.chat import UiPathChatBedrockConverse

llm = UiPathChatBedrockConverse(
model="<model>",
temperature=0,
max_tokens=1024,
# other params...
)
```

## UiPathChatVertex

Drop-in replacement for `ChatGoogleGenerativeAI` or `ChatVertexAI`.

Original code using `ChatGoogleGenerativeAI`:

```python
from langchain_google_genai import ChatGoogleGenerativeAI

llm = ChatGoogleGenerativeAI(
model="gemini-2.0-flash-001",
temperature=0,
max_tokens=1024,
# other params...
)
```

Swap for `UiPathChatVertex`:

```python
from uipath_langchain.chat import UiPathChatVertex

llm = UiPathChatVertex(
model="<model>",
temperature=0,
max_tokens=1024,
# other params...
)
```

/// warning
Some models may not be available in all regions due to data residency restrictions. Use `uipath list-models` to see which models are available in your region.
///
4 changes: 2 additions & 2 deletions src/uipath_langchain/_cli/_templates/main.py.template
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
from langchain_core.messages import HumanMessage, SystemMessage
from langgraph.graph import START, StateGraph, END
from uipath_langchain.chat import UiPathChat
from uipath_langchain.chat import UiPathChatOpenAI
from pydantic import BaseModel

llm = UiPathChat(model="gpt-4o-mini-2024-07-18")
llm = UiPathChatOpenAI(model="gpt-4o-2024-11-20")


class GraphState(BaseModel):
Expand Down
21 changes: 14 additions & 7 deletions src/uipath_langchain/_resources/REQUIRED_STRUCTURE.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,15 +27,22 @@ class Output(BaseModel):

### Required LLM Initialization

Unless the user explicitly requests a different LLM provider, always use `UiPathChat`:
Unless the user explicitly requests a different LLM provider, always use `UiPathChatOpenAI`:

```python
from uipath_langchain.chat import UiPathChat
from uipath_langchain.chat import UiPathChatOpenAI

llm = UiPathChat(model="gpt-4.1-mini-2025-04-14", temperature=0.7)
llm = UiPathChatOpenAI(model="gpt-4o-2024-11-20", temperature=0.7)
```

**Alternative LLMs** (only use if explicitly requested):
To list available models, run `uipath list-models`.

**Alternative UiPath chat models** (require the matching extra: `uipath-langchain[bedrock]` or `uipath-langchain[vertex]`):
- `UiPathChatAnthropicBedrock` for Anthropic models via AWS Bedrock
- `UiPathChatBedrockConverse` for the AWS Bedrock Converse API (multi-provider)
- `UiPathChatVertex` for Google Vertex AI models

**Alternative LangChain LLMs** (only use if explicitly requested):
- `ChatOpenAI` from `langchain_openai`
- `ChatAnthropic` from `langchain_anthropic`
- Other LangChain-compatible LLMs
Expand All @@ -47,7 +54,7 @@ Every agent should follow this basic structure:
```python
from langchain_core.messages import SystemMessage, HumanMessage
from langgraph.graph import START, StateGraph, END
from uipath_langchain.chat import UiPathChat
from uipath_langchain.chat import UiPathChatOpenAI
from pydantic import BaseModel

# 1. Define Input, State, and Output models
Expand All @@ -61,8 +68,8 @@ class State(BaseModel):
class Output(BaseModel):
result: str

# 2. Initialize UiPathChat LLM
llm = UiPathChat(model="gpt-4.1-mini-2025-04-14", temperature=0.7)
# 2. Initialize the LLM
llm = UiPathChatOpenAI(model="gpt-4o-2024-11-20", temperature=0.7)

# 3. Define agent nodes (async functions)
async def process_node(state: State) -> State:
Expand Down
4 changes: 2 additions & 2 deletions testcases/init-flow/expected_traces.json
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,10 @@
}
},
{
"name": "UiPathChat",
"name": "UiPathChatOpenAI",
"attributes": {
"openinference.span.kind": "LLM",
"llm.provider": "uipath",
"llm.provider": "openai",
"llm.model_name": "*"
}
}
Expand Down
Loading