Skip to content

Commit 8937f6b

Browse files
fix: more info on system prompt for agents (#1736)
1 parent 2e9101c commit 8937f6b

File tree

1 file changed

+60
-0
lines changed

1 file changed

+60
-0
lines changed

src/oss/langchain/agents.mdx

Lines changed: 60 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -464,10 +464,70 @@ const agent = createAgent({
464464

465465
:::python
466466
When no @[`system_prompt`] is provided, the agent will infer its task from the messages directly.
467+
468+
The @[`system_prompt`] parameter accepts either a `str` or a @[`SystemMessage`]. Using a `SystemMessage` gives you more control over the prompt structure, which is useful for provider-specific features like [Anthropic's prompt caching](/oss/integrations/chat/anthropic#prompt-caching):
469+
470+
```python wrap
471+
from langchain.agents import create_agent
472+
from langchain.messages import SystemMessage, HumanMessage
473+
474+
literary_agent = create_agent(
475+
model="anthropic:claude-sonnet-4-5",
476+
system_prompt=SystemMessage(
477+
content=[
478+
{
479+
"type": "text",
480+
"text": "You are an AI assistant tasked with analyzing literary works.",
481+
},
482+
{
483+
"type": "text",
484+
"text": "<the entire contents of 'Pride and Prejudice'>",
485+
"cache_control": {"type": "ephemeral"}
486+
}
487+
]
488+
)
489+
)
490+
491+
result = literary_agent.invoke(
492+
{"messages": [HumanMessage("Analyze the major themes in 'Pride and Prejudice'.")]}
493+
)
494+
```
495+
496+
The `cache_control` field with `{"type": "ephemeral"}` tells Anthropic to cache that content block, reducing latency and costs for repeated requests that use the same system prompt.
467497
:::
468498

469499
:::js
470500
When no `systemPrompt` is provided, the agent will infer its task from the messages directly.
501+
502+
The `systemPrompt` parameter accepts either a `string` or a `SystemMessage`. Using a `SystemMessage` gives you more control over the prompt structure, which is useful for provider-specific features like [Anthropic's prompt caching](/oss/integrations/chat/anthropic#prompt-caching):
503+
504+
```ts wrap
505+
import { createAgent } from "langchain";
506+
import { SystemMessage, HumanMessage } from "@langchain/core/messages";
507+
508+
const literaryAgent = createAgent({
509+
model: "anthropic:claude-sonnet-4-5",
510+
systemPrompt: new SystemMessage({
511+
content: [
512+
{
513+
type: "text",
514+
text: "You are an AI assistant tasked with analyzing literary works.",
515+
},
516+
{
517+
type: "text",
518+
text: "<the entire contents of 'Pride and Prejudice'>",
519+
cache_control: { type: "ephemeral" }
520+
}
521+
]
522+
})
523+
});
524+
525+
const result = await literaryAgent.invoke({
526+
messages: [new HumanMessage("Analyze the major themes in 'Pride and Prejudice'.")]
527+
});
528+
```
529+
530+
The `cache_control` field with `{ type: "ephemeral" }` tells Anthropic to cache that content block, reducing latency and costs for repeated requests that use the same system prompt.
471531
:::
472532

473533
#### Dynamic system prompt

0 commit comments

Comments
 (0)