Skip to content

Commit ffdb217

Browse files
authored
fix: unclosed <Tab> issue (#1778)
1 parent 808fbed commit ffdb217

File tree

2 files changed

+31
-32
lines changed

2 files changed

+31
-32
lines changed

src/oss/python/integrations/tools/parallel_search.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -235,7 +235,7 @@ if "search_metadata" in result:
235235

236236
We can use our tool in a chain by first binding it to a [tool-calling model](/oss/langchain/tools/) and then calling it:
237237

238-
import ChatModelTabs from "@theme/ChatModelTabs";
238+
import ChatModelTabs from '/snippets/chat-model-tabs.mdx';
239239

240240
<ChatModelTabs customVarName="llm" />
241241

src/snippets/chat-model-tabs.mdx

Lines changed: 30 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -134,43 +134,42 @@
134134
model = ChatBedrock(model="anthropic.claude-3-5-sonnet-20240620-v1:0")
135135
```
136136
</CodeGroup>
137+
</Tab>
138+
<Tab title="HuggingFace">
139+
👉 Read the [HuggingFace chat model integration docs](/oss/python/integrations/chat/huggingface/)
137140

138-
<Tab title="HuggingFace">
139-
👉 Read the [HuggingFace chat model integration docs](/oss/python/integrations/chat/huggingface/)
140-
141-
```shell
142-
pip install -U "langchain[huggingface]"
143-
```
141+
```shell
142+
pip install -U "langchain[huggingface]"
143+
```
144144

145-
<CodeGroup>
146-
```python init_chat_model
147-
import os
148-
from langchain.chat_models import init_chat_model
145+
<CodeGroup>
146+
```python init_chat_model
147+
import os
148+
from langchain.chat_models import init_chat_model
149149

150-
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "hf_..."
150+
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "hf_..."
151151

152-
model = init_chat_model(
153-
"microsoft/Phi-3-mini-4k-instruct",
154-
model_provider="huggingface",
155-
temperature=0.7,
156-
max_tokens=1024,
157-
)
158-
```
152+
model = init_chat_model(
153+
"microsoft/Phi-3-mini-4k-instruct",
154+
model_provider="huggingface",
155+
temperature=0.7,
156+
max_tokens=1024,
157+
)
158+
```
159159

160-
```python Model Class
161-
import os
162-
from langchain_huggingface import ChatHuggingFace, HuggingFaceEndpoint
160+
```python Model Class
161+
import os
162+
from langchain_huggingface import ChatHuggingFace, HuggingFaceEndpoint
163163

164-
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "hf_..."
164+
os.environ["HUGGINGFACEHUB_API_TOKEN"] = "hf_..."
165165

166-
llm = HuggingFaceEndpoint(
167-
repo_id="microsoft/Phi-3-mini-4k-instruct",
168-
temperature=0.7,
169-
max_length=1024,
170-
)
171-
model = ChatHuggingFace(llm=llm)
172-
```
173-
</CodeGroup>
174-
</Tab>
166+
llm = HuggingFaceEndpoint(
167+
repo_id="microsoft/Phi-3-mini-4k-instruct",
168+
temperature=0.7,
169+
max_length=1024,
170+
)
171+
model = ChatHuggingFace(llm=llm)
172+
```
173+
</CodeGroup>
175174
</Tab>
176175
</Tabs>

0 commit comments

Comments
 (0)