Skip to content

Commit 86a8fa1

Browse files
authored
docs(ollama): fix references to renamed tutorial file (#513)
1 parent ba19aad commit 86a8fa1

File tree

2 files changed

+4
-4
lines changed

2 files changed

+4
-4
lines changed

docs/inference/configure.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ $ openshell provider create \
8282
Use `--config OPENAI_BASE_URL` to point to any OpenAI-compatible server running where the gateway runs. For host-backed local inference, use `host.openshell.internal` or the host's LAN IP. Avoid `127.0.0.1` and `localhost`. Set `OPENAI_API_KEY` to a dummy value if the server does not require authentication.
8383

8484
:::{tip}
85-
For a self-contained setup, the Ollama community sandbox bundles Ollama inside the sandbox itself — no host-level provider needed. See {doc}`/tutorials/local-inference-ollama` for details.
85+
For a self-contained setup, the Ollama community sandbox bundles Ollama inside the sandbox itself — no host-level provider needed. See {doc}`/tutorials/inference-ollama` for details.
8686
:::
8787

8888
Ollama also supports cloud-hosted models using the `:cloud` tag suffix (e.g., `qwen3.5:cloud`).
@@ -189,7 +189,7 @@ A successful response confirms the privacy router can reach the configured backe
189189
Explore related topics:
190190

191191
- To understand the inference routing flow and supported API patterns, refer to {doc}`index`.
192-
- To follow a complete Ollama-based local setup, refer to {doc}`/tutorials/local-inference-ollama`.
192+
- To follow a complete Ollama-based local setup, refer to {doc}`/tutorials/inference-ollama`.
193193
- To follow a complete LM Studio-based local setup, refer to {doc}`/tutorials/local-inference-lmstudio`.
194194
- To control external endpoints, refer to [Policies](/sandboxes/policies.md).
195195
- To manage provider records, refer to {doc}`../sandboxes/manage-providers`.

docs/tutorials/index.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ Launch Claude Code in a sandbox, diagnose a policy denial, and iterate on a cust
4545
:::
4646

4747
:::{grid-item-card} Inference with Ollama
48-
:link: local-inference-ollama
48+
:link: inference-ollama
4949
:link-type: doc
5050

5151
Route inference through Ollama using cloud-hosted or local models, and verify it from a sandbox.
@@ -68,6 +68,6 @@ Route inference to a local LM Studio server via the OpenAI or Anthropic compatib
6868
6969
First Network Policy <first-network-policy>
7070
GitHub Push Access <github-sandbox>
71-
Inference with Ollama <local-inference-ollama>
71+
Inference with Ollama <inference-ollama>
7272
Local Inference with LM Studio <local-inference-lmstudio>
7373
```

0 commit comments

Comments
 (0)