Skip to content

Commit 022d25e

Browse files
Enhance documentation for llm-model resource setup
Added explanation for llm-model resource usage and score-compose commands. Signed-off-by: Mathieu Benoit <mathieu-benoit@hotmail.fr>
1 parent d535a51 commit 022d25e

1 file changed

Lines changed: 3 additions & 1 deletion

File tree

  • content/blog/score-examples-hub-announcement

content/blog/score-examples-hub-announcement/index.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,11 +50,13 @@ Run:
5050
score-compose init \
5151
--provisioners https://raw.githubusercontent.com/score-spec/community-provisioners/refs/heads/main/llm-model/score-compose/10-dmr-llm-model-via-service-provider.provisioners.yaml
5252

53-
score-compose generate score.yaml
53+
score-compose generate score.yaml -o compose.yaml
5454

5555
docker compose up -d --wait
5656
```
5757

58+
In the `score.yaml` file, the Developer can request a `llm-model` resource to use it in their app by injecting the corresponding `url` generated when this Score file will be deployed. By using `score-compose init --provisioners`, the actual implementation of the `llm-model` resource is downloaded locally (you can [look at its definition here](https://docs.score.dev/examples/resource-provisioners/community/llm-model/score-compose/template/dmr-llm-model-via-service-provider/), Docker Model Runner is used). Then, with `score-compose generate`, both the `my-container` and the `llm-model` services are generated into a `compose.yaml` file. Finally, `docker compose up` will deploy them.
59+
5860
## Score resources provisioners examples
5961

6062
Find examples of resources provisioners with either [`score-compose`](https://docs.score.dev/examples/resource-provisioners?implementation=score-compose) or [`score-k8s`](https://docs.score.dev/examples/resource-provisioners?implementation=score-k8s):

0 commit comments

Comments
 (0)