@@ -9,8 +9,8 @@ import TabItem from "@theme/TabItem";
99
1010This command allows you to manage various engines available within Cortex.
1111
12-
1312** Usage** :
13+
1414<Tabs >
1515 <TabItem value = " MacOs/Linux" label = " MacOs/Linux" >
1616 ``` sh
@@ -24,26 +24,25 @@ This command allows you to manage various engines available within Cortex.
2424 </TabItem >
2525</Tabs >
2626
27-
2827** Options** :
2928
3029| Option | Description | Required | Default value | Example |
3130| -------------------| -------------------------------------------------------| ----------| ---------------| -----------------|
3231| ` -h ` , ` --help ` | Display help information for the command. | No | - | ` -h ` |
3332{ /* | `-vk`, `--vulkan` | Install Vulkan engine. | No | `false` | `-vk` | */ }
3433
35- ---
36- # Subcommands:
34+
3735## ` cortex engines list `
36+
3837:::info
3938This CLI command calls the following API endpoint:
4039- [ List Engines] ( /api-reference#tag/engines/get/v1/engines )
4140:::
42- This command lists all the Cortex's engines.
43-
4441
42+ This command lists all the Cortex's engines.
4543
4644** Usage** :
45+
4746<Tabs >
4847 <TabItem value = " MacOs/Linux" label = " MacOs/Linux" >
4948 ``` sh
@@ -58,6 +57,7 @@ This command lists all the Cortex's engines.
5857</Tabs >
5958
6059For example, it returns the following:
60+
6161```
6262+---+--------------+-------------------+---------+----------------------------+---------------+
6363| # | Name | Supported Formats | Version | Variant | Status |
@@ -66,18 +66,19 @@ For example, it returns the following:
6666+---+--------------+-------------------+---------+----------------------------+---------------+
6767| 2 | llama-cpp | GGUF | 0.1.34 | linux-amd64-avx2-cuda-12-0 | Ready |
6868+---+--------------+-------------------+---------+----------------------------+---------------+
69- | 3 | tensorrt-llm | TensorRT Engines | | | Not Installed |
70- +---+--------------+-------------------+---------+----------------------------+---------------+
7169```
7270
7371## ` cortex engines get `
72+
7473:::info
7574This CLI command calls the following API endpoint:
7675- [ Get Engine] ( /api-reference#tag/engines/get/v1/engines/{name} )
7776:::
77+
7878This command returns an engine detail defined by an engine ` engine_name ` .
7979
8080** Usage** :
81+
8182<Tabs >
8283 <TabItem value = " MacOs/Linux" label = " MacOs/Linux" >
8384 ``` sh
@@ -92,18 +93,19 @@ This command returns an engine detail defined by an engine `engine_name`.
9293</Tabs >
9394
9495For example, it returns the following:
96+
9597```
9698+-----------+-------------------+---------+-----------+--------+
9799| Name | Supported Formats | Version | Variant | Status |
98100+-----------+-------------------+---------+-----------+--------+
99101| llama-cpp | GGUF | 0.1.37 | mac-arm64 | Ready |
100102+-----------+-------------------+---------+-----------+--------+
101103```
104+
102105:::info
103106To get an engine name, run the [ ` engines list ` ] ( /docs/cli/engines/list ) command.
104107:::
105108
106-
107109** Options** :
108110
109111| Option | Description | Required | Default value | Example |
@@ -114,16 +116,18 @@ To get an engine name, run the [`engines list`](/docs/cli/engines/list) command.
114116
115117
116118## ` cortex engines install `
119+
117120:::info
118121This CLI command calls the following API endpoint:
119122- [ Init Engine] ( /api-reference#tag/engines/post/v1/engines/{name}/init )
120123:::
124+
121125This command downloads the required dependencies and installs the engine within Cortex. Currently, Cortex supports three engines:
122126- ` llama-cpp `
123127- ` onnxruntime `
124- - ` tensorrt-llm `
125128
126129** Usage** :
130+
127131<Tabs >
128132 <TabItem value = " MacOs/Linux" label = " MacOs/Linux" >
129133 ``` sh
@@ -133,7 +137,6 @@ This command downloads the required dependencies and installs the engine within
133137 <TabItem value = " Windows" label = " Windows" >
134138 ``` sh
135139 cortex.exe engines install [options] < engine_name>
136-
137140 ```
138141 </TabItem >
139142</Tabs >
@@ -150,6 +153,7 @@ This command downloads the required dependencies and installs the engine within
150153This command uninstalls the engine within Cortex.
151154
152155** Usage** :
156+
153157<Tabs >
154158 <TabItem value = " MacOs/Linux" label = " MacOs/Linux" >
155159 ``` sh
@@ -164,6 +168,7 @@ This command uninstalls the engine within Cortex.
164168</Tabs >
165169
166170For Example:
171+
167172``` bash
168173# # Llama.cpp engine
169174cortex engines uninstall llama-cpp
0 commit comments