Skip to content
This repository was archived by the owner on Jul 4, 2025. It is now read-only.

Commit aae83ae

Browse files
authored
Update Readme and the URL for the API documentation (#779)
1 parent 2f32dc5 commit aae83ae

File tree

4 files changed

+88
-86
lines changed

4 files changed

+88
-86
lines changed

README.md

Lines changed: 39 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -25,13 +25,45 @@ Cortex currently supports 3 inference engines:
2525
- [Docs](https://cortex.jan.ai/docs/)
2626

2727
## Quickstart
28+
### Prerequisites
29+
Ensure that your system meets the following requirements to run Cortex:
30+
- **Dependencies**:
31+
- **Node.js**: version 18 and above is required to run the installation.
32+
- **NPM**: Needed to manage packages.
33+
- **CPU Instruction Sets**: Available for download from the [Cortex GitHub Releases](https://github.com/janhq/cortex/releases) page.
34+
- **OS**:
35+
- MacOSX 13.6 or higher.
36+
- Windows 10 or higher.
37+
- Ubuntu 22.04 and later.
38+
39+
> Visit [Quickstart](https://cortex.jan.ai/docs/quickstart) to get started.
40+
41+
### NPM
42+
``` bash
43+
# Install using NPM
44+
npm i -g cortexso
45+
# Install using Brew
46+
brew tap janhq/cortexso
47+
brew install cortexso
48+
# Run model
49+
cortex run llama3
50+
# To uninstall globally using NPM
51+
npm uninstall -g cortexso
52+
```
2853

29-
Visit [Quickstart](https://cortex.jan.ai/docs/quickstart) to get started.
30-
54+
### Homebrew
3155
``` bash
32-
npm i -g @janhq/cortex
56+
# Install using Brew
57+
brew tap janhq/cortexso
58+
brew install cortexso
59+
# Run model
3360
cortex run llama3
61+
# To uninstall using Brew
62+
brew uninstall cortexso
63+
brew untap janhq/cortexso
3464
```
65+
> You can also install Cortex using the Cortex Installer available on [GitHub Releases](https://github.com/janhq/cortex/releases).
66+
3567
To run Cortex as an API server:
3668
```bash
3769
cortex serve
@@ -90,18 +122,13 @@ See [CLI Reference Docs](https://cortex.jan.ai/docs/cli) for more information.
90122
models start Start a specified model.
91123
models stop Stop a specified model.
92124
models update Update the configuration of a specified model.
93-
```
94-
95-
## Uninstall Cortex
96-
97-
Run the following command to uninstall Cortex globally on your machine:
98-
99-
```
100-
# Uninstall globally using NPM
101-
npm uninstall -g @janhq/cortex
125+
benchmark Benchmark and analyze the performance of a specific AI model using your system.
126+
presets Show all the available model presets within Cortex.
127+
telemetry Retrieve telemetry logs for monitoring and analysis.
102128
```
103129
104130
## Contact Support
105131
- For support, please file a GitHub ticket.
106132
- For questions, join our Discord [here](https://discord.gg/FTk2MvZwJH).
107133
- For long-form inquiries, please email [hello@jan.ai](mailto:hello@jan.ai).
134+

cortex-js/README.md

Lines changed: 47 additions & 72 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Cortex - CLI
1+
# Cortex
22
<p align="center">
33
<img alt="cortex-cpplogo" src="https://raw.githubusercontent.com/janhq/cortex/dev/assets/cortex-banner.png">
44
</p>
@@ -11,91 +11,71 @@
1111
> ⚠️ **Cortex is currently in Development**: Expect breaking changes and bugs!
1212
1313
## About
14-
Cortex is an openAI-compatible local AI server that developers can use to build LLM apps. It is packaged with a Docker-inspired command-line interface and a Typescript client library. It can be used as a standalone server, or imported as a library.
14+
Cortex is an OpenAI-compatible AI engine that developers can use to build LLM apps. It is packaged with a Docker-inspired command-line interface and client libraries. It can be used as a standalone server or imported as a library.
1515

16-
Cortex currently supports two inference engines:
16+
Cortex currently supports 3 inference engines:
1717

1818
- Llama.cpp
19+
- ONNX Runtime
1920
- TensorRT-LLM
2021

21-
> Read more about Cortex at https://jan.ai/cortex
22-
2322
## Quicklinks
24-
Cortex
25-
- [Website](https://jan.ai/)
26-
- [GitHub](https://github.com/janhq/cortex)
27-
- [User Guides](https://jan.ai/cortex)
28-
- [API reference](https://jan.ai/api-reference)
29-
30-
## Prerequisites
31-
32-
### **Dependencies**
33-
34-
Before installation, ensure that you have installed the following:
35-
- **Node.js**: version 18 and above is required to run the installation.
36-
- **NPM**: Needed to manage packages.
37-
- **CPU Instruction Sets**: Available for download from the [Cortex GitHub Releases](https://github.com/janhq/cortex/releases) page.
38-
39-
40-
>💡 The **CPU instruction sets** are not required for the initial installation of Cortex. This dependency will be automatically installed during the Cortex initialization if they are not already on your system.
41-
4223

43-
### **Hardware**
24+
- [Homepage](https://cortex.jan.ai/)
25+
- [Docs](https://cortex.jan.ai/docs/)
4426

27+
## Quickstart
28+
### Prerequisites
4529
Ensure that your system meets the following requirements to run Cortex:
46-
30+
- **Dependencies**:
31+
- **Node.js**: version 18 and above is required to run the installation.
32+
- **NPM**: Needed to manage packages.
33+
- **CPU Instruction Sets**: Available for download from the [Cortex GitHub Releases](https://github.com/janhq/cortex/releases) page.
4734
- **OS**:
4835
- MacOSX 13.6 or higher.
4936
- Windows 10 or higher.
5037
- Ubuntu 22.04 and later.
51-
- **RAM (CPU Mode):**
52-
- 8GB for running up to 3B models.
53-
- 16GB for running up to 7B models.
54-
- 32GB for running up to 13B models.
55-
- **VRAM (GPU Mode):**
5638

57-
- 6GB can load the 3B model (int4) with `ngl` at 120 ~ full speed on CPU/ GPU.
58-
- 8GB can load the 7B model (int4) with `ngl` at 120 ~ full speed on CPU/ GPU.
59-
- 12GB can load the 13B model (int4) with `ngl` at 120 ~ full speed on CPU/ GPU.
39+
> Visit [Quickstart](https://cortex.jan.ai/docs/quickstart) to get started.
6040
61-
- **Disk**: At least 10GB for app and model download.
62-
63-
## Quickstart
64-
To install Cortex CLI, follow the steps below:
65-
1. Install the Cortex NPM package globally:
41+
### NPM
6642
``` bash
43+
# Install using NPM
6744
npm i -g cortexso
68-
```
69-
> Cortex automatically detects your CPU and GPU, downloading the appropriate CPU instruction sets and required dependencies to optimize GPU performance.
70-
71-
2. Download a GGUF model from Hugging Face:
72-
``` bash
73-
# Pull a model most compatible with your hardware
74-
cortex pull llama3
75-
76-
# Pull a specific variant with `repo_name:branch`
77-
cortex pull llama3:7b
78-
79-
# Pull a model with the HuggingFace `model_id`
80-
cortex pull microsoft/Phi-3-mini-4k-instruct-gguf
81-
```
82-
3. Load the model:
83-
``` bash
84-
cortex models start llama3:7b
45+
# Install using Brew
46+
brew tap janhq/cortexso
47+
brew install cortexso
48+
# Run model
49+
cortex run llama3
50+
# To uninstall globally using NPM
51+
npm uninstall -g cortexso
8552
```
8653

87-
4. Start chatting with the model:
54+
### Homebrew
8855
``` bash
89-
cortex chat tell me a joke
56+
# Install using Brew
57+
brew tap janhq/cortexso
58+
brew install cortexso
59+
# Run model
60+
cortex run llama3
61+
# To uninstall using Brew
62+
brew uninstall cortexso
63+
brew untap janhq/cortexso
9064
```
65+
> You can also install Cortex using the Cortex Installer available on [GitHub Releases](https://github.com/janhq/cortex/releases).
9166
92-
93-
## Run as an API server
9467
To run Cortex as an API server:
9568
```bash
9669
cortex serve
70+
71+
# Output
72+
# Started server at http://localhost:1337
73+
# Swagger UI available at http://localhost:1337/api
9774
```
9875

76+
You can now access the Cortex API server at `http://localhost:1337`,
77+
and the Swagger UI at `http://localhost:1337/api`.
78+
9979
## Build from Source
10080

10181
To install Cortex from the source, follow the steps below:
@@ -120,9 +100,10 @@ chmod +x '[path-to]/cortex/cortex-js/dist/src/command.js'
120100
npm link
121101
```
122102

123-
## Cortex CLI Command
124-
The following CLI commands are currently available:
125-
> ⚠️ **Cortex is currently in Development**: More commands will be added soon!
103+
## Cortex CLI Commands
104+
105+
The following CLI commands are currently available.
106+
See [CLI Reference Docs](https://cortex.jan.ai/docs/cli) for more information.
126107

127108
```bash
128109

@@ -141,18 +122,12 @@ The following CLI commands are currently available:
141122
models start Start a specified model.
142123
models stop Stop a specified model.
143124
models update Update the configuration of a specified model.
144-
engines Execute a specified command related to engines.
145-
engines list List all available engines.
125+
benchmark Benchmark and analyze the performance of a specific AI model using your system.
126+
presets Show all the available model presets within Cortex.
127+
telemetry Retrieve telemetry logs for monitoring and analysis.
146128
```
147-
## Uninstall Cortex
148-
149-
Run the following command to uninstall Cortex globally on your machine:
150129
151-
```
152-
# Uninstall globally using NPM
153-
npm uninstall -g cortexso
154-
```
155130
## Contact Support
156131
- For support, please file a GitHub ticket.
157132
- For questions, join our Discord [here](https://discord.gg/FTk2MvZwJH).
158-
- For long-form inquiries, please email [hello@jan.ai](mailto:hello@jan.ai).
133+
- For long-form inquiries, please email [hello@jan.ai](mailto:hello@jan.ai).

cortex-js/src/app.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ export const getApp = async () => {
2323
const config = new DocumentBuilder()
2424
.setTitle('Cortex API')
2525
.setDescription(
26-
'Cortex API provides a command-line interface (CLI) for seamless interaction with large language models (LLMs). Fully compatible with the [OpenAI API](https://platform.openai.com/docs/api-reference), it enables straightforward command execution and management of LLM interactions.',
26+
'Cortex API provides a command-line interface (CLI) for seamless interaction with large language models (LLMs). It is fully compatible with the [OpenAI API](https://platform.openai.com/docs/api-reference) and enables straightforward command execution and management of LLM interactions.',
2727
)
2828
.setVersion('1.0')
2929
.addTag(

cortex-js/src/infrastructure/controllers/status.controller.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ export class StatusController {
88

99
@ApiOperation({
1010
summary: "Get health status",
11-
description: "Retrieves the health status of the Cortex's API endpoint server.",
11+
description: "Retrieves the health status of your Cortex's system.",
1212
})
1313
@HttpCode(200)
1414
@ApiResponse({

0 commit comments

Comments
 (0)