Skip to content
This repository has been archived by the owner on Oct 14, 2024. It is now read-only.

Commit

Permalink
Merge pull request #195 from janhq/pena-patch
Browse files Browse the repository at this point in the history
Update the CLI commands
  • Loading branch information
irfanpena authored Sep 5, 2024
2 parents bb2b386 + 6215c31 commit 225e257
Show file tree
Hide file tree
Showing 8 changed files with 179 additions and 167 deletions.
13 changes: 7 additions & 6 deletions docs/cli/chat.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ This command starts a chat session with a specified model, allowing you to inter
## Usage

```bash
cortex chat [options] [model_id] [message]
cortex chat <model_id> [options]
```
:::info
This command uses a `model_id` from the model that you have downloaded or available in your file system.
Expand All @@ -30,10 +30,11 @@ This command uses a `model_id` from the model that you have downloaded or availa

| Option | Description | Required | Default value | Example |
| ----------------------------- | ----------------------------------------------------------------------------------------------- | -------- | ------------- | ----------------------------- |
| `model_id` | Model ID to chat with. If there is no model_id provided, it will prompt to select from running models. | No | - | `mistral` |
| `-t`, `--thread <thread_id>` | Thread ID. If not provided, will create new thread | No | - | `-t 98765` |
| `model_id` | Model ID to chat with. | No | - | `mistral` |
| `-m`, `--message <message>` | Message to send to the model | No | - | `-m "Hello, model!"` |
| `-a`, `--attach` | Attach to interactive chat session | No | `false` | `-a` |
| `-p`, `--preset <preset>` | Apply a chat preset to the chat session | No | - | `-p default` |
| `-h`, `--help` | Display help information for the command | No | - | `-h` |
| `-h`, `--help` | Display help information for the command. | No | - | `-h` |

<!-- | `-t`, `--thread <thread_id>` | Thread ID. If not provided, will create new thread | No | - | `-t 98765` | -->
<!-- | `-a`, `--attach` | Attach to interactive chat session | No | `false` | `-a` |
| `-p`, `--preset <preset>` | Apply a chat preset to the chat session | No | - | `-p default` | -->

12 changes: 6 additions & 6 deletions docs/cli/cortex.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,12 +25,12 @@ cortex [command] [options]

| Option | Description | Required | Default value | Example |
| ---------------------------- | ----------------------------------------- | -------- | ------------- | ----------------------------- |
| `-a`, `--address <address>` | Address to use | No | - | `-a 192.168.1.1` |
| `-p`, `--port <port>` | Port to serve the application | No | - | `-p 1337` |
| `-l`, `--logs` | Show logs | No | `false` | `-l` |
| `--dataFolder <dataFolder>` | Set the data folder directory | No | - | `--dataFolder /path/to/data` |
| `-v`, `--version` | Show version | No | `false` | `-v` |
| `-h`, `--help` | Display help information for the command | No | - | `-h` |
| `-a`, `--address <address>` | Address to use. | No | - | `-a 192.168.1.1` |
| `-p`, `--port <port>` | Port to serve the application. | No | - | `-p 1337` |
| `-v`, `--version` | Show version. | No | - | `-v` |
| `-h`, `--help` | Display help information for the command. | No | - | `-h` |
| `--verbose` | Show the detailed command logs | No | - | `--verbose` |
<!--| `--dataFolder <dataFolder>` | Set the data folder directory | No | - | `--dataFolder /path/to/data` | -->


## Command Chaining
Expand Down
113 changes: 60 additions & 53 deletions docs/cli/engines/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,29 +15,29 @@ This command allows you to manage various engines available within Cortex.
**Usage**:

```bash
cortex engines [options] <command|parameter> [subcommand]
cortex engines <command|parameter> [options] [subcommand]
```

**Options**:

| Option | Description | Required | Default value | Example |
|-------------------|-------------------------------------------------------|----------|---------------|-----------------|
| `-vk`, `--vulkan` | Install Vulkan engine | No | `false` | `-vk` |
| `-vk`, `--vulkan` | Install Vulkan engine. | No | `false` | `-vk` |
| `-h`, `--help` | Display help information for the command. | No | - | `-h` |

## `cortex engines get`
:::info
This CLI command calls the following API endpoint:
- [Get Engine](/api-reference#tag/engines/get/v1/engines/{name})
:::
This command returns an engine detail defined by an engine `name`.
This command returns an engine detail defined by an engine `engine_name`.



**Usage**:

```bash
cortex engines <name> get
cortex engines get <engine_name>
```
For example, it returns the following:
```bash
Expand All @@ -59,97 +59,104 @@ To get an engine name, run the [`engines list`](/docs/cli/engines/list) command

| Option | Description | Required | Default value | Example |
|-------------------|-------------------------------------------------------|----------|---------------|-----------------|
| `name` | The name of the engine that you want to retrieve. | Yes | - | `cortex.llamacpp`|
| `engine_name` | The name of the engine that you want to retrieve. | Yes | - | `cortex.llamacpp`|
| `-h`, `--help` | Display help information for the command. | No | - | `-h` |

## `cortex engines init`
## `cortex engines list`
:::info
This CLI command calls the following API endpoint:
- [List Engines](/api-reference#tag/engines/get/v1/engines)
:::
This command lists all the Cortex's engines.



**Usage**:

```bash
cortex engines list [options]
```
For example, it returns the following:
```bash
+---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+
| (Index) | name | description | version | product name | status |
+---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+
| 1 | cortex.onnx | This extension enables chat completion API calls using the Onnx engine | 0.0.1
| Onnx Inference Engine | not_initialized |
+---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+
| 2 | cortex.llamacpp | This extension enables chat completion API calls using the LlamaCPP engine | 0.0.1
| LlamaCPP Inference Engine | ready |
+---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+
| 3 | cortex.tensorrt-llm | This extension enables chat completion API calls using the TensorrtLLM engine | 0.0.1
| TensorrtLLM Inference Engine | not_initialized |
+---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+
```

**Options**:

| Option | Description | Required | Default value | Example |
|---------------------------|----------------------------------------------------|----------|---------------|----------------------|
| `-h`, `--help` | Display help for command. | No | - | `-h` |


## `cortex engines install`
:::info
This CLI command calls the following API endpoint:
- [Init Engine](/api-reference#tag/engines/post/v1/engines/{name}/init)
:::
This command sets up and downloads the required dependencies to run the available engines within Cortex. Currently, Cortex supports three engines:
This command downloads the required dependencies and installs the engine within Cortex. Currently, Cortex supports three engines:
- `Llama.cpp`
- `Onnx`
- `Tensorrt-llm`

**Usage**:
```bash
cortex engines <name> init [options]
cortex engines install <engine_name> [options]
```
For Example:
```bash
## Llama.cpp engine
cortex engines init cortex.llamacpp
cortex engines install cortex.llamacpp

## ONNX engine
cortex engines init cortex.onnx
cortex engines install cortex.onnx

## Tensorrt-LLM engine
cortex engines init cortex.tensorrt-llm
cortex engines install cortex.tensorrt-llm

```

**Options**:

| Option | Description | Required | Default value | Example |
|---------------------------|----------------------------------------------------|----------|---------------|----------------------|
| `name` | The name of the engine you want to run. | Yes | - | - |
| `engine_name` | The name of the engine you want to install. | Yes | - | - |
| `-h`, `--help` | Display help for command. | No | - | `-h` |
## `cortex engines list`
:::info
This CLI command calls the following API endpoint:
- [List Engines](/api-reference#tag/engines/get/v1/engines)
:::
This command lists all the Cortex's engines.

## `cortex engines uninstall`

This command uninstalls the engine within Cortex.

**Usage**:

```bash
cortex engines list [options]
cortex engines uninstall <engine_name> [options]
```
For example, it returns the following:
For Example:
```bash
┌─────────┬───────────────────────┬────────────────────────────────────────────────────────────────────────────┬─────────┬──────────────────────────────┐
│ (index) │ name │ description │ version │ productName │
├─────────┼───────────────────────┼────────────────────────────────────────────────────────────────────────────┼─────────┼──────────────────────────────┤
│ 0 │ 'cortex.llamacpp''This extension enables chat completion API calls using the Cortex engine''0.0.1''Cortex Inference Engine'
│ 1 │ 'cortex.onnx''This extension enables chat completion API calls using the Cortex engine''0.0.1''Cortex Inference Engine'
│ 2 │ 'cortex.tensorrt-llm''This extension enables chat completion API calls using the Cortex engine''0.0.1''Cortex Inference Engine'
│ 3 │ 'openai''This extension enables OpenAI chat completion API calls''0.0.1''OpenAI Inference Engine'
│ 4 │ 'groq''This extension enables fast Groq chat completion API calls''0.0.1''Groq Inference Engine'
│ 5 │ 'mistral''This extension enables Mistral chat completion API calls''0.0.1''Mistral Inference Engine'
│ 6 │ 'anthropic''This extension enables Anthropic chat completion API calls''0.0.1''Anthropic Inference Engine'
└─────────┴───────────────────────┴────────────────────────────────────────────────────────────────────────────┴─────────┴──────────────────────────────┘
```

**Options**:

| Option | Description | Required | Default value | Example |
|---------------------------|----------------------------------------------------|----------|---------------|----------------------|
| `-h`, `--help` | Display help for command. | No | - | `-h` |

## `cortex engines set`
:::info
This CLI command calls the following API endpoint:
- [Update Engine](/api-reference#tag/engines/patch/v1/engines/{name})
:::

This command updates an engine configuration.
## Llama.cpp engine
cortex engines uninstall cortex.llamacpp

## ONNX engine
cortex engines uninstall cortex.onnx

**Usage**:
## Tensorrt-LLM engine
cortex engines uninstall cortex.tensorrt-llm

```bash
cortex engines <name> set <config> <value>
```

**Options**:

| Option | Description | Required | Default value | Example |
|---------------------------|----------------------------------------------------|----------|---------------|----------------------|
| `name` | The name of the engine you want to update. | Yes | - | - |
| `config` | Configuration name. | Yes | - | `openai` |
| `value` | Configuration value. | Yes | - | `sk-xxxxx` |
| `engine_name` | The name of the engine you want to uninstall. | Yes | - | - |
| `-h`, `--help` | Display help for command. | No | - | `-h` |
Loading

0 comments on commit 225e257

Please sign in to comment.