From 17339e2687e627745b2b67b6fd51e9a588b72869 Mon Sep 17 00:00:00 2001 From: irfanpena Date: Fri, 30 Aug 2024 13:55:29 +0700 Subject: [PATCH 1/5] Update the CLI commands --- docs/cli/chat.md | 9 ++--- docs/cli/cortex.md | 4 +-- docs/cli/engines/index.mdx | 72 +++++++++++++++++++++----------------- docs/cli/models/index.md | 6 ++-- docs/cli/run.md | 24 +++---------- 5 files changed, 53 insertions(+), 62 deletions(-) diff --git a/docs/cli/chat.md b/docs/cli/chat.md index 497d373..9063c2f 100644 --- a/docs/cli/chat.md +++ b/docs/cli/chat.md @@ -20,7 +20,7 @@ This command starts a chat session with a specified model, allowing you to inter ## Usage ```bash -cortex chat [options] [model_id] [message] +cortex chat [model_id] [message] ``` :::info This command uses a `model_id` from the model that you have downloaded or available in your file system. @@ -31,9 +31,10 @@ This command uses a `model_id` from the model that you have downloaded or availa | Option | Description | Required | Default value | Example | | ----------------------------- | ----------------------------------------------------------------------------------------------- | -------- | ------------- | ----------------------------- | | `model_id` | Model ID to chat with. If there is no model_id provided, it will prompt to select from running models. | No | - | `mistral` | -| `-t`, `--thread ` | Thread ID. If not provided, will create new thread | No | - | `-t 98765` | | `-m`, `--message ` | Message to send to the model | No | - | `-m "Hello, model!"` | -| `-a`, `--attach` | Attach to interactive chat session | No | `false` | `-a` | -| `-p`, `--preset ` | Apply a chat preset to the chat session | No | - | `-p default` | | `-h`, `--help` | Display help information for the command | No | - | `-h` | + + + diff --git a/docs/cli/cortex.md b/docs/cli/cortex.md index 85c3b74..90b9651 100644 --- a/docs/cli/cortex.md +++ b/docs/cli/cortex.md @@ -27,10 +27,10 @@ cortex [command] [options] | ---------------------------- | ----------------------------------------- | -------- | ------------- | ----------------------------- | | `-a`, `--address
` | Address to use | No | - | `-a 192.168.1.1` | | `-p`, `--port ` | Port to serve the application | No | - | `-p 1337` | -| `-l`, `--logs` | Show logs | No | `false` | `-l` | -| `--dataFolder ` | Set the data folder directory | No | - | `--dataFolder /path/to/data` | | `-v`, `--version` | Show version | No | `false` | `-v` | | `-h`, `--help` | Display help information for the command | No | - | `-h` | + ## Command Chaining diff --git a/docs/cli/engines/index.mdx b/docs/cli/engines/index.mdx index 617f7d2..5e5587c 100644 --- a/docs/cli/engines/index.mdx +++ b/docs/cli/engines/index.mdx @@ -15,7 +15,7 @@ This command allows you to manage various engines available within Cortex. **Usage**: ```bash -cortex engines [options] [subcommand] +cortex engines [options] [subcommand] ``` **Options**: @@ -37,7 +37,7 @@ This command returns an engine detail defined by an engine `name`. **Usage**: ```bash -cortex engines get +cortex engines get ``` For example, it returns the following: ```bash @@ -62,30 +62,30 @@ To get an engine name, run the [`engines list`](/docs/cli/engines/list) command | `name` | The name of the engine that you want to retrieve. | Yes | - | `cortex.llamacpp`| | `-h`, `--help` | Display help information for the command. | No | - | `-h` | -## `cortex engines init` +## `cortex engines install` :::info This CLI command calls the following API endpoint: - [Init Engine](/api-reference#tag/engines/post/v1/engines/{name}/init) ::: -This command sets up and downloads the required dependencies to run the available engines within Cortex. Currently, Cortex supports three engines: +This command downloads the required dependencies and installs the engine within Cortex. Currently, Cortex supports three engines: - `Llama.cpp` - `Onnx` - `Tensorrt-llm` **Usage**: ```bash -cortex engines init [options] +cortex engines install [options] ``` For Example: ```bash ## Llama.cpp engine -cortex engines init cortex.llamacpp +cortex engines install cortex.llamacpp ## ONNX engine -cortex engines init cortex.onnx +cortex engines install cortex.onnx ## Tensorrt-LLM engine -cortex engines init cortex.tensorrt-llm +cortex engines install cortex.tensorrt-llm ``` @@ -93,8 +93,38 @@ cortex engines init cortex.tensorrt-llm | Option | Description | Required | Default value | Example | |---------------------------|----------------------------------------------------|----------|---------------|----------------------| -| `name` | The name of the engine you want to run. | Yes | - | - | +| `name` | The name of the engine you want to install. | Yes | - | - | | `-h`, `--help` | Display help for command. | No | - | `-h` | + +## `cortex engines uninstall` + +This command uninstalls the engine within Cortex. + +**Usage**: +```bash +cortex engines uninstall [options] +``` +For Example: +```bash +## Llama.cpp engine +cortex engines uninstall cortex.llamacpp + +## ONNX engine +cortex engines uninstall cortex.onnx + +## Tensorrt-LLM engine +cortex engines uninstall cortex.tensorrt-llm + +``` + +**Options**: + +| Option | Description | Required | Default value | Example | +|---------------------------|----------------------------------------------------|----------|---------------|----------------------| +| `name` | The name of the engine you want to uninstall. | Yes | - | - | +| `-h`, `--help` | Display help for command. | No | - | `-h` | + + ## `cortex engines list` :::info This CLI command calls the following API endpoint: @@ -129,27 +159,3 @@ For example, it returns the following: | Option | Description | Required | Default value | Example | |---------------------------|----------------------------------------------------|----------|---------------|----------------------| | `-h`, `--help` | Display help for command. | No | - | `-h` | - -## `cortex engines set` -:::info -This CLI command calls the following API endpoint: -- [Update Engine](/api-reference#tag/engines/patch/v1/engines/{name}) -::: - -This command updates an engine configuration. - - -**Usage**: - -```bash -cortex engines set -``` - -**Options**: - -| Option | Description | Required | Default value | Example | -|---------------------------|----------------------------------------------------|----------|---------------|----------------------| -| `name` | The name of the engine you want to update. | Yes | - | - | -| `config` | Configuration name. | Yes | - | `openai` | -| `value` | Configuration value. | Yes | - | `sk-xxxxx` | -| `-h`, `--help` | Display help for command. | No | - | `-h` | diff --git a/docs/cli/models/index.md b/docs/cli/models/index.md index 79f1eff..5aea605 100644 --- a/docs/cli/models/index.md +++ b/docs/cli/models/index.md @@ -109,8 +109,8 @@ For example, it returns the following: | Option | Description | Required | Default value | Example | |---------------------------|----------------------------------------------------|----------|---------------|----------------------| -| `-f`, `--format ` | Specify output format for the models list. | No | `json` | `-f json` | | `-h`, `--help` | Display help for command. | No | - | `-h` | + ## `cortex models remove` :::info @@ -168,9 +168,9 @@ cortex models start [model_id]:[engine] [options] | Option | Description | Required | Default value | Example | |---------------------------|---------------------------------------------------------------------------|----------|----------------------------------------------|------------------------| | `model_id` | The identifier of the model you want to start. | No | `Prompt to select from the available models` | `mistral` | -| `-a`, `--attach` | Attach to an interactive chat session. | No | `false` | `-a` | -| `-p`, `--preset ` | Apply a chat preset to the chat session. | No | `false` | `-p friendly` | | `-h`, `--help` | Display help information for the command. | No | - | `-h` | + ## `cortex models stop` :::info diff --git a/docs/cli/run.md b/docs/cli/run.md index 0f133be..3c4e079 100644 --- a/docs/cli/run.md +++ b/docs/cli/run.md @@ -18,9 +18,6 @@ This CLI command calls the following API endpoint: This command facilitates the initiation of an interactive chat shell with a specified machine-learning model. - -This command supports both local and remote models. - ## Usage ```bash @@ -32,22 +29,10 @@ cortex run [options] [model_id]:[engine] -c ``` ### `model_id` You can use the [Built-in models](/docs/hub/cortex-hub) or Supported [HuggingFace models](/docs/hub/hugging-face). -### Local Model -To run a local model in Cortex: -```bash -## Local model -cortex run mistral -``` + :::info This command downloads and installs the model if not already available in your file system, then starts it for interaction. ::: -### Remote Model -To run a remote model: -1. Set the API key by using the [`cortex configs set`](/docs/cli/configs/set) command. -2. Once you have set the API key, run the following command: -```bash -cortex run gpt-3.5-turbo -``` ## Options @@ -55,11 +40,10 @@ cortex run gpt-3.5-turbo | Option | Description | Required | Default value | Example | |-----------------------------|-----------------------------------------------------------------------------|----------|----------------------------------------------|------------------------| | `model_id` | The identifier of the model you want to chat with. | No | `Prompt to select from the available models` | `mistral` | -| `-t`, `--thread ` | Specify the Thread ID. Defaults to creating a new thread if none specified. | No | - | `-t jan_1717650808` | -| `-p`, `--preset` | Apply a chat preset to the chat session. | No | - | `-p friendly` | -| `-c`, `--chat` | Start a chat session after running the model. | No | - | `-c` | - | `-h`, `--help` | Display help information for the command. | No | - | `-h` | + From 18f60edb974e697ec55eb8e73aca0c2836d090de Mon Sep 17 00:00:00 2001 From: irfanpena Date: Mon, 2 Sep 2024 15:41:24 +0700 Subject: [PATCH 2/5] Update the CLI docs --- docs/cli/chat.md | 4 +-- docs/cli/cortex.md | 8 ++--- docs/cli/engines/index.mdx | 74 +++++++++++++++++++------------------- docs/cli/models/index.md | 48 ++++++++++++------------- 4 files changed, 67 insertions(+), 67 deletions(-) diff --git a/docs/cli/chat.md b/docs/cli/chat.md index 9063c2f..5ef4212 100644 --- a/docs/cli/chat.md +++ b/docs/cli/chat.md @@ -30,9 +30,9 @@ This command uses a `model_id` from the model that you have downloaded or availa | Option | Description | Required | Default value | Example | | ----------------------------- | ----------------------------------------------------------------------------------------------- | -------- | ------------- | ----------------------------- | -| `model_id` | Model ID to chat with. If there is no model_id provided, it will prompt to select from running models. | No | - | `mistral` | +| `model_id` | Model ID to chat with. | No | - | `mistral` | | `-m`, `--message ` | Message to send to the model | No | - | `-m "Hello, model!"` | -| `-h`, `--help` | Display help information for the command | No | - | `-h` | +| `-h`, `--help` | Display help information for the command. | No | - | `-h` | diff --git a/docs/cli/engines/index.mdx b/docs/cli/engines/index.mdx index 5e5587c..8bfcf47 100644 --- a/docs/cli/engines/index.mdx +++ b/docs/cli/engines/index.mdx @@ -22,7 +22,7 @@ cortex engines [options] [subcommand] | Option | Description | Required | Default value | Example | |-------------------|-------------------------------------------------------|----------|---------------|-----------------| -| `-vk`, `--vulkan` | Install Vulkan engine | No | `false` | `-vk` | +| `-vk`, `--vulkan` | Install Vulkan engine. | No | `false` | `-vk` | | `-h`, `--help` | Display help information for the command. | No | - | `-h` | ## `cortex engines get` @@ -62,6 +62,42 @@ To get an engine name, run the [`engines list`](/docs/cli/engines/list) command | `name` | The name of the engine that you want to retrieve. | Yes | - | `cortex.llamacpp`| | `-h`, `--help` | Display help information for the command. | No | - | `-h` | +## `cortex engines list` +:::info +This CLI command calls the following API endpoint: +- [List Engines](/api-reference#tag/engines/get/v1/engines) +::: +This command lists all the Cortex's engines. + + + +**Usage**: + +```bash +cortex engines list [options] +``` +For example, it returns the following: +```bash +┌─────────┬───────────────────────┬────────────────────────────────────────────────────────────────────────────┬─────────┬──────────────────────────────┐ +│ (index) │ name │ description │ version │ productName │ +├─────────┼───────────────────────┼────────────────────────────────────────────────────────────────────────────┼─────────┼──────────────────────────────┤ +│ 0 │ 'cortex.llamacpp' │ 'This extension enables chat completion API calls using the Cortex engine' │ '0.0.1' │ 'Cortex Inference Engine' │ +│ 1 │ 'cortex.onnx' │ 'This extension enables chat completion API calls using the Cortex engine' │ '0.0.1' │ 'Cortex Inference Engine' │ +│ 2 │ 'cortex.tensorrt-llm' │ 'This extension enables chat completion API calls using the Cortex engine' │ '0.0.1' │ 'Cortex Inference Engine' │ +│ 3 │ 'openai' │ 'This extension enables OpenAI chat completion API calls' │ '0.0.1' │ 'OpenAI Inference Engine' │ +│ 4 │ 'groq' │ 'This extension enables fast Groq chat completion API calls' │ '0.0.1' │ 'Groq Inference Engine' │ +│ 5 │ 'mistral' │ 'This extension enables Mistral chat completion API calls' │ '0.0.1' │ 'Mistral Inference Engine' │ +│ 6 │ 'anthropic' │ 'This extension enables Anthropic chat completion API calls' │ '0.0.1' │ 'Anthropic Inference Engine' │ +└─────────┴───────────────────────┴────────────────────────────────────────────────────────────────────────────┴─────────┴──────────────────────────────┘ +``` + +**Options**: + +| Option | Description | Required | Default value | Example | +|---------------------------|----------------------------------------------------|----------|---------------|----------------------| +| `-h`, `--help` | Display help for command. | No | - | `-h` | + + ## `cortex engines install` :::info This CLI command calls the following API endpoint: @@ -123,39 +159,3 @@ cortex engines uninstall cortex.tensorrt-llm |---------------------------|----------------------------------------------------|----------|---------------|----------------------| | `name` | The name of the engine you want to uninstall. | Yes | - | - | | `-h`, `--help` | Display help for command. | No | - | `-h` | - - -## `cortex engines list` -:::info -This CLI command calls the following API endpoint: -- [List Engines](/api-reference#tag/engines/get/v1/engines) -::: -This command lists all the Cortex's engines. - - - -**Usage**: - -```bash -cortex engines list [options] -``` -For example, it returns the following: -```bash -┌─────────┬───────────────────────┬────────────────────────────────────────────────────────────────────────────┬─────────┬──────────────────────────────┐ -│ (index) │ name │ description │ version │ productName │ -├─────────┼───────────────────────┼────────────────────────────────────────────────────────────────────────────┼─────────┼──────────────────────────────┤ -│ 0 │ 'cortex.llamacpp' │ 'This extension enables chat completion API calls using the Cortex engine' │ '0.0.1' │ 'Cortex Inference Engine' │ -│ 1 │ 'cortex.onnx' │ 'This extension enables chat completion API calls using the Cortex engine' │ '0.0.1' │ 'Cortex Inference Engine' │ -│ 2 │ 'cortex.tensorrt-llm' │ 'This extension enables chat completion API calls using the Cortex engine' │ '0.0.1' │ 'Cortex Inference Engine' │ -│ 3 │ 'openai' │ 'This extension enables OpenAI chat completion API calls' │ '0.0.1' │ 'OpenAI Inference Engine' │ -│ 4 │ 'groq' │ 'This extension enables fast Groq chat completion API calls' │ '0.0.1' │ 'Groq Inference Engine' │ -│ 5 │ 'mistral' │ 'This extension enables Mistral chat completion API calls' │ '0.0.1' │ 'Mistral Inference Engine' │ -│ 6 │ 'anthropic' │ 'This extension enables Anthropic chat completion API calls' │ '0.0.1' │ 'Anthropic Inference Engine' │ -└─────────┴───────────────────────┴────────────────────────────────────────────────────────────────────────────┴─────────┴──────────────────────────────┘ -``` - -**Options**: - -| Option | Description | Required | Default value | Example | -|---------------------------|----------------------------------------------------|----------|---------------|----------------------| -| `-h`, `--help` | Display help for command. | No | - | `-h` | diff --git a/docs/cli/models/index.md b/docs/cli/models/index.md index 5aea605..87c90a9 100644 --- a/docs/cli/models/index.md +++ b/docs/cli/models/index.md @@ -112,29 +112,6 @@ For example, it returns the following: | `-h`, `--help` | Display help for command. | No | - | `-h` | -## `cortex models remove` -:::info -This CLI command calls the following API endpoint: -- [Delete Model](/api-reference#tag/models/delete/v1/models/{id}) -::: -This command deletes a local model defined by a `model_id`. - - - -**Usage**: - -```bash -cortex models remove -``` -:::info -This command uses a `model_id` from the model that you have downloaded or available in your file system. -::: -**Options**: -| Option | Description | Required | Default value | Example | -|---------------------------|-----------------------------------------------------------------------------|----------|----------------------|------------------------| -| `model_id` | The identifier of the model you want to remove. | Yes | - | `mistral` | -| `-h`, `--help` | Display help for command. | No | - | `-h` | - ## `cortex models start` :::info This CLI command calls the following API endpoint: @@ -219,4 +196,27 @@ This command uses a `model_id` from the model that you have downloaded or availa |-----------------------------|-------------------------------------------------------------------------------------------------------|----------|----------------------|-----------------------------------------------------------| | `model_id` | The identifier of the model you want to update. | Yes | - | `mistral` | | `-c`, `--options ` | Specify the options to update the model. Syntax: `-c option1=value1 option2=value2`. | Yes | - | `-c max_tokens=100 temperature=0.5` | -| `-h`, `--help` | Display help information for the command. | No | - | `-h` | \ No newline at end of file +| `-h`, `--help` | Display help information for the command. | No | - | `-h` | + +## `cortex models remove` +:::info +This CLI command calls the following API endpoint: +- [Delete Model](/api-reference#tag/models/delete/v1/models/{id}) +::: +This command deletes a local model defined by a `model_id`. + + + +**Usage**: + +```bash +cortex models remove +``` +:::info +This command uses a `model_id` from the model that you have downloaded or available in your file system. +::: +**Options**: +| Option | Description | Required | Default value | Example | +|---------------------------|-----------------------------------------------------------------------------|----------|----------------------|------------------------| +| `model_id` | The identifier of the model you want to remove. | Yes | - | `mistral` | +| `-h`, `--help` | Display help for command. | No | - | `-h` | \ No newline at end of file From 16a46fed648b21df78d26ab3baeacfdc87944b8c Mon Sep 17 00:00:00 2001 From: irfanpena Date: Mon, 2 Sep 2024 15:57:10 +0700 Subject: [PATCH 3/5] Update the CLI docs --- docs/cli/chat.md | 2 +- docs/cli/models/index.md | 6 +++--- docs/cli/ps.md | 2 +- docs/cli/pull.md | 9 +-------- docs/cli/run.md | 15 +++++++++------ docs/cli/stop.md | 2 +- 6 files changed, 16 insertions(+), 20 deletions(-) diff --git a/docs/cli/chat.md b/docs/cli/chat.md index 5ef4212..fa16af1 100644 --- a/docs/cli/chat.md +++ b/docs/cli/chat.md @@ -20,7 +20,7 @@ This command starts a chat session with a specified model, allowing you to inter ## Usage ```bash -cortex chat [model_id] [message] +cortex chat [options] [model_id] ``` :::info This command uses a `model_id` from the model that you have downloaded or available in your file system. diff --git a/docs/cli/models/index.md b/docs/cli/models/index.md index 87c90a9..f54ff72 100644 --- a/docs/cli/models/index.md +++ b/docs/cli/models/index.md @@ -125,13 +125,13 @@ This command starts a model defined by a `model_id`. ```bash # Start a model -cortex models start [model_id] +cortex models start # Start a model with a preset -cortex models start [model_id] [options] +cortex models start [options] # Start with a specified engine -cortex models start [model_id]:[engine] [options] +cortex models start :[engine] [options] ``` diff --git a/docs/cli/ps.md b/docs/cli/ps.md index 436f8ea..0361313 100644 --- a/docs/cli/ps.md +++ b/docs/cli/ps.md @@ -21,7 +21,7 @@ This command shows the running model and its status. ## Usage ```bash -cortex ps +cortex ps [options] ``` For example, it returns the following table: diff --git a/docs/cli/pull.md b/docs/cli/pull.md index 8f90a22..779cda4 100644 --- a/docs/cli/pull.md +++ b/docs/cli/pull.md @@ -18,17 +18,10 @@ This command downloads models from supported [model repositories](/docs/model-so The downloaded model will be stored in the Cortex folder in your home data directory. - -## Alias - -The following alias is also available for downloading models: - -- `cortex download _` - ## Usage ```bash -cortex pull +cortex pull [options] ``` ## Options diff --git a/docs/cli/run.md b/docs/cli/run.md index 3c4e079..810ab25 100644 --- a/docs/cli/run.md +++ b/docs/cli/run.md @@ -11,7 +11,8 @@ slug: "run" # `cortex run` :::info This CLI command calls the following API endpoint: -- [Download Model](/api-reference#tag/models/post/v1/models/{modelId}/pull) (The command only calls this endpoint if the specified model is not already downloaded.) +- [Download Model](/api-reference#tag/models/post/v1/models/{modelId}/pull) (The command only calls this endpoint if the specified model is not downloaded yet.) +- Download Engine (The command only calls this endpoint if the specified engine is not downloaded yet.) - [Start Model](/api-reference#tag/models/post/v1/models/{modelId}/start) - [Chat Completions](/api-reference#tag/inference/post/v1/chat/completions) (The command makes a call to this endpoint if the `-c` option is used.) ::: @@ -21,11 +22,11 @@ This command facilitates the initiation of an interactive chat shell with a spec ## Usage ```bash -cortex run [options] [model_id] +cortex run [options] # With a specified engine -cortex run [options] [model_id]:[engine] +cortex run [options] :[engine] # Start chatting once the model started -cortex run [options] [model_id]:[engine] -c +cortex run [options] :[engine] -c ``` ### `model_id` You can use the [Built-in models](/docs/hub/cortex-hub) or Supported [HuggingFace models](/docs/hub/hugging-face). @@ -51,5 +52,7 @@ This command downloads and installs the model if not already available in your f `cortex run` command is a convenience wrapper that automatically executes a sequence of commands to simplify user interactions: -1. [`cortex start`](/docs/cli/models/start): This command starts the specified model, making it active and ready for interactions. -2. [`cortex chat`](/docs/cli/chat): Following model activation, this command opens an interactive chat shell where users can directly communicate with the model. +1. [`cortex pull`](/docs/cli/models/): This command pulls the specified model if the model is not yet downloaded. +2. [`cortex engines install`](/docs/cli/engines/): This command installs the specified engines if not yet downloaded. +3. [`cortex models start`](/docs/cli/models/): This command starts the specified model, making it active and ready for interactions. +4. [`cortex chat`](/docs/cli/chat): Following model activation, this command opens an interactive chat shell where users can directly communicate with the model. diff --git a/docs/cli/stop.md b/docs/cli/stop.md index 200367a..ae72e59 100644 --- a/docs/cli/stop.md +++ b/docs/cli/stop.md @@ -20,7 +20,7 @@ This command stops the API server. ## Usage ```bash -cortex stop +cortex stop [options] ``` ## Options From 09835be9d0b2ab17090ce9c9ad40bb37cc00e24c Mon Sep 17 00:00:00 2001 From: irfanpena Date: Tue, 3 Sep 2024 13:17:42 +0700 Subject: [PATCH 4/5] Update the command outputs except ps (many commands are not working on my device) --- docs/cli/engines/index.mdx | 31 ++++++------ docs/cli/models/index.md | 96 ++++++++++++++++++++++++-------------- 2 files changed, 77 insertions(+), 50 deletions(-) diff --git a/docs/cli/engines/index.mdx b/docs/cli/engines/index.mdx index 8bfcf47..eb7b317 100644 --- a/docs/cli/engines/index.mdx +++ b/docs/cli/engines/index.mdx @@ -30,7 +30,7 @@ cortex engines [options] [subcommand] This CLI command calls the following API endpoint: - [Get Engine](/api-reference#tag/engines/get/v1/engines/{name}) ::: -This command returns an engine detail defined by an engine `name`. +This command returns an engine detail defined by an engine `engine_name`. @@ -59,7 +59,7 @@ To get an engine name, run the [`engines list`](/docs/cli/engines/list) command | Option | Description | Required | Default value | Example | |-------------------|-------------------------------------------------------|----------|---------------|-----------------| -| `name` | The name of the engine that you want to retrieve. | Yes | - | `cortex.llamacpp`| +| `engine_name` | The name of the engine that you want to retrieve. | Yes | - | `cortex.llamacpp`| | `-h`, `--help` | Display help information for the command. | No | - | `-h` | ## `cortex engines list` @@ -78,17 +78,18 @@ cortex engines list [options] ``` For example, it returns the following: ```bash -┌─────────┬───────────────────────┬────────────────────────────────────────────────────────────────────────────┬─────────┬──────────────────────────────┐ -│ (index) │ name │ description │ version │ productName │ -├─────────┼───────────────────────┼────────────────────────────────────────────────────────────────────────────┼─────────┼──────────────────────────────┤ -│ 0 │ 'cortex.llamacpp' │ 'This extension enables chat completion API calls using the Cortex engine' │ '0.0.1' │ 'Cortex Inference Engine' │ -│ 1 │ 'cortex.onnx' │ 'This extension enables chat completion API calls using the Cortex engine' │ '0.0.1' │ 'Cortex Inference Engine' │ -│ 2 │ 'cortex.tensorrt-llm' │ 'This extension enables chat completion API calls using the Cortex engine' │ '0.0.1' │ 'Cortex Inference Engine' │ -│ 3 │ 'openai' │ 'This extension enables OpenAI chat completion API calls' │ '0.0.1' │ 'OpenAI Inference Engine' │ -│ 4 │ 'groq' │ 'This extension enables fast Groq chat completion API calls' │ '0.0.1' │ 'Groq Inference Engine' │ -│ 5 │ 'mistral' │ 'This extension enables Mistral chat completion API calls' │ '0.0.1' │ 'Mistral Inference Engine' │ -│ 6 │ 'anthropic' │ 'This extension enables Anthropic chat completion API calls' │ '0.0.1' │ 'Anthropic Inference Engine' │ -└─────────┴───────────────────────┴────────────────────────────────────────────────────────────────────────────┴─────────┴──────────────────────────────┘ ++---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+ +| (Index) | name | description | version | product name | status | ++---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+ +| 1 | cortex.onnx | This extension enables chat completion API calls using the Onnx engine | 0.0.1 + | Onnx Inference Engine | not_initialized | ++---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+ +| 2 | cortex.llamacpp | This extension enables chat completion API calls using the LlamaCPP engine | 0.0.1 + | LlamaCPP Inference Engine | ready | ++---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+ +| 3 | cortex.tensorrt-llm | This extension enables chat completion API calls using the TensorrtLLM engine | 0.0.1 + | TensorrtLLM Inference Engine | not_initialized | ++---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+ ``` **Options**: @@ -129,7 +130,7 @@ cortex engines install cortex.tensorrt-llm | Option | Description | Required | Default value | Example | |---------------------------|----------------------------------------------------|----------|---------------|----------------------| -| `name` | The name of the engine you want to install. | Yes | - | - | +| `engine_name` | The name of the engine you want to install. | Yes | - | - | | `-h`, `--help` | Display help for command. | No | - | `-h` | ## `cortex engines uninstall` @@ -157,5 +158,5 @@ cortex engines uninstall cortex.tensorrt-llm | Option | Description | Required | Default value | Example | |---------------------------|----------------------------------------------------|----------|---------------|----------------------| -| `name` | The name of the engine you want to uninstall. | Yes | - | - | +| `engine_name` | The name of the engine you want to uninstall. | Yes | - | - | | `-h`, `--help` | Display help for command. | No | - | `-h` | diff --git a/docs/cli/models/index.md b/docs/cli/models/index.md index f54ff72..38fff6b 100644 --- a/docs/cli/models/index.md +++ b/docs/cli/models/index.md @@ -42,27 +42,58 @@ cortex models get For example, it returns the following: ```bash -{ - name: 'tinyllama', - model: 'tinyllama', - version: 1, - files: [ 'C:\\Users\\ACER\\cortex\\models\\tinyllama\\model.gguf' ], - stop: [ '' ], - top_p: 0.95, - temperature: 0.7, - frequency_penalty: 0, - presence_penalty: 0, - max_tokens: 4096, - stream: true, - ngl: 33, - ctx_len: 4096, - engine: 'cortex.llamacpp', - prompt_template: '<|system|>\n{system_message}<|user|>\n{prompt}<|assistant|>', - id: 'tinyllama', - created: 1720659351720, - object: 'model', - owned_by: '' -} +ModelConfig Details: +------------------- +id: tinyllama +name: tinyllama 1B +model: tinyllama:1B +version: 1 +stop: [] +top_p: 0.95 +temperature: 0.7 +frequency_penalty: 0 +presence_penalty: 0 +max_tokens: 4096 +stream: true +ngl: 33 +ctx_len: 4096 +engine: cortex.llamacpp +prompt_template: + +<|system|> +{system_message} + + + + +<|user|> +{prompt} + + +<|assistant|> + + +system_template: + +<|system|> + +user_template: + + + + +<|user|> + +ai_template: + + +<|assistant|> + + +tp: 0 +text_model: false +files: [model_path] +created: 1725342964 ``` :::info This command uses a `model_id` from the model that you have downloaded or available in your file system. @@ -91,17 +122,13 @@ cortex models list [options] ``` For example, it returns the following: ```bash -┌─────────┬───────────────────────────────────────────────┬──────────────────────────────┬───────────┐ -│ (index) │ id │ engine │ version │ -├─────────┼───────────────────────────────────────────────┼──────────────────────────────┼───────────┤ -│ 0 │ 'gpt-3.5-turbo' │ 'openai' │ 1 │ -│ 1 │ 'gpt-4o' │ 'openai' │ 1 │ -│ 2 │ 'llama3:onnx' │ 'cortex.onnx' │ 1 │ -│ 3 │ 'llama3' │ 'cortex.llamacpp' │ undefined │ -│ 4 │ 'openhermes-2.5:tensorrt-llm-windows-ada' │ 'cortex.tensorrt-llm' │ 1 │ -│ 5 │ 'openhermes-2.5:tensorrt-llm' │ 'cortex.tensorrt-llm' │ 1 │ -│ 6 │ 'tinyllama' │ 'cortex.llamacpp' │ undefined │ -└─────────┴───────────────────────────────────────────────┴──────────────────────────────┴───────────┘ ++---------+----------------+-----------------+---------+ +| (Index) | ID | engine | version | ++---------+----------------+-----------------+---------+ +| 1 | tinyllama-gguf | cortex.llamacpp | 1 | ++---------+----------------+-----------------+---------+ +| 2 | tinyllama | cortex.llamacpp | 1 | ++---------+----------------+-----------------+---------+ ``` @@ -136,8 +163,7 @@ cortex models start :[engine] [options] :::info -- This command uses a `model_id` from the model that you have downloaded or available in your file system. -- Model preset is applied only at the start of the model and does not change during the chat session. +This command uses a `model_id` from the model that you have downloaded or available in your file system. ::: **Options**: @@ -164,7 +190,7 @@ This command stops a model defined by a `model_id`. cortex models stop ``` :::info -- This command uses a `model_id` from the model that you have started before. +This command uses a `model_id` from the model that you have started before. ::: **Options**: From 6215c319e03194d05eb34587b70a2d5975120cec Mon Sep 17 00:00:00 2001 From: irfanpena Date: Thu, 5 Sep 2024 11:13:18 +0700 Subject: [PATCH 5/5] Updated per comments --- docs/cli/chat.md | 2 +- docs/cli/cortex.md | 6 +++--- docs/cli/run.md | 2 -- 3 files changed, 4 insertions(+), 6 deletions(-) diff --git a/docs/cli/chat.md b/docs/cli/chat.md index fa16af1..2f77f03 100644 --- a/docs/cli/chat.md +++ b/docs/cli/chat.md @@ -20,7 +20,7 @@ This command starts a chat session with a specified model, allowing you to inter ## Usage ```bash -cortex chat [options] [model_id] +cortex chat [options] ``` :::info This command uses a `model_id` from the model that you have downloaded or available in your file system. diff --git a/docs/cli/cortex.md b/docs/cli/cortex.md index 0f4df58..fd83190 100644 --- a/docs/cli/cortex.md +++ b/docs/cli/cortex.md @@ -27,10 +27,10 @@ cortex [command] [options] | ---------------------------- | ----------------------------------------- | -------- | ------------- | ----------------------------- | | `-a`, `--address
` | Address to use. | No | - | `-a 192.168.1.1` | | `-p`, `--port ` | Port to serve the application. | No | - | `-p 1337` | -| `-v`, `--version` | Show version. | No | `false` | `-v` | +| `-v`, `--version` | Show version. | No | - | `-v` | | `-h`, `--help` | Display help information for the command. | No | - | `-h` | - +| `--verbose` | Show the detailed command logs | No | - | `--verbose` | + ## Command Chaining diff --git a/docs/cli/run.md b/docs/cli/run.md index 810ab25..65b383a 100644 --- a/docs/cli/run.md +++ b/docs/cli/run.md @@ -25,8 +25,6 @@ This command facilitates the initiation of an interactive chat shell with a spec cortex run [options] # With a specified engine cortex run [options] :[engine] -# Start chatting once the model started -cortex run [options] :[engine] -c ``` ### `model_id` You can use the [Built-in models](/docs/hub/cortex-hub) or Supported [HuggingFace models](/docs/hub/hugging-face).