Skip to content

Commit

Permalink
Add huggingface token for native llm (#827)
Browse files Browse the repository at this point in the history
* add huggingface token for native llm

Signed-off-by: Xinyao Wang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix bug

Signed-off-by: Xinyao Wang <[email protected]>

* fix bug

Signed-off-by: Xinyao Wang <[email protected]>

---------

Signed-off-by: Xinyao Wang <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
  • Loading branch information
XinyaoWa and pre-commit-ci[bot] authored Oct 29, 2024
1 parent 6c670c9 commit 9fec226
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 0 deletions.
1 change: 1 addition & 0 deletions comps/llms/text-generation/native/langchain/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ In order to start Native LLM service, you need to setup the following environmen

```bash
export LLM_NATIVE_MODEL="Qwen/Qwen2-7B-Instruct"
export HUGGINGFACEHUB_API_TOKEN="your_huggingface_token"
```

### 1.2 Build Docker Image
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ services:
HABANA_VISIBLE_DEVICES: all
OMPI_MCA_btl_vader_single_copy_mechanism: none
TOKENIZERS_PARALLELISM: false
HUGGINGFACEHUB_API_TOKEN: ${HUGGINGFACEHUB_API_TOKEN}
restart: unless-stopped

networks:
Expand Down
5 changes: 5 additions & 0 deletions comps/llms/text-generation/native/langchain/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
from pathlib import Path

import torch
from huggingface_hub import login
from optimum.habana.checkpoint_utils import (
get_ds_injection_policy,
get_repo_root,
Expand All @@ -42,6 +43,10 @@
from transformers import AutoConfig, AutoModelForCausalLM, AutoTokenizer
from transformers.utils import check_min_version

HUGGINGFACEHUB_API_TOKEN = os.getenv("HUGGINGFACEHUB_API_TOKEN", "")
if HUGGINGFACEHUB_API_TOKEN != "":
login(token=HUGGINGFACEHUB_API_TOKEN)


def adjust_batch(batch, size):
curr_size = batch["input_ids"].shape[1]
Expand Down

0 comments on commit 9fec226

Please sign in to comment.