Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added validate_api_key, num_gpus; fixed dataset of llamafactory and suffix of GLM #361

Merged
merged 6 commits into from
Nov 26, 2024

Conversation

JingofXin
Copy link
Contributor

@JingofXin JingofXin commented Nov 25, 2024

  1. New Feature: Added validate_api_key API for the Online Model to validate the effectiveness of the API key.
  2. Enhancement: Introduced the num_gpus parameter for the Local Model training service, enabling multi-GPU training support.
  3. Bug Fixes:
    • Fixed an issue where the dataset configuration for llamafactory was not unique, resolved by adding a UUID as a new level in the folder structure.
    • Fixed a bug where the llamafactory dataset required an ‘input’ field even when it was empty, now an empty ‘input’ field is added by default.
    • Fixed a limitation where the model suffix in GLM had to be less than 8 characters, now using a 7-character UUID as the suffix.
    • Fixed an issue in dataset construction where the path evaluation was missing a ‘not’ condition, which could lead to incorrect path handling.

@JingofXin JingofXin changed the title Added validate_api_key, num_gpus; fixed dataset of llamafacotory and suffix of GLM Added validate_api_key, num_gpus; fixed dataset of llamafactory and suffix of GLM Nov 25, 2024
@@ -157,6 +158,13 @@ def _query_finetuned_jobs(self):
raise requests.RequestException('\n'.join([c.decode('utf-8') for c in r.iter_content(None)]))
return r.json()

def _validate_api_key(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个是不是可以写到基类里面

tests/charge_tests/test_engine.py Show resolved Hide resolved
@wzh1994 wzh1994 merged commit 7f4e6a4 into LazyAGI:main Nov 26, 2024
13 checks passed
wzh1994 pushed a commit to wzh1994/LazyLLM that referenced this pull request Nov 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants