Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable vLLM Profiling for ChatQnA #1124

Merged
merged 1 commit into from
Nov 13, 2024

Conversation

louie-tsai
Copy link
Collaborator

@louie-tsai louie-tsai commented Nov 13, 2024

Description

Enable vLLM PyTorch Profiling for ChatQnA.
For advance users who want to do vLLM performance profiling, good to have profiling feature enabled.

Issues

n/a.

Type of change

List the type of change like below. Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds new functionality)
  • Breaking change (fix or feature that would break existing design and interface)
  • Others (enhancement, documentation, validation, etc.)

Dependencies

NA

Tests

Manual Testing

Copy link
Collaborator

@yinghu5 yinghu5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you for the good feature

@yinghu5 yinghu5 merged commit 7adbba6 into opea-project:main Nov 13, 2024
11 of 13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants