Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: correct minimum required cuda version for llamacpp #1286

Merged
merged 3 commits into from
Sep 23, 2024

Conversation

vansangpfiev
Copy link
Contributor

Describe Your Changes

https://github.com/janhq/cortex/issues/1046 still in-progress, so we need to specify the minimum required Cuda version for engines.
Change from 12.4 to 12.0 to make our dev branch works correctly.

Fixes Issues

  • Closes #

Self Checklist

  • Added relevant comments, esp in complex areas
  • Updated docs (for bug fixes / features)
  • Created issues for follow-up changes or refactoring needed

@vansangpfiev vansangpfiev marked this pull request as ready for review September 23, 2024 01:20
@vansangpfiev vansangpfiev merged commit 1a4f16c into dev Sep 23, 2024
4 checks passed
@vansangpfiev vansangpfiev deleted the fix/llamacpp-cuda-version branch September 23, 2024 02:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants