Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: can't load DeepSeek models #1843

Closed
2 of 7 tasks
vansangpfiev opened this issue Jan 3, 2025 · 1 comment
Closed
2 of 7 tasks

bug: can't load DeepSeek models #1843

vansangpfiev opened this issue Jan 3, 2025 · 1 comment
Assignees
Labels
type: bug Something isn't working

Comments

@vansangpfiev
Copy link
Contributor

Cortex version

v1.0.7

Describe the issue and expected behaviour

From Channeira:
tested on 0.5.13-beta (needed to enable Vulkan support on AMDGPU)
AppImage format
Experimental + Vulkan Support enabled
AMDGPU selected in dropdown list (lavapipe unselected)

Description: models fail to load after a few seconds on first prompt

non-GGUF (tested Llama) work

tested failing with https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF and https://huggingface.co/legraphista/DeepSeek-V2-Lite-Chat-IMat-GGUF

app.log
cortex.log

Steps to Reproduce

No response

Screenshots / Logs

No response

What is your OS?

  • Windows
  • Mac Silicon
  • Mac Intel
  • Linux / Ubuntu

What engine are you running?

  • cortex.llamacpp (default)
  • cortex.tensorrt-llm (Nvidia GPUs)
  • cortex.onnx (NPUs, DirectML)

Hardware Specs eg OS version, GPU

No response

@vansangpfiev vansangpfiev added the type: bug Something isn't working label Jan 3, 2025
@vansangpfiev vansangpfiev self-assigned this Jan 3, 2025
@github-project-automation github-project-automation bot moved this to Investigating in Menlo Jan 3, 2025
@vansangpfiev vansangpfiev moved this from Investigating to In Progress in Menlo Jan 3, 2025
@vansangpfiev vansangpfiev moved this from In Progress to Eng Review in Menlo Jan 3, 2025
@vansangpfiev vansangpfiev moved this from Eng Review to QA in Menlo Jan 8, 2025
@TC117
Copy link

TC117 commented Jan 24, 2025

Working in llama-cpp 0.1.49

@TC117 TC117 closed this as completed Jan 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug Something isn't working
Projects
Status: QA
Development

No branches or pull requests

2 participants