Skip to content
This repository has been archived by the owner on Jun 24, 2024. It is now read-only.

Error #8

Closed
dillfrescott opened this issue Apr 15, 2023 · 4 comments
Closed

Error #8

dillfrescott opened this issue Apr 15, 2023 · 4 comments

Comments

@dillfrescott
Copy link

C:\Users\micro\Downloads\llamacord>cargo run --release
    Finished release [optimized] target(s) in 0.16s
     Running `target\release\llamacord.exe`
thread '<unnamed>' panicked at 'called `Result::unwrap()` on an `Err` value: InvalidMagic { path: "models/vicuna-13b-free-q4_0.bin" }', src\main.rs:116:14
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
@philpax
Copy link
Collaborator

philpax commented Apr 18, 2023

The version of llama-rs that llamacord uses doesn't support GGJT format models yet, which is what Vicuna uses. Hoping to have this fixed soon! (rustformers/llm#93)

@dillfrescott
Copy link
Author

Oh, okay!

@philpax
Copy link
Collaborator

philpax commented May 7, 2023

This should now work! Let me know if it doesn't :)

@philpax philpax closed this as completed May 7, 2023
@dillfrescott
Copy link
Author

Thank you so much!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants