-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
segfault when using latest llama #1
Comments
Looks like llama_batch's definition has changed to be a |
Thanks for your work! |
No problem. It would be great to be able to run llama from ocaml! I swapped over to the new multi-dimensional array for seq_id, and I'm no longer getting segfaults, but am hitting a
I'm not confident that I'm binding to the bigarray correctly. I also tried a nested pointer in the ctypes and then coercing it before building the big array, but that gave the same results. Looks like this issue was fixed here by using a helper function in common.cpp ggerganov/llama.cpp#3803 |
Setting the type of seq_id to let seq_id = Ctypes.CArray.get seq_id 0 in
let seq_id = Ctypes.CArray.from_ptr seq_id 1 in
Ctypes.CArray.set seq_id 0 0l; I'm not that familiar with Bigarray, so not sure if this nested pointer is possible to represent using it. It does give a better api, since now ctypes leaks into the use sites instead of the more generic and more user-friendly Bigarray. |
I introduced a helper function to avoid leaking ctypes 7174abd. The simple example works now, but the repl fails with
|
Looks like this last one is a missing call to |
I updated this lib to the latest llama, which I can build and run locally, but I'm getting a segfault when trying to run the
simple
example with my changes: master...joprice:llama-cpp-ocaml:update-llama-cpp. The master branch runs for me without issue, so I assume it's something with the bindings going out of sync that I didn't catch.The text was updated successfully, but these errors were encountered: