-
Notifications
You must be signed in to change notification settings - Fork 2.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Segmentation fault (core dumped)
when using Transformers.js
#4619
Comments
confirming that I can duplicate |
Bump. Using Bun (v0.6.9) with Transformers.js works under M2 Mac, but doesn't work when Bun (v1.0.2) under Ubuntu. |
I believe so. Transformers.js uses onnxruntime-node, so any issues faced there would impact it too. |
Anyone know what the cause of this is yet? is this an issue with bun, or an issue somewhere else? |
Fixing this is 100% planned. They are duplicates of this issue.
yes. It is a bug in our NAPI implementation. |
Hey @Jarred-Sumner - Any progress on this type of NAPI issue? |
If you happen to only need transformers.js, I maintain a fork that runs on bun. |
That is exactly my use case Is your fork here? https://github.com/sroussey/transformers.js The readme doesnt mention anything about bun, what exactly changes from the original to make it bun-compatible? |
You can look at the publish branch. @sroussey/transformers on npm Changes:
And 3: a change to set logging level as it complains a lot |
You can also check out our v3 development branch (huggingface/transformers.js#545), which uses the latest version of onnxruntime-node and should run with bun. Please let me know if you run into any issues! |
Thanks both |
Back on a computer and not a phone. Here are the changes: They are very minimal! I do not recommend using my fork for longer than needed. I keep it up to date, but only until the official one works on bun. And deals with the noisy error log. The change to onnx to 1.16.0 fixed the crash. However, later versions have introduced bugs, so I don't plan to update that. The WASM thing is because sometimes the native code errors and it retries with WASM, but not setting the thread count to 1 will error again. Once you work with transformers.js on the command line, and build a tui interface, you will hate all the error logs about the models, so i put in a way to tap into changing that. That is it. I may fix some of the type stuff in the future, if that matters to you. I made PRs for the things if I remember right, it has been a while. It may get back in to main version before v3. BTW: v3 should have WebGPU enabled, as it is in onnx 1.17, but bun does not have webGPU. If you are using CUDA on Windows, it won't matter as the native onnx should use that. The CoreML stuff for native mac is not as well developed. It will be interesting to see how node with webgpu (or deno) compare in webgpu mode vs native. Native should win, but with less support for apple devices on onnx, who knows. Hopefully though, so bun won't miss out! |
v3 seems to work great for me with no modifications needed. :) There is not an npm package yet, so I published @sroussey/[email protected] Running some tests, all looks good: I have not tried switching to wasm, but I prefer native anyhow. |
Just a quick note - v3 also works for me, |
@jkanavin-kdi for me running |
okay, |
in a debug build
|
Now that Transformers.js v3 is out, many models are now functional. However, as @nektro points out, certain models are still broken. I am considering switching the Transformers.js CI to use Bun, but there are a few blockers. Here are some examples which throw errors: import { pipeline } from '@huggingface/transformers';
await pipeline('text-classification', 'hf-internal-testing/tiny-random-BertForSequenceClassification').then((pipe) => pipe('I love transformers!'));
await pipeline('fill-mask', 'hf-internal-testing/tiny-random-BertForMaskedLM').then((pipe) => pipe('Hello, my name is [MASK].')); Would be great to have these working! |
What version of Bun is running?
1.0.0+822a00c4d508b54f650933a73ca5f4a3af9a7983
What platform is your computer?
Linux 5.15.0-1041-azure x86_64 x86_64
What steps can reproduce the bug?
Using Transformers.js causes a segmentation fault. See docs for information about the library.
What is the expected behavior?
It should output the same as when running with node.js:
What do you see instead?
$ bun run index.js No model specified. Using default model: "Xenova/distilbert-base-uncased-finetuned-sst-2-english". Segmentation fault (core dumped)
Additional information
When running for the first time, you might run into an issue with the
sharp
dependency.You can fix it by running their recommended command:
npm install --ignore-scripts=false --foreground-scripts --verbose sharp
. I have tested running transformers.js with and without sharp, so I don't believe this to be the cause of the error.The text was updated successfully, but these errors were encountered: