Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segmentation fault (core dumped) when using Transformers.js #4619

Open
xenova opened this issue Sep 8, 2023 · 18 comments
Open

Segmentation fault (core dumped) when using Transformers.js #4619

xenova opened this issue Sep 8, 2023 · 18 comments
Labels
crash An issue that could cause a crash napi Compatibility with the native layer of Node.js

Comments

@xenova
Copy link

xenova commented Sep 8, 2023

What version of Bun is running?

1.0.0+822a00c4d508b54f650933a73ca5f4a3af9a7983

What platform is your computer?

Linux 5.15.0-1041-azure x86_64 x86_64

What steps can reproduce the bug?

Using Transformers.js causes a segmentation fault. See docs for information about the library.

import { pipeline } from '@xenova/transformers';

// Allocate a pipeline for sentiment-analysis
let pipe = await pipeline('sentiment-analysis');

let out = await pipe('I love transformers!');
console.log(out);

What is the expected behavior?

It should output the same as when running with node.js:

$ node audio-processing/index.js
No model specified. Using default model: "Xenova/distilbert-base-uncased-finetuned-sst-2-english".
[ { label: 'POSITIVE', score: 0.999788761138916 } ]

What do you see instead?

$ bun run index.js
No model specified. Using default model: "Xenova/distilbert-base-uncased-finetuned-sst-2-english".
Segmentation fault (core dumped)

Additional information

When running for the first time, you might run into an issue with the sharp dependency.

$ bun run index.js 
32 |     if (loadedModule) {
33 |       const [, loadedPackage] = loadedModule.match(/node_modules[\\/]([^\\/]+)[\\/]/);
34 |       help.push(`- Ensure the version of sharp aligns with the ${loadedPackage} package: "npm ls sharp"`);
35 |     }
36 |   }
37 |   throw new Error(help.join('\n'));
            ^
error: 
Something went wrong installing the "sharp" module

Cannot find module "../build/Release/sharp-linux-x64.node" from "/workspaces/dev/node_modules/sharp/lib/sharp.js"

Possible solutions:
- Install with verbose logging and look for errors: "npm install --ignore-scripts=false --foreground-scripts --verbose sharp"
- Install for the current linux-x64 runtime: "npm install --platform=linux --arch=x64 sharp"
- Consult the installation documentation: https://sharp.pixelplumbing.com/install
      at /workspaces/dev/node_modules/sharp/lib/sharp.js:37:8
      at globalThis (/workspaces/dev/node_modules/sharp/lib/sharp.js:37:33)
      at require (:1:20)
      at /workspaces/dev/node_modules/sharp/lib/constructor.js:11:0
      at globalThis (/workspaces/dev/node_modules/sharp/lib/constructor.js:439:17)
      at require (:1:20)
      at /workspaces/dev/node_modules/sharp/lib/index.js:6:6
      at globalThis (/workspaces/dev/node_modules/sharp/lib/index.js:16:17)

You can fix it by running their recommended command: npm install --ignore-scripts=false --foreground-scripts --verbose sharp. I have tested running transformers.js with and without sharp, so I don't believe this to be the cause of the error.

@xenova xenova added the bug Something isn't working label Sep 8, 2023
@Electroid Electroid added the crash An issue that could cause a crash label Sep 8, 2023
@codegod100
Copy link

confirming that I can duplicate

@johnhorsema
Copy link

Bump. Using Bun (v0.6.9) with Transformers.js works under M2 Mac, but doesn't work when Bun (v1.0.2) under Ubuntu.
Also is this related to #3574?

@xenova
Copy link
Author

xenova commented Sep 20, 2023

Also is this related to #3574?

I believe so. Transformers.js uses onnxruntime-node, so any issues faced there would impact it too.

@calumk
Copy link

calumk commented Dec 5, 2023

Anyone know what the cause of this is yet?
I see two of the referenced issues are closed as "not planned"

is this an issue with bun, or an issue somewhere else?

@Jarred-Sumner
Copy link
Collaborator

Anyone know what the cause of this is yet? I see two of the referenced issues are closed as "not planned"

Fixing this is 100% planned. They are duplicates of this issue.

is this an issue with bun

yes. It is a bug in our NAPI implementation.

@calumk
Copy link

calumk commented Mar 14, 2024

Hey @Jarred-Sumner - Any progress on this type of NAPI issue?

@sroussey
Copy link
Contributor

If you happen to only need transformers.js, I maintain a fork that runs on bun.

@calumk
Copy link

calumk commented Mar 14, 2024

That is exactly my use case

Is your fork here? https://github.com/sroussey/transformers.js

The readme doesnt mention anything about bun, what exactly changes from the original to make it bun-compatible?

@sroussey
Copy link
Contributor

You can look at the publish branch. @sroussey/transformers on npm

Changes:

  1. Upgrade onnx to 1.16.0
  2. WASM threads to 1 (needed for node as well if you fallback to it)

And 3: a change to set logging level as it complains a lot

@xenova
Copy link
Author

xenova commented Mar 14, 2024

You can also check out our v3 development branch (huggingface/transformers.js#545), which uses the latest version of onnxruntime-node and should run with bun. Please let me know if you run into any issues!

@calumk
Copy link

calumk commented Mar 14, 2024

Thanks both

@sroussey
Copy link
Contributor

Back on a computer and not a phone.

Here are the changes:
https://github.com/sroussey/transformers.js/compare/main...sroussey:transformers.js:publish?expand=1

They are very minimal!

I do not recommend using my fork for longer than needed. I keep it up to date, but only until the official one works on bun. And deals with the noisy error log.

The change to onnx to 1.16.0 fixed the crash. However, later versions have introduced bugs, so I don't plan to update that.

The WASM thing is because sometimes the native code errors and it retries with WASM, but not setting the thread count to 1 will error again.

Once you work with transformers.js on the command line, and build a tui interface, you will hate all the error logs about the models, so i put in a way to tap into changing that.

That is it.

I may fix some of the type stuff in the future, if that matters to you.

I made PRs for the things if I remember right, it has been a while. It may get back in to main version before v3.

BTW: v3 should have WebGPU enabled, as it is in onnx 1.17, but bun does not have webGPU. If you are using CUDA on Windows, it won't matter as the native onnx should use that. The CoreML stuff for native mac is not as well developed. It will be interesting to see how node with webgpu (or deno) compare in webgpu mode vs native. Native should win, but with less support for apple devices on onnx, who knows. Hopefully though, so bun won't miss out!

@sroussey
Copy link
Contributor

sroussey commented Mar 16, 2024

v3 seems to work great for me with no modifications needed. :)

There is not an npm package yet, so I published @sroussey/[email protected]

Running some tests, all looks good:
image

I have not tried switching to wasm, but I prefer native anyhow.

@jkanavin-kdi
Copy link

Just a quick note - v3 also works for me, bun add github:xenova/transformers.js#v3 works fine to grab it without having to publish to NPM though :)

@nektro nektro added the napi Compatibility with the native layer of Node.js label Aug 15, 2024
@di-sukharev
Copy link

@jkanavin-kdi for me running bun add github:xenova/transformers.js#v3 adds "@huggingface/transformers": "github:xenova/transformers.js#v3" to package.json dependencies, but ts yields an error (screenshot)

image

@di-sukharev
Copy link

okay, "@huggingface/transformers": "^3.0.0-alpha.14" was installed by running bun add @huggingface/transformers, works fine. here is the repo

@nektro
Copy link
Member

nektro commented Oct 22, 2024

in a debug build

❯ ~/src/bun3/build/debug/bun-debug index.js                
No model specified. Using default model: "Xenova/distilbert-base-uncased-finetuned-sst-2-english".
22 |         __classPrivateFieldSet(this, _OnnxruntimeSessionHandler_inferenceSession, new binding_1.binding.InferenceSession(), "f");
23 |         if (typeof pathOrBuffer === 'string') {
24 |             __classPrivateFieldGet(this, _OnnxruntimeSessionHandler_inferenceSession, "f").loadModel(pathOrBuffer, options);
25 |         }
26 |         else {
27 |             __classPrivateFieldGet(this, _OnnxruntimeSessionHandler_inferenceSession, "f").loadModel(pathOrBuffer.buffer, pathOrBuffer.byteOffset, pathOrBuffer.byteLength, options);
                                                                                                ^
error: Error
      at new OnnxruntimeSessionHandler (/Users/meghandenny/src/test/node_modules/onnxruntime-node/dist/backend.js:27:92)
      at /Users/meghandenny/src/test/node_modules/onnxruntime-node/dist/backend.js:64:29

Something went wrong during model construction (most likely a missing operation). Using `wasm` as a fallback. 
[
  {
    label: "POSITIVE",
    score: 0.999788761138916,
  }
]
^C

@nektro nektro removed the bug Something isn't working label Oct 26, 2024
@xenova
Copy link
Author

xenova commented Dec 2, 2024

Now that Transformers.js v3 is out, many models are now functional. However, as @nektro points out, certain models are still broken.

I am considering switching the Transformers.js CI to use Bun, but there are a few blockers. Here are some examples which throw errors:

import { pipeline } from '@huggingface/transformers';

await pipeline('text-classification', 'hf-internal-testing/tiny-random-BertForSequenceClassification').then((pipe) => pipe('I love transformers!'));
await pipeline('fill-mask', 'hf-internal-testing/tiny-random-BertForMaskedLM').then((pipe) => pipe('Hello, my name is [MASK].'));

Would be great to have these working!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
crash An issue that could cause a crash napi Compatibility with the native layer of Node.js
Projects
None yet
Development

No branches or pull requests

10 participants