Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: allow to preload ML models when running inference #224

Merged
merged 3 commits into from
Sep 28, 2023

Conversation

Angelmmiguel
Copy link
Contributor

@Angelmmiguel Angelmmiguel commented Sep 27, 2023

Introduce the ML model preloading by leveraging the "Named models" feature from WASI-NN. Now, workers don't need to load the model files and send it to the host, but loading it in the host directly. I added a new example that uses this feature and produces the same result as the rust-wasi-nn example.

I also introduced some new configuration properties to the wasi_nn feature:

name = "wasi-nn-mobilenet"
version = "1"

[[features.wasi_nn.preload_models]]
backend = "openvino"
provider = { type = "local", dir = "./_models/mobilenet/" }

The preload_models array allows to configure multiple models. The provider configures the way wws retrieves the models. Currently, only the local provider is available but we plan to add more.

Limitations

Even though we're preloading the models on the host, this process still happens at request level. Based on some comments, the Wasmtime WASI-NN library is not protected against concurrent access. To avoid multiple workers accessing the same context, I'll keep this at request level for now.

In the future, my goal is to load models once at worker initialization.

It closes #215

@Angelmmiguel Angelmmiguel added the 🚀 enhancement New feature or request label Sep 27, 2023
@Angelmmiguel Angelmmiguel added this to the v1.6.0 milestone Sep 27, 2023
@Angelmmiguel Angelmmiguel requested a review from a team September 27, 2023 15:16
@Angelmmiguel Angelmmiguel self-assigned this Sep 27, 2023
Copy link
Contributor

@ereslibre ereslibre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super!, LGTM :)


/// Available providers to load Wasi NN models.
#[derive(Deserialize, Clone, Debug)]
#[serde(rename_all = "lowercase", tag = "type")]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice :)

@Angelmmiguel Angelmmiguel merged commit 0b160a4 into main Sep 28, 2023
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add named models support on AI workers
3 participants