You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
WASI-NN is a standard for inferencing in WebAssembly. From the README:
wasi-nn is a WASI API for performing ML inference. Its name derives from the fact that ML models are also known as neural networks (nn). ML models are typically trained using a large data set, resulting in one or more files that describe the model's weights. The model is then used to compute an "inference," e.g., the probabilities of classifying an image as a set of tags. This API is concerned initially with inference, not training.
The goal is to provide AI / ML capabilities to Wasm Workers Server opening new use cases. This feature will enable developers to add ML capabilities to their applications. Currently, the models are loaded from the Wasm module using the Graph primitive. Then, developers must convert the data to the Tensor primitive, send it to a ML backend at the host and retrieve the output.
We will include an example so you can try and learn.
Describe the solution you'd like
Configuration
Following the capability-based security approach of wws, workers won't be able to call WASI-NN APIs by default. You will need to configure it via config file:
name = "nn"version = "1"
[features]
[features.wasi_nn]
allowed_backends = ["openvino"]
Language support
For the initial release, WASI-NN bindings will be available only to Rust through the wasi-nn crate. In the future, we will add support for more languages like JavaScript and Python.
Machine Learning Backends
The initial host implementation will rely on the wasmtime-wasi-nn crate. This crate only supports OpenVINO™ as machine learning backend. However, in the future we will add new backends using different approaches.
Tasks
Add WASI-NN bindings
Add configuration properties
Add example
Add documentation
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
WASI-NN is a standard for inferencing in WebAssembly. From the README:
The goal is to provide AI / ML capabilities to Wasm Workers Server opening new use cases. This feature will enable developers to add ML capabilities to their applications. Currently, the models are loaded from the Wasm module using the
Graph
primitive. Then, developers must convert the data to theTensor
primitive, send it to a ML backend at the host and retrieve the output.We will include an example so you can try and learn.
Describe the solution you'd like
Configuration
Following the capability-based security approach of
wws
, workers won't be able to call WASI-NN APIs by default. You will need to configure it via config file:Language support
For the initial release, WASI-NN bindings will be available only to Rust through the
wasi-nn
crate. In the future, we will add support for more languages like JavaScript and Python.Machine Learning Backends
The initial host implementation will rely on the wasmtime-wasi-nn crate. This crate only supports OpenVINO™ as machine learning backend. However, in the future we will add new backends using different approaches.
Tasks
The text was updated successfully, but these errors were encountered: