Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load onnx model from Stream of bytes #7254

Merged
merged 1 commit into from
Oct 9, 2024

Conversation

michaelgsharp
Copy link
Member

@michaelgsharp michaelgsharp commented Oct 2, 2024

Fixes #6591 by adding an overload API to allow the ONNX model to be passed in as a Stream of bytes.

var tempModelFile = Path.Combine(((IHostEnvironmentInternal)env).TempFilePath, Path.GetRandomFileName());
using (var fileStream = File.Create(tempModelFile))
{
modelBytes.Seek(0, SeekOrigin.Begin);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seek

Wouldn't you support streams that cannot seek?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@michaelgsharp what are your thoughts on this?

modelBytes.Seek(0, SeekOrigin.Begin);
modelBytes.CopyTo(fileStream);
}
return new OnnxModel(tempModelFile, gpuDeviceId, fallbackToCpu,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OnnxModel

I am wondering if we can have an OnnxModel constructor that can work directly with Streams? Or the native will not allow that? I am thinking if we can avoid writing to a temp file.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thats something I want to do as well, but this will at least unblock people and then I can come back and look into it. I am not sure if ONNX supports that or not (though they probably do).

Copy link
Member

@tarekgh tarekgh Oct 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just a suggestion you may decide not to apply :-)

Instead of exposing CreateFromStream, would it make sense to expose a new constructor for OnnxModel which take the stream and make the implementation inside this constructor as the one provided here? This will avoid exposing extra method if we decided later to support streams in OnnxModel. This is minor point though. I am fine either way.

Copy link

codecov bot commented Oct 7, 2024

Codecov Report

Attention: Patch coverage is 64.22764% with 44 lines in your changes missing coverage. Please review.

Project coverage is 68.78%. Comparing base (be1e428) to head (9f5e86c).
Report is 10 commits behind head on main.

Files with missing lines Patch % Lines
src/Microsoft.ML.OnnxTransformer/OnnxCatalog.cs 13.79% 25 Missing ⚠️
src/Microsoft.ML.OnnxTransformer/OnnxTransform.cs 61.36% 16 Missing and 1 partial ⚠️
...osoft.ML.OnnxTransformerTest/OnnxTransformTests.cs 95.00% 0 Missing and 2 partials ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #7254      +/-   ##
==========================================
+ Coverage   68.77%   68.78%   +0.01%     
==========================================
  Files        1462     1463       +1     
  Lines      272261   272407     +146     
  Branches    28176    28183       +7     
==========================================
+ Hits       187254   187386     +132     
- Misses      77764    77778      +14     
  Partials     7243     7243              
Flag Coverage Δ
Debug 68.78% <64.22%> (+0.01%) ⬆️
production 63.29% <49.39%> (+<0.01%) ⬆️
test 89.05% <95.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files with missing lines Coverage Δ
src/Microsoft.ML.OnnxTransformer/OnnxUtils.cs 84.82% <100.00%> (+0.61%) ⬆️
...osoft.ML.OnnxTransformerTest/OnnxTransformTests.cs 95.54% <95.00%> (-0.04%) ⬇️
src/Microsoft.ML.OnnxTransformer/OnnxTransform.cs 87.35% <61.36%> (-2.48%) ⬇️
src/Microsoft.ML.OnnxTransformer/OnnxCatalog.cs 58.90% <13.79%> (-29.74%) ⬇️

... and 9 files with indirect coverage changes

@michaelgsharp michaelgsharp merged commit e794342 into dotnet:main Oct 9, 2024
25 checks passed
@github-actions github-actions bot locked and limited conversation to collaborators Nov 9, 2024
@michaelgsharp michaelgsharp deleted the onnx-byte branch November 18, 2024 20:04
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Load ONNX from in-memory binary instead of file path
4 participants