Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[request] onnxruntime/1.13.1 #16699

Closed
elejke opened this issue Mar 25, 2023 · 5 comments · Fixed by #16849
Closed

[request] onnxruntime/1.13.1 #16699

elejke opened this issue Mar 25, 2023 · 5 comments · Fixed by #16849

Comments

@elejke
Copy link
Contributor

elejke commented Mar 25, 2023

Package Name/Version

onnxruntime/1.13.1

Webpage

https://www.onnxruntime.ai/

Source code

https://github.com/microsoft/onnxruntime

Description of the library/tool

ONNX Runtime is a cross-platform inference and training machine-learning accelerator compatible with deep learning frameworks, PyTorch and TensorFlow/Keras, as well as classical machine learning libraries such as scikit-learn, and more.

There was one good try to create a recipe from @CAMOBAP, even I managed to build it now for Mac OS for version 1.7.1 and test_package passes correctly.

On Windows it have some errors, but it looks like there was a solution for them before from @fdgStilla (windows wil library should be added as build_requirement)

As for now 1.7.1 is too old and it will be perfect to update the recipe and finally get it to the conan center.

This is the most popular library for neural network inference on all platforms!

@SpaceIm
Copy link
Contributor

SpaceIm commented Mar 26, 2023

duplicate of #4806

@elejke
Copy link
Contributor Author

elejke commented Mar 26, 2023

duplicate of #4806

I have managed to reproduce results from @CAMOBAP on Mac / Linux / Windows (with some bug fixes)

But the main problem with this package is that they using subprojects to random commits (looks like). And that is why I think it will be many problems to build such new version, but, you can close it if needed, I will up #4806

@gmeeker
Copy link
Contributor

gmeeker commented Mar 29, 2023

@elejke I have a partial but working recipe of onnxruntime 1.14.1 here:
https://github.com/gmeeker/conan-center-index/tree/feature/onnxruntime

I wrote this before I noticed #4086 and it's not nearly complete as that effort (very few options, just tried to get CPU engine working). However, I mostly got through the major cmake changes in recent onnxruntime releases. They started using FetchPackage in cmake 3.24 so the original recipe is out of date (onnxruntime_PREFER_SYSTEM_LIB is gone).

If you're willing to take this over, that'd be great. This has reached the limit of my knowledge of cmake, and I'm not available at the moment. It's using the new cmake tool. The dependencies are mostly solved except:

  • onnx is still downloaded (protoc fails if we use the conan onnx package)
  • boost should use conan package
  • Microsoft's wil is not in CCI and is downloaded
  • Eigen is still downloaded (I might have missed some FetchPackage patches)

The patches are quite extensive. Not sure how to handle that. Maybe some of it can get merged upstream in onnxruntime, like the package name calls to FetchPackage.

@elejke
Copy link
Contributor Author

elejke commented Mar 29, 2023

@gmeeker

for the start: #16701

🙂

great work! I have been disappointed when discovered recent changes in ort building workflow and decided that reproducing the 1.7.1 results are almost nothing to make it work with >=1.13.1, but your recipe looks promising!!

did you check it on all desktop os?

@CAMOBAP's recipe works for cross-building from linux to all android abi's after small fixes

don't you want to create a PR with it and let others to help to push it to cci finally?

@gmeeker gmeeker mentioned this issue Apr 1, 2023
3 tasks
@gmeeker
Copy link
Contributor

gmeeker commented Apr 1, 2023

Okay, I've put up #16842 but it could use a lot of help. I've tested macOS, Windows, and Linux, but not iOS or Android, and of course all the options for execution providers that should go in there.

snnn pushed a commit to microsoft/onnxruntime that referenced this issue Apr 4, 2023
…LLY_DISCONNECTED=ON` (#15323)

### Description
Rework some external targets to ease building with
`-DFETCHCONTENT_FULLY_DISCONNECTED=ON`
This will allow package managers to more easily provide an onnxruntime
package by reducing the amount of patching needed downstream at each
version.

### Motivation and Context
Availability of onnxruntime in some C++ package managers
#7150
conan-io/conan-center-index#16699
microsoft/vcpkg#20548

My initial intent is to get this in conan but the PR would most likely
be useful (though not tested) to vcpkg as well (and maybe others).
I tried to get only a first batch of not too specific patches (i.e. not
specific to conan).

The first commit reworks `flatbuffers` and just extends what @snnn did
in #13991
The second commit reworks `pytorch_cpuinfo`
The third commit reworks `google_nsync`
adityagoel4512 pushed a commit to adityagoel4512/onnxruntime that referenced this issue Apr 5, 2023
…LLY_DISCONNECTED=ON` (microsoft#15323)

### Description
Rework some external targets to ease building with
`-DFETCHCONTENT_FULLY_DISCONNECTED=ON`
This will allow package managers to more easily provide an onnxruntime
package by reducing the amount of patching needed downstream at each
version.

### Motivation and Context
Availability of onnxruntime in some C++ package managers
microsoft#7150
conan-io/conan-center-index#16699
microsoft/vcpkg#20548

My initial intent is to get this in conan but the PR would most likely
be useful (though not tested) to vcpkg as well (and maybe others).
I tried to get only a first batch of not too specific patches (i.e. not
specific to conan).

The first commit reworks `flatbuffers` and just extends what @snnn did
in microsoft#13991
The second commit reworks `pytorch_cpuinfo`
The third commit reworks `google_nsync`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants