-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Faxu documentation #16
Merged
Merged
Changes from 3 commits
Commits
Show all changes
4 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,34 +2,71 @@ | |
|
||
[![Build Status](https://dev.azure.com/onnxruntime/onnxruntime/_apis/build/status/Microsoft.onnxruntime)](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=1) | ||
|
||
ONNX Runtime is the runtime for [ONNX](https://github.com/onnx/onnx). | ||
# Introduction | ||
ONNX Runtime is an open-source scoring engine for Open Neural Network Exchange (ONNX) models. | ||
|
||
# Engineering Design | ||
[Engineering Design](docs/HighLevelDesign.md) | ||
ONNX is an open format for machine learning (ML) models that is supported by various ML and DNN frameworks and tools. This format makes it easier to interoperate between frameworks and to maximize the reach of your hardware optimization investments. Learn more about ONNX on [https://onnx.ai](https://onnx.ai) or view the [Github Repo](https://github.com/onnx/onnx). | ||
|
||
# Why use ONNX Runtime | ||
## Run any ONNX model | ||
ONNX Runtime provides comprehensive support of the ONNX spec and can be used to run all models based on ONNX v1.2.1 and higher. See ONNX version release details [here](https://github.com/onnx/onnx/releases). | ||
|
||
# API | ||
| API | CPU package | GPU package | | ||
In order to support popular and leading AI models, the runtime stays up-to-date with evolving ONNX operators and functionalities. | ||
|
||
## Cross Platform | ||
ONNX Runtime offers: | ||
* APIs for Python, C#, and C | ||
* Available for Linux, Windows, and Mac | ||
|
||
See API documentation and package installation instructions [below](#Installation). | ||
|
||
## High Performance | ||
You can use the ONNX Runtime with both CPU and GPU hardware. You can also plug in additional execution providers to ONNX Runtime. With many graph optimizations and various accelerators, ONNX Runtime can often provide lower latency and higher efficiency compared to other runtimes. This provides smoother end-to-end customer experiences and lower costs from improved machine utilization. | ||
|
||
Currently ONNX Runtime supports CUDA, MKL, and MKL-DNN for computation acceleration, with more coming soon. To add an execution provider, please refer to [this page](docs/AddingExecutionProvider.md). | ||
|
||
# Getting Started | ||
If you need a model: | ||
* Check out the [ONNX Model Zoo](https://github.com/onnx/models) for ready-to-use pre-trained models. | ||
* To get an ONNX model by exporting from various frameworks, see [ONNX Tutorials](https://github.com/onnx/tutorials). | ||
|
||
If you already have an ONNX model, just [install the runtime](#Installation) for your machine to try it out. One easy way to operationalize the model on the cloud is by using [Azure Machine Learning](https://azure.microsoft.com/en-us/services/machine-learning-service). See a how-to guide [here](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-build-deploy-onnx). | ||
|
||
# Installation | ||
## APIs and Official Builds | ||
| API Documentation | CPU package | GPU package | | ||
|-----|-------------|-------------| | ||
| [Python](https://docs.microsoft.com/en-us/python/api/overview/azure/onnx/intro?view=azure-onnx-py) | [Windows](TODO)<br>[Linux](https://pypi.org/project/onnxruntime/)<br>[Mac](TODO)| [Windows](TODO)<br>[Linux](https://pypi.org/project/onnxruntime-gpu/) | | ||
| [C#](docs/CSharp_API.md) | [Windows](TODO)| Not available | | ||
| [C](docs/C_API.md) | [Windows](TODO)<br>[Linux](TODO) | Not available | | ||
| [C#](docs/CSharp_API.md) | [Windows](TODO)| Coming Soon | | ||
| [C](docs/C_API.md) | [Windows](TODO)<br>[Linux](TODO) | Coming Soon | | ||
|
||
# Build | ||
[Build](BUILD.md) | ||
## Build Details | ||
For details on the build configurations and information on how to create a build, see [Build ONNX Runtime](BUILD.md). | ||
|
||
# Contribute | ||
[Contribute](CONTRIBUTING.md) | ||
## Versioning | ||
See more details on API and ABI Versioning and ONNX Compatibility in [Versioning](docs/Versioning.md). | ||
|
||
# Versioning | ||
[Versioning](docs/Versioning.md) | ||
# Design and Key Features | ||
For an overview of the high level architecture and key decisions in the technical design of ONNX Runtime, see [Engineering Design](docs/HighLevelDesign.md). | ||
|
||
ONNX Runtime is built with an extensible design that makes it versatile to support a wide array of models with high performance. | ||
|
||
* [Add a custom operator/kernel](AddingCustomOp.md) | ||
* [Add an execution provider](AddingExecutionProvider.md) | ||
* [Add a new graph | ||
transform](../include/onnxruntime/core/graph/graph_transformer.h) | ||
* [Add a new rewrite rule](../include/onnxruntime/core/graph/rewrite_rule.h) | ||
|
||
# Contribute | ||
We welcome your contributions! Please see the [contribution guidelines](CONTRIBUTING.md). | ||
|
||
# Feedback | ||
* File a bug in [GitHub Issues](https://github.com/Microsoft/onnxruntime/issues) | ||
## Feedback | ||
For any feedback or to report a bug, please file a [GitHub Issue](https://github.com/Microsoft/onnxruntime/issues). | ||
|
||
# Code of Conduct | ||
## Code of Conduct | ||
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). | ||
For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) | ||
or contact [[email protected]](mailto:[email protected]) with any additional questions or comments. | ||
|
||
# License | ||
[LICENSE](LICENSE) | ||
[MIT License](LICENSE) |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Linux and Mac C# should also be coming soon