From 5f73a9453deb3479db46346911d1ecf8b1969225 Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Fri, 2 Aug 2024 14:59:26 -0700 Subject: [PATCH 01/11] update android build docs, remove some outdated info --- docs/build/android.md | 7 ++----- 1 file changed, 2 insertions(+), 5 deletions(-) diff --git a/docs/build/android.md b/docs/build/android.md index 9d86082bb492b..427035deb1037 100644 --- a/docs/build/android.md +++ b/docs/build/android.md @@ -81,9 +81,8 @@ Resources: * Install the NDK * Find the available NDK versions by running `sdkmanager --list` * Install - * you can install a specific version or the latest (called 'ndk-bundle') e.g. `sdkmanager --install "ndk;21.1.6352462"` + * install the desired version, e.g., `sdkmanager --install "ndk;21.1.6352462"` * NDK path in our example with this install would be `.../Android/ndk/21.1.6352462` - * NOTE: If you install the ndk-bundle package the path will be `.../Android/ndk-bundle` as there's no version number ## Android Build Instructions @@ -116,14 +115,12 @@ To build on Windows with `--build_java` enabled you must also: * set JAVA_HOME to the path to your JDK install * this could be the JDK from Android Studio, or a [standalone JDK install](https://www.oracle.com/java/technologies/javase-downloads.html) * e.g. Powershell: `$env:JAVA_HOME="C:\Program Files\Java\jdk-15"` CMD: `set JAVA_HOME=C:\Program Files\Java\jdk-15` -* install [Gradle version 6.8.3](https://gradle.org/install/) and add the directory to the PATH - * e.g. Powershell: `$env:PATH="$env:PATH;C:\Gradle\gradle-6.6.1\bin"` CMD: `set PATH=%PATH%;C:\Gradle\gradle-6.6.1\bin` * run the build from an admin window * the Java build needs permissions to create a symlink, which requires an admin window #### Note: Proguard rules for R8 minimization Android app builds to work -For Android consumers using the library with R8-minimized builds, currently you need to add the following line to your `proguard-rules.pro` file inside your Android project to use package `com.microsoft.onnxruntime:onnxruntime-android` (for Full build) or `com.microsoft.onnxruntime:onnxruntime-mobile` (for Mobile build) to avoid runtime crashes: +For Android consumers using the library with R8-minimized builds, currently you need to add the following line to your `proguard-rules.pro` file inside your Android project to use package `com.microsoft.onnxruntime:onnxruntime-android` to avoid runtime crashes: ``` -keep class ai.onnxruntime.** { *; } From b618c2e4e2e84a9484384c09de6f63aabd610adf Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Fri, 2 Aug 2024 16:48:04 -0700 Subject: [PATCH 02/11] remove references to mobile packages --- docs/build/custom.md | 28 +++++++++--------- docs/get-started/with-obj-c.md | 4 +-- docs/install/index.md | 45 ++++++++--------------------- docs/tutorials/mobile/deploy-ios.md | 2 +- docs/tutorials/mobile/index.md | 36 ++++------------------- 5 files changed, 34 insertions(+), 81 deletions(-) diff --git a/docs/build/custom.md b/docs/build/custom.md index e270feac445a1..a3a94327d142c 100644 --- a/docs/build/custom.md +++ b/docs/build/custom.md @@ -157,7 +157,7 @@ Find them [here](https://github.com/microsoft/onnxruntime/tags). ## Custom build packages -In this section, `ops.config` is a [configuration file](../reference/operators/reduced-operator-config-file.md) that specifies the opsets, op kernels, and types to include. See the configuration file used by the pre-built mobile packages at [tools/ci_build/github/android/mobile_package.required_operators.config](https://github.com/microsoft/onnxruntime/blob/main/tools/ci_build/github/android/mobile_package.required_operators.config). +In this section, `ops.config` is a [configuration file](../reference/operators/reduced-operator-config-file.md) that specifies the opsets, op kernels, and types to include. ### Web @@ -182,9 +182,9 @@ To produce pods for an iOS build, use the [build_and_assemble_apple_pods.py](htt This will do a custom build and create the pod package files for it in /path/to/staging/dir. - The build options are specified with the file provided to the `--build-settings-file` option. See the current build options used by the pre-built mobile package at [tools/ci_build/github/apple/default_mobile_ios_framework_build_settings.json](https://github.com/microsoft/onnxruntime/blob/main/tools/ci_build/github/apple/default_mobile_ios_framework_build_settings.json). You can use this file directly. + The build options are specified with the file provided to the `--build-settings-file` option. See the current build options used by the pre-built package at [tools/ci_build/github/apple/default_full_apple_framework_build_settings.json](https://github.com/microsoft/onnxruntime/blob/main/tools/ci_build/github/apple/default_full_apple_framework_build_settings.json). You can use this file directly. - The reduced set of ops in the custom build is specified with the file provided to the `--include_ops_by_config` option. See the current op config used by the pre-built mobile package at [tools/ci_build/github/android/mobile_package.required_operators.config](https://github.com/microsoft/onnxruntime/blob/main/tools/ci_build/github/android/mobile_package.required_operators.config) (Android and iOS pre-built mobile packages share the same config file). You can use this file directly. + The reduced set of ops in the custom build is specified with the [configuration file](../reference/operators/reduced-operator-config-file.md) provided to the `--include_ops_by_config` option. This is optional. The default package does not include the training APIs. To create a training package, add `--enable_training_apis` in the build options file provided to `--build-settings-file` and add the `--variant Training` option when calling `build_and_assemble_apple_pods.py`. @@ -202,15 +202,15 @@ To produce pods for an iOS build, use the [build_and_assemble_apple_pods.py](htt 3. Use the local pods. - For example, update the Podfile to use the local onnxruntime-mobile-objc pod instead of the released one: + For example, update the Podfile to use the local onnxruntime-objc pod instead of the released one: ```diff - - pod 'onnxruntime-mobile-objc' - + pod 'onnxruntime-mobile-objc', :path => "/path/to/staging/dir/onnxruntime-mobile-objc" - + pod 'onnxruntime-mobile-c', :path => "/path/to/staging/dir/onnxruntime-mobile-c" + - pod 'onnxruntime-objc' + + pod 'onnxruntime-objc', :path => "/path/to/staging/dir/onnxruntime-objc" + + pod 'onnxruntime-c', :path => "/path/to/staging/dir/onnxruntime-c" ``` - Note: The onnxruntime-mobile-objc pod depends on the onnxruntime-mobile-c pod. If the released onnxruntime-mobile-objc pod is used, this dependency is automatically handled. However, if a local onnxruntime-mobile-objc pod is used, the local onnxruntime-mobile-c pod that it depends on also needs to be specified in the Podfile. + Note: The onnxruntime-objc pod depends on the onnxruntime-c pod. If the released onnxruntime-objc pod is used, this dependency is automatically handled. However, if a local onnxruntime-objc pod is used, the local onnxruntime-c pod that it depends on also needs to be specified in the Podfile. ### Android @@ -236,23 +236,23 @@ Note: In the steps below, replace `` with the ONNX Runtime version Specify the ONNX Runtime version you want to use with the `--onnxruntime_branch_or_tag` option. The script uses a separate copy of the ONNX Runtime repo in a Docker container so this is independent from the containing ONNX Runtime repo's version. - The build options are specified with the file provided to the `--build_settings` option. See the current build options used by the pre-built mobile package at [tools/ci_build/github/android/default_mobile_aar_build_settings.json](https://github.com/microsoft/onnxruntime/blob/main/tools/ci_build/github/android/default_mobile_aar_build_settings.json). + The build options are specified with the file provided to the `--build_settings` option. See the current build options used by the pre-built package at [tools/ci_build/github/android/default_full_aar_build_settings.json](https://github.com/microsoft/onnxruntime/blob/main/tools/ci_build/github/android/default_full_aar_build_settings.json). - The reduced set of ops in the custom build is specified with the file provided to the `--include_ops_by_config` option. See the current op config used by the pre-built mobile package at [tools/ci_build/github/android/mobile_package.required_operators.config](https://github.com/microsoft/onnxruntime/blob/main/tools/ci_build/github/android/mobile_package.required_operators.config). + The reduced set of ops in the custom build is specified with the [configuration file](../reference/operators/reduced-operator-config-file.md) provided to the `--include_ops_by_config` option. - The `--build_settings` and `--include_ops_by_config` options are both optional and will default to what is used to build the pre-built mobile package. Not specifying either will result in a package like the pre-built mobile package. + The `--build_settings` and `--include_ops_by_config` options are both optional and will default to what is used to build the pre-built package. Not specifying either will result in a package like the pre-built package. 2. Use the local custom Android AAR package. For example, in an Android Studio project: - a. Copy the AAR file from `/path/to/working/dir/output/aar_out//com/microsoft/onnxruntime/onnxruntime-mobile//onnxruntime-mobile-.aar` to the project's `/libs` directory. + a. Copy the AAR file from `/path/to/working/dir/output/aar_out//com/microsoft/onnxruntime/onnxruntime-android//onnxruntime-android-.aar` to the project's `/libs` directory. b. Update the project's `/build.gradle` file dependencies section: ```diff - - implementation 'com.microsoft.onnxruntime:onnxruntime-mobile:latest.release' - + implementation files('libs/onnxruntime-mobile-.aar') + - implementation 'com.microsoft.onnxruntime:onnxruntime-android:latest.release' + + implementation files('libs/onnxruntime-android-.aar') ``` ### Python diff --git a/docs/get-started/with-obj-c.md b/docs/get-started/with-obj-c.md index 51941fffc5aed..5a3a286631e3d 100644 --- a/docs/get-started/with-obj-c.md +++ b/docs/get-started/with-obj-c.md @@ -16,7 +16,7 @@ ONNX Runtime provides an Objective-C API for running ONNX models on iOS devices. ## Supported Versions -iOS 11+. +See iOS [compatibility info](../reference/compatibility.md). ## Builds @@ -24,7 +24,7 @@ The artifacts are published to CocoaPods. | Artifact | Description | Supported Platforms | |-|-|-| -| onnxruntime-mobile-objc | CPU and CoreML | iOS | +| onnxruntime-objc | CPU and CoreML | iOS | Refer to the [installation instructions](../install/index.md#install-on-ios). diff --git a/docs/install/index.md b/docs/install/index.md index a7839874857b1..303c551f667c6 100644 --- a/docs/install/index.md +++ b/docs/install/index.md @@ -147,20 +147,10 @@ dotnet add package Microsoft.AI.MachineLearning ## Install on web and mobile -Unless stated otherwise, the installation instructions in this section refer to pre-built packages that include support -for selected operators and ONNX opset versions based on the requirements of popular models. These packages may be -referred to as "mobile packages". If you use mobile packages, your model must only use the -supported [opsets and operators](../reference/operators/mobile_package_op_type_support_1.14.md). +The pre-built packages have full support for all ONNX opsets and operators. -Another type of pre-built package has full support for all ONNX opsets and operators, at the cost of larger binary size. -These packages are referred to as "full packages". - -If the pre-built mobile package supports your model/s but is too large, you can create -a [custom build](../build/custom.md). A custom build can include just the opsets and operators in your model/s to reduce -the size. - -If the pre-built mobile package does not include the opsets or operators in your model/s, you can either use the full -package if available, or create a custom build. +If the pre-built package is too large, you can create a [custom build](../build/custom.md). +A custom build can include just the opsets and operators in your model/s to reduce the size. ### JavaScript Installs @@ -190,18 +180,14 @@ npm install onnxruntime-react-native ### Install on iOS -In your CocoaPods `Podfile`, add the `onnxruntime-c`, `onnxruntime-mobile-c`, `onnxruntime-objc`, -or `onnxruntime-mobile-objc` pod, depending on whether you want to use a full or mobile package and which API you want -to use. +In your CocoaPods `Podfile`, add the `onnxruntime-c` or `onnxruntime-objc` pod, depending on which API you want to use. #### C/C++ ```ruby use_frameworks! - # choose one of the two below: - pod 'onnxruntime-c' # full package - #pod 'onnxruntime-mobile-c' # mobile package + pod 'onnxruntime-c' ``` #### Objective-C @@ -209,9 +195,7 @@ to use. ```ruby use_frameworks! - # choose one of the two below: - pod 'onnxruntime-objc' # full package - #pod 'onnxruntime-mobile-objc' # mobile package + pod 'onnxruntime-objc' ``` Run `pod install`. @@ -238,19 +222,14 @@ In your Android Studio Project, make the following changes to: ```gradle dependencies { - // choose one of the two below: - implementation 'com.microsoft.onnxruntime:onnxruntime-android:latest.release' // full package - //implementation 'com.microsoft.onnxruntime:onnxruntime-mobile:latest.release' // mobile package + implementation 'com.microsoft.onnxruntime:onnxruntime-android:latest.release' } ``` #### C/C++ -Download the [onnxruntime-android](https://mvnrepository.com/artifact/com.microsoft.onnxruntime/onnxruntime-android) ( -full package) or [onnxruntime-mobile](https://mvnrepository.com/artifact/com.microsoft.onnxruntime/onnxruntime-mobile) ( -mobile package) AAR hosted at MavenCentral, change the file extension from `.aar` to `.zip`, and unzip it. Include the -header files from the `headers` folder, and the relevant `libonnxruntime.so` dynamic library from the `jni` folder in -your NDK project. +Download the [onnxruntime-android](https://mvnrepository.com/artifact/com.microsoft.onnxruntime/onnxruntime-android) AAR hosted at MavenCentral, change the file extension from `.aar` to `.zip`, and unzip it. +Include the header files from the `headers` folder, and the relevant `libonnxruntime.so` dynamic library from the `jni` folder in your NDK project. #### Custom build @@ -438,9 +417,9 @@ below: | WinML | [**Microsoft.AI.MachineLearning**](https://www.nuget.org/packages/Microsoft.AI.MachineLearning) | [ort-nightly (dev)](https://aiinfra.visualstudio.com/PublicPackages/_artifacts/feed/ORT-Nightly/NuGet/Microsoft.AI.MachineLearning/overview) | [View](https://docs.microsoft.com/en-us/windows/ai/windows-ml/port-app-to-nuget#prerequisites) | | Java | CPU: [**com.microsoft.onnxruntime:onnxruntime**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime) | | [View](../api/java) | | | GPU (CUDA/TensorRT): [**com.microsoft.onnxruntime:onnxruntime_gpu**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime_gpu) | | [View](../api/java) | -| Android | [**com.microsoft.onnxruntime:onnxruntime-mobile**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime-mobile) | | [View](../install/index.md#install-on-ios) | -| iOS (C/C++) | CocoaPods: **onnxruntime-mobile-c** | | [View](../install/index.md#install-on-ios) | -| Objective-C | CocoaPods: **onnxruntime-mobile-objc** | | [View](../install/index.md#install-on-ios) | +| Android | [**com.microsoft.onnxruntime:onnxruntime-android**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime-android) | | [View](../install/index.md#install-on-android) | +| iOS (C/C++) | CocoaPods: **onnxruntime-c** | | [View](../install/index.md#install-on-ios) | +| Objective-C | CocoaPods: **onnxruntime-objc** | | [View](../install/index.md#install-on-ios) | | React Native | [**onnxruntime-react-native** (latest)](https://www.npmjs.com/package/onnxruntime-react-native) | [onnxruntime-react-native (dev)](https://www.npmjs.com/package/onnxruntime-react-native?activeTab=versions) | [View](../api/js) | | Node.js | [**onnxruntime-node** (latest)](https://www.npmjs.com/package/onnxruntime-node) | [onnxruntime-node (dev)](https://www.npmjs.com/package/onnxruntime-node?activeTab=versions) | [View](../api/js) | | Web | [**onnxruntime-web** (latest)](https://www.npmjs.com/package/onnxruntime-web) | [onnxruntime-web (dev)](https://www.npmjs.com/package/onnxruntime-web?activeTab=versions) | [View](../api/js) | diff --git a/docs/tutorials/mobile/deploy-ios.md b/docs/tutorials/mobile/deploy-ios.md index 6667c138d49ca..b60eaa63550b2 100644 --- a/docs/tutorials/mobile/deploy-ios.md +++ b/docs/tutorials/mobile/deploy-ios.md @@ -88,7 +88,7 @@ Here's an example screenshot of the app: pod install ``` - The `Podfile` contains the `onnxruntime-mobile-objc` dependency, which is the pod containing the Objective C API. + The `Podfile` contains the `onnxruntime-objc` dependency, which is the pod containing the Objective C API. At the end of this step, you should see a file called `ORTObjectDetection.xcworkspace` in the `mobile/examples/object_detection/ios` directory. diff --git a/docs/tutorials/mobile/index.md b/docs/tutorials/mobile/index.md index c87e68bf69d5d..b4c664406345b 100644 --- a/docs/tutorials/mobile/index.md +++ b/docs/tutorials/mobile/index.md @@ -55,7 +55,7 @@ Accelerators are called Execution Providers in ONNX Runtime. If the model is quantized, start with the CPU Execution Provider. If the model is not quantized start with XNNPACK. These are the simplest and most consistent as everything is running on CPU. -If CPU/XNNPACK do not meet the application's performance results, then try NNAPI/CoreML. Performance with these execution providers is device and model specific. If the model is broken into multiple partitions due to the model using operators that ONNX Runtime, NNAPI/CoreML. or the device doesn't support (e.g. older NNAPI versions), performance may degrade. +If CPU/XNNPACK do not meet the application's performance results, then try NNAPI/CoreML. Performance with these execution providers is device and model specific. If the model is broken into multiple partitions due to the model using operators that the execution provider doesn't support (e.g., due to older NNAPI versions), performance may degrade. Specific execution providers are configured in the SessionOptions, when the ONNXRuntime session is created and the model is loaded. For more detail, see your language [API docs](../../api). @@ -80,36 +80,10 @@ Another way of reducing the model size is to find a new model with the same inpu #### Reduce application binary size -There are two options for reducing the ONNX Runtime binary size. +To reduce the ONNX Runtime binary size, you can build a custom runtime based on your model(s). -1. Use the published packages that are optimized for mobile +Refer to the process to build a [custom runtime](../../build/custom.md). - * Android Java/C/C++: onnxruntime-mobile - * iOS C/C++: onnxruntime-mobile-c - * iOS Objective-C: onnxruntime-mobile-objc +One of the outputs of the ORT format conversion is a build configuration file, containing a list of operators from your model(s) and their types. You can use this configuration file as input to the custom runtime binary build. - These mobile packages have a smaller binary size but limited feature support, like a reduced set of operator implementations and the model must be converted to [ORT format](../../performance/model-optimizations/ort-format-models.md#convert-onnx-models-to-ort-format). - - See the [install guide](../../install/#install-on-web-and-mobile) for package specific instructions. - - If the mobile package does not have coverage for all of the operators in your model, then you can build a custom runtime binary based your specific model. - -2. Build a custom runtime based on your model(s) - - One of the outputs of the ORT format conversion is a build configuration file, containing a list of operators from your model(s) and their types. You can use this configuration as input to the custom runtime binary build script. - - The process to build a [custom runtime](../../build/custom.md) uses the same build scripts as standard ONNX Runtime, with some extra parameters. - -To give an idea of the binary size difference between the full packages and the mobile optimized packages: - -[ONNX Runtime 1.13.1 Android](https://central.sonatype.com/namespace/com.microsoft.onnxruntime) library file size - -|Architecture|Package|Size| -|-|-|-| -|arm64|onnxruntime-android|12.2 MB| -||onnxruntime-mobile|3.2 MB| -|arm32|onnxruntime-android|8.4 MB| -||onnxruntime-mobile|2.3 MB| -||custom (MobileNet)|_Coming soon_| - -The iOS package is a static framework and so the library package size is not a good indication of the actual contribution to the application binary size. The above sizes for Android are good estimates for iOS. +TODO compare full package vs custom build binary size From ca4d2a5e4651e7ad72e8abf828741ae9d564fad9 Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Fri, 2 Aug 2024 16:59:01 -0700 Subject: [PATCH 03/11] update table.svelte --- src/routes/getting-started/table.svelte | 28 ++++++++++++------------- 1 file changed, 14 insertions(+), 14 deletions(-) diff --git a/src/routes/getting-started/table.svelte b/src/routes/getting-started/table.svelte index 3e54164299012..b47ec1fa21fc6 100644 --- a/src/routes/getting-started/table.svelte +++ b/src/routes/getting-started/table.svelte @@ -409,7 +409,7 @@ "Follow build instructions from here", 'android,Java,ARM64,NNAPI': - "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android or com.microsoft.onnxruntime:onnxruntime-mobile using Maven/Gradle and refer to the mobile deployment guide", + "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android using Maven/Gradle and refer to the mobile deployment guide", 'android,C-API,X86,NNAPI': "Follow build instructions from here", @@ -421,7 +421,7 @@ "Install Nuget package Microsoft.ML.OnnxRuntime.", 'android,Java,X64,NNAPI': - "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android or com.microsoft.onnxruntime:onnxruntime-mobile using Maven/Gradle and refer to the mobile deployment guide", + "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android using Maven/Gradle and refer to the mobile deployment guide", 'android,C-API,X64,NNAPI': "Follow build instructions from here", @@ -433,7 +433,7 @@ "Install Nuget package Microsoft.ML.OnnxRuntime.", 'android,Java,X86,NNAPI': - "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android or com.microsoft.onnxruntime:onnxruntime-mobile using Maven/Gradle and refer to the mobile deployment guide", + "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android using Maven/Gradle and refer to the mobile deployment guide", 'android,C-API,ARM32,NNAPI': "Follow build instructions from here", @@ -445,7 +445,7 @@ "Install Nuget package Microsoft.ML.OnnxRuntime.", 'android,Java,ARM32,NNAPI': - "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android or com.microsoft.onnxruntime:onnxruntime-mobile using Maven/Gradle and refer to the mobile deployment guide", + "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android using Maven/Gradle and refer to the mobile deployment guide", 'android,C-API,ARM64,DefaultCPU': "Follow build instructions from here", @@ -454,7 +454,7 @@ "Follow build instructions from here", 'android,Java,ARM64,DefaultCPU': - "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android or com.microsoft.onnxruntime:onnxruntime-mobile using Maven/Gradle and refer to the mobile deployment guide", + "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android using Maven/Gradle and refer to the mobile deployment guide", 'android,C-API,ARM32,DefaultCPU': "Follow build instructions from here", @@ -466,7 +466,7 @@ "Install Nuget package Microsoft.ML.OnnxRuntime.", 'android,Java,ARM32,DefaultCPU': - "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android or com.microsoft.onnxruntime:onnxruntime-mobile using Maven/Gradle and refer to the mobile deployment guide", + "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android using Maven/Gradle and refer to the mobile deployment guide", 'android,C-API,X86,DefaultCPU': "Follow build instructions from here", @@ -478,7 +478,7 @@ "Install Nuget package Microsoft.ML.OnnxRuntime.", 'android,Java,X86,DefaultCPU': - "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android or com.microsoft.onnxruntime:onnxruntime-mobile using Maven/Gradle and refer to the mobile deployment guide", + "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android using Maven/Gradle and refer to the mobile deployment guide", 'android,C-API,X64,DefaultCPU': "Follow build instructions from here", @@ -490,7 +490,7 @@ "Install Nuget package Microsoft.ML.OnnxRuntime.", 'android,Java,X64,DefaultCPU': - "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android or com.microsoft.onnxruntime:onnxruntime-mobile using Maven/Gradle and refer to the mobile deployment guide", + "Add a dependency on com.microsoft.onnxruntime:onnxruntime-android using Maven/Gradle and refer to the mobile deployment guide", 'android,C#,ARM64,DefaultCPU': "Install Nuget package Microsoft.ML.OnnxRuntime.", @@ -499,22 +499,22 @@ "Install Nuget package Microsoft.ML.OnnxRuntime.", 'ios,C-API,ARM64,DefaultCPU': - "Add 'onnxruntime-c' or 'onnxruntime-mobile-c' using CocoaPods and refer to the mobile deployment guide", + "Add 'onnxruntime-c' using CocoaPods and refer to the mobile deployment guide", 'ios,C++,ARM64,DefaultCPU': - "Add 'onnxruntime-c' or 'onnxruntime-mobile-c' using CocoaPods and refer to the mobile deployment guide", + "Add 'onnxruntime-c' using CocoaPods and refer to the mobile deployment guide", 'ios,C-API,ARM64,CoreML': - "Add 'onnxruntime-c' or 'onnxruntime-mobile-c' using CocoaPods and refer to the mobile deployment guide", + "Add 'onnxruntime-c' using CocoaPods and refer to the mobile deployment guide", 'ios,C++,ARM64,CoreML': - "Add 'onnxruntime-c' or 'onnxruntime-mobile-c' using CocoaPods and refer to the mobile deployment guide", + "Add 'onnxruntime-c' using CocoaPods and refer to the mobile deployment guide", 'ios,objectivec,ARM64,DefaultCPU': - "Add 'onnxruntime-objc' or 'onnxruntime-mobile-objc' using CocoaPods and refer to the mobile deployment guide", + "Add 'onnxruntime-objc' using CocoaPods and refer to the mobile deployment guide", 'ios,objectivec,ARM64,CoreML': - "Add 'onnxruntime-objc' or 'onnxruntime-mobile-objc' using CocoaPods and refer to the mobile deployment guide", + "Add 'onnxruntime-objc' using CocoaPods and refer to the mobile deployment guide", 'ios,C#,ARM64,DefaultCPU': "Install Nuget package Microsoft.ML.OnnxRuntime.", From 64a502dec0ba2ef8079c762a8f3bd953b01f9c97 Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Mon, 5 Aug 2024 13:28:27 -0700 Subject: [PATCH 04/11] update docs for mobile package ops --- docs/reference/operators/ContribOperators.md | 2 +- docs/reference/operators/MobileOps.md | 7 +++++-- docs/reference/operators/add-custom-op.md | 2 +- docs/reference/operators/reduced-operator-config-file.md | 2 +- 4 files changed, 8 insertions(+), 5 deletions(-) diff --git a/docs/reference/operators/ContribOperators.md b/docs/reference/operators/ContribOperators.md index cd4ed0dc23e74..2d61940495c43 100644 --- a/docs/reference/operators/ContribOperators.md +++ b/docs/reference/operators/ContribOperators.md @@ -2,7 +2,7 @@ title: Contrib operators parent: Operators grand_parent: Reference -nav_order: 3 +nav_order: 2 --- # Contrib ops diff --git a/docs/reference/operators/MobileOps.md b/docs/reference/operators/MobileOps.md index 59258c7026e34..7096199b9e3e8 100644 --- a/docs/reference/operators/MobileOps.md +++ b/docs/reference/operators/MobileOps.md @@ -2,11 +2,14 @@ title: ORT Mobile operators parent: Operators grand_parent: Reference -nav_order: 2 +nav_exclude: true --- + +# **IMPORTANT: The ORT Mobile pre-built packages with reduced operator support are no longer available starting from 1.19. Please use the versions with full operator support or do a custom build.** + # ORT Mobile Pre-Built Package Operator and Type Support -These are the operators and types included in the ORT Mobile pre-built packages for each release. Supported operators and types selected are based on what is required to support float32 and quantized versions of popular models. The full list of input models used to determine this list is available [here](https://github.com/microsoft/onnxruntime/blob/main/tools/ci_build/github/android/mobile_package.required_operators.readme.txt). +These are the operators and types included in the ORT Mobile pre-built packages for each release. Supported operators and types selected are based on what is required to support float32 and quantized versions of popular models. The full list of input models used to determine this list is available [here](https://github.com/microsoft/onnxruntime/blob/v1.17.3/tools/ci_build/github/android/mobile_package.required_operators.readme.txt). | Release | Documentation | |---------|---------------| diff --git a/docs/reference/operators/add-custom-op.md b/docs/reference/operators/add-custom-op.md index b4b43b2324eb5..360df90360528 100644 --- a/docs/reference/operators/add-custom-op.md +++ b/docs/reference/operators/add-custom-op.md @@ -2,7 +2,7 @@ title: Custom operators parent: Operators grand_parent: Reference -nav_order: 4 +nav_order: 3 --- # Custom operators diff --git a/docs/reference/operators/reduced-operator-config-file.md b/docs/reference/operators/reduced-operator-config-file.md index 5bdf9d98a0e08..2bf6caf1f7810 100644 --- a/docs/reference/operators/reduced-operator-config-file.md +++ b/docs/reference/operators/reduced-operator-config-file.md @@ -3,7 +3,7 @@ title: Reduced operator config file description: Specification of the reduced operator config file, used to reduce the size of the ONNX Runtime parent: Operators grand_parent: Reference -nav_order: 5 +nav_order: 4 --- From ecedd53db6f49195bdf66b361e5f3c41f71dc91e Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Mon, 5 Aug 2024 16:23:00 -0700 Subject: [PATCH 05/11] update onnx model opset updater description --- docs/tutorials/mobile/helpers/index.md | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/docs/tutorials/mobile/helpers/index.md b/docs/tutorials/mobile/helpers/index.md index 0fa071f00f5d7..c97b155ddddc6 100644 --- a/docs/tutorials/mobile/helpers/index.md +++ b/docs/tutorials/mobile/helpers/index.md @@ -28,9 +28,8 @@ See [here](./model-usability-checker.md) for more details. ## ONNX model opset updater -The ORT Mobile pre-built package only supports the most recent ONNX opsets in order to minimize binary size. Most ONNX models can be updated to a newer ONNX opset using this tool. It is recommended to use the latest opset the pre-built package supports, which is currently opset 15. - -The ONNX opsets supported by the pre-built package are documented [here](../../../reference/operators/MobileOps.md). +If you are doing a custom build, you may want to update your target models to the same ONNX opset or opsets so that the custom build can support a smaller number of opsets in order to reduce binary size. +Most ONNX models can be updated to a newer ONNX opset using this tool. Usage: From b24a2094008eded28f217acdcf2484d78a380263 Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Mon, 5 Aug 2024 19:55:44 -0700 Subject: [PATCH 06/11] update model usability checker example output --- .../mobile/helpers/model-usability-checker.md | 92 +++++++++---------- 1 file changed, 45 insertions(+), 47 deletions(-) diff --git a/docs/tutorials/mobile/helpers/model-usability-checker.md b/docs/tutorials/mobile/helpers/model-usability-checker.md index 8ff83d777d29d..24e9d00c91476 100644 --- a/docs/tutorials/mobile/helpers/model-usability-checker.md +++ b/docs/tutorials/mobile/helpers/model-usability-checker.md @@ -21,19 +21,16 @@ The model usability checker analyzes an ONNX model regarding its suitability for ``` python -m onnxruntime.tools.check_onnx_model_mobile_usability --help -usage: check_onnx_model_mobile_usability.py [-h] [--config_path CONFIG_PATH] [--log_level {debug,info,warning,error}] model_path +usage: check_onnx_model_mobile_usability.py [-h] [--log_level {debug,info}] model_path -Analyze an ONNX model to determine how well it will work in mobile scenarios, and whether it is likely to be able to use the pre-built ONNX Runtime Mobile Android or iOS package. +Analyze an ONNX model to determine how well it will work in mobile scenarios. positional arguments: model_path Path to ONNX model to check optional arguments: -h, --help show this help message and exit - --config_path CONFIG_PATH - Path to required operators and types configuration used to build the pre-built ORT mobile package. (default: - \tools\mobile_helpers\mobile_package.required_operators.config) - --log_level {debug,info,warning,error} + --log_level {debug,info} Logging level (default: info) ``` @@ -43,12 +40,19 @@ The script will check if the operators in the model are supported by ORT's NNAPI Example output from this check looks like: ``` -INFO: Checking mobilenet_v1_1.0_224_quant.onnx for usability with ORT Mobile. +INFO: Checking resnet50-v1-7.onnx for usability with ORT Mobile. INFO: Checking NNAPI +INFO: 1 partitions with a total of 121/122 nodes can be handled by the NNAPI EP. +INFO: Partition sizes: [121] +INFO: Unsupported nodes due to operator=0 +INFO: Caveats that have not been checked and may result in a node not actually being supported: + ai.onnx:Conv:Only 2D Conv is supported. Weights and bias should be constant. + ai.onnx:Gemm:If input B is not constant, transB should be 1. + ai.onnx:GlobalAveragePool:Only 2D Pool is supported. + ai.onnx:MaxPool:Only 2D Pool is supported. +INFO: Unsupported nodes due to input having a dynamic shape=1 +INFO: NNAPI should work well for this model as there is one partition covering 99.2% of the nodes in the model. INFO: Model should perform well with NNAPI as is: YES -INFO: Checking CoreML -INFO: Model should perform well with CoreML as is: NO -INFO: Re-run with log level of DEBUG for more details on the NNAPI/CoreML issues. ``` If the model has dynamic input shapes an additional check is made to estimate whether making the shapes of fixed size would help. See [onnxruntime.tools.make_dynamic_shape_fixed](./make-dynamic-shape-fixed.md) for more information. @@ -56,56 +60,50 @@ If the model has dynamic input shapes an additional check is made to estimate wh Example output from this check: ``` -INFO: Checking abs_free_dimensions.onnx for usability with ORT Mobile. -INFO: Checking NNAPI -INFO: Model should perform well with NNAPI as is: NO +INFO: Checking resnet50-v1-7.onnx for usability with ORT Mobile. +... +INFO: Checking CoreML MLProgram +INFO: 2 partitions with a total of 120/122 nodes can be handled by the CoreML MLProgram EP. +INFO: Partition sizes: [119, 1] +INFO: Unsupported nodes due to operator=1 +INFO: Unsupported ops: ai.onnx:Flatten +INFO: Caveats that have not been checked and may result in a node not actually being supported: + ai.onnx:Conv:Only 1D/2D Conv is supported. Bias if provided must be constant. + ai.onnx:Gemm:Input B must be constant. + ai.onnx:GlobalAveragePool:Only 2D Pool is supported currently. 3D and 5D support can be added if needed. + ai.onnx:MaxPool:Only 2D Pool is supported currently. 3D and 5D support can be added if needed. +INFO: Unsupported nodes due to input having a dynamic shape=1 +INFO: CoreML MLProgram can be considered for this model as there are two partitions covering 98.4% of the nodes. Performance testing is required to validate. +INFO: Model should perform well with CoreML MLProgram as is: MAYBE +INFO: -------- INFO: Checking if model will perform better if the dynamic shapes are fixed... -INFO: Model should perform well with NNAPI if modified to have fixed input shapes: YES +INFO: Partition information if the model was updated to make the shapes fixed: +INFO: 2 partitions with a total of 121/122 nodes can be handled by the CoreML MLProgram EP. +INFO: Partition sizes: [120, 1] +INFO: Unsupported nodes due to operator=1 +INFO: Unsupported ops: ai.onnx:Flatten +INFO: Caveats that have not been checked and may result in a node not actually being supported: + ai.onnx:Conv:Only 1D/2D Conv is supported. Bias if provided must be constant. + ai.onnx:Gemm:Input B must be constant. + ai.onnx:GlobalAveragePool:Only 2D Pool is supported currently. 3D and 5D support can be added if needed. + ai.onnx:MaxPool:Only 2D Pool is supported currently. 3D and 5D support can be added if needed. +INFO: CoreML MLProgram can be considered for this model as there are two partitions covering 99.2% of the nodes. Performance testing is required to validate. +INFO: Model should perform well with CoreML MLProgram if modified to have fixed input shapes: MAYBE INFO: Shapes can be altered using python -m onnxruntime.tools.make_dynamic_shape_fixed ``` -Setting the log level to `debug` will result in significant amounts of diagnostic output that provides in-depth information on why the recommendations were made. +There is diagnostic output that provides in-depth information on why the recommendations were made. This includes - information on individual operators that are supported or unsupported by the NNAPI and CoreML EPs - information on how many groups (a.k.a. partitions) the supported operators are broken into - the more groups the worse performance will be as we have to switch between the NPU (Neural Processing Unit) and CPU each time we switch between a supported and unsupported group of nodes -## Use with ORT Mobile Pre-Built package - -The ONNX opset and operators used in the model are checked to determine if they are supported by the ORT Mobile pre-built package. - -Example output if the model can be used as-is: -``` -INFO: Checking if pre-built ORT Mobile package can be used with mobilenet_v1_1.0_224_quant.onnx once model is - converted from ONNX to ORT format using onnxruntime.tools.convert_onnx_models_to_ort... -INFO: Model should work with the pre-built package. -``` - -If the model uses an old ONNX opset, information will be provided on how to update it. -See [onnxruntime.tools.update_onnx_opset](./index.md#onnx-model-opset-updater) for more information. - -Example output: -``` -INFO: Checking if pre-built ORT Mobile package can be used with abs_free_dimensions.onnx once model is converted - from ONNX to ORT format using onnxruntime.tools.convert_onnx_models_to_ort... -INFO: Model uses ONNX opset 9. -INFO: The pre-built package only supports ONNX opsets [12, 13, 14, 15]. -INFO: Please try updating the ONNX model opset to a supported version using - python -m onnxruntime.tools.onnx_model_utils.update_onnx_opset ... -``` - ## Recommendation -Finally the script will provide information on how to [convert the model to the ORT format](../../../../docs/performance/model-optimizations/ort-format-models.md) required by ORT Mobile, and recommend which of the two ORT format models to use. +Finally, the script will provide a recommendation on what EP to use. ``` -INFO: Run `python -m onnxruntime.tools.convert_onnx_models_to_ort ...` to convert the ONNX model to ORT format. - By default, the conversion tool will create an ORT format model with saved optimizations which can potentially be - applied at runtime (with a .with_runtime_opt.ort file extension) for use with NNAPI or CoreML, and a fully - optimized ORT format model (with a .ort file extension) for use with the CPU EP. -INFO: As NNAPI or CoreML may provide benefits with this model it is recommended to compare the performance of - the .with_runtime_opt.ort model using the NNAPI EP on Android, and the CoreML EP on iOS, against the - performance of the .ort model using the CPU EP. +INFO: As NNAPI or CoreML may provide benefits with this model it is recommended to compare the performance of the model using the NNAPI EP on Android, and the CoreML EP on iOS, against the performance using the CPU EP. ``` From 6900c16ab9b16b2881f68613d1ddef34f9bac716 Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Tue, 6 Aug 2024 11:13:28 -0700 Subject: [PATCH 07/11] add aar binary size measurements --- docs/tutorials/mobile/index.md | 10 +++++++++- 1 file changed, 9 insertions(+), 1 deletion(-) diff --git a/docs/tutorials/mobile/index.md b/docs/tutorials/mobile/index.md index b4c664406345b..6889d62251374 100644 --- a/docs/tutorials/mobile/index.md +++ b/docs/tutorials/mobile/index.md @@ -86,4 +86,12 @@ Refer to the process to build a [custom runtime](../../build/custom.md). One of the outputs of the ORT format conversion is a build configuration file, containing a list of operators from your model(s) and their types. You can use this configuration file as input to the custom runtime binary build. -TODO compare full package vs custom build binary size +To give an idea of the binary size difference between the pre-built package and a custom build: + +File|Pre-built package size (bytes)|Custom build size (bytes) +-|-|- +`jni/arm64-v8a/libonnxruntime.so`, uncompressed|16276832|4079536 +`jni/x86_64/libonnxruntime.so`, uncompressed|18222208|4464568 +AAR|24415212|8234421 + +This custom build supports the operators needed to run a ResNet50 model. It also has limited framework support (built with `--minimal_build=extended`), only supporting ORT format models. It has support for the NNAPI and XNNPACK execution providers. From a8831c2f5dccbcedf3bc732cc847a7ffbe8cfc25 Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Tue, 6 Aug 2024 12:39:29 -0700 Subject: [PATCH 08/11] update measurements with build using more similar settings --- docs/tutorials/mobile/index.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/tutorials/mobile/index.md b/docs/tutorials/mobile/index.md index 6889d62251374..bb64c8c25fc2c 100644 --- a/docs/tutorials/mobile/index.md +++ b/docs/tutorials/mobile/index.md @@ -90,8 +90,8 @@ To give an idea of the binary size difference between the pre-built package and File|Pre-built package size (bytes)|Custom build size (bytes) -|-|- -`jni/arm64-v8a/libonnxruntime.so`, uncompressed|16276832|4079536 -`jni/x86_64/libonnxruntime.so`, uncompressed|18222208|4464568 -AAR|24415212|8234421 +AAR|24415212|7532309 +`jni/arm64-v8a/libonnxruntime.so`, uncompressed|16276832|3962832 +`jni/x86_64/libonnxruntime.so`, uncompressed|18222208|4240864 This custom build supports the operators needed to run a ResNet50 model. It also has limited framework support (built with `--minimal_build=extended`), only supporting ORT format models. It has support for the NNAPI and XNNPACK execution providers. From 0c77a29c1784a13531b4b80baec4d46bb417c59a Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Thu, 8 Aug 2024 18:39:10 -0700 Subject: [PATCH 09/11] Update docs/install/index.md --- docs/install/index.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/install/index.md b/docs/install/index.md index 303c551f667c6..cda3f74556af0 100644 --- a/docs/install/index.md +++ b/docs/install/index.md @@ -417,9 +417,9 @@ below: | WinML | [**Microsoft.AI.MachineLearning**](https://www.nuget.org/packages/Microsoft.AI.MachineLearning) | [ort-nightly (dev)](https://aiinfra.visualstudio.com/PublicPackages/_artifacts/feed/ORT-Nightly/NuGet/Microsoft.AI.MachineLearning/overview) | [View](https://docs.microsoft.com/en-us/windows/ai/windows-ml/port-app-to-nuget#prerequisites) | | Java | CPU: [**com.microsoft.onnxruntime:onnxruntime**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime) | | [View](../api/java) | | | GPU (CUDA/TensorRT): [**com.microsoft.onnxruntime:onnxruntime_gpu**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime_gpu) | | [View](../api/java) | -| Android | [**com.microsoft.onnxruntime:onnxruntime-android**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime-android) | | [View](../install/index.md#install-on-android) | -| iOS (C/C++) | CocoaPods: **onnxruntime-c** | | [View](../install/index.md#install-on-ios) | -| Objective-C | CocoaPods: **onnxruntime-objc** | | [View](../install/index.md#install-on-ios) | +| Android | [**com.microsoft.onnxruntime:onnxruntime-android**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime-android) | | [View](../install/index.md#install-on-android) | +| iOS (C/C++) | CocoaPods: **onnxruntime-c** | | [View](../install/index.md#install-on-ios) | +| Objective-C | CocoaPods: **onnxruntime-objc** | | [View](../install/index.md#install-on-ios) | | React Native | [**onnxruntime-react-native** (latest)](https://www.npmjs.com/package/onnxruntime-react-native) | [onnxruntime-react-native (dev)](https://www.npmjs.com/package/onnxruntime-react-native?activeTab=versions) | [View](../api/js) | | Node.js | [**onnxruntime-node** (latest)](https://www.npmjs.com/package/onnxruntime-node) | [onnxruntime-node (dev)](https://www.npmjs.com/package/onnxruntime-node?activeTab=versions) | [View](../api/js) | | Web | [**onnxruntime-web** (latest)](https://www.npmjs.com/package/onnxruntime-web) | [onnxruntime-web (dev)](https://www.npmjs.com/package/onnxruntime-web?activeTab=versions) | [View](../api/js) | From 2d78756ef059cac1f833de342a14ca1631480674 Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Thu, 8 Aug 2024 18:40:54 -0700 Subject: [PATCH 10/11] Update docs/install/index.md --- docs/install/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/install/index.md b/docs/install/index.md index cda3f74556af0..d9e14b1609697 100644 --- a/docs/install/index.md +++ b/docs/install/index.md @@ -417,7 +417,7 @@ below: | WinML | [**Microsoft.AI.MachineLearning**](https://www.nuget.org/packages/Microsoft.AI.MachineLearning) | [ort-nightly (dev)](https://aiinfra.visualstudio.com/PublicPackages/_artifacts/feed/ORT-Nightly/NuGet/Microsoft.AI.MachineLearning/overview) | [View](https://docs.microsoft.com/en-us/windows/ai/windows-ml/port-app-to-nuget#prerequisites) | | Java | CPU: [**com.microsoft.onnxruntime:onnxruntime**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime) | | [View](../api/java) | | | GPU (CUDA/TensorRT): [**com.microsoft.onnxruntime:onnxruntime_gpu**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime_gpu) | | [View](../api/java) | -| Android | [**com.microsoft.onnxruntime:onnxruntime-android**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime-android) | | [View](../install/index.md#install-on-android) | +| Android | [**com.microsoft.onnxruntime:onnxruntime-android**](https://search.maven.org/artifact/com.microsoft.onnxruntime/onnxruntime-android) | | [View](../install/index.md#install-on-android) | | iOS (C/C++) | CocoaPods: **onnxruntime-c** | | [View](../install/index.md#install-on-ios) | | Objective-C | CocoaPods: **onnxruntime-objc** | | [View](../install/index.md#install-on-ios) | | React Native | [**onnxruntime-react-native** (latest)](https://www.npmjs.com/package/onnxruntime-react-native) | [onnxruntime-react-native (dev)](https://www.npmjs.com/package/onnxruntime-react-native?activeTab=versions) | [View](../api/js) | From 514307c48155da22969f5fb651a1d4b5f7d481dd Mon Sep 17 00:00:00 2001 From: Edward Chen <18449977+edgchen1@users.noreply.github.com> Date: Thu, 8 Aug 2024 19:20:02 -0700 Subject: [PATCH 11/11] address comments --- docs/build/android.md | 13 +++++++------ docs/tutorials/mobile/index.md | 4 ++-- 2 files changed, 9 insertions(+), 8 deletions(-) diff --git a/docs/build/android.md b/docs/build/android.md index 427035deb1037..70a6a21b29d15 100644 --- a/docs/build/android.md +++ b/docs/build/android.md @@ -23,6 +23,8 @@ The SDK and NDK packages can be installed via Android Studio or the sdkmanager c Android Studio is more convenient but a larger installation. The command line tools are smaller and usage can be scripted, but are a little more complicated to setup. They also require a Java runtime environment to be available. +Generally, you'll want to use the latest stable NDK version. We'll refer to the version that you use as `` from here on. + Resources: * [API levels](https://developer.android.com/guide/topics/manifest/uses-sdk-element.html) @@ -36,7 +38,7 @@ Resources: 2. Install any additional SDK Platforms if necessary * File->Settings->Appearance & Behavior->System Settings->Android SDK to see what is currently installed - * Note that the SDK path you need to use as --android_sdk_path when building ORT is also on this configuration page + * Note that the SDK path you need to use as `--android_sdk_path` when building ORT is also on this configuration page * Most likely you don't require additional SDK Platform packages as the latest platform can target earlier API levels. 3. Install an NDK version @@ -44,8 +46,7 @@ Resources: * File->Settings->Appearance & Behavior->System Settings->Android SDK * 'SDK Tools' tab * Select 'Show package details' checkbox at the bottom to see specific versions. By default the latest will be installed which should be fine. - * The NDK path will be the 'ndk/{version}' subdirectory of the SDK path shown - * e.g. if 21.1.6352462 is installed it will be {SDK path}/ndk/21.1.6352462 + * The NDK path will be the `ndk/` subdirectory of the SDK path shown ### sdkmanager from command line tools @@ -81,8 +82,8 @@ Resources: * Install the NDK * Find the available NDK versions by running `sdkmanager --list` * Install - * install the desired version, e.g., `sdkmanager --install "ndk;21.1.6352462"` - * NDK path in our example with this install would be `.../Android/ndk/21.1.6352462` + * install the desired version, e.g., `sdkmanager --install "ndk;"` + * NDK path in our example with this install would be `.../Android/ndk/` ## Android Build Instructions @@ -97,7 +98,7 @@ The [Ninja](https://ninja-build.org/) generator needs to be used to build on Win e.g. using the paths from our example ``` -./build.bat --android --android_sdk_path .../Android --android_ndk_path .../Android/ndk/21.1.6352462 --android_abi arm64-v8a --android_api 27 --cmake_generator Ninja +./build.bat --android --android_sdk_path .../Android --android_ndk_path .../Android/ndk/ --android_abi arm64-v8a --android_api 27 --cmake_generator Ninja ``` ### Cross compiling on Linux and macOS diff --git a/docs/tutorials/mobile/index.md b/docs/tutorials/mobile/index.md index bb64c8c25fc2c..852320521d4c5 100644 --- a/docs/tutorials/mobile/index.md +++ b/docs/tutorials/mobile/index.md @@ -88,10 +88,10 @@ One of the outputs of the ORT format conversion is a build configuration file, c To give an idea of the binary size difference between the pre-built package and a custom build: -File|Pre-built package size (bytes)|Custom build size (bytes) +File|1.18.0 pre-built package size (bytes)|1.18.0 custom build size (bytes) -|-|- AAR|24415212|7532309 `jni/arm64-v8a/libonnxruntime.so`, uncompressed|16276832|3962832 `jni/x86_64/libonnxruntime.so`, uncompressed|18222208|4240864 -This custom build supports the operators needed to run a ResNet50 model. It also has limited framework support (built with `--minimal_build=extended`), only supporting ORT format models. It has support for the NNAPI and XNNPACK execution providers. +This custom build supports the operators needed to run a ResNet50 model. It requires the use of ORT format models (as it was built with `--minimal_build=extended`). It has support for the NNAPI and XNNPACK execution providers.