Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenVINO-EP v4.0 Release PR with OpenVINO 2022.1 #11025

Merged
merged 101 commits into from
Apr 6, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
101 commits
Select commit Hold shift + click to select a range
133fb01
Enabling ov-ep for 2022.1 Release
MaajidKhan Jan 17, 2022
948941f
Fix for output mismatch b/w OpenVINO and ONNX
MaajidKhan Jan 17, 2022
140c599
Enabling Adobe ops
MaajidKhan Jan 17, 2022
d5908a6
Removing irrelevant conditions
MaajidKhan Jan 17, 2022
72cc91b
Enable upsample op
MaajidKhan Jan 19, 2022
4b55164
Enable Adobe proxy-e model
MaajidKhan Jan 20, 2022
155765b
Removing any extra conditions for Opset13 ops
MaajidKhan Feb 2, 2022
8455317
Opset13 changes
MaajidKhan Jan 24, 2022
7ad2a02
Exception handling for devices
hdgx Jan 24, 2022
47116d2
Added comments
hdgx Jan 24, 2022
34a9c7a
Implement GPU Throttling feature
MaajidKhan Feb 2, 2022
dc24bad
Renaming the runtime config option
MaajidKhan Feb 9, 2022
117e3e9
Added the user to video and users group
mayavijx Dec 16, 2021
9cdbdf3
Handling_GPU.0_GPU.1
hdgx Feb 17, 2022
2c94fc3
Handling special conditions
MaajidKhan Feb 18, 2022
e4a455b
Modification to include new api 2.0 changes in the code
MaajidKhan Feb 23, 2022
a14b8c0
Added opset13 changes
MaajidKhan Feb 23, 2022
e264ca9
Enabling ov-ep for 2022.1 Release
MaajidKhan Jan 17, 2022
049d8c9
Fix for output mismatch b/w OpenVINO and ONNX
MaajidKhan Jan 17, 2022
e0cc70f
Enabling Adobe ops
MaajidKhan Jan 17, 2022
d0141be
Removing irrelevant conditions
MaajidKhan Jan 17, 2022
ec96986
Enable upsample op
MaajidKhan Jan 19, 2022
b9e87d6
Enable Adobe proxy-e model
MaajidKhan Jan 20, 2022
ca70463
Removing any extra conditions for Opset13 ops
MaajidKhan Feb 2, 2022
fde21af
Opset13 changes
MaajidKhan Jan 24, 2022
c9f7ea6
Exception handling for devices
hdgx Jan 24, 2022
d95f8df
Added comments
hdgx Jan 24, 2022
1f663ba
Implement GPU Throttling feature
MaajidKhan Feb 2, 2022
682c466
Renaming the runtime config option
MaajidKhan Feb 9, 2022
d5c3a16
Added the user to video and users group
mayavijx Dec 16, 2021
90c2307
Handling_GPU.0_GPU.1
hdgx Feb 17, 2022
dcebe0a
Handling special conditions
MaajidKhan Feb 18, 2022
94eae53
Added opset13 changes
MaajidKhan Feb 23, 2022
dfca9a5
Log comments updated
MaajidKhan Mar 2, 2022
cf8dbf3
Changes to enable 2.0 api
MaajidKhan Mar 2, 2022
9e69fbc
Merge remote-tracking branch 'origin/enable_2022.1_branch' into sahar…
MaajidKhan Mar 2, 2022
335ba34
Enabling ov-ep for 2022.1 Release
MaajidKhan Jan 17, 2022
1e95f5c
Fix for output mismatch b/w OpenVINO and ONNX
MaajidKhan Jan 17, 2022
0cdbe24
Enabling Adobe ops
MaajidKhan Jan 17, 2022
6c1cbf3
Removing irrelevant conditions
MaajidKhan Jan 17, 2022
fff2c74
Enable upsample op
MaajidKhan Jan 19, 2022
01f229c
Enable Adobe proxy-e model
MaajidKhan Jan 20, 2022
2e1cbe5
Removing any extra conditions for Opset13 ops
MaajidKhan Feb 2, 2022
b819b34
Opset13 changes
MaajidKhan Jan 24, 2022
c4c0f39
Exception handling for devices
hdgx Jan 24, 2022
23b51c8
Added comments
hdgx Jan 24, 2022
59cf128
Implement GPU Throttling feature
MaajidKhan Feb 2, 2022
4eb5a08
Renaming the runtime config option
MaajidKhan Feb 9, 2022
2e87c12
Added the user to video and users group
mayavijx Dec 16, 2021
e2e88e3
Handling_GPU.0_GPU.1
hdgx Feb 17, 2022
067e3dc
Handling special conditions
MaajidKhan Feb 18, 2022
d83517f
Added opset13 changes
MaajidKhan Feb 23, 2022
7a9b39b
Merge remote-tracking branch 'origin/enable_2022.1_branch' into sahar…
MaajidKhan Mar 2, 2022
30696d0
Merge branch 'enable_2022.1_branch' of https://github.com/intel/onnxr…
MaajidKhan Mar 2, 2022
ebdcb21
Merge branch 'enable_2022.1_branch' into sahar/2.0_api
MaajidKhan Mar 2, 2022
1ca62ca
Merge pull request #141 from intel/sahar/2.0_api
sfatimar Mar 2, 2022
9841ba9
Fix build issue
MaajidKhan Mar 2, 2022
e67d02b
Merge remote-tracking branch 'msft/master' into enable_2022.1_mar2
MaajidKhan Mar 3, 2022
9578cef
Fixes issues
MaajidKhan Mar 3, 2022
a231d35
commit to make openvino_2021.4 compatible
MaajidKhan Mar 3, 2022
2ea9e49
Fixed IO Buffer Optimization
MaajidKhan Mar 4, 2022
adb69e4
Fix output names issue
MaajidKhan Mar 4, 2022
035d11c
Merge branch 'master' into enable_2022.1_branch
MaajidKhan Mar 4, 2022
9aeaae4
Fix 2021.3 branch
MaajidKhan Mar 9, 2022
d30e522
Merge pull request #146 from intel/sahar/fix_2021.3
mohsinmx Mar 9, 2022
e6ef10b
Bug Fix for Multiple inputs/outputs
MaajidKhan Mar 12, 2022
05fdbab
Add comments for the changes made
MaajidKhan Mar 14, 2022
28a8d08
Merge pull request #147 from intel/multiple_inps_outs_bug_fix
sfatimar Mar 14, 2022
7361b90
IO Buffer Changes
MaajidKhan Mar 16, 2022
be5de99
Merge branch 'enable_2022.1_branch' into sahar/io_buffer_20
MaajidKhan Mar 16, 2022
459aaec
Commit for Disabling GPU Throttling for 2021.4
MaajidKhan Mar 16, 2022
c25a2ec
Merge pull request #151 from intel/sahar/io_buffer_20
sfatimar Mar 16, 2022
3c21187
Updated branch
MaajidKhan Mar 16, 2022
20a492b
Merge branch 'enable_2022.1_branch' of https://github.com/intel/onnxr…
MaajidKhan Mar 16, 2022
0c6a86f
Fix windows build
MaajidKhan Mar 7, 2022
e10d979
Fixed CPP Unit tests for CPU
MaajidKhan Mar 7, 2022
d5ffcb4
Fixed first set of GPU Tests
MaajidKhan Mar 11, 2022
f250a84
Fixed additional failing tests on GPU
MaajidKhan Mar 14, 2022
7c763a0
Added Expand op support for CPU
MaajidKhan Mar 15, 2022
b341aee
Added condition for squeeze op
MaajidKhan Mar 16, 2022
3af3377
Add support for LessOrEqual op function
MaajidKhan Mar 17, 2022
1c41635
OV Interface wait for replaced by indefinite wait call
MaajidKhan Mar 18, 2022
72ebda5
Merge branch 'master' into enable_2022.1_branch
MaajidKhan Mar 18, 2022
d29123b
use names from ONNX model to access OV tensors
MaajidKhan Mar 20, 2022
8cd78ad
Fixes Myriad unit tests and other issues
MaajidKhan Mar 20, 2022
dc247de
Fix segfault issue
MaajidKhan Mar 21, 2022
eb54b2d
Fixed build isuse with ov 2021.4 with I/O buffer
MaajidKhan Mar 22, 2022
8c00cb6
Disables performance counters for I/O Buffer
MaajidKhan Mar 22, 2022
4f47954
Fixed inputs/outputs mismatch for HDDL with 2022.1
amiraqx Mar 22, 2022
2e87293
Fix to enable GPU FP16
MaajidKhan Mar 23, 2022
63cfc77
Merge branch 'master' into enable_2022.1_branch
MaajidKhan Mar 24, 2022
01f67c4
Enabled mlperf_ssd_mobilenet_300 model fully on CPU
MaajidKhan Mar 24, 2022
4151eff
Added ov version specific dll packaging for nuget
mayavijx Mar 25, 2022
d9880cb
Fixed conditions for few ops
MaajidKhan Mar 28, 2022
08f9381
Dockerfile updates
mayavijx Mar 28, 2022
499a331
Merge branch 'master' into enable_2022.1_branch
MaajidKhan Mar 28, 2022
92b83b3
Updated License Info
MaajidKhan Mar 30, 2022
1fc1620
Merge branch 'master' into enable_2022.1_mar28
MaajidKhan Apr 5, 2022
3b35966
Disabling mlperf_ssd_mobilenet_300 model
MaajidKhan Apr 5, 2022
2561683
Disabling failing python CPU Tests
MaajidKhan Apr 5, 2022
e3e5958
Fixed flake8 python errors
MaajidKhan Apr 5, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions cmake/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -1553,9 +1553,12 @@ if (onnxruntime_USE_OPENVINO)
elseif (${VER} MATCHES "2021.4" OR $ENV{INTEL_OPENVINO_DIR} MATCHES "2021.4")
set(OPENVINO_VERSION "2021.4")
add_definitions(-DOPENVINO_2021_4=1)
elseif (${VER} MATCHES "2022.1" OR $ENV{INTEL_OPENVINO_DIR} MATCHES "2022.1")
set(OPENVINO_VERSION "2022.1")
add_definitions(-DOPENVINO_2022_1=1)
elseif ($ENV{INTEL_OPENVINO_DIR} MATCHES "openvino")
set(OPENVINO_VERSION "2021.4")
add_definitions(-DOPENVINO_2021_4=1)
set(OPENVINO_VERSION "2022.1")
add_definitions(-DOPENVINO_2022_1=1)
else()
message(FATAL_ERROR "Unsupported OpenVINO version: ${INTEL_OPENVINO_DIR}")
endif()
Expand Down
13 changes: 9 additions & 4 deletions cmake/onnxruntime_providers.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -749,16 +749,21 @@ if (onnxruntime_USE_OPENVINO)
# Header paths
find_package(InferenceEngine REQUIRED)
find_package(ngraph REQUIRED)


if (OPENVINO_2022_1)
find_package(OpenVINO REQUIRED COMPONENTS Runtime ONNX)
list (OV_20_LIBS openvino::frontend::onnx openvino::runtime)
endif()

if (WIN32)
unset(CMAKE_MAP_IMPORTED_CONFIG_RELWITHDEBINFO)
endif()

if ((DEFINED ENV{OPENCL_LIBS}) AND (DEFINED ENV{OPENCL_INCS}))
add_definitions(-DIO_BUFFER_ENABLED=1)
list(APPEND OPENVINO_LIB_LIST $ENV{OPENCL_LIBS} ${InferenceEngine_LIBRARIES} ${NGRAPH_LIBRARIES} ngraph::onnx_importer ${PYTHON_LIBRARIES})
list(APPEND OPENVINO_LIB_LIST $ENV{OPENCL_LIBS} ${OV_20_LIBS} ${InferenceEngine_LIBRARIES} ${NGRAPH_LIBRARIES} ngraph::onnx_importer ${PYTHON_LIBRARIES})
else()
list(APPEND OPENVINO_LIB_LIST ${InferenceEngine_LIBRARIES} ${NGRAPH_LIBRARIES} ngraph::onnx_importer ${PYTHON_LIBRARIES})
list(APPEND OPENVINO_LIB_LIST ${OV_20_LIBS} ${InferenceEngine_LIBRARIES} ${NGRAPH_LIBRARIES} ngraph::onnx_importer ${PYTHON_LIBRARIES})
endif()

source_group(TREE ${ONNXRUNTIME_ROOT}/core FILES ${onnxruntime_providers_openvino_cc_srcs})
Expand All @@ -771,7 +776,7 @@ if (onnxruntime_USE_OPENVINO)
target_compile_options(onnxruntime_providers_openvino PRIVATE "-Wno-parentheses")
endif()
add_dependencies(onnxruntime_providers_openvino onnxruntime_providers_shared ${onnxruntime_EXTERNAL_DEPENDENCIES})
target_include_directories(onnxruntime_providers_openvino SYSTEM PUBLIC ${ONNXRUNTIME_ROOT} ${CMAKE_CURRENT_BINARY_DIR} ${eigen_INCLUDE_DIRS} ${OPENVINO_INCLUDE_DIR_LIST} ${PYTHON_INCLUDE_DIRS} $ENV{OPENCL_INCS})
target_include_directories(onnxruntime_providers_openvino SYSTEM PUBLIC ${ONNXRUNTIME_ROOT} ${CMAKE_CURRENT_BINARY_DIR} ${eigen_INCLUDE_DIRS} ${OpenVINO_INCLUDE_DIR} ${OPENVINO_INCLUDE_DIR_LIST} ${PYTHON_INCLUDE_DIRS} $ENV{OPENCL_INCS})
target_link_libraries(onnxruntime_providers_openvino ${ONNXRUNTIME_PROVIDERS_SHARED} ${OPENVINO_LIB_LIST} absl::raw_hash_set absl::hash)

if(MSVC)
Expand Down
10 changes: 5 additions & 5 deletions dockerfiles/Dockerfile.openvino
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# SPDX-License-Identifier: MIT
#--------------------------------------------------------------------------

ARG OPENVINO_VERSION=2021.4.2
ARG OPENVINO_VERSION=2022.1.0


# Build stage
Expand All @@ -17,8 +17,7 @@ ARG DEVICE=CPU_FP32
ARG ONNXRUNTIME_REPO=https://github.com/microsoft/onnxruntime.git
ARG ONNXRUNTIME_BRANCH=master

ENV InferenceEngine_DIR=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/share
ENV ngraph_DIR=${INTEL_OPENVINO_DIR}/deployment_tools/ngraph/cmake
ENV InferenceEngine_DIR=${INTEL_OPENVINO_DIR}/runtime/cmake

USER root
RUN apt update; apt install -y git protobuf-compiler libprotobuf-dev
Expand Down Expand Up @@ -52,10 +51,11 @@ RUN apt update; apt install -y unattended-upgrades && \
ARG BUILD_UID=1001
ARG BUILD_USER=onnxruntimedev
RUN adduser --uid $BUILD_UID $BUILD_USER
RUN usermod -a -G video,users ${BUILD_USER}
ENV WORKDIR_PATH /home/${BUILD_USER}
WORKDIR ${WORKDIR_PATH}

USER ${BUILD_USER}
ENV PATH=${WORKDIR_PATH}/miniconda/bin:${WORKDIR_PATH}/cmake-dir/bin:$PATH
ENV IE_PLUGINS_PATH=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/lib/intel64
ENV LD_LIBRARY_PATH=/opt/intel/opencl:${INTEL_OPENVINO_DIR}/inference_engine/external/gna/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/mkltiny_lnx/lib:$INTEL_OPENVINO_DIR/deployment_tools/ngraph/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/omp/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/tbb/lib:${IE_PLUGINS_PATH}:${LD_LIBRARY_PATH}
ENV IE_PLUGINS_PATH=${INTEL_OPENVINO_DIR}/runtime/lib/intel64
ENV LD_LIBRARY_PATH=/opt/intel/opencl:${INTEL_OPENVINO_DIR}/runtime/3rdparty/tbb/lib:${IE_PLUGINS_PATH}:${LD_LIBRARY_PATH}
42 changes: 18 additions & 24 deletions dockerfiles/Dockerfile.openvino-csharp
Original file line number Diff line number Diff line change
Expand Up @@ -15,15 +15,13 @@ ARG MY_ROOT=/code
ENV PATH /opt/miniconda/bin:/code/cmake-3.21.0-linux-x86_64/bin:$PATH
ENV LD_LIBRARY_PATH=/opt/miniconda/lib:/usr/lib:/usr/lib/x86_64-linux-gnu:$LD_LIBRARY_PATH

ENV INTEL_OPENVINO_DIR=/opt/intel/openvino_2021.4.752
ENV InferenceEngine_DIR=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/share
ENV IE_PLUGINS_PATH=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/lib/intel64
ENV LD_LIBRARY_PATH=/opt/intel/opencl:${INTEL_OPENVINO_DIR}/inference_engine/external/gna/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/mkltiny_lnx/lib:$INTEL_OPENVINO_DIR/deployment_tools/ngraph/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/omp/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/tbb/lib:${IE_PLUGINS_PATH}:${LD_LIBRARY_PATH}
ENV OpenCV_DIR=${INTEL_OPENVINO_DIR}/opencv/share/OpenCV
ENV LD_LIBRARY_PATH=${INTEL_OPENVINO_DIR}/opencv/lib:${INTEL_OPENVINO_DIR}/opencv/share/OpenCV/3rdparty/lib:${LD_LIBRARY_PATH}
ENV HDDL_INSTALL_DIR=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/hddl
ENV LD_LIBRARY_PATH=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/hddl/lib:$LD_LIBRARY_PATH
ENV ngraph_DIR=${INTEL_OPENVINO_DIR}/deployment_tools/ngraph/cmake
ENV INTEL_OPENVINO_DIR=/opt/intel/openvino_2022.1.0.643
ENV InferenceEngine_DIR=${INTEL_OPENVINO_DIR}/runtime/cmake
ENV IE_PLUGINS_PATH=${INTEL_OPENVINO_DIR}/runtime/lib/intel64
ENV LD_LIBRARY_PATH=/opt/intel/opencl:${INTEL_OPENVINO_DIR}/runtime/3rdparty/tbb/lib:${IE_PLUGINS_PATH}:${LD_LIBRARY_PATH}
ENV HDDL_INSTALL_DIR=${INTEL_OPENVINO_DIR}/runtime/3rdparty/hddl
ENV LD_LIBRARY_PATH=${INTEL_OPENVINO_DIR}/runtime/3rdparty/hddl/lib:$LD_LIBRARY_PATH
ENV ngraph_DIR=${INTEL_OPENVINO_DIR}/runtime/cmake
ENV LANG en_US.UTF-8
ENV DEBIAN_FRONTEND=noninteractive

Expand All @@ -49,19 +47,15 @@ RUN apt update -y && \
/bin/mkdir -p '/usr/local/lib/pkgconfig' && \
# Install OpenVINO
cd ${MY_ROOT} && \
wget https://apt.repos.intel.com/openvino/2021/GPG-PUB-KEY-INTEL-OPENVINO-2021 && \
apt-key add GPG-PUB-KEY-INTEL-OPENVINO-2021 && rm GPG-PUB-KEY-INTEL-OPENVINO-2021 && \
cd /etc/apt/sources.list.d && \
echo "deb https://apt.repos.intel.com/openvino/2021 all main">intel-openvino-2021.list && \
wget https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB && \
apt-key add GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB && rm GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB && \
echo "deb https://apt.repos.intel.com/openvino/2022 bionic main" | tee /etc/apt/sources.list.d/intel-openvino-2022.list && \
apt update -y && \
apt -y install intel-openvino-dev-ubuntu18-2021.4.752 && \
apt -y install openvino-2022.1.0 && \
cd ${INTEL_OPENVINO_DIR}/install_dependencies && ./install_openvino_dependencies.sh -y && \
cd ${INTEL_OPENVINO_DIR} && rm -rf documentation data_processing && \
cd deployment_tools/ && rm -rf model_optimizer open_model_zoo demo tools && \
cd inference_engine && rm -rf samples && \
cd /opt/libusb-1.0.22/ && \
/usr/bin/install -c -m 644 libusb-1.0.pc '/usr/local/lib/pkgconfig' && \
cp /opt/intel/openvino_2021/deployment_tools/inference_engine/external/97-myriad-usbboot.rules /etc/udev/rules.d/ && \
cp ${INTEL_OPENVINO_DIR}/runtime/3rdparty/97-myriad-usbboot.rules /etc/udev/rules.d/ && \
ldconfig && \
# Install GPU runtime and drivers
cd ${MY_ROOT} && \
Expand All @@ -70,18 +64,18 @@ RUN apt update -y && \
apt update -y && \
apt install -y --no-install-recommends ocl-icd-libopencl1 && \
rm -rf /var/lib/apt/lists/* && \
wget "https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-gmmlib_19.3.2_amd64.deb" && \
wget "https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-igc-core_1.0.2597_amd64.deb" && \
wget "https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-igc-opencl_1.0.2597_amd64.deb" && \
wget "https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-opencl_19.41.14441_amd64.deb" && \
wget "https://github.com/intel/compute-runtime/releases/download/19.41.14441/intel-ocloc_19.41.14441_amd64.deb" && \
wget "https://github.com/intel/compute-runtime/releases/download/21.38.21026/intel-gmmlib_21.2.1_amd64.deb" && \
wget "https://github.com/intel/intel-graphics-compiler/releases/download/igc-1.0.8708/intel-igc-core_1.0.8708_amd64.deb" && \
wget "https://github.com/intel/intel-graphics-compiler/releases/download/igc-1.0.8708/intel-igc-opencl_1.0.8708_amd64.deb" && \
wget "https://github.com/intel/compute-runtime/releases/download/21.38.21026/intel-opencl_21.38.21026_amd64.deb" && \
wget "https://github.com/intel/compute-runtime/releases/download/21.38.21026/intel-ocloc_21.38.21026_amd64.deb" && \
wget "https://github.com/intel/compute-runtime/releases/download/21.38.21026/intel-level-zero-gpu_1.2.21026_amd64.deb" && \
dpkg -i /tmp/opencl/*.deb && \
ldconfig && \
rm -rf /tmp/opencl && \
# Install Mono
cd ${MY_ROOT} && \
apt install -y gnupg ca-certificates && \
#apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF && \
curl https://download.mono-project.com/repo/xamarin.gpg | apt-key add - && \
echo "deb https://download.mono-project.com/repo/ubuntu stable-bionic main" | sudo tee /etc/apt/sources.list.d/mono-official-stable.list && \
apt update -y && \
Expand Down
5 changes: 5 additions & 0 deletions dockerfiles/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -152,6 +152,11 @@ If the `device_type` runtime config option is not explicitly specified, CPU will
```
docker run -it --rm --device-cgroup-rule='c 189:* rmw' -v /dev/bus/usb:/dev/bus/usb --device /dev/dri:/dev/dri onnxruntime-gpu:latest
```
If your host system is Ubuntu 20, use the below command to run. Please find the alternative steps [here](https://github.com/openvinotoolkit/docker_ci/blob/master/configure_gpu_ubuntu20.md).
```
docker run -it --rm --device-cgroup-rule='c 189:* rmw' -v /dev/bus/usb:/dev/bus/usb --device /dev/dri:/dev/dri --group-add=$(stat -c "%g" /dev/dri/render*) onnxruntime-gpu:latest
```

### OpenVINO on Myriad VPU Accelerator

1. Build the docker image from the DockerFile in this repository.
Expand Down
3 changes: 2 additions & 1 deletion include/onnxruntime/core/session/onnxruntime_c_api.h
Original file line number Diff line number Diff line change
Expand Up @@ -503,7 +503,7 @@ typedef struct OrtMIGraphXProviderOptions {
*/
typedef struct OrtOpenVINOProviderOptions {
#ifdef __cplusplus
OrtOpenVINOProviderOptions() : device_type{}, enable_vpu_fast_compile{}, device_id{}, num_of_threads{}, use_compiled_network{}, blob_dump_path{}, context{} {}
OrtOpenVINOProviderOptions() : device_type{}, enable_vpu_fast_compile{}, device_id{}, num_of_threads{}, use_compiled_network{}, blob_dump_path{}, context{}, enable_opencl_throttling{} {}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

using c api, we're supposed to maintain backwards compatibility (with abi)
i.e. a user can drop in a new version of onnxruntime shared lib (built with openvino ep)
and it would work. But here, applications built using earlier ort version would be using a version of OrtOpenVINOProviderOptions struct that did not contain this new field.
do you know if openvino ep users rely on abi? or will they always recompile their application with latest ort headers/library.
This is why we have moved towards using opaque structs to avoid this issue:
see #7808 as example.
we create a V2 version of struct, which the user does not directly manipulate. they use api's to construct and pass the struct in as option. the api's use key/value strings for EP options.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi George. As of today, all our users don't rely on abi. They usually recompile their application with latest ort headers/library with the version of OpenVINO they want to work with. However, I think moving forward, we would also like to move to use the opaque structs as mentioned in the above comments to have the backward compatibility support.

#endif
/** \brief Device type string
*
Expand All @@ -516,6 +516,7 @@ typedef struct OrtOpenVINOProviderOptions {
unsigned char use_compiled_network; ///< 0 = disabled, nonzero = enabled
const char* blob_dump_path; // path is set to empty by default
void* context;
unsigned char enable_opencl_throttling; ///< 0 = disabled, nonzero = enabled
} OrtOpenVINOProviderOptions;

struct OrtApi;
Expand Down
2 changes: 1 addition & 1 deletion onnxruntime/core/providers/openvino/backend_manager.cc
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
// Copyright(C) 2019 Intel Corporation
// Copyright (C) 2019-2022 Intel Corporation
// Licensed under the MIT License

#include "core/providers/shared_library/provider_api.h"
Expand Down
5 changes: 2 additions & 3 deletions onnxruntime/core/providers/openvino/backend_manager.h
Original file line number Diff line number Diff line change
@@ -1,10 +1,9 @@
// Copyright(C) 2019 Intel Corporation
// Copyright (C) 2019-2022 Intel Corporation
// Licensed under the MIT License

#pragma once

#include <inference_engine.hpp>

#include "ov_interface.h"
#include "contexts.h"
#include "ibackend.h"

Expand Down
Loading