Skip to content

Latest commit

 

History

History
52 lines (40 loc) · 2.02 KB

README.md

File metadata and controls

52 lines (40 loc) · 2.02 KB

libxop.so

libxop.so, which strives to include custom op of all backend inference engines, such as ONNXRuntime and libtorch.
Considering that it is almost impossible for the edge device side to use it to inference, and it is mainly used to support model transformation across DNN frameworks and graph optimization, so only the version of x86 architecture has been developed.

Architecture

Architecture

Roadmap

Useful, easy to use, indispensable

Requirements

  • cmake >= 3.15.5
  • cuda 11.4Recommended
  • g++ >= 7.5 or 9.3Recommended
  • ubnutu >= 18.04 or 20.04Recommended
  • onnxruntime-linux-x64-1.8.1

Build and Installation (docker)

git clone -b develop http://10.94.119.155/team/percep/porting/libraries/xop.git   
docker pull 10.95.61.122:80/devops/dds_cross_compile:v3.3.1 

docker run -it -v /tmp/.X11-unix:/tmp/.X11-unix -e DISPLAY=unix$DISPLAY --privileged --network host -v /home/igs:/root/code --gpus all --name dds-conan-v3.3.1 6e5b2467c5be bash

# enter into docker env 
cd xop  
sh ./scripts/build_project.sh  x86_64  

Release

Known Issues

none     

Reference

Copyright and License

xop is provided under the [Apache-2.0 license](LICENSE).