mirror of
https://github.com/saymrwulf/onnxruntime.git
synced 2026-05-14 20:48:00 +00:00
docker updated to support openvino R1.1 (#1475)
* docker updated to support openvino R1.1 * Update README.md Updated Readme to downlaod openvino R1.1
This commit is contained in:
parent
91d32c9060
commit
c0f927c57c
2 changed files with 11 additions and 11 deletions
|
|
@ -6,7 +6,7 @@
|
|||
FROM ubuntu:16.04
|
||||
|
||||
RUN apt update && \
|
||||
apt -y install python3.5 python3-pip zip x11-apps lsb-core wget cpio sudo libboost-python-dev libpng-dev zlib1g-dev git libnuma1 ocl-icd-libopencl1 clinfo libboost-filesystem1.58.0 libboost-thread1.58.0 protobuf-compiler libprotoc-dev libusb-1.0-0-dev&& pip3 install numpy networkx opencv-python pytest && locale-gen en_US.UTF-8 && update-locale LANG=en_US.UTF-8
|
||||
apt -y install python3.5 python3-pip zip x11-apps lsb-core wget cpio sudo libboost-python-dev libpng-dev zlib1g-dev git libnuma1 ocl-icd-libopencl1 clinfo libboost-filesystem1.58.0 libboost-thread1.58.0 protobuf-compiler libprotoc-dev libusb-1.0-0-dev && pip3 install numpy networkx opencv-python pytest && locale-gen en_US.UTF-8 && update-locale LANG=en_US.UTF-8
|
||||
|
||||
ARG DEVICE=CPU_FP32
|
||||
ARG ONNXRUNTIME_REPO=https://github.com/microsoft/onnxruntime
|
||||
|
|
@ -28,20 +28,20 @@ RUN tar -xzf l_openvino_toolkit*.tgz && \
|
|||
cd /opt/intel/openvino/deployment_tools/model_optimizer/install_prerequisites && ./install_prerequisites_onnx.sh
|
||||
|
||||
ENV LD_LIBRARY_PATH=/usr/lib:/usr/lib/x86_64-linux-gnu:$LD_LIBRARY_PATH
|
||||
ENV INTEL_OPENVINO_DIR=/opt/intel/openvino
|
||||
ENV INTEL_CVSDK_DIR=/opt/intel/openvino
|
||||
ENV LD_LIBRARY_PATH=${INSTALL_OPENVINO_DIR}/deployment_tools/model_optimizer/model_optimizer_caffe/bin:${LD_LIBRARY_PATH}
|
||||
ENV ModelOptimizer_ROOT_DIR=${INSTALL_OPENVINO_DIR}/deployment_tools/model_optimizer/model_optimizer_caffe
|
||||
ENV INTEL_OPENVINO_DIR=/opt/intel/openvino_2019.1.144
|
||||
ENV INTEL_CVSDK_DIR=/opt/intel/openvino_2019.1.144
|
||||
ENV LD_LIBRARY_PATH=${INTEL_OPENVINO_DIR}/deployment_tools/model_optimizer/model_optimizer_caffe/bin:${LD_LIBRARY_PATH}
|
||||
ENV ModelOptimizer_ROOT_DIR=${INTEL_OPENVINO_DIR}/deployment_tools/model_optimizer/model_optimizer_caffe
|
||||
ENV InferenceEngine_DIR=${INTEL_CVSDK_DIR}/deployment_tools/inference_engine/share
|
||||
ENV IE_PLUGINS_PATH=${INTEL_CVSDK_DIR}/deployment_tools/inference_engine/lib/intel64
|
||||
ENV LD_LIBRARY_PATH=/opt/intel/opencl:${INSTALL_OPENVINO_DIR}/deployment_tools/inference_engine/external/cldnn/lib:${INSTALL_OPENVINO_DIR}/inference_engine/external/gna/lib:${INSTALL_OPENVINO_DIR}/deployment_tools/inference_engine/external/mkltiny_lnx/lib:${INSTALL_OPENVINO_DIR}/deployment_tools/inference_engine/external/omp/lib:${INSTALL_OPENVINO_DIR}/deployment_tools/inference_engine/external/tbb/lib:${IE_PLUGINS_PATH}:${LD_LIBRARY_PATH}
|
||||
ENV OpenCV_DIR=${INSTALL_OPENVINO_DIR}/opencv/share/OpenCV
|
||||
ENV LD_LIBRARY_PATH=${INSTALL_OPENVINO_DIR}/opencv/lib:${INSTALL_OPENVINO_DIR}/opencv/share/OpenCV/3rdparty/lib:${LD_LIBRARY_PATH}
|
||||
ENV LD_LIBRARY_PATH=/opt/intel/opencl:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/cldnn/lib:${INTEL_OPENVINO_DIR}/inference_engine/external/gna/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/mkltiny_lnx/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/omp/lib:${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/tbb/lib:${IE_PLUGINS_PATH}:${LD_LIBRARY_PATH}
|
||||
ENV OpenCV_DIR=${INTEL_OPENVINO_DIR}/opencv/share/OpenCV
|
||||
ENV LD_LIBRARY_PATH=${INTEL_OPENVINO_DIR}/opencv/lib:${INTEL_OPENVINO_DIR}/opencv/share/OpenCV/3rdparty/lib:${LD_LIBRARY_PATH}
|
||||
ENV PATH=${INTEL_CVSDK_DIR}/deployment_tools/model_optimizer:$PATH
|
||||
ENV PYTHONPATH=${INTEL_CVSDK_DIR}/deployment_tools/model_optimizer:$PYTHONPATH
|
||||
ENV PYTHONPATH=$INTEL_CVSDK_DIR/python/python3.5:${INTEL_CVSDK_DIR}/python/python3.5/ubuntu16:${PYTHONPATH}
|
||||
ENV HDDL_INSTALL_DIR=${INSTALL_OPENVINO_DIR}/deployment_tools/inference_engine/external/hddl
|
||||
ENV LD_LIBRARY_PATH=${INSTALL_OPENVINO_DIR}/deployment_tools/inference_engine/external/hddl/lib:$LD_LIBRARY_PATH
|
||||
ENV HDDL_INSTALL_DIR=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/hddl
|
||||
ENV LD_LIBRARY_PATH=${INTEL_OPENVINO_DIR}/deployment_tools/inference_engine/external/hddl/lib:$LD_LIBRARY_PATH
|
||||
|
||||
RUN wget https://github.com/intel/compute-runtime/releases/download/19.15.12831/intel-gmmlib_19.1.1_amd64.deb
|
||||
RUN wget https://github.com/intel/compute-runtime/releases/download/19.15.12831/intel-igc-core_1.0.2-1787_amd64.deb
|
||||
|
|
|
|||
|
|
@ -87,7 +87,7 @@
|
|||
|
||||
Retrieve your docker image in one of the following ways.
|
||||
|
||||
- For building the docker image, download OpenVINO online installer version 2018 R5.0.1 from [here](https://software.intel.com/en-us/openvino-toolkit/choose-download) and copy the openvino tar file in the same directory and build the image. The online installer size is only 16MB and the components needed for the accelerators are mentioned in the dockerfile. Providing the argument device enables onnxruntime for that particular device. You can also provide arguments ONNXRUNTIME_REPO and ONNXRUNTIME_BRANCH to test that particular repo and branch. Default values are http://github.com/microsoft/onnxruntime and repo is master
|
||||
- For building the docker image, download OpenVINO online installer version 2019 R1.1 from [here](https://software.intel.com/en-us/openvino-toolkit/choose-download) and copy the openvino tar file in the same directory and build the image. The online installer size is only 16MB and the components needed for the accelerators are mentioned in the dockerfile. Providing the argument device enables onnxruntime for that particular device. You can also provide arguments ONNXRUNTIME_REPO and ONNXRUNTIME_BRANCH to test that particular repo and branch. Default values are http://github.com/microsoft/onnxruntime and repo is master
|
||||
```
|
||||
docker build -t onnxruntime --build-arg DEVICE=$DEVICE .
|
||||
```
|
||||
|
|
|
|||
Loading…
Reference in a new issue