mirror of
https://github.com/saymrwulf/onnxruntime.git
synced 2026-05-14 20:48:00 +00:00
|
|
||
|---|---|---|
| .. | ||
| Dockerfile.arm32v7 | ||
| Dockerfile.ngraph | ||
| Dockerfile.openvino | ||
| Dockerfile.server | ||
| README.md | ||
Quick-start Docker containers for ONNX Runtime
nGraph Version (Preview)
Linux 16.04, Python Bindings
- Build the docker image from the Dockerfile in this repository.
# If you have a Linux machine, preface this command with "sudo"
docker build -t onnxruntime-ngraph -f Dockerfile.ngraph .
- Run the Docker image
# If you have a Linux machine, preface this command with "sudo"
docker run -it onnxruntime-ngraph
ONNX Runtime Server (Preview)
Linux 16.04
- Build the docker image from the Dockerfile in this repository
docker build -t {docker_image_name} -f Dockerfile.server .
- Run the ONNXRuntime server with the image created in step 1
docker run -v {localModelAbsoluteFolder}:{dockerModelAbsoluteFolder} -e MODEL_ABSOLUTE_PATH={dockerModelAbsolutePath} -p {your_local_port}:8001 {imageName}
- Send HTTP requests to the container running ONNX Runtime Server
Send HTTP requests to the docker container through the binding local port. Here is the full usage document.
curl -X POST -d "@request.json" -H "Content-Type: application/json" http://0.0.0.0:{your_local_port}/v1/models/mymodel/versions/3:predict
OpenVINO Version (Preview)
Linux 16.04, Python Bindings
- Build the docker image from the Dockerfile in this repository.
# If you have a Linux machine, preface this command with "sudo"
docker build -t onnxruntime-openvino -f Dockerfile.openvino .
To use GPU_FP32:
docker build -t onnxruntime-openvino --build-arg TARGET_DEVICE=GPU_FP32 -f Dockerfile.openvino .
- Run the Docker image
# If you have a Linux machine, preface this command with "sudo"
docker run -it onnxruntime-openvino