onnxruntime/dockerfiles
Harry Yang 2512b0ebeb Adding the onnxruntime Dockerfile and instructions (#968)
* Adding the onnxruntime Dockerfile and instructions

* Adding the onnxruntime server Dockerfile with build and instructions

* Specify more details in instruction

* Update dockerfiles/README.md

Co-Authored-By: tianchijushi <huaimingyang@hotmail.com>

* Update Dockerfile.server

Specify the onnxruntime server build branch during git clone
2019-05-08 18:24:16 -07:00
..
Dockerfile.arm32v7 ARM32v7 Dockerfile and build instructions update. (#737) 2019-04-03 14:45:24 -07:00
Dockerfile.ngraph ng ep update1 (#895) 2019-04-24 10:35:26 -07:00
Dockerfile.server Adding the onnxruntime Dockerfile and instructions (#968) 2019-05-08 18:24:16 -07:00
README.md Adding the onnxruntime Dockerfile and instructions (#968) 2019-05-08 18:24:16 -07:00

Quick-start Docker containers for ONNX Runtime

nGraph Version (Preview)

Linux 16.04, Python Bindings

  1. Build the docker image from the Dockerfile in this repository.
# If you have a Linux machine, preface this command with "sudo"

docker build -t onnxruntime-ngraph -f Dockerfile.ngraph .
  1. Run the Docker image
# If you have a Linux machine, preface this command with "sudo"

docker run -it onnxruntime-ngraph

ONNX Runtime Server (Preview)

Linux 16.04

  1. Build the docker image from the Dockerfile in this repository
docker build -t {docker_image_name} -f Dockerfile.server .
  1. Run the ONNXRuntime server with the image created in step 1
docker run -v {localModelAbsoluteFolder}:{dockerModelAbsoluteFolder} -e MODEL_ABSOLUTE_PATH={dockerModelAbsolutePath} -p {your_local_port}:8001 {imageName}
  1. Send HTTP requests to the container running ONNX Runtime Server

Send HTTP requests to the docker container through the binding local port. Here is the full usage document.

curl  -X POST -d "@request.json" -H "Content-Type: application/json" http://0.0.0.0:{your_local_port}/v1/models/mymodel/versions/3:predict