mirror of
https://github.com/saymrwulf/onnxruntime.git
synced 2026-05-14 20:48:00 +00:00
| .. | ||
| c_cxx | ||
| iOS | ||
| nodejs | ||
| python | ||
| swift | ||
| .gitignore | ||
| README.md | ||
ONNX Runtime Samples and Tutorials
Here you will find various samples, tutorials, and reference implementations for using ONNX Runtime. For a list of available dockerfiles and published images to help with getting started, see this page.
General
Integrations
- Azure Machine Learning
- Azure IoT Edge
- Azure Media Services
- Azure SQL Edge and Managed Instance
- Windows Machine Learning
- ML.NET
- Huggingface
General
Python
Inference only
- Basic
- Resnet50
- ONNX-Ecosystem Docker image samples
- ONNX Runtime Server: SSD Single Shot MultiBox Detector
- NUPHAR EP samples
Inference with model conversion
- SKL tutorials
- Keras - Basic
- SSD Mobilenet (Tensorflow)
- BERT-SQuAD (PyTorch) on CPU
- BERT-SQuAD (PyTorch) on GPU
- BERT-SQuAD (Keras)
- BERT-SQuAD (Tensorflow)
- GPT2 (PyTorch)
- EfficientDet (Tensorflow)
- EfficientNet-Edge (Tensorflow)
- EfficientNet-Lite (Tensorflow)
- EfficientNet(Keras)
- MNIST (Keras)
Quantization
Other
C#
C/C++
- C: SqueezeNet
- C++: model-explorer - single and batch processing
- C++: SqueezeNet
- C++: MNIST
Java
Node.js
Integrations
Azure Machine Learning
Inference and deploy through AzureML
For aditional information on training in AzureML, please see AzureML Training Notebooks
- Inferencing on CPU using ONNX Model Zoo models:
- Inferencing on CPU with PyTorch model training:
- Inferencing on CPU with model conversion for existing (CoreML) model:
- Inferencing on GPU with TensorRT Execution Provider (AKS):
Azure IoT Edge
Inference and Deploy with Azure IoT Edge
Azure Media Services
Azure SQL
Deploy ONNX model in Azure SQL Edge
Windows Machine Learning
Examples of inferencing with ONNX Runtime through Windows Machine Learning
ML.NET
Object Detection with ONNX Runtime in ML.NET