ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Find a file
Numfor Tiapo dee36f8ade
DML EP Register ScatterND-16 (#14240)
This PR registers ScatterND-16 to the DML EP

- CPU fallback is added if the reduction attribute is in use, as this is
not yet supported by DML.

Co-authored-by: Numfor Mbiziwo-Tiapo <numform@microsoft.com>
2023-01-12 10:39:25 -08:00
.config Update tsaoptions.json: update the email alias (#13448) 2022-10-26 15:56:16 -07:00
.devcontainer Remove two lines in the Dockerfile for Github Codespace (#12278) 2022-07-21 20:52:17 -07:00
.gdn Update compliance tasks in python packaging pipeline and fix some compile warnings (#8471) 2021-07-30 17:16:37 -07:00
.github Delete add-issues-to-project (#14147) 2023-01-11 14:33:37 -08:00
.pipelines [DML EP] Upgrade DML to 1.10.0 (#13796) 2022-11-30 21:32:14 -08:00
.vscode cpplint & Eager mode: refactor and add comments to empty_* functions, general lint cleanup in ort_aten (#12238) 2022-07-20 11:47:57 -04:00
cgmanifests Add ability to register custom ops by specifying a function name (#14177) 2023-01-12 15:11:34 +10:00
cmake [DML EP] Add FusedMatMul (#14196) 2023-01-12 02:17:04 -08:00
csharp Create dedicated build for training api (#14136) 2023-01-10 20:58:04 -08:00
dockerfiles Openvino ep 2022.3 v4.3 (#14210) 2023-01-11 16:31:26 -08:00
docs DML EP Register ScatterND-16 (#14240) 2023-01-12 10:39:25 -08:00
include/onnxruntime/core Add ability to register custom ops by specifying a function name (#14177) 2023-01-12 15:11:34 +10:00
java [java] Sparse tensor support (#10653) 2022-11-22 10:29:24 -08:00
js [web] utility functions for tensor<->image conversion in ORT web (#13603) 2023-01-12 09:05:18 -08:00
objectivec [xnnpack-ep] NEW EP API in objc (#13941) 2022-12-15 20:12:02 +08:00
onnxruntime DML EP Register ScatterND-16 (#14240) 2023-01-12 10:39:25 -08:00
orttraining Split(18) (#14015) 2023-01-12 08:14:10 +10:00
package/rpm Bumping up version number to 1.14.0 on main branch (#13401) 2022-10-21 19:16:44 -04:00
samples Format all python files under onnxruntime with black and isort (#11324) 2022-04-26 09:35:16 -07:00
test Multi-stream execution support (#13495) 2022-12-15 07:39:29 -08:00
tools Add ability to register custom ops by specifying a function name (#14177) 2023-01-12 15:11:34 +10:00
winml Enabling thread pool to be numa-aware (#13778) 2022-12-12 10:33:55 -08:00
.clang-format
.clang-tidy Create clang-tidy CI (#12653) 2022-09-30 08:05:38 -07:00
.dockerignore Update dockerfiles (#5929) 2020-11-25 15:38:22 -08:00
.flake8 Remove miscellaneous nuphar configs (#13070) 2022-09-26 13:41:28 -07:00
.gitattributes
.gitignore Ignore more build directories and clangd files (#14154) 2023-01-07 06:58:57 +08:00
.gitmodules Remove unused git submodules (#13830) 2022-12-07 21:59:16 -08:00
build.amd64.1411.bat
build.bat
build.sh
CITATION.cff Fix CITATION.cff and add automatic validation of your citation metadata (#10478) 2022-04-13 10:03:52 -07:00
CODEOWNERS Add cgmanifest file in codeowner list (#13042) 2022-09-22 18:58:01 -07:00
CONTRIBUTING.md minor improvements to CONTRIBUTING doc (#11080) 2022-04-12 15:22:34 -07:00
lgtm.yml Fix lgtm C++ error (#13613) 2022-11-10 10:06:22 -08:00
LICENSE Remove year from license (#6658) 2021-02-12 00:25:56 -08:00
NuGet.config Delete nuget extra configs (#6477) 2021-01-27 20:25:45 -08:00
ort.wprp
ORT_icon_for_light_bg.png Update nuget icon (#10672) 2022-03-01 09:11:03 -08:00
packages.config [DML EP] Upgrade DML to 1.10.0 (#13796) 2022-11-30 21:32:14 -08:00
pyproject.toml Update pylint config to include valid short names (#13631) 2022-11-14 10:00:25 -08:00
README.md Update resource section in readme (#13724) 2022-11-28 09:42:31 -08:00
requirements-dev.txt Introduce parameterized as a dev dependency (#11364) 2022-04-26 17:24:39 -07:00
requirements-doc.txt Add auto doc gen for ORTModule API during CI build (#7046) 2021-03-22 10:20:33 -07:00
requirements-training.txt Remove protobuf pin from training requirements (#13695) 2022-11-22 12:27:18 -08:00
requirements.txt.in Add additional python requirements (#11522) 2022-05-20 16:16:18 -07:00
SECURITY.md Microsoft mandatory file (#11619) 2022-05-25 13:56:10 -07:00
setup.py Openvino ep 2022.3 v4.3 (#14210) 2023-01-11 16:31:26 -08:00
ThirdPartyNotices.txt Add ability to register custom ops by specifying a function name (#14177) 2023-01-12 15:11:34 +10:00
VERSION_NUMBER Bumping up version number to 1.14.0 on main branch (#13401) 2022-10-21 19:16:44 -04:00

ONNX Runtime is a cross-platform inference and training machine-learning accelerator.

ONNX Runtime inference can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms. Learn more →

ONNX Runtime training can accelerate the model training time on multi-node NVIDIA GPUs for transformer models with a one-line addition for existing PyTorch training scripts. Learn more →

Get Started & Resources

Build Pipeline Status

System CPU GPU EPs
Windows Build Status Build Status Build Status
Linux Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Mac Build Status
Build Status
Android Build Status
iOS Build Status
WebAssembly Build Status

Data/Telemetry

Windows distributions of this project may collect usage data and send it to Microsoft to help improve our products and services. See the privacy statement for more details.

Contributions and Feedback

We welcome contributions! Please see the contribution guidelines.

For feature requests or bug reports, please file a GitHub Issue.

For general discussion or questions, please use GitHub Discussions.

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

License

This project is licensed under the MIT License.