ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Find a file
Ashrit Shetty 4b5b5f7101
Update win-ort-main to tip main 250123 (#23473)
### Description
This PR is to update the win-ort-main branch to the tip main branch as
of 2025-01-23.

### PR List
ddf0d377a7 [QNN EP] Add LoggingManager::HasDefaultLogger() to provider
bridge API (#23467)
05fbbdf91f [QNN EP] Make QNN EP a shared library (#23120)
1336566d7f Add custom vcpkg ports (#23456)
2e1173c411 Update the compile flags for vcpkg packages (#23455)
1f628a9858 [Mobile] Add BrowserStack Android MAUI Test (#23383)
009cae0ec8 [js/webgpu] Optimize ConvTranspose (Continue) (#23429)
04a4a694cb Use onnx_protobuf.h to suppress some GCC warnings (#23453)
2e3b62b4b0 Suppress some strict-aliasing related warnings in WebGPU EP
(#23454)
b708f9b1dc Bump ruff from 0.9.1 to 0.9.2 (#23427)
c0afc66b2a [WebNN] Remove workarounds for TFLite backend (#23406)
8a821ff7f9 Bump vite from 6.0.7 to 6.0.11 in
/js/web/test/e2e/exports/testcases/vite-default (#23446)
220c1a203e Make ORT and Dawn use the same protobuf/abseil source code
(#23447)
b7b5792147 Change MacOS-13 to ubuntu on for
android-java-api-aar-test.yml. (#23444)
19d0d2a30f WIP: Dp4MatMulNBits accuracy level 4 matmul for WebGPU EP
(#23365)
95b8effbc4 [QNN EP]: Clean up QNN logging resources if an error occurs
during initialization (#23435)
626134c5b5 Bump clang-format from 19.1.6 to 19.1.7 (#23428)
0cf975301f Fix eigen external deps (#23439)
f9440aedce Moving RN_CI Android Testing to Linux (#23422)
1aa5902ff4 [QNN EP] workaround for QNN validation bug for Tanh with
uint16 quantized output (#23432)
7f5582a0e2 Seperate RN andriod and IOS into 2 separated Stages. (#23400)
73deac2e7f Implement some missing element wise Add/Sub/Mul/Div/Neg
operations for CPU and CUDA EPs (#23090)
949fe42af4 Upgrade Java version from react-native/android to Java 17
(#23066)
0892c23463 Update Qnn SDK default version to 2.30 (#23411)
94c099bcec Fix type cast build error (#23423)
d633e571d1 [WebNN EP] Fix AddInitializersToSkip issues (#23354)
e988ef00e2 [QNN EP] Fix regression for MatMul with two quantized/dynamic
uint16 inputs (#23419)
7538795f6b Update onnxruntime binary size checks ci pipeline's docker
image (#23405)
6c5ea41cad Revert "[QNN EP] Clean up correctly from a partial setup
(#23320)" (#23420)
e866804bbe Enable comprehension simplification in ruff rules (#23414)
0a5f1f392c bugfix: string_view of invalid memory (#23417)
4cc38e0277 fix crash when first input of BatchNormalization is 1-D
(#23387)
033441487f Target py310 and modernize codebase with ruff (#23401)
87341ac010 [QNN EP] Fix segfault when unregistering HTP shared memory
handles (#23402)

### Motivation and Context
This update includes the change to make QNN-EP a shared library.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Adrian Lizarraga <adlizarraga@microsoft.com>
Co-authored-by: Justin Chu <justinchuby@users.noreply.github.com>
Co-authored-by: Yulong Wang <7679871+fs-eire@users.noreply.github.com>
Co-authored-by: Edward Chen <18449977+edgchen1@users.noreply.github.com>
Co-authored-by: Changming Sun <chasun@microsoft.com>
Co-authored-by: Peishen Yan <peishen.yan@intel.com>
Co-authored-by: Tianlei Wu <tlwu@microsoft.com>
Co-authored-by: Hector Li <hecli@microsoft.com>
Co-authored-by: Jian Chen <cjian@microsoft.com>
Co-authored-by: Alexis Tsogias <1114095+Zyrin@users.noreply.github.com>
Co-authored-by: junchao-zhao <68935141+junchao-loongson@users.noreply.github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: sushraja-msft <44513542+sushraja-msft@users.noreply.github.com>
Co-authored-by: Wanming Lin <wanming.lin@intel.com>
Co-authored-by: Jiajia Qin <jiajiaqin@microsoft.com>
Co-authored-by: Caroline Zhu <wolfivyaura@gmail.com>
2025-01-23 09:12:03 -08:00
.config Auto-generated baselines by 1ES Pipeline Templates (#22817) 2024-11-13 13:50:52 -08:00
.devcontainer
.gdn
.github Update win-ort-main to tip main 250116 (#23398) 2025-01-16 15:20:25 -08:00
.pipelines [DML EP] Update DML to 1.15.4 (#22635) 2024-10-29 17:13:57 -07:00
.vscode Stop VSCode appending file associations to settings.json (#21944) 2024-08-31 19:04:12 -07:00
cgmanifests Update win-ort-main to tip main 250116 (#23398) 2025-01-16 15:20:25 -08:00
cmake Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
csharp Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
dockerfiles Update win-ort-main to tip main 250116 (#23398) 2025-01-16 15:20:25 -08:00
docs Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
include/onnxruntime/core Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
java Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
js Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
objectivec Use UTF8 string encoding in ORTSaveCodeAndDescriptionToError(). (#22982) 2024-12-02 17:41:52 -08:00
onnxruntime Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
orttraining Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
rust Fix typos according to reviewdog report. (#21335) 2024-07-22 13:37:32 -07:00
samples
tools Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
winml Update win-ort-main to tip main 250116 (#23398) 2025-01-16 15:20:25 -08:00
.clang-format
.clang-tidy
.dockerignore
.gitattributes Fix typos according to reviewdog report. (#21335) 2024-07-22 13:37:32 -07:00
.gitignore
.gitmodules Revert "Upgrade emsdk from 3.1.59 to 3.1.62" (#21817) 2024-08-22 11:21:00 -07:00
.lintrunner.toml Update win-ort-main to tip main 250116 (#23398) 2025-01-16 15:20:25 -08:00
build.bat
build.sh
build_arm64x.bat
CITATION.cff
CODEOWNERS Update CODEOWNERS: remove onnxruntime-es (#21677) 2024-12-17 13:39:13 -08:00
CONTRIBUTING.md
CPPLINT.cfg Ignore all whitespace lint messages for cpplint (#22781) 2024-11-08 14:31:28 -08:00
lgtm.yml
LICENSE
NuGet.config Update C# test projects (#21631) 2024-09-05 08:21:23 +10:00
ort.wprp
ORT_icon_for_light_bg.png
packages.config [DML EP] Update DML to 1.15.4 (#22635) 2024-10-29 17:13:57 -07:00
pyproject.toml Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
README.md Update pipeline status (#22924) 2024-11-24 21:26:27 -08:00
requirements-dev.txt Update python version metadata (remove 3.7, 3.8, 3.9; add 3.13). (#23067) 2024-12-17 10:59:20 -08:00
requirements-doc.txt
requirements-lintrunner.txt Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
requirements-training.txt
requirements.txt Add compatibility for NumPy 2.0 (#21085) 2024-06-27 13:50:53 -07:00
SECURITY.md
setup.py Update win-ort-main to tip main 250123 (#23473) 2025-01-23 09:12:03 -08:00
ThirdPartyNotices.txt Cleanup code (#22827) 2024-11-19 14:13:33 -08:00
VERSION_NUMBER bumps up version in main from 1.20 -> 1.21 (#22482) 2024-10-17 12:32:35 -07:00

ONNX Runtime is a cross-platform inference and training machine-learning accelerator.

ONNX Runtime inference can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms. Learn more →

ONNX Runtime training can accelerate the model training time on multi-node NVIDIA GPUs for transformer models with a one-line addition for existing PyTorch training scripts. Learn more →

Get Started & Resources

Builtin Pipeline Status

System Inference Training
Windows Build Status
Build Status
Build Status
Build Status
Linux Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Mac Build Status
Android Build Status
iOS Build Status
Web Build Status
Other Build Status

This project is tested with BrowserStack.

Third-party Pipeline Status

System Inference Training
Linux Build Status

Releases

The current release and past releases can be found here: https://github.com/microsoft/onnxruntime/releases.

For details on the upcoming release, including release dates, announcements, features, and guidance on submitting feature requests, please visit the release roadmap: https://onnxruntime.ai/roadmap.

Data/Telemetry

Windows distributions of this project may collect usage data and send it to Microsoft to help improve our products and services. See the privacy statement for more details.

Contributions and Feedback

We welcome contributions! Please see the contribution guidelines.

For feature requests or bug reports, please file a GitHub Issue.

For general discussion or questions, please use GitHub Discussions.

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

License

This project is licensed under the MIT License.