ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Find a file
Yulong Wang d0dde4f7d4
[wasm/test] update packages versions (#23008)
### Description

Upgrade packages version to resolve the following dependabot alerts:
- https://github.com/microsoft/onnxruntime/security/dependabot/269
- https://github.com/microsoft/onnxruntime/security/dependabot/268
- https://github.com/microsoft/onnxruntime/security/dependabot/275
- https://github.com/microsoft/onnxruntime/security/dependabot/306



```
# npm audit report

braces  <3.0.3
Severity: high
Uncontrolled resource consumption in braces - https://github.com/advisories/GHSA-grv7-fg5c-xmjg
fix available via `npm audit fix`
node_modules/braces

cookie  <0.7.0
cookie accepts cookie name, path, and domain with out of bounds characters - https://github.com/advisories/GHSA-pxg6-pf52-xh8x
fix available via `npm audit fix`
node_modules/cookie
  engine.io  0.7.8 - 0.7.9 || 1.8.0 - 6.6.1
  Depends on vulnerable versions of cookie
  Depends on vulnerable versions of ws
  node_modules/engine.io
    socket.io  1.6.0 - 4.7.5
    Depends on vulnerable versions of engine.io
    node_modules/socket.io


ws  8.0.0 - 8.17.0
Severity: high
ws affected by a DoS when handling a request with many HTTP headers - https://github.com/advisories/GHSA-3h5v-q93c-6h6q
fix available via `npm audit fix`
node_modules/ws
  socket.io-adapter  2.5.2 - 2.5.4
  Depends on vulnerable versions of ws
  node_modules/socket.io-adapter

6 vulnerabilities (1 low, 1 moderate, 4 high)

```
2024-12-04 13:08:13 -08:00
.config Auto-generated baselines by 1ES Pipeline Templates (#22817) 2024-11-13 13:50:52 -08:00
.devcontainer
.gdn
.github Move C# doc Github Action to Windows (#22880) 2024-11-18 23:56:59 -08:00
.pipelines [DML EP] Update DML to 1.15.4 (#22635) 2024-10-29 17:13:57 -07:00
.vscode
cgmanifests Bump version of Dawn to 12a3b24c4 (#23002) 2024-12-04 09:47:16 -08:00
cmake Bump version of Dawn to 12a3b24c4 (#23002) 2024-12-04 09:47:16 -08:00
csharp [CoreML] Create EP by AppendExecutionProvider (#22675) 2024-11-27 09:26:31 +08:00
dockerfiles Fix warning - LegacyKeyValueFormat: "ENV key=value" should be used instead of legacy "ENV key value" format (#22800) 2024-11-11 13:05:34 -08:00
docs Implementation of TreeEnsemble ai.onnx.ml==5 (#22333) 2024-11-22 19:48:23 +01:00
include/onnxruntime/core [CoreML] Create EP by AppendExecutionProvider (#22675) 2024-11-27 09:26:31 +08:00
java Redo "Update Gradle version 8.7 and java version 17 within onnxruntime/java" (#22923) 2024-12-02 18:34:25 -08:00
js [js/node] fix TypeScript declaration in onnxruntime-node (#23000) 2024-12-04 11:29:27 -08:00
objectivec Use UTF8 string encoding in ORTSaveCodeAndDescriptionToError(). (#22982) 2024-12-02 17:41:52 -08:00
onnxruntime [wasm/test] update packages versions (#23008) 2024-12-04 13:08:13 -08:00
orttraining Fix warning - LegacyKeyValueFormat: "ENV key=value" should be used instead of legacy "ENV key value" format (#22800) 2024-11-11 13:05:34 -08:00
rust
samples
tools Bump version of Dawn to 12a3b24c4 (#23002) 2024-12-04 09:47:16 -08:00
winml
.clang-format
.clang-tidy
.dockerignore
.gitattributes
.gitignore
.gitmodules
.lintrunner.toml
build.bat
build.sh
build_arm64x.bat
CITATION.cff
CODEOWNERS
CONTRIBUTING.md
CPPLINT.cfg Ignore all whitespace lint messages for cpplint (#22781) 2024-11-08 14:31:28 -08:00
lgtm.yml
LICENSE
NuGet.config Update C# test projects (#21631) 2024-09-05 08:21:23 +10:00
ort.wprp
ORT_icon_for_light_bg.png
packages.config [DML EP] Update DML to 1.15.4 (#22635) 2024-10-29 17:13:57 -07:00
pyproject.toml
README.md Update pipeline status (#22924) 2024-11-24 21:26:27 -08:00
requirements-dev.txt
requirements-doc.txt
requirements-lintrunner.txt Update lintrunner requirements (#22185) 2024-09-23 18:27:16 -07:00
requirements-training.txt
requirements.txt
SECURITY.md
setup.py Update CMake to 3.31.0rc1 (#22433) 2024-10-16 11:50:13 -07:00
ThirdPartyNotices.txt Cleanup code (#22827) 2024-11-19 14:13:33 -08:00
VERSION_NUMBER bumps up version in main from 1.20 -> 1.21 (#22482) 2024-10-17 12:32:35 -07:00

ONNX Runtime is a cross-platform inference and training machine-learning accelerator.

ONNX Runtime inference can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms. Learn more →

ONNX Runtime training can accelerate the model training time on multi-node NVIDIA GPUs for transformer models with a one-line addition for existing PyTorch training scripts. Learn more →

Get Started & Resources

Builtin Pipeline Status

System Inference Training
Windows Build Status
Build Status
Build Status
Build Status
Linux Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Mac Build Status
Android Build Status
iOS Build Status
Web Build Status
Other Build Status

This project is tested with BrowserStack.

Third-party Pipeline Status

System Inference Training
Linux Build Status

Releases

The current release and past releases can be found here: https://github.com/microsoft/onnxruntime/releases.

For details on the upcoming release, including release dates, announcements, features, and guidance on submitting feature requests, please visit the release roadmap: https://onnxruntime.ai/roadmap.

Data/Telemetry

Windows distributions of this project may collect usage data and send it to Microsoft to help improve our products and services. See the privacy statement for more details.

Contributions and Feedback

We welcome contributions! Please see the contribution guidelines.

For feature requests or bug reports, please file a GitHub Issue.

For general discussion or questions, please use GitHub Discussions.

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

License

This project is licensed under the MIT License.