onnxruntime/js/node
Yi Zhang dae77e6014
Support building Windows CUDA with Ninja (#20176)
### How to run it locally
1. conda install ninja
2. "C:\Program Files\Microsoft Visual
Studio\2022\Enterprise\VC\Auxiliary\Build\vcvarsall.bat" x64
3. python.exe {ort_repo}\tools\ci_build\build.py --config RelWithDebInfo
--build_dir {ort_repo}\build_cuda --skip_submodule_sync --build_csharp
--update --parallel --cmake_generator "Ninja" --build_shared_lib
--enable_onnx_tests --enable_pybind --build_java --build_nodejs
--use_cuda "--cuda_home=C:\Program Files\NVIDIA GPU Computing
Toolkit\CUDA\v11.8" --enable_cuda_profiling --cmake_extra_defines
CMAKE_CUDA_ARCHITECTURES=60
4. cd build_cuda\RelWithDebInfo
5.  cmake --build . j16

### Motivation and Context
In packaging pipelines, we often come across a random issue that the
building with CUDA on Windows takes too much time.
Although it has been reduced much by moving the building to the CPU
machine.
We're planning to build with Ninja instead of msbuild in Packaging
pipelines, thus, nvcc can run parallelly.
It's the first step to support it locally.
2024-04-03 11:19:31 +08:00
..
lib [node] Switch to setImmediate to avoid starving the Node.js event loop (#19610) 2024-02-22 18:53:50 -08:00
script Support building Windows CUDA with Ninja (#20176) 2024-04-03 11:19:31 +08:00
src [js/node] support manually dispose session (#18655) 2023-12-19 16:20:00 -08:00
test [js] update a few packages (#18499) 2023-11-17 22:40:51 -08:00
.gitignore [node.js binding] aggregate binaries for multiple platforms in single NPM package (#9501) 2021-10-25 20:16:10 -07:00
.npmignore [node.js binding] aggregate binaries for multiple platforms in single NPM package (#9501) 2021-10-25 20:16:10 -07:00
CMakeLists.txt Support building Windows CUDA with Ninja (#20176) 2024-04-03 11:19:31 +08:00
package-lock.json Bump follow-redirects from 1.15.4 to 1.15.6 in /js/node (#19951) 2024-03-16 18:54:53 -07:00
package.json [ORT 1.17.0 release] Bump up version to 1.18.0 (#19170) 2024-01-17 11:18:32 -08:00
README.md link to docs incorrect for js/web/node (#18960) 2024-01-03 17:30:24 -08:00
tsconfig.json [js/web] fix typescript type check (#18343) 2023-11-10 16:03:38 -08:00

ONNX Runtime Node.js Binding

ONNX Runtime Node.js binding enables Node.js applications to run ONNX model inference.

Usage

Install the latest stable version:

npm install onnxruntime-node

Refer to ONNX Runtime JavaScript examples for samples and tutorials.

Requirements

ONNXRuntime works on Node.js v12.x+ or Electron v5.x+.

Following platforms are supported with pre-built binaries:

  • Windows x64 CPU NAPI_v3
  • Linux x64 CPU NAPI_v3
  • MacOS x64 CPU NAPI_v3

To use on platforms without pre-built binaries, you can build Node.js binding from source and consume it by npm install <onnxruntime_repo_root>/js/node/. See also instructions for building ONNX Runtime Node.js binding locally.

GPU Support

Right now, the Windows version supports only the DML provider. Linux x64 can use CUDA and TensorRT.

License

License information can be found here.