onnxruntime/js/react_native
Yulong Wang 3437967e63
[js/rn] fix CI packaging for react native E2E test (#11463)
* [js/rn] fix ORTRN packaging in CI

* fix env var setting
2022-05-09 18:09:52 -07:00
..
android [js/rn] set minSdkVersion to 21 for ORT-RN Android (#11403) 2022-05-02 19:36:41 -07:00
e2e [js/rn] fix ORTRN for iOS (#11425) 2022-05-04 13:58:55 -07:00
ios [js/rn] fix ORTRN for iOS (#11425) 2022-05-04 13:58:55 -07:00
lib [js/rn] fix ORTRN for iOS (#11425) 2022-05-04 13:58:55 -07:00
scripts [js/rn] fix CI packaging for react native E2E test (#11463) 2022-05-09 18:09:52 -07:00
.gitignore [js/react_native] Create ONNX Runtime React Native pipeline (#10474) 2022-02-09 21:37:05 -08:00
babel.config.js [js] resolve CodeQL warnings for force strict mode (#8645) 2021-08-06 19:35:43 -07:00
onnxruntime-mobile-c.podspec [js/react_native] Fix a broken manual build (#10012) 2021-12-13 19:02:10 -08:00
onnxruntime-react-native.podspec [js/rn] fix ORTRN for iOS (#11425) 2022-05-04 13:58:55 -07:00
package.json [js/rn] fix CI packaging for react native E2E test (#11463) 2022-05-09 18:09:52 -07:00
README.md [js/react_native] Use a mobile ORT instead of a full ORT (#8042) 2021-06-15 13:36:05 -07:00
tsconfig.build.json [js/react_native] Create ONNX Runtime React Native pipeline (#10474) 2022-02-09 21:37:05 -08:00
tsconfig.json [js] release pipeline for web and react native (#10656) 2022-03-01 21:38:33 -08:00
tsconfig.scripts.json
yarn.lock [js] aggregation of recent dependabot security warnings fix (#11060) 2022-03-31 02:06:04 -07:00

onnxruntime-react-native

ONNX Runtime React Native provides a JavaScript library for running ONNX models on React Native app.

Why ONNX models

The Open Neural Network Exchange (ONNX) is an open standard for representing machine learning models. The biggest advantage of ONNX is that it allows interoperability across different open source AI frameworks, which itself offers more flexibility for AI frameworks adoption.

Why ONNX Runtime React Native

With ONNX Runtime React Native, React Native developers can score pre-trained ONNX models directy on React Native apps by leveraging ONNX Runtime Mobile, so it provides a light-weight inference solution for Android and iOS.

Installation

yarn add onnxruntime-react-native

Usage

import { InferenceSession } from "onnxruntime-react-native";

// load a model
const session: InferenceSession = await InferenceSession.create(modelPath);
// input as InferenceSession.OnnxValueMapType
const result = session.run(input, ['num_detection:0', 'detection_classes:0'])

Refer to ONNX Runtime JavaScript examples for samples and tutorials. Different from other JavaScript frameworks like node.js and web, React Native library doesn't support these features.

  • Unsigned data type at Tensor
  • Model loading using ArrayBuffer

Operator and type support

ONNX Runtime React Native currently supports most operators used by popular models. Refer to ONNX Runtime Mobile Pacakge Operator and Type.

License

License information can be found here.