onnxruntime/js/react_native
Rachel Guo 65434dce57
Bump decode-uri-component from 0.2.0 to 0.2.2 in /js/react_native/e2e (#16329)
### Description
<!-- Describe your changes. -->

As title.

Similar as this pr: https://github.com/microsoft/onnxruntime/pull/13846


### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->

To resolve component governance alert.

https://aiinfra.visualstudio.com/Lotus/_componentGovernance/97926/alert/8087084?typeId=16589570

Co-authored-by: rachguo <rachguo@rachguos-Mac-mini.local>
2023-06-15 10:30:48 -07:00
..
android [js/rn] Implement dispose native method (#16131) 2023-06-09 09:17:33 +10:00
e2e Bump decode-uri-component from 0.2.0 to 0.2.2 in /js/react_native/e2e (#16329) 2023-06-15 10:30:48 -07:00
ios [js/rn] Implement dispose native method (#16131) 2023-06-09 09:17:33 +10:00
lib [js] add API that allows to get package version (#16207) 2023-06-09 16:18:53 -07:00
scripts [js/rn] fix CI packaging for react native E2E test (#11463) 2022-05-09 18:09:52 -07:00
.gitignore
app.plugin.js [js/rn] add expo config plugin support (#11556) 2022-05-25 11:55:35 -07:00
babel.config.js
onnxruntime-react-native.podspec [js/rn] Package dependency change to manage ort-extensions for react_native app (#15641) 2023-04-29 00:07:12 -07:00
package.json Update VERSION_NUMBER (#15773) 2023-05-03 15:07:34 -07:00
README.md Update React Native documentation to reflect change to use full ORT (#13091) 2022-09-28 08:11:58 +10:00
test_types_models.readme.md Use full ORT package for onnxruntime-react-native. (#13037) 2022-09-23 07:20:03 +10:00
tsconfig.build.json
tsconfig.json
tsconfig.scripts.json
unimodule.json [js/rn] add expo config plugin support (#11556) 2022-05-25 11:55:35 -07:00
yarn.lock Update VERSION_NUMBER (#15773) 2023-05-03 15:07:34 -07:00

onnxruntime-react-native

ONNX Runtime React Native provides a JavaScript library for running ONNX models in a React Native app.

Why ONNX models

The Open Neural Network Exchange (ONNX) is an open standard for representing machine learning models. The biggest advantage of ONNX is that it allows interoperability across different open source AI frameworks, which itself offers more flexibility for AI frameworks adoption.

Why ONNX Runtime React Native

With ONNX Runtime React Native, React Native developers can score pre-trained ONNX models directly in React Native apps by leveraging ONNX Runtime, so it provides a light-weight inference solution for Android and iOS.

Installation

yarn add onnxruntime-react-native

Usage

import { InferenceSession } from "onnxruntime-react-native";

// load a model
const session: InferenceSession = await InferenceSession.create(modelPath);
// input as InferenceSession.OnnxValueMapType
const result = session.run(input, ['num_detection:0', 'detection_classes:0'])

Refer to ONNX Runtime JavaScript examples for samples and tutorials. The ONNX Runtime React Native library does not currently support the following features:

  • Tensors with unsigned data types, with the exception of uint8 on Android devices
  • Model loading using ArrayBuffer

Operator and type support

ONNX Runtime React Native version 1.13 supports both ONNX and ORT format models, and includes all operators and types.

Previous ONNX Runtime React Native packages use the ONNX Runtime Mobile package, and support operators and types used in popular mobile models. See here for the list of supported operators and types.

License

License information can be found here.