**Description**: Use full ORT package for onnxruntime-react-native. Left the params required for the mobile build in comments so they're easily discovered if we need to create onnxruntime-react-native-mobile in the future. **Motivation and Context** Remove barrier to using ORT with react native as the mobile package that was being used supports a limited range of opsets/operators/types, and requires ORT format models. The full package will run any model. |
||
|---|---|---|
| .. | ||
| android | ||
| e2e | ||
| ios | ||
| lib | ||
| scripts | ||
| .gitignore | ||
| app.plugin.js | ||
| babel.config.js | ||
| onnxruntime-c.podspec | ||
| onnxruntime-react-native.podspec | ||
| package.json | ||
| README.md | ||
| test_types_models.readme.md | ||
| tsconfig.build.json | ||
| tsconfig.json | ||
| tsconfig.scripts.json | ||
| unimodule.json | ||
| yarn.lock | ||
onnxruntime-react-native
ONNX Runtime React Native provides a JavaScript library for running ONNX models on React Native app.
Why ONNX models
The Open Neural Network Exchange (ONNX) is an open standard for representing machine learning models. The biggest advantage of ONNX is that it allows interoperability across different open source AI frameworks, which itself offers more flexibility for AI frameworks adoption.
Why ONNX Runtime React Native
With ONNX Runtime React Native, React Native developers can score pre-trained ONNX models directy on React Native apps by leveraging ONNX Runtime Mobile, so it provides a light-weight inference solution for Android and iOS.
Installation
yarn add onnxruntime-react-native
Usage
import { InferenceSession } from "onnxruntime-react-native";
// load a model
const session: InferenceSession = await InferenceSession.create(modelPath);
// input as InferenceSession.OnnxValueMapType
const result = session.run(input, ['num_detection:0', 'detection_classes:0'])
Refer to ONNX Runtime JavaScript examples for samples and tutorials. Different from other JavaScript frameworks like node.js and web, React Native library doesn't support these features.
- Unsigned data type at Tensor
- Model loading using ArrayBuffer
Operator and type support
ONNX Runtime React Native currently supports most operators used by popular models. Refer to ONNX Runtime Mobile Pacakge Operator and Type.
License
License information can be found here.