A couple of places in onnxruntime used `float_t` data type alias as an
alternative to `float`. However, this is not entirely correct, since
`float_t` is an implementation-defined type alias, which may be `float`,
`double`, `long double` or some other implementation-defined data type,
depending on the state of the internal `FLT_EVAL_METHOD` macro:
https://en.cppreference.com/w/c/numeric/math/float_t
On most major platforms and compilers (clang, GCC, MSVC) this is only a
cosmetic change and will not lead to any changes. However, icpx compiler
(and legacy icc) tends to substitute `float_t` with `long double`,
resulting in a linker error (unresolved reference) to the base onnx
library, that only contains the `ParseData` function for `float` and
`double` as in
[here](9264e09367/onnx/defs/tensor_proto_util.cc (L133-L134)).
Overall, this PR cleans up the implementation-defined behaviour and
enables building onnxruntime with icpx.
### Description
<!-- Describe your changes. -->
- Create `OnnxruntimeJSIHelper` native module to provide two JSI
functions
- `jsiOnnxruntimeStoreArrayBuffer`: Store buffer in Blob Manager &
return blob object (iOS: RCTBlobManager, Android: BlobModule)
- `jsiOnnxruntimeResolveArrayBuffer`: Use blob object to get buffer
- The part of implementation is reference to
[react-native-blob-jsi-helper](https://github.com/mrousavy/react-native-blob-jsi-helper)
- Replace base64 encode/decode
- `loadModelFromBlob`: Rename from `loadModelFromBase64EncodedBuffer`
- `run`: Use blob object to replace input.data & results[].data
For [this
context](https://github.com/microsoft/onnxruntime/issues/16031#issuecomment-1556527812),
it saved a lot of time and avoid JS thread blocking in decode return
type, it is 3700ms -> 5~20ms for the case. (resolve function only takes
0.x ms)
### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->
It’s related to #16031, but not a full implementation for migrate to
JSI.
It just uses JSI through BlobManager to replace the slow part (base64
encode / decode).
Rewriting it entirely in JSI could be complicated, like type convertion
and threading. This PR might be considered a minor change.
/cc @skottmckay
### Description
<!-- Describe your changes. -->
As title.
### Motivation and Context
<!-- - Why is this change required? What problem does it solve?
- If it fixes an open issue, please link to the issue here. -->
Uint8 type might be required for some model used in sample application.
To match supported data types for onnxruntime-react-native for Android.
Co-authored-by: rachguo <rachguo@rachguos-Mac-mini.local>
Co-authored-by: rachguo <rachguo@rachguos-Mini.attlocal.net>
* onnxruntime react native binding
* add react native backend
* fix lint comments
* fix react native backend for ios
* remove unnecessary files to check in
* move onnxruntime-common to devDependency
* create two podspec files for iphoneos and iphonesimulator
* revise README.md and add third party notices for react native
* rename a package
* rename a package and revise README
* add a license into package.json
* revise README and comments
* fix typo
* fix lint errors
* fix lint errors
* add a prepack script. touch index.tsx and App.tsx to resolve CI issue
* remove a unsupported tsx format from clang-format
* fix a type and add steps tp publish a react native npm package
* resolve comments
* fix clang format
* remove promise wrap. change prepack to typescript