mirror of
https://github.com/saymrwulf/onnxruntime.git
synced 2026-05-14 20:48:00 +00:00
* Spacing fix for code block * Update instructions Include java, acl, and nn api instructions on build page * Update build instructions to link to build.md * typo * Update build instructions to link to build.md * Include other minor build.md page updates * Update CUDA version * Fix dockerfile links
1,007 B
1,007 B
NNAPI Execution Provider
Android Neural Networks API (NNAPI) is a unified interface to CPU, GPU, and NN accelerators on Android. It is supported by onnxruntime via DNNLibrary.
Minimum requirements
The NNAPI EP requires Android devices with Android 8.1 or higher.
Build NNAPI EP
For build instructions, please see the BUILD page.
Using NNAPI EP in C/C++
To use NNAPI EP for inferencing, please register it as below.
InferenceSession session_object{so};
session_object.RegisterExecutionProvider(std::make_unique<::onnxruntime::NnapiExecutionProvider>());
status = session_object.Load(model_file_name);
The C API details are here.


