mirror of
https://github.com/saymrwulf/onnxruntime.git
synced 2026-05-14 20:48:00 +00:00
* Add support for sessions to share a global threadpool. * Fix build issues * Add tests, fix build issues. * Added some documentation * Fix centos issue when threadpools become nullptr due to 1 core. * Fix mac and x86 build issues * Address some PR comments * Disabled test for android, added few more tests and addressed more PR comments. * const_cast
37 lines
1.4 KiB
Markdown
37 lines
1.4 KiB
Markdown
# NNAPI Execution Provider
|
|
|
|
[Android Neural Networks API (NNAPI)](https://developer.android.com/ndk/guides/neuralnetworks) is a unified interface to CPU, GPU, and NN accelerators on Android. It is supported by onnxruntime via [DNNLibrary](https://github.com/JDAI-CV/DNNLibrary).
|
|
|
|
## Minimum requirements
|
|
|
|
The NNAPI EP requires Android devices with Android 8.1 or higher.
|
|
|
|
## Build NNAPI EP
|
|
|
|
For build instructions, please see the [BUILD page](../../BUILD.md#Android-NNAPI).
|
|
|
|
## Using NNAPI EP in C/C++
|
|
|
|
To use NNAPI EP for inferencing, please register it as below.
|
|
```
|
|
string log_id = "Foo";
|
|
auto logging_manager = std::make_unique<LoggingManager>
|
|
(std::unique_ptr<ISink>{new CLogSink{}},
|
|
static_cast<Severity>(lm_info.default_warning_level),
|
|
false,
|
|
LoggingManager::InstanceType::Default,
|
|
&log_id)
|
|
Environment::Create(std::move(logging_manager), env)
|
|
InferenceSession session_object{so,env};
|
|
session_object.RegisterExecutionProvider(std::make_unique<::onnxruntime::NnapiExecutionProvider>());
|
|
status = session_object.Load(model_file_name);
|
|
```
|
|
The C API details are [here](../C_API.md#c-api).
|
|
|
|
## Performance
|
|
|
|

|
|
|
|

|
|
|
|

|