onnxruntime/docs/execution_providers/NNAPI-ExecutionProvider.md
Pranav Sharma 435f014d71
Add support for sessions to share a global threadpool. (#3177)
* Add support for sessions to share a global threadpool.

* Fix build issues

* Add tests, fix build issues.

* Added some documentation

* Fix centos issue when threadpools become nullptr due to 1 core.

* Fix mac and x86 build issues

* Address some PR comments

* Disabled test for android, added few more tests and addressed more PR comments.

* const_cast
2020-03-18 15:42:46 -07:00

37 lines
1.4 KiB
Markdown

# NNAPI Execution Provider
[Android Neural Networks API (NNAPI)](https://developer.android.com/ndk/guides/neuralnetworks) is a unified interface to CPU, GPU, and NN accelerators on Android. It is supported by onnxruntime via [DNNLibrary](https://github.com/JDAI-CV/DNNLibrary).
## Minimum requirements
The NNAPI EP requires Android devices with Android 8.1 or higher.
## Build NNAPI EP
For build instructions, please see the [BUILD page](../../BUILD.md#Android-NNAPI).
## Using NNAPI EP in C/C++
To use NNAPI EP for inferencing, please register it as below.
```
string log_id = "Foo";
auto logging_manager = std::make_unique<LoggingManager>
(std::unique_ptr<ISink>{new CLogSink{}},
static_cast<Severity>(lm_info.default_warning_level),
false,
LoggingManager::InstanceType::Default,
&log_id)
Environment::Create(std::move(logging_manager), env)
InferenceSession session_object{so,env};
session_object.RegisterExecutionProvider(std::make_unique<::onnxruntime::NnapiExecutionProvider>());
status = session_object.Load(model_file_name);
```
The C API details are [here](../C_API.md#c-api).
## Performance
![NNAPI EP on RK3399](./images/nnapi-ep-rk3399.png)
![NNAPI EP on OnePlus 6T](./images/nnapi-ep-oneplus6t.png)
![NNAPI EP on Huawei Honor V10](./images/nnapi-ep-huaweihonorv10.png)