2019-12-09 22:37:03 +00:00
# NNAPI Execution Provider
[Android Neural Networks API (NNAPI) ](https://developer.android.com/ndk/guides/neuralnetworks ) is a unified interface to CPU, GPU, and NN accelerators on Android. It is supported by onnxruntime via [DNNLibrary ](https://github.com/JDAI-CV/DNNLibrary ).
## Minimum requirements
The NNAPI EP requires Android devices with Android 8.1 or higher.
## Build NNAPI EP
2019-12-19 21:40:34 +00:00
For build instructions, please see the [BUILD page ](../../BUILD.md#Android-NNAPI ).
2019-12-09 22:37:03 +00:00
## Using NNAPI EP in C/C++
To use NNAPI EP for inferencing, please register it as below.
```
2020-03-18 22:42:46 +00:00
string log_id = "Foo";
auto logging_manager = std::make_unique< LoggingManager >
(std::unique_ptr< ISink > {new CLogSink{}},
static_cast< Severity > (lm_info.default_warning_level),
false,
LoggingManager::InstanceType::Default,
& log_id)
Environment::Create(std::move(logging_manager), env)
InferenceSession session_object{so,env};
2019-12-09 22:37:03 +00:00
session_object.RegisterExecutionProvider(std::make_unique< ::onnxruntime::NnapiExecutionProvider > ());
status = session_object.Load(model_file_name);
```
The C API details are [here ](../C_API.md#c-api ).
## Performance


