onnxruntime/docs/execution_providers/NNAPI-ExecutionProvider.md
Pranav Sharma 435f014d71
Add support for sessions to share a global threadpool. (#3177)
* Add support for sessions to share a global threadpool.

* Fix build issues

* Add tests, fix build issues.

* Added some documentation

* Fix centos issue when threadpools become nullptr due to 1 core.

* Fix mac and x86 build issues

* Address some PR comments

* Disabled test for android, added few more tests and addressed more PR comments.

* const_cast
2020-03-18 15:42:46 -07:00

1.4 KiB

NNAPI Execution Provider

Android Neural Networks API (NNAPI) is a unified interface to CPU, GPU, and NN accelerators on Android. It is supported by onnxruntime via DNNLibrary.

Minimum requirements

The NNAPI EP requires Android devices with Android 8.1 or higher.

Build NNAPI EP

For build instructions, please see the BUILD page.

Using NNAPI EP in C/C++

To use NNAPI EP for inferencing, please register it as below.

string log_id = "Foo";
auto logging_manager = std::make_unique<LoggingManager>
(std::unique_ptr<ISink>{new CLogSink{}},
                                  static_cast<Severity>(lm_info.default_warning_level),
                                  false,
                                  LoggingManager::InstanceType::Default,
                                  &log_id)
Environment::Create(std::move(logging_manager), env)
InferenceSession session_object{so,env};
session_object.RegisterExecutionProvider(std::make_unique<::onnxruntime::NnapiExecutionProvider>());
status = session_object.Load(model_file_name);

The C API details are here.

Performance

NNAPI EP on RK3399

NNAPI EP on OnePlus 6T

NNAPI EP on Huawei Honor V10