onnxruntime/docs/execution_providers/NNAPI-ExecutionProvider.md
Faith Xu bb7f43ee91
Documentation update: build instructions (#2636)
* Spacing fix for code block

* Update instructions

Include java, acl, and nn api instructions on build page

* Update build instructions to link to build.md

* typo

* Update build instructions to link to build.md

* Include other minor build.md page updates

* Update CUDA version

* Fix dockerfile links
2019-12-19 13:40:34 -08:00

1,007 B

NNAPI Execution Provider

Android Neural Networks API (NNAPI) is a unified interface to CPU, GPU, and NN accelerators on Android. It is supported by onnxruntime via DNNLibrary.

Minimum requirements

The NNAPI EP requires Android devices with Android 8.1 or higher.

Build NNAPI EP

For build instructions, please see the BUILD page.

Using NNAPI EP in C/C++

To use NNAPI EP for inferencing, please register it as below.

InferenceSession session_object{so};
session_object.RegisterExecutionProvider(std::make_unique<::onnxruntime::NnapiExecutionProvider>());
status = session_object.Load(model_file_name);

The C API details are here.

Performance

NNAPI EP on RK3399

NNAPI EP on OnePlus 6T

NNAPI EP on Huawei Honor V10