onnxruntime/docs/execution_providers/NNAPI-ExecutionProvider.md
2019-12-09 14:37:03 -08:00

1.4 KiB

NNAPI Execution Provider

Android Neural Networks API (NNAPI) is a unified interface to CPU, GPU, and NN accelerators on Android. It is supported by onnxruntime via DNNLibrary.

Minimum requirements

The NNAPI EP requires Android devices with Android 8.1 or higher.

Build NNAPI EP

Pre-Requisites

To build onnxruntime with NNAPI EP, install Android NDK first (see BUILD.md)

Build Instructions

The basic commands are following. There are also some other parameters for building the Android version. See BUILD.md for the details.

Cross compiling on Windows

./build.bat --android --android_ndk_path <android ndk path> --dnnlibrary

Cross compiling on Linux

./build.sh --android --android_ndk_path <android ndk path> --dnnlibrary

Using NNAPI EP in C/C++

To use NNAPI EP for inferencing, please register it as below.

InferenceSession session_object{so};
session_object.RegisterExecutionProvider(std::make_unique<::onnxruntime::NnapiExecutionProvider>());
status = session_object.Load(model_file_name);

The C API details are here.

Performance

NNAPI EP on RK3399

NNAPI EP on OnePlus 6T

NNAPI EP on Huawei Honor V10