# NNAPI Execution Provider [Android Neural Networks API (NNAPI)](https://developer.android.com/ndk/guides/neuralnetworks) is a unified interface to CPU, GPU, and NN accelerators on Android. ## Minimum requirements The NNAPI EP requires Android devices with Android 8.1 or higher, it is recommended to use Android devices with Android 9 or higher to achieve optimal performance. ## Build NNAPI EP For build instructions, please see the [BUILD page](../../BUILD.md#Android-NNAPI-Execution-Provider). ## Using NNAPI EP in C/C++ To use NNAPI EP for inferencing, please register it as below. ``` string log_id = "Foo"; auto logging_manager = std::make_unique (std::unique_ptr{new CLogSink{}}, static_cast(lm_info.default_warning_level), false, LoggingManager::InstanceType::Default, &log_id) Environment::Create(std::move(logging_manager), env) InferenceSession session_object{so,env}; session_object.RegisterExecutionProvider(std::make_unique<::onnxruntime::NnapiExecutionProvider>()); status = session_object.Load(model_file_name); ``` The C API details are [here](../C_API.md#c-api).