mirror of
https://github.com/saymrwulf/onnxruntime.git
synced 2026-05-14 20:48:00 +00:00
* remove dependency of external jd-dnnlibrary * remove extra variables not used any more * update /cgmanifest.json
1.2 KiB
1.2 KiB
NNAPI Execution Provider
Android Neural Networks API (NNAPI) is a unified interface to CPU, GPU, and NN accelerators on Android.
Minimum requirements
The NNAPI EP requires Android devices with Android 8.1 or higher, it is recommended to use Android devices with Android 9 or higher to achieve optimal performance.
Build NNAPI EP
For build instructions, please see the BUILD page.
Using NNAPI EP in C/C++
To use NNAPI EP for inferencing, please register it as below.
string log_id = "Foo";
auto logging_manager = std::make_unique<LoggingManager>
(std::unique_ptr<ISink>{new CLogSink{}},
static_cast<Severity>(lm_info.default_warning_level),
false,
LoggingManager::InstanceType::Default,
&log_id)
Environment::Create(std::move(logging_manager), env)
InferenceSession session_object{so,env};
session_object.RegisterExecutionProvider(std::make_unique<::onnxruntime::NnapiExecutionProvider>());
status = session_object.Load(model_file_name);
The C API details are here.