### Description This PR is to update the win-ort-main branch to the tip main branch as of 2025-01-23. ### PR List ddf0d377a7 [QNN EP] Add LoggingManager::HasDefaultLogger() to provider bridge API (#23467) 05fbbdf91f [QNN EP] Make QNN EP a shared library (#23120) 1336566d7f Add custom vcpkg ports (#23456) 2e1173c411 Update the compile flags for vcpkg packages (#23455) 1f628a9858 [Mobile] Add BrowserStack Android MAUI Test (#23383) 009cae0ec8 [js/webgpu] Optimize ConvTranspose (Continue) (#23429) 04a4a694cb Use onnx_protobuf.h to suppress some GCC warnings (#23453) 2e3b62b4b0 Suppress some strict-aliasing related warnings in WebGPU EP (#23454) b708f9b1dc Bump ruff from 0.9.1 to 0.9.2 (#23427) c0afc66b2a [WebNN] Remove workarounds for TFLite backend (#23406) 8a821ff7f9 Bump vite from 6.0.7 to 6.0.11 in /js/web/test/e2e/exports/testcases/vite-default (#23446) 220c1a203e Make ORT and Dawn use the same protobuf/abseil source code (#23447) b7b5792147 Change MacOS-13 to ubuntu on for android-java-api-aar-test.yml. (#23444) 19d0d2a30f WIP: Dp4MatMulNBits accuracy level 4 matmul for WebGPU EP (#23365) 95b8effbc4 [QNN EP]: Clean up QNN logging resources if an error occurs during initialization (#23435) 626134c5b5 Bump clang-format from 19.1.6 to 19.1.7 (#23428) 0cf975301f Fix eigen external deps (#23439) f9440aedce Moving RN_CI Android Testing to Linux (#23422) 1aa5902ff4 [QNN EP] workaround for QNN validation bug for Tanh with uint16 quantized output (#23432) 7f5582a0e2 Seperate RN andriod and IOS into 2 separated Stages. (#23400) 73deac2e7f Implement some missing element wise Add/Sub/Mul/Div/Neg operations for CPU and CUDA EPs (#23090) 949fe42af4 Upgrade Java version from react-native/android to Java 17 (#23066) 0892c23463 Update Qnn SDK default version to 2.30 (#23411) 94c099bcec Fix type cast build error (#23423) d633e571d1 [WebNN EP] Fix AddInitializersToSkip issues (#23354) e988ef00e2 [QNN EP] Fix regression for MatMul with two quantized/dynamic uint16 inputs (#23419) 7538795f6b Update onnxruntime binary size checks ci pipeline's docker image (#23405) 6c5ea41cad Revert "[QNN EP] Clean up correctly from a partial setup (#23320)" (#23420) e866804bbe Enable comprehension simplification in ruff rules (#23414) 0a5f1f392c bugfix: string_view of invalid memory (#23417) 4cc38e0277 fix crash when first input of BatchNormalization is 1-D (#23387) 033441487f Target py310 and modernize codebase with ruff (#23401) 87341ac010 [QNN EP] Fix segfault when unregistering HTP shared memory handles (#23402) ### Motivation and Context This update includes the change to make QNN-EP a shared library. --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: Adrian Lizarraga <adlizarraga@microsoft.com> Co-authored-by: Justin Chu <justinchuby@users.noreply.github.com> Co-authored-by: Yulong Wang <7679871+fs-eire@users.noreply.github.com> Co-authored-by: Edward Chen <18449977+edgchen1@users.noreply.github.com> Co-authored-by: Changming Sun <chasun@microsoft.com> Co-authored-by: Peishen Yan <peishen.yan@intel.com> Co-authored-by: Tianlei Wu <tlwu@microsoft.com> Co-authored-by: Hector Li <hecli@microsoft.com> Co-authored-by: Jian Chen <cjian@microsoft.com> Co-authored-by: Alexis Tsogias <1114095+Zyrin@users.noreply.github.com> Co-authored-by: junchao-zhao <68935141+junchao-loongson@users.noreply.github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: sushraja-msft <44513542+sushraja-msft@users.noreply.github.com> Co-authored-by: Wanming Lin <wanming.lin@intel.com> Co-authored-by: Jiajia Qin <jiajiaqin@microsoft.com> Co-authored-by: Caroline Zhu <wolfivyaura@gmail.com> |
||
|---|---|---|
| .. | ||
| ApiDocs | ||
| sample | ||
| src | ||
| test | ||
| testdata | ||
| tools | ||
| .clang-format | ||
| Directory.Build.props.in | ||
| NuGet.CSharp.config | ||
| OnnxRuntime.CSharp.proj | ||
| OnnxRuntime.CSharp.sln | ||
| OnnxRuntime.DesktopOnly.CSharp.sln | ||
| OnnxRuntime.snk | ||
| readme.md | ||
| readme.txt | ||
ORT C# Managed Library
The solution files here are used to produce nuget packages for the C# bindings.
Note that the project naming is currently confusing and needs updating.
- The Microsoft.ML.OnnxRuntime project produces the Microsoft.ML.OnnxRuntime.Managed nuget package.
- The Microsoft.ML.OnnxRuntime nuget package contains the native (i.e. C++) code for various platforms.
Solution files
The main solution file is OnnxRuntime.CSharp.sln. This includes desktop and Xamarin mobile projects. OnnxRuntime.DesktopOnly.CSharp.sln is a copy of that with all the mobile projects removed. This is due to there being no way to selectively exclude a csproj from the sln if Xamarin isn't available.
If changes are required, either update the main solution first and copy the relevant changes across, or copy the entire file and remove the mobile projects (anything with iOS, Android or Droid in the name).
Development setup:
Requirements:
Windows
NOTE: The usage of this solution is primarily for ORT developers creating the managed Microsoft.ML.OnnxRuntime.Managed nuget package. Due to that, the requirements are quite specific.
Visual Studio 2022 v17.2.4 or later, with Xamarin workloads
- v17.2.4 installs dotnet sdk 6.0.301
- in theory you could use an earlier VS version and download dotnet SDK 6.0.300+ from https://dotnet.microsoft.com/en-us/download/dotnet/6.0
- untested
There's no good way to use Visual Studio 2022 17.3 Preview in a CI, so we currently have to build pre-.net6 targets using VS, and .net6 targets using dotnet. We can't build them all using dotnet as the xamarin targets require msbuild. We can't package them using dotnet as that also requires msbuild.
Once the official VS 2022 release supports .net6 and is available in the CI we can revert to the original simple setup of building everything using msbuild.
To test packaging locally you will also need nuget.exe. Download from https://www.nuget.org/downloads. Put in a folder (e.g. C:\Program Files (x86)\NuGet). Add that folder to your PATH.
Linux
- Install .Net SDK.
- Install Mono.
wget http://download.mono-project.com/repo/xamarin.gpg && sudo apt-key add xamarin.gpg && rm xamarin.gpg echo "deb https://download.mono-project.com/repo/ubuntu stable-bionic main" | sudo tee /etc/apt/sources.list.d/mono-official-stable.list sudo apt update -y && apt install -y mono-devel - Install
nupkg.exewget https://dist.nuget.org/win-x86-commandline/latest/nuget.exe && sudo mv nuget.exe /usr/local/bin/nuget.exe echo 'mono /usr/local/bin/nuget.exe $@' | sudo tee /usr/local/bin/nuget chmod a+x /usr/local/bin/nuget
Magic incantations to build the nuget managed package locally:
Windows
If we're starting with VS 2022 17.2.4 we should have dotnet sdk 6.0.301
The IncludeMobileTargets property determines whether the MAUI targets are included. This is set to 'true' for the real package build.
Make sure all the required workloads are installed if the MAUI targets are included.
dotnet workload install maui android ios maccatalyst
- original example from here:
Restore and build the managed package.
msbuild -t:restore .\src\Microsoft.ML.OnnxRuntime\Microsoft.ML.OnnxRuntime.csproj -p:IncludeMobileTargets=true
msbuild -t:build .\src\Microsoft.ML.OnnxRuntime\Microsoft.ML.OnnxRuntime.csproj -p:IncludeMobileTargets=true
Need to run msbuild twice - once to restore which creates some json configs that are needed like Microsoft.ML.OnnxRuntime\obj\project.assets.json, and once to build using those configs.
Create nuget package
msbuild .\OnnxRuntime.CSharp.proj -t:CreatePackage -p:OrtPackageId=Microsoft.ML.OnnxRuntime -p:Configuration=Debug -p:Platform="Any CPU"
Linux
For example, to build a CUDA GPU package, just run:
./build.sh \
--config="Release" \
--cmake_generator Ninja \
--use_cuda \
--cuda_home=/usr/local/cuda \
--cudnn_home=/usr \
--build_nuget \
--msbuild_extra_options \
/p:SelectedTargets=Net6 \
/p:Net6Targets=net6.0 \
/p:TargetFrameworks=netstandard2.0 \
/p:IsLinuxBuild=true
Note: to build a pure CPU development package, you need to add /p:OrtPackageId="Microsoft.ML.OnnxRuntime"
to --msbuild_extra_options. Otherwise, it will try to create Xamarin mobile targets which may not be properly configured on your devbox.
A .nupkg file will be produced at you build root, say, build/Release.
To consume the package, in your .net project,
nuget add <path/to/packages.nuget> -Source ./packages/
dotnet add package microsoft.ml.onnxruntime.managed -s ./packages --prerelease
dotnet add package microsoft.ml.onnxruntime.gpu -s ./packages --prerelease