Microsoft.ML.OnnxRuntime.QNN
1.23.2
Prefix Reserved
See the version list below for details.
dotnet add package Microsoft.ML.OnnxRuntime.QNN --version 1.23.2
NuGet\Install-Package Microsoft.ML.OnnxRuntime.QNN -Version 1.23.2
<PackageReference Include="Microsoft.ML.OnnxRuntime.QNN" Version="1.23.2" />
<PackageVersion Include="Microsoft.ML.OnnxRuntime.QNN" Version="1.23.2" />
<PackageReference Include="Microsoft.ML.OnnxRuntime.QNN" />
paket add Microsoft.ML.OnnxRuntime.QNN --version 1.23.2
#r "nuget: Microsoft.ML.OnnxRuntime.QNN, 1.23.2"
#:package Microsoft.ML.OnnxRuntime.QNN@1.23.2
#addin nuget:?package=Microsoft.ML.OnnxRuntime.QNN&version=1.23.2
#tool nuget:?package=Microsoft.ML.OnnxRuntime.QNN&version=1.23.2
About

ONNX Runtime is a cross-platform machine-learning inferencing accelerator.
ONNX Runtime can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms.
Learn more → here
NuGet Packages
ONNX Runtime Native packages
Microsoft.ML.OnnxRuntime
- Native libraries for all supported platforms
- CPU Execution Provider
- CoreML Execution Provider on macOS/iOS
- XNNPACK Execution Provider on Android/iOS
Microsoft.ML.OnnxRuntime.Gpu
- Windows and Linux
- TensorRT Execution Provider
- CUDA Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.DirectML
- Windows
- DirectML Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.QNN
- 64-bit Windows
- QNN Execution Provider
- CPU Execution Provider
Intel.ML.OnnxRuntime.OpenVino
- 64-bit Windows
- OpenVINO Execution Provider
- CPU Execution Provider
Other packages
Microsoft.ML.OnnxRuntime.Managed
- C# language bindings
Microsoft.ML.OnnxRuntime.Extensions
- Custom operators for pre/post processing on all supported platforms.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| native | native is compatible. |
-
.NETCoreApp 0.0
- Microsoft.ML.OnnxRuntime.Managed (>= 1.23.2)
-
.NETFramework 0.0
- Microsoft.ML.OnnxRuntime.Managed (>= 1.23.2)
-
.NETStandard 0.0
- Microsoft.ML.OnnxRuntime.Managed (>= 1.23.2)
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 1.24.2-rc.1 | 0 | 2/13/2026 |
| 1.24.1 | 0 | 2/4/2026 |
| 1.24.0 | 0 | 1/29/2026 |
| 1.23.2 | 0 | 10/22/2025 |
| 1.23.1 | 0 | 10/3/2025 |
| 1.23.0 | 0 | 9/22/2025 |
| 1.23.0-dev-20250723-0512-3b... | 0 | 7/23/2025 |
| 1.23.0-dev-20250722-0517-af... | 0 | 7/22/2025 |
| 1.23.0-dev-20250721-0809-19... | 0 | 7/21/2025 |
| 1.23.0-dev-20250719-0510-16... | 0 | 7/19/2025 |
| 1.23.0-dev-20250718-0526-e3... | 0 | 7/18/2025 |
| 1.23.0-dev-20250716-0530-09... | 0 | 7/16/2025 |
| 1.23.0-dev-20250715-0528-9d... | 0 | 7/15/2025 |
| 1.23.0-dev-20250714-0526-2b... | 0 | 7/14/2025 |
| 1.23.0-dev-20250714-0317-2b... | 0 | 7/14/2025 |
| 1.23.0-dev-20250712-0523-aa... | 0 | 7/12/2025 |
| 1.23.0-dev-20250711-0525-56... | 0 | 7/11/2025 |
| 1.23.0-dev-20250710-0524-d2... | 0 | 7/10/2025 |
| 1.23.0-dev-20250708-0526-0c... | 0 | 7/8/2025 |
| 1.23.0-dev-20250707-0526-a3... | 0 | 7/7/2025 |
Release Def:
Branch: refs/heads/rel-1.23.2
Commit: a83fc4d58cb48eb68890dd689f94f28288cf2278
Build: https://aiinfra.visualstudio.com/Lotus/_build/results?buildId=974987