onnxruntime/csharp
Yulong Wang 718068f020
update C# API to optimize inference latency (#3171)
* update C# API to optimize inference latency

* rename PinnedOnnxValue to fixedBufferOnnxValue and fix build break

* add more test cases

* add conditions on string tensors for pre-allocated outputs

* change to random inputs

* fix word spell

* resolve comments

* resolve comments

* remove FixedBufferOnnxValueTests.cs

* fix trivial typos in doc
2020-04-08 11:57:40 -07:00
..
sample/Microsoft.ML.OnnxRuntime.InferenceSample Renaming MKL-DNN as DNNL (#2515) 2019-12-03 07:34:23 -08:00
src/Microsoft.ML.OnnxRuntime update C# API to optimize inference latency (#3171) 2020-04-08 11:57:40 -07:00
test update C# API to optimize inference latency (#3171) 2020-04-08 11:57:40 -07:00
testdata Enforce shape validation. (#1716) 2019-09-02 20:00:37 -07:00
tools/Microsoft.ML.OnnxRuntime.PerfTool update default optimization level + fix gemm_activation fusion (#2791) 2020-01-13 14:05:38 -08:00
Directory.Build.props.in Conditionally export execution provider apis in chsarp (#1724) 2019-09-09 11:17:44 -07:00
Nuget.CSharp.config Initial bootstrap commit. 2018-11-19 16:48:22 -08:00
OnnxRuntime.CSharp.proj Enable DML Nuget Package for x64 or x86 architectures (#3120) 2020-03-02 20:18:46 -08:00
OnnxRuntime.CSharp.sln Create a separate Nuget hosting just managed assemblies (#3020) 2020-02-27 18:00:17 -08:00
OnnxRuntime.snk Copy System.Numerics.Tensors sources from dotnet/corefx into onnxruntime (#1605) 2019-08-15 17:28:47 -07:00