onnxruntime/js/web/lib
Guenther Schmuelling 4a5f13b681
fix resize for fp16 (#19110)
resize for fp16 has 2 issues: scales are always f32 and roi can be f32
or f16.
scales:
this is fixed.

roi
this is fixed for the case where roi is not passed as optional input
with f16. To fix this it requires a much larger change and I did not
want to risk this short before a release. For all practical purpose
passing roi as input with f16 should be rare and we can fix it in the
near future.
2024-01-12 13:44:28 -08:00
..
onnxjs [js] optimize eslint config (#18460) 2023-11-20 12:00:56 -08:00
wasm fix resize for fp16 (#19110) 2024-01-12 13:44:28 -08:00
backend-onnxjs.ts [js/web/training] runTrainStep implementation (#18006) 2023-11-02 08:32:50 -07:00
backend-wasm-inference.ts Add "glue" between training WASM artifacts and training web (#17474) 2023-10-12 11:16:56 -07:00
backend-wasm-training.ts [js/web/training] runTrainStep implementation (#18006) 2023-11-02 08:32:50 -07:00
backend-wasm.ts [js/webgpu] Introduce trace support (#18928) 2024-01-03 10:13:17 -08:00
build-def.d.ts [WebNN] Enable npm unit tests (#18486) 2024-01-09 10:10:57 -08:00
index.ts [WebNN] Enable npm unit tests (#18486) 2024-01-09 10:10:57 -08:00
version.ts Bump Up Version to 1.17.0 (#17587) 2023-09-20 11:02:58 +08:00