mirror of
https://github.com/saymrwulf/pytorch.git
synced 2026-05-15 21:00:47 +00:00
The previous PR in this stack uncovered an error in the forward over backward for this function. In this PR, we fix this error and we also fix the gradgrad implementation (and make it more stable and faster using `logsigmoid`). We also move the double backward for this function to `FunctoinsManual` as there's no reason for it to be in `native_functions` Pull Request resolved: https://github.com/pytorch/pytorch/pull/80083 Approved by: https://github.com/zou3519 |
||
|---|---|---|
| .. | ||
| custom_build | ||
| lightweight_dispatch | ||
| model_test | ||
| nnc | ||
| test_bytecode.py | ||
| test_lite_script_module.py | ||
| test_lite_script_type.py | ||
| test_quantize_fx_lite_script_module.py | ||
| test_upgrader_bytecode_table_example.cpp | ||
| test_upgrader_codegen.py | ||
| test_upgraders.py | ||