mirror of
https://github.com/saymrwulf/pytorch.git
synced 2026-05-15 21:00:47 +00:00
Fixes https://github.com/pytorch/pytorch/issues/96887 We error out in BOTH the case when graph is created and when it is not created. Still bc-breaking, but not as severe because we are limiting to the case where someone uses setup_context. This makes setup_context and non-setup_context versions diverge in their behavior - With the non-setup_context version, saved variables are assumed to have the grad_fn of the inputs. - But now with the setup_context version, we produce an error for this case. Pull Request resolved: https://github.com/pytorch/pytorch/pull/97212 Approved by: https://github.com/zou3519 |
||
|---|---|---|
| .. | ||
| api | ||
| c10d | ||
| common | ||
| dist_autograd | ||
| jit | ||
| lazy | ||
| lite_interpreter_runtime | ||
| monitor | ||
| profiler | ||
| rpc | ||
| tensorexpr | ||
| __init__.py | ||