pytorch/functorch/_src
Ivan Yashchuk 2cfc4cb367 Add optional recomputable_ops argument for the min cut partitioner (#86686)
`min_cut_rematerialization_partition` has a default set of hard-coded operations that are allowed to be recomputed in the backward pass.
This PR adds customization ability to this function allowing users to control the behavior by passing `recomputable_ops` instead of relying on the default setting.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/86686
Approved by: https://github.com/Chillee
2022-10-14 12:15:30 +00:00
..
__init__.py
aot_autograd.py Unified symbolic shape variables between AOTAutograd and Inductor (#86659) 2022-10-14 00:24:43 +00:00
benchmark_utils.py
compile_utils.py
compilers.py removed compile cache and static argnums (#85783) 2022-09-28 08:33:59 +00:00
config.py Unified symbolic shape variables between AOTAutograd and Inductor (#86659) 2022-10-14 00:24:43 +00:00
eager_transforms.py Disallow saved tensor hooks in functorch transforms (#85972) 2022-09-30 20:03:58 +00:00
fx_minifier.py
make_functional.py
named_members_polyfill.py
partitioners.py Add optional recomputable_ops argument for the min cut partitioner (#86686) 2022-10-14 12:15:30 +00:00
python_key.py
pytree_hacks.py
top_operators_github_usage.py
vmap.py Change torch.autograd.graph.disable_saved_tensors_hooks to be public API (#85994) 2022-10-03 16:25:01 +00:00