pytorch/tools/autograd
Luca Wehrstedt 92a4ee1cf6 Revert D26375734: Implemented torch.linalg.multi_dot
Test Plan: revert-hammer

Differential Revision:
D26375734 (0396f492b9)

Original commit changeset: 839642692424

fbshipit-source-id: cb64db646010128d802e1930d5e9526c1f7aa6a2
2021-02-25 00:43:57 -08:00
..
templates Revert D26375734: Implemented torch.linalg.multi_dot 2021-02-25 00:43:57 -08:00
__init__.py
deprecated.yaml
derivatives.yaml add OneDNN pooling backward (#49454) 2021-02-23 14:45:55 -08:00
gen_annotated_fn_args.py Add torch.linalg to generated annotated_args for test_overrides (#52464) 2021-02-24 15:30:32 -08:00
gen_autograd.py [pytorch][codegen] migrate gen_variable_type to new data model (#49735) 2021-01-05 14:12:39 -08:00
gen_autograd_functions.py Making ops c10-full: list of optional tensors (#49138) 2021-01-04 05:04:02 -08:00
gen_python_functions.py fake_quant cachemask: remove Python bindings (#51878) 2021-02-09 23:27:53 -08:00
gen_trace_type.py update tracing codegen to use redispatch API (#52009) 2021-02-22 13:26:47 -08:00
gen_variable_factories.py Split out RegisterDispatchKey to its own file (#51508) 2021-02-04 09:19:32 -08:00
gen_variable_type.py update autograd kernels to use redispatch (#51363) 2021-02-22 13:24:34 -08:00
load_derivatives.py Split out RegisterDispatchKey to its own file (#51508) 2021-02-04 09:19:32 -08:00
README.md

If you add a file to this directory, you MUST update torch/CMakeLists.txt and add the file as a dependency to the add_custom_command call.