pytorch/docs/source/notes
Felix Divo 25fba4a019 [DOC] Add link to "double backward" from "extending pytorch" page (#72584)
Summary:
It is probably the most user friendly to link to that (lesser known?) feature.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/72584

Reviewed By: soulitzer

Differential Revision: D34173999

Pulled By: albanD

fbshipit-source-id: 99fff7a55412faf54888f8317ab2388f4d7d30e4
(cherry picked from commit 2191ee76570b8c22a16aca75f947ece38e6ca3cf)
2022-02-11 20:34:13 +00:00
..
amp_examples.rst
autograd.rst [DOC] Add link to "double backward" from "extending pytorch" page (#72584) 2022-02-11 20:34:13 +00:00
broadcasting.rst
cpu_threading_runtimes.svg
cpu_threading_torchscript_inference.rst
cpu_threading_torchscript_inference.svg
cuda.rst Fixes jiterator cache macro include + updates CUDA note with cache variables (#71452) 2022-01-19 03:45:05 +00:00
ddp.rst [Docs][BE] DDP doc fix (#71363) 2022-01-18 22:24:51 +00:00
extending.rst [DOC] Add link to "double backward" from "extending pytorch" page (#72584) 2022-02-11 20:34:13 +00:00
faq.rst
gradcheck.rst
hip.rst
large_scale_deployments.rst
modules.rst
multiprocessing.rst
numerical_accuracy.rst
randomness.rst
serialization.rst
windows.rst Remove remaining THC code (#69039) 2021-12-08 12:18:08 -08:00