mirror of
https://github.com/saymrwulf/pytorch.git
synced 2026-05-14 20:57:59 +00:00
Fix typo in Reproducibility docs (#141341)
Fixes trivial issue in the docs. Pull Request resolved: https://github.com/pytorch/pytorch/pull/141341 Approved by: https://github.com/svekars
This commit is contained in:
parent
42ab61241e
commit
2bbd984aa2
1 changed files with 2 additions and 2 deletions
|
|
@ -153,8 +153,8 @@ because the output will be nondeterministic. But there is nothing to actually
|
|||
prevent such invalid code from being run. So for safety,
|
||||
:attr:`torch.utils.deterministic.fill_uninitialized_memory` is set to ``True``
|
||||
by default, which will fill the uninitialized memory with a known value if
|
||||
:code:`torch.use_deterministic_algorithms(True)` is set. This will to prevent
|
||||
the possibility of this kind of nondeterministic behavior.
|
||||
:code:`torch.use_deterministic_algorithms(True)` is set. This will prevent the
|
||||
possibility of this kind of nondeterministic behavior.
|
||||
|
||||
However, filling uninitialized memory is detrimental to performance. So if your
|
||||
program is valid and does not use uninitialized memory as the input to an
|
||||
|
|
|
|||
Loading…
Reference in a new issue