pytorch/torch/utils/data
yl-to 1b55e2b043 add prefetch_factor for multiprocessing prefetching process (#41130)
Summary:
fix https://github.com/pytorch/pytorch/issues/40604
Add parameter to Dataloader to configure the per-worker prefetch number.
Before this edit, the prefetch process always prefetch 2 * num_workers data items, this commit help us make this configurable, e.x. you can specify to prefetch 10 * num_workers data items.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/41130

Reviewed By: izdeby

Differential Revision: D22705288

Pulled By: albanD

fbshipit-source-id: 2c483fce409735fef1351eb5aa0b033f8e596561
2020-07-24 08:38:13 -07:00
..
_utils Remove duplicate assignment in collate.py (#40655) 2020-07-06 12:37:59 -07:00
__init__.py
__init__.pyi
dataloader.py add prefetch_factor for multiprocessing prefetching process (#41130) 2020-07-24 08:38:13 -07:00
dataset.py type annotations for dataloader, dataset, sampler (#39392) 2020-07-07 07:16:18 -07:00
distributed.py Fixes formatting of vander, count_nonzero, DistributedSampler documentation (#41025) 2020-07-06 14:26:13 -07:00
distributed.pyi
sampler.py Patch for #40026 RandomSampler generates samples one at a time when replacement=True (#41682) 2020-07-22 13:45:09 -07:00