mirror of
https://github.com/saymrwulf/pytorch.git
synced 2026-05-15 21:00:47 +00:00
Summary: fix https://github.com/pytorch/pytorch/issues/40604 Add parameter to Dataloader to configure the per-worker prefetch number. Before this edit, the prefetch process always prefetch 2 * num_workers data items, this commit help us make this configurable, e.x. you can specify to prefetch 10 * num_workers data items. Pull Request resolved: https://github.com/pytorch/pytorch/pull/41130 Reviewed By: izdeby Differential Revision: D22705288 Pulled By: albanD fbshipit-source-id: 2c483fce409735fef1351eb5aa0b033f8e596561 |
||
|---|---|---|
| .. | ||
| _utils | ||
| __init__.py | ||
| __init__.pyi | ||
| dataloader.py | ||
| dataset.py | ||
| distributed.py | ||
| distributed.pyi | ||
| sampler.py | ||