mirror of
https://github.com/saymrwulf/pytorch.git
synced 2026-05-15 21:00:47 +00:00
Summary: After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Test Plan: Signals Differential Revision: D57747749 Pull Request resolved: https://github.com/pytorch/pytorch/pull/127309 Approved by: https://github.com/jerryzh168 |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| test_bias_correction_eager.py | ||
| test_equalize_eager.py | ||
| test_fuse_eager.py | ||
| test_model_numerics.py | ||
| test_numeric_suite_eager.py | ||
| test_quantize_eager_ptq.py | ||
| test_quantize_eager_qat.py | ||