pytorch/test/quantization
Vasiliy Kuznetsov 2912ad1324 ns for fx: move linear activation test case to new API (#53777)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/53777

Moves linear activation test case to new NS API

Test Plan:
```
python test/test_quantization.py TestFXNumericSuiteCoreAPIsModels.test_compare_activations_linear
```

Imported from OSS

Reviewed By: hx89

Differential Revision: D26967107

fbshipit-source-id: 83c4401b2bf79d15227b7fb3e59c54276ec5626b
2021-03-12 10:02:52 -08:00
..
serialized
__init__.py
test_backward_compatibility.py
test_bias_correction.py Numeric Suite: Swap with shadow modules only for quantized part of model (#51052) 2021-02-04 11:40:30 -08:00
test_equalize.py
test_fusion_passes.py
test_numeric_suite.py Numeric Suite: Swap with shadow modules only for quantized part of model (#51052) 2021-02-04 11:40:30 -08:00
test_numeric_suite_fx.py ns for fx: move linear activation test case to new API (#53777) 2021-03-12 10:02:52 -08:00
test_qat_module.py [reland][quant][fix] Add bias once in conv_fused (#48593) (#48661) 2020-12-02 10:17:43 -08:00
test_quantize.py quant: fix conv transpose with qconfig == None (#52844) 2021-02-25 11:52:30 -08:00
test_quantize_fx.py [not for land] fix fx quant for quant_layer -> stack -> sum (#53196) 2021-03-12 07:43:50 -08:00
test_quantize_jit.py [quantization] Add some support for 3d operations (#50003) 2021-03-10 16:40:35 -08:00
test_quantized_functional.py [reland][quant] Remove nn.quantized.ReLU module and nn.quantized.functional.relu (#47415) (#48038) 2020-11-17 09:52:21 -08:00
test_quantized_module.py [quant] Reference option for conv module (#52316) 2021-02-24 14:54:02 -08:00
test_quantized_op.py [quant][fix] MHA tensor assignment fix (#53031) 2021-03-03 14:49:19 -08:00
test_quantized_tensor.py [quant] PerChannelFloatQParams support for quint4x2 dtype (#45594) 2020-10-01 23:59:53 -07:00
test_workflow_module.py Fake Quantization support for f16 and f32 (#52612) 2021-02-23 10:49:12 -08:00