pytorch/caffe2/python/layers
Peiyao Zhou 46fefc98e2 Change dper3 loss module to match dper2 (#28265)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/28265

Fix the difference in dper3 and dper2 when regressionLoss is used.

Test Plan:
test using dper2 model id f134632386
Comparison tool output before change:
```
FOUND OP DIFFERENT WITH DPER2!!!
OP is of type ExpandDims
OP inputs ['supervision:label']
OP outputs ['sparse_nn/regression_loss/mean_squared_error_loss/ExpandDims:0']
===============================
Finished all dper3 ops, number of good ops 11, bad ops 1, skipped 26
run_comparison for dper2 / dper3 nets running time: 0.0020143985748291016
result type: <class 'NoneType'> result: None
```

After change:

```
FOUND OP DIFFERENT WITH DPER2!!!
OP is of type ExpandDims
OP inputs ['sparse_nn_2/regression_loss_2/mean_squared_error_loss_8/Squeeze:0_grad']
OP outputs ['sparse_nn_2/over_arch_2/linear_2/FC_grad']
===============================
Finished all dper3 ops, number of good ops 19, bad ops 1, skipped 16
run_comparison for dper2 / dper3 nets running time: 0.0017991065979003906
result type: <class 'NoneType'> result: None
```

dper2  label part of net P111794577
dper3  label part of net after change P116817194

Reviewed By: kennyhorror

Differential Revision: D17795740

fbshipit-source-id: 9faf96f5140f5a1efdf2985820bda3ca400f61fa
2019-10-18 10:08:38 -07:00
..
__init__.py
adaptive_weight.py
add_bias.py
arc_cosine_feature_map.py
batch_huber_loss.py
batch_lr_loss.py Exponential decay of the weight of task loss (#27508) 2019-10-08 09:15:41 -07:00
batch_mse_loss.py Change dper3 loss module to match dper2 (#28265) 2019-10-18 10:08:38 -07:00
batch_normalization.py
batch_sigmoid_cross_entropy_loss.py
batch_softmax_loss.py fix loss_weight for self_supervision 2019-10-15 10:40:48 -07:00
blob_weighted_sum.py
bpr_loss.py Add BPR loss to TTSN (#24439) 2019-08-15 23:20:15 -07:00
bucket_weighted.py add feature name into module and update position weighted to match dper2 2019-10-14 08:06:19 -07:00
build_index.py
concat.py
constant_weight.py
conv.py
dropout.py
fc.py Integrate FC fp16 exporter into Dper2 (#26582) 2019-09-29 10:19:28 -07:00
fc_without_bias.py
feature_sparse_to_dense.py Return list of AccessedFeatures from get_accessed_features (#23983) 2019-08-14 10:50:27 -07:00
functional.py
gather_record.py
homotopy_weight.py
label_smooth.py
last_n_window_collector.py
layer_normalization.py
layers.py Return list of AccessedFeatures from get_accessed_features (#23983) 2019-08-14 10:50:27 -07:00
margin_rank_loss.py
merge_id_lists.py
pairwise_similarity.py
position_weighted.py
random_fourier_features.py
reservoir_sampling.py
sampling_train.py
sampling_trainable_mixin.py
select_record_by_context.py
semi_random_features.py
sparse_dropout_with_replacement.py hook up dropout sparse with replacement operator 2019-07-23 14:34:25 -07:00
sparse_feature_hash.py Refactor and expose metadata of tum_history layer for online prediction 2019-08-15 00:27:11 -07:00
sparse_lookup.py Fix predict net issue with LRU hash eviction 2019-10-14 16:08:14 -07:00
split.py Enable variable size embedding (#25782) 2019-09-09 22:08:32 -07:00
tags.py
uniform_sampling.py