mirror of
https://github.com/saymrwulf/pytorch.git
synced 2026-05-15 21:00:47 +00:00
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/21389 As titled. To do weight re-init on evicted rows in embedding table, we need to pass the info of the evicted hashed values to SparseLookup, which is the layer model responsible for constructing the embedding table and do pooling. To pass evicted values, we need to adjust the output record of lru_sparse_hash to include the evicted values, and add optional input to all processors that needs to take in sparse segment. For SparseLookup to get the evicted values, its input record needs to be adjusted. Now the input record can have type IdList/IdScoreList/or a struct of feature + evicted values Reviewed By: itomatik Differential Revision: D15590307 fbshipit-source-id: e493881909830d5ca5806a743a2a713198c100c2 |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| adaptive_weight.py | ||
| add_bias.py | ||
| arc_cosine_feature_map.py | ||
| batch_huber_loss.py | ||
| batch_lr_loss.py | ||
| batch_mse_loss.py | ||
| batch_normalization.py | ||
| batch_sigmoid_cross_entropy_loss.py | ||
| batch_softmax_loss.py | ||
| blob_weighted_sum.py | ||
| bucket_weighted.py | ||
| build_index.py | ||
| concat.py | ||
| constant_weight.py | ||
| conv.py | ||
| dropout.py | ||
| fc.py | ||
| fc_without_bias.py | ||
| feature_sparse_to_dense.py | ||
| functional.py | ||
| gather_record.py | ||
| homotopy_weight.py | ||
| label_smooth.py | ||
| last_n_window_collector.py | ||
| layer_normalization.py | ||
| layers.py | ||
| margin_rank_loss.py | ||
| merge_id_lists.py | ||
| pairwise_similarity.py | ||
| position_weighted.py | ||
| random_fourier_features.py | ||
| reservoir_sampling.py | ||
| sampling_train.py | ||
| sampling_trainable_mixin.py | ||
| select_record_by_context.py | ||
| semi_random_features.py | ||
| sparse_feature_hash.py | ||
| sparse_lookup.py | ||
| split.py | ||
| tags.py | ||
| uniform_sampling.py | ||