mirror of
https://github.com/saymrwulf/pytorch.git
synced 2026-05-15 21:00:47 +00:00
Summary: Feed team uses distributed training and wants to also use transfer learning. Currently, transfer learning implements by overwriting the layer parameter initializer. Therefore, PS builder can't infer correctly the parameter shape. To fix this, add a field 'shape' in `layer_parameter` and set the shape if we overwrite its initializer. We also enforce the check of parameter shape between the original initializer and the loaded blob. (this adds extra cost) Differential Revision: D5520541 fbshipit-source-id: 80547dbd328b3f6cbfcea0b2daaf4004703dfe81 |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| add_bias.py | ||
| arc_cosine_feature_map.py | ||
| batch_distill_lr_loss.py | ||
| batch_lr_loss.py | ||
| batch_mse_loss.py | ||
| batch_normalization.py | ||
| batch_sigmoid_cross_entropy_loss.py | ||
| batch_softmax_loss.py | ||
| build_index.py | ||
| concat.py | ||
| dot_product.py | ||
| dropout.py | ||
| fc.py | ||
| fc_without_bias.py | ||
| feature_sparse_to_dense.py | ||
| functional.py | ||
| gather_record.py | ||
| last_n_window_collector.py | ||
| layers.py | ||
| pairwise_dot_product.py | ||
| position_weighted.py | ||
| random_fourier_features.py | ||
| sampling_train.py | ||
| sampling_trainable_mixin.py | ||
| select_record_by_context.py | ||
| semi_random_features.py | ||
| sparse_feature_hash.py | ||
| sparse_lookup.py | ||
| split.py | ||
| tags.py | ||
| uniform_sampling.py | ||