pytorch/caffe2/python/layers
Yan Zhu ac9f0a6884 refactor preproc, support dense in TumHistory layer
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/11131

Reviewed By: xianjiec

Differential Revision: D9358415

fbshipit-source-id: 38bf0e597e22d540d9e985ac8da730f80971d745
2018-09-05 16:10:13 -07:00
..
__init__.py
adaptive_weight.py
add_bias.py
arc_cosine_feature_map.py
batch_distill_lr_loss.py Co-disitillation with different archs and/or feature set (#9793) 2018-07-25 10:10:27 -07:00
batch_lr_loss.py
batch_mse_loss.py
batch_normalization.py
batch_sigmoid_cross_entropy_loss.py
batch_softmax_loss.py
blob_weighted_sum.py
build_index.py
concat.py refactor preproc, support dense in TumHistory layer 2018-09-05 16:10:13 -07:00
constant_weight.py
conv.py
dropout.py
fc.py
fc_without_bias.py
feature_sparse_to_dense.py support generic feature in DPER2 (#10197) 2018-08-04 15:25:13 -07:00
functional.py
gather_record.py
homotopy_weight.py
label_smooth.py
last_n_window_collector.py
layer_normalization.py
layers.py
margin_rank_loss.py
merge_id_lists.py
pairwise_similarity.py move matrix formation for dot products to precompute/request-only (#10531) 2018-08-15 11:02:10 -07:00
position_weighted.py Remove unused code base for distributed training (#10282) 2018-08-16 20:10:17 -07:00
random_fourier_features.py
reservoir_sampling.py
sampling_train.py
sampling_trainable_mixin.py
select_record_by_context.py
semi_random_features.py
sparse_feature_hash.py
sparse_lookup.py Fix bug that always uses the same blob when repeating poolings 2018-07-28 00:09:16 -07:00
split.py
tags.py Remove unused code base for distributed training (#10282) 2018-08-16 20:10:17 -07:00
uniform_sampling.py