pytorch/caffe2/python/layers
Jiyan Yang deadf3ba89 Add assertion to make sure init op is always fp16 compatible in fp16 training
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/18498

Reviewed By: kennyhorror

Differential Revision: D14626755

fbshipit-source-id: d8a0b3c02920ab3835911a21bf05e8956853fcd7
2019-04-21 23:43:13 -07:00
..
__init__.py
adaptive_weight.py
add_bias.py
arc_cosine_feature_map.py
batch_distill_lr_loss.py
batch_lr_loss.py try to enable uncertainty for lr loss (#17236) 2019-04-11 07:35:19 -07:00
batch_mse_loss.py
batch_normalization.py
batch_sigmoid_cross_entropy_loss.py
batch_softmax_loss.py
blob_weighted_sum.py
bucket_weighted.py
build_index.py
concat.py
constant_weight.py
conv.py
dropout.py add dropout during eval (#17549) 2019-02-28 23:21:29 -08:00
fc.py
fc_without_bias.py
feature_sparse_to_dense.py Revert D13551909: [fbcode] logdevice for generic feature type 2019-01-25 00:33:06 -08:00
functional.py
gather_record.py
homotopy_weight.py
label_smooth.py
last_n_window_collector.py
layer_normalization.py
layers.py
margin_rank_loss.py
merge_id_lists.py
pairwise_similarity.py
position_weighted.py
random_fourier_features.py
reservoir_sampling.py
sampling_train.py
sampling_trainable_mixin.py
select_record_by_context.py
semi_random_features.py
sparse_feature_hash.py
sparse_lookup.py Add assertion to make sure init op is always fp16 compatible in fp16 training 2019-04-21 23:43:13 -07:00
split.py
tags.py
uniform_sampling.py