pytorch/caffe2/python/layers
Artem Volkhin 000db87bc7 Half-floats support for the rest of segment ops
Summary:
previously fp16 type was supported in SparseLengthsSum operator, now it
works in all other segment operator as well.

Reviewed By: dzhulgakov

Differential Revision: D4624312

fbshipit-source-id: c9d72110e3762167270bb088405eaf9c56e88493
2017-02-28 11:19:15 -08:00
..
__init__.py fbsync. TODO: check if build files need update. 2016-11-15 00:00:46 -08:00
batch_lr_loss.py NextScopedBlob with well-defined behavior and respect namescope 2017-02-16 17:16:36 -08:00
concat.py NextScopedBlob with well-defined behavior and respect namescope 2017-02-16 17:16:36 -08:00
dot_product.py NextScopedBlob with well-defined behavior and respect namescope 2017-02-16 17:16:36 -08:00
expand_dims.py NextScopedBlob with well-defined behavior and respect namescope 2017-02-16 17:16:36 -08:00
fc.py model and preprocessor can handle empty dense inputs 2017-02-22 11:19:15 -08:00
functional.py Add a way do describe layers in a more AdHoc manner. 2017-02-27 23:30:39 -08:00
layers.py Add a way do describe layers in a more AdHoc manner. 2017-02-27 23:30:39 -08:00
simple_operator_layers.py Fix random issues with some of the layers getting missing from registry. 2017-01-10 15:14:31 -08:00
sparse_lookup.py Half-floats support for the rest of segment ops 2017-02-28 11:19:15 -08:00
sparse_to_dense.py NextScopedBlob with well-defined behavior and respect namescope 2017-02-16 17:16:36 -08:00
split.py NextScopedBlob with well-defined behavior and respect namescope 2017-02-16 17:16:36 -08:00
tags.py fbsync. TODO: check if build files need update. 2016-11-15 00:00:46 -08:00