neuralmonkey.nn.noisy_gru_cell module¶
-
class
neuralmonkey.nn.noisy_gru_cell.
NoisyGRUCell
(num_units: int, training) → None¶ Bases:
tensorflow.python.ops.rnn_cell_impl.RNNCell
Gated Recurrent Unit cell (cf. http://arxiv.org/abs/1406.1078).
GRU with noisy activation functions (http://arxiv.org/abs/1603.00391). The theano code is availble at https://github.com/caglar/noisy_units.
It is based on the TensorFlow implementatin of GRU just the activation function are changed for the noisy ones.
-
__init__
(num_units: int, training) → None¶
-
output_size
¶ Integer or TensorShape – size of outputs produced by this cell.
-
state_size
¶ size(s) of state(s) used by this cell.
It can be represented by an Integer, a TensorShape or a tuple of Integers or TensorShapes.
-
-
neuralmonkey.nn.noisy_gru_cell.
noisy_activation
(x, generic, linearized, training, alpha: float = 1.1, c: float = 0.5)¶ Apply the noisy activation.
Implements the noisy activation with Half-Normal Noise for Hard-Saturation functions.
See http://arxiv.org/abs/1603.00391, Algorithm 1.
Parameters: - x – Tensor which is an input to the activation function
- generic – The generic formulation of the activation function. (denoted as h in the paper)
- linearized – Linearization of the activation based on the first-order Tailor expansion around zero. (denoted as u in the paper)
- training – A boolean tensor telling whether we are in the training stage (and the noise is sampled) or in runtime when the expactation is used instead.
- alpha – Mixing hyper-parameter. The leakage rate from the linearized function to the nonlinear one.
- c – Standard deviation of the sampled noise.
-
neuralmonkey.nn.noisy_gru_cell.
noisy_sigmoid
(x, training)¶
-
neuralmonkey.nn.noisy_gru_cell.
noisy_tanh
(x, training)¶