neuralmonkey.decoders.classifier module

class neuralmonkey.decoders.classifier.Classifier(name: str, encoders: List[neuralmonkey.model.stateful.Stateful], vocabulary: neuralmonkey.vocabulary.Vocabulary, data_id: str, layers: List[int], activation_fn: Callable[[tensorflow.python.framework.ops.Tensor], tensorflow.python.framework.ops.Tensor] = <function relu>, dropout_keep_prob: float = 0.5, save_checkpoint: Union[str, NoneType] = None, load_checkpoint: Union[str, NoneType] = None, initializers: List[Tuple[str, Callable]] = None) → None

Bases: neuralmonkey.model.model_part.ModelPart

A simple MLP classifier over encoders.

The API pretends it is an RNN decoder which always generates a sequence of length exactly one.

__init__(name: str, encoders: List[neuralmonkey.model.stateful.Stateful], vocabulary: neuralmonkey.vocabulary.Vocabulary, data_id: str, layers: List[int], activation_fn: Callable[[tensorflow.python.framework.ops.Tensor], tensorflow.python.framework.ops.Tensor] = <function relu>, dropout_keep_prob: float = 0.5, save_checkpoint: Union[str, NoneType] = None, load_checkpoint: Union[str, NoneType] = None, initializers: List[Tuple[str, Callable]] = None) → None

Construct a new instance of the sequence classifier.

Parameters:
  • name – Name of the decoder. Should be unique accross all Neural Monkey objects
  • encoders – Input encoders of the decoder
  • vocabulary – Target vocabulary
  • data_id – Target data series
  • layers

    List defining structure of the NN. Ini example: layers=[100,20,5] ;creates classifier with hidden layers of

    size 100, 20, 5 and one output layer depending on the size of vocabulary
  • activation_fn – activation function used on the output of each hidden layer.
  • dropout_keep_prob – Probability of keeping a value during dropout
cost
decoded
decoded_logits
decoded_seq
feed_dict(dataset: neuralmonkey.dataset.dataset.Dataset, train: bool = False) → Dict[tensorflow.python.framework.ops.Tensor, Any]
loss_with_decoded_ins
loss_with_gt_ins
runtime_logprobs
runtime_loss
train_loss