rztdl.dl.components.layers.recurrent package

Submodules

rztdl.dl.components.layers.recurrent.bi_directional_rnn module

@created on: 05/02/20,
@author: Umesh Kumar,
@version: v3.0.0

Description:

Sphinx Documentation Status: Complete

class rztdl.dl.components.layers.recurrent.bi_directional_rnn.BiDirectionalRNN(name: str, forward_cells: typing.List[rztdl.dl.components.layers.recurrent.cells.Cells], backward_cells: typing.List[rztdl.dl.components.layers.recurrent.cells.Cells] = None, inputs: typing.Union[str, tensorflow.python.framework.ops.Tensor] = None, outputs: str = None, return_sequences: bool = False, merge_mode: rztdl.dl.constants.string_constants.MergeMode = <MergeMode.CONCAT: 'concat'>, forward_cell_output: str = None, backward_cell_output: str = None, forward_cell_state: str = None, forward_hidden_state: str = None, backward_cell_state: str = None, backward_hidden_state: str = None)[source]

Bases: tensorflow.python.keras.layers.wrappers.Bidirectional, rztdl.dl.components.layers.layer.Layer

Bi Directional RNN Layer

Parameters:
  • merge_mode (MergeMode) – Merge mode for forward cell output and backward cell output
  • name (str) – Name of instance
  • forward_cells (List[Cells]) – Forward Layer Cells
  • backward_cells (Optional[List[Cells]]) – Backward layer Cells
  • inputs (Union[str, Tensor, None]) – Input Tensor of Bi-Directional RNN layer
  • outputs (Optional[str]) – Output tensors
  • return_sequences (bool) – Whether to return the last output in the output sequence, or the full sequence
  • forward_cell_output (Optional[str]) – Forward Layer cell output
  • backward_cell_output (Optional[str]) – Backward Layer cell output
  • forward_cell_state (Optional[str]) – Forward Layer cell state (used only when last cell is LSTM)
  • forward_hidden_state (Optional[str]) – Forward Layer hidden output
  • backward_cell_state (Optional[str]) – Backward layer cell state (used only when last cell is LSTM)
  • backward_hidden_state (Optional[str]) – Backward layer hidden state
classmethod component_blueprint()[source]
create(inputs)[source]
parameter_validation(name, cells, hidden_state, cell_state, merge_mode, cell_output, cell_type, layer_output)[source]
static prepare_bi_rnn_outputs(outputs, cell_state, hidden_state, cell_output, cell_type)[source]
validate(inputs)[source]

rztdl.dl.components.layers.recurrent.cells module

@created on: 24/01/20,
@author: Umesh Kumar,
@version: v3.0.0

Description:

Sphinx Documentation Status: Complete

class rztdl.dl.components.layers.recurrent.cells.Cells[source]

Bases: object

classmethod blueprint()[source]
classmethod blueprint_properties()[source]
class rztdl.dl.components.layers.recurrent.cells.GRUCell(name: str, units: int, activation: rztdl.dl.helpers.activations.Activation = <rztdl.dl.helpers.activations.Tanh object>, recurrent_activation: rztdl.dl.helpers.activations.Activation = <rztdl.dl.helpers.activations.Sigmoid object>, use_bias: bool = True, kernel_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.GlorotUniform object>, recurrent_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Orthogonal object>, bias_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Zeros object>, kernel_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, recurrent_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, bias_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, activity_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, kernel_constraint: rztdl.dl.helpers.constraints.Constraint = None, recurrent_constraint: rztdl.dl.helpers.constraints.Constraint = None, bias_constraint: rztdl.dl.helpers.constraints.Constraint = None, input_dropout: float = 0.0, dropout_rate: float = 0.0, recurrent_dropout: float = 0.0)[source]

Bases: tensorflow.python.keras.layers.recurrent_v2.GRUCell, rztdl.dl.components.layers.recurrent.cells.Cells

GRU Cell for Stacked RNN

Parameters:
  • name (str) – name of component
  • units (int) – Positive integer, dimensionality of the output space.
  • activation (Activation) – Activation function to use. If you don’t specify anything, no activation is applied
  • recurrent_activation (Activation) – Activation function to use for the recurrent step.
  • use_bias (bool) – Boolean, whether the layer uses a bias vector
  • kernel_initializer (Initializer) – Initializer for the kernel weights matrix
  • bias_initializer (Initializer) – Initializer for the bias vector
  • recurrent_initializer (Initializer) – Initializer for the Recurrent Weight Matrix
  • dropout_rate (float) – Fraction of the units to drop for the linear transformation of the output.
  • input_dropout (float) – Fraction of the units to drop for the linear transformation of the inputs.
  • recurrent_dropout (float) – Fraction of the units to drop for the linear transformation of the recurrent state.
  • kernel_regularizer (Optional[Regularizer]) – Regularizer function applied to the kernel weights matrix
  • bias_regularizer (Optional[Regularizer]) – Regularizer function applied to the bias vector
  • activity_regularizer (Optional[Regularizer]) – Regularizer function applied to the output of the layer (its “activation”)
  • kernel_constraint (Optional[Constraint]) – Constraint function applied to the kernel matrix
  • bias_constraint (Optional[Constraint]) – Constraint function applied to the bias vector.
  • recurrent_regularizer (Optional[Regularizer]) – Regularizer function applied to the recurrent_kernel weights matrix.
  • recurrent_constraint (Optional[Constraint]) – Constraint function applied to the recurrent_kernel weights matrix
parameter_validation(name, dropout_rate, recurrent_dropout, input_dropout)[source]
class rztdl.dl.components.layers.recurrent.cells.LSTMCell(name: str, units: int, activation: rztdl.dl.helpers.activations.Activation = <rztdl.dl.helpers.activations.Tanh object>, recurrent_activation: rztdl.dl.helpers.activations.Activation = <rztdl.dl.helpers.activations.Sigmoid object>, use_bias: bool = True, kernel_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.GlorotUniform object>, recurrent_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Orthogonal object>, bias_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Zeros object>, kernel_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, recurrent_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, bias_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, activity_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, kernel_constraint: rztdl.dl.helpers.constraints.Constraint = None, recurrent_constraint: rztdl.dl.helpers.constraints.Constraint = None, bias_constraint: rztdl.dl.helpers.constraints.Constraint = None, input_dropout: float = 0.0, dropout_rate: float = 0.0, recurrent_dropout: float = 0.0)[source]

Bases: tensorflow.python.keras.layers.recurrent_v2.LSTMCell, rztdl.dl.components.layers.recurrent.cells.Cells

LSTM Cell for Stacked RNN

Parameters:
  • name (str) – name of component
  • units (int) – Positive integer, dimensionality of the output space.
  • activation (Activation) – Activation function to use. If you don’t specify anything, no activation is applied
  • recurrent_activation (Activation) – Activation function to use for the recurrent step.
  • use_bias (bool) – Boolean, whether the layer uses a bias vector
  • kernel_initializer (Initializer) – Initializer for the kernel weights matrix
  • bias_initializer (Initializer) – Initializer for the bias vector
  • recurrent_initializer (Initializer) – Initializer for the Recurrent Weight Matrix
  • dropout_rate (float) – Fraction of the units to drop for the linear transformation of the output.
  • input_dropout (float) – Fraction of the units to drop for the linear transformation of the inputs.
  • recurrent_dropout (float) – Fraction of the units to drop for the linear transformation of the recurrent state.
  • kernel_regularizer (Optional[Regularizer]) – Regularizer function applied to the kernel weights matrix
  • bias_regularizer (Optional[Regularizer]) – Regularizer function applied to the bias vector
  • activity_regularizer (Optional[Regularizer]) – Regularizer function applied to the output of the layer (its “activation”)
  • kernel_constraint (Optional[Constraint]) – Constraint function applied to the kernel matrix
  • bias_constraint (Optional[Constraint]) – Constraint function applied to the bias vector.
  • recurrent_regularizer (Optional[Regularizer]) – Regularizer function applied to the recurrent_kernel weights matrix.
  • recurrent_constraint (Optional[Constraint]) – Constraint function applied to the recurrent_kernel weights matrix
parameter_validation(name, dropout_rate, recurrent_dropout, input_dropout)[source]
class rztdl.dl.components.layers.recurrent.cells.RNNCell(name: str, units: int, activation: rztdl.dl.helpers.activations.Activation = <rztdl.dl.helpers.activations.Tanh object>, use_bias: bool = True, kernel_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.GlorotUniform object>, recurrent_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Orthogonal object>, bias_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Zeros object>, kernel_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, recurrent_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, bias_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, activity_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, kernel_constraint: rztdl.dl.helpers.constraints.Constraint = None, recurrent_constraint: rztdl.dl.helpers.constraints.Constraint = None, bias_constraint: rztdl.dl.helpers.constraints.Constraint = None, dropout_rate: float = 0.0, input_dropout: float = 0.0, recurrent_dropout: float = 0.0)[source]

Bases: tensorflow.python.keras.layers.recurrent.SimpleRNNCell, rztdl.dl.components.layers.recurrent.cells.Cells

Simple RNN Cell for Stacked RNN

Parameters:
  • name (str) – name of component
  • units (int) – Positive integer, dimensionality of the output space.
  • activation (Activation) – Activation function to use. If you don’t specify anything, no activation is applied
  • use_bias (bool) – Boolean, whether the layer uses a bias vector
  • kernel_initializer (Initializer) – Initializer for the kernel weights matrix
  • bias_initializer (Initializer) – Initializer for the bias vector
  • recurrent_initializer (Initializer) – Initializer for the Recurrent Weight Matrix
  • dropout_rate (float) – Fraction of the units to drop for the linear transformation of the output.
  • input_dropout (float) – Fraction of the units to drop for the linear transformation of the inputs.
  • recurrent_dropout (float) – Fraction of the units to drop for the linear transformation of the recurrent state.
  • kernel_regularizer (Optional[Regularizer]) – Regularizer function applied to the kernel weights matrix
  • bias_regularizer (Optional[Regularizer]) – Regularizer function applied to the bias vector
  • activity_regularizer (Optional[Regularizer]) – Regularizer function applied to the output of the layer (its “activation”)
  • kernel_constraint (Optional[Constraint]) – Constraint function applied to the kernel matrix
  • bias_constraint (Optional[Constraint]) – Constraint function applied to the bias vector.
  • recurrent_regularizer (Optional[Regularizer]) – Regularizer function applied to the recurrent_kernel weights matrix.
  • recurrent_constraint (Optional[Constraint]) – Constraint function applied to the recurrent_kernel weights matrix
parameter_validation(name, dropout_rate, recurrent_dropout, input_dropout)[source]

rztdl.dl.components.layers.recurrent.gru module

@created on: 12/28/19,
@author: Prathyush SP,
@version: v0.0.1

Description:

Sphinx Documentation Status:

class rztdl.dl.components.layers.recurrent.gru.GRU(name: str, units: int, outputs: str, activation: rztdl.dl.helpers.activations.Activation = <rztdl.dl.helpers.activations.Tanh object>, recurrent_activation: rztdl.dl.helpers.activations.Activation = <rztdl.dl.helpers.activations.HardSigmoid object>, use_bias: bool = True, kernel_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.GlorotUniform object>, recurrent_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Orthogonal object>, bias_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Zeros object>, kernel_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, recurrent_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, bias_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, activity_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, kernel_constraint: rztdl.dl.helpers.constraints.Constraint = None, recurrent_constraint: rztdl.dl.helpers.constraints.Constraint = None, bias_constraint: rztdl.dl.helpers.constraints.Constraint = None, input_dropout: float = 0.0, dropout_rate: float = 0.0, recurrent_dropout: float = 0.0, return_sequences: bool = False, inputs: typing.Union[str, tensorflow.python.framework.ops.Tensor] = None, hidden_state: str = None)[source]

Bases: tensorflow.python.keras.layers.recurrent_v2.GRU, rztdl.dl.components.layers.layer.Layer

GRU Layer

Parameters:
  • name (str) – name of component
  • units (int) – Positive integer, dimensionality of the output space.
  • activation (Activation) – Activation function to use. If you don’t specify anything, no activation is applied
  • recurrent_activation (Activation) – Activation function to use for the recurrent step.
  • use_bias (bool) – Boolean, whether the layer uses a bias vector
  • kernel_initializer (Initializer) – Initializer for the kernel weights matrix
  • recurrent_initializer (Initializer) – Initializer for the Recurrent Weight Matrix
  • bias_initializer (Initializer) – Initializer for the bias vector
  • recurrent_dropout (float) – Fraction of the units to drop for the linear transformation of the recurrent state.
  • kernel_regularizer (Optional[Regularizer]) – Regularizer function applied to the kernel weights matrix
  • bias_regularizer (Optional[Regularizer]) – Regularizer function applied to the bias vector
  • recurrent_regularizer (Optional[Regularizer]) – Regularizer function applied to the recurrent_kernel weights matrix.
  • activity_regularizer (Optional[Regularizer]) – Regularizer function applied to the output of the layer (its “activation”)
  • kernel_constraint (Optional[Constraint]) – Constraint function applied to the kernel matrix
  • bias_constraint (Optional[Constraint]) – Constraint function applied to the bias vector.
  • recurrent_constraint (Optional[Constraint]) – Constraint function applied to the recurrent_kernel weights matrix
  • dropout_rate (float) – Fraction of the units to drop for the linear transformation of the output.
  • input_dropout (float) – Fraction of the units to drop for the linear transformation of the inputs.
  • recurrent_dropout – Fraction of the units to drop for the linear transformation of the recurrent state.
  • inputs (Union[str, Tensor, None]) – Input Tensor
  • outputs (str) – Output Tensor
  • return_sequences (bool) – Whether to return sequence
  • hidden_state (Optional[str]) – Last Hidden State Output
classmethod component_blueprint()[source]
create(inputs)[source]
parameter_validation(dropout_rate, recurrent_dropout, input_dropout)[source]
validate(inputs)[source]

rztdl.dl.components.layers.recurrent.lstm module

@created on: 12/28/19,
@author: Prathyush SP,
@version: v0.0.1

Description:

Sphinx Documentation Status:

class rztdl.dl.components.layers.recurrent.lstm.LSTM(name: str, units: int, activation: rztdl.dl.helpers.activations.Activation = <rztdl.dl.helpers.activations.Tanh object>, recurrent_activation: rztdl.dl.helpers.activations.Activation = <rztdl.dl.helpers.activations.HardSigmoid object>, use_bias: bool = True, kernel_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.GlorotUniform object>, recurrent_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Orthogonal object>, bias_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Zeros object>, unit_forget_bias: bool = True, kernel_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, recurrent_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, bias_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, activity_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, kernel_constraint: rztdl.dl.helpers.constraints.Constraint = None, recurrent_constraint: rztdl.dl.helpers.constraints.Constraint = None, bias_constraint: rztdl.dl.helpers.constraints.Constraint = None, dropout_rate: float = 0.0, input_dropout: float = 0.0, recurrent_dropout: float = 0.0, return_sequences: bool = False, hidden_state: str = None, cell_state: str = None, inputs: typing.Union[str, tensorflow.python.framework.ops.Tensor] = None, outputs: str = None)[source]

Bases: tensorflow.python.keras.layers.recurrent_v2.LSTM, rztdl.dl.components.layers.layer.Layer

LSTM Layer

Parameters:
  • name (str) – name of component
  • units (int) – Positive integer, dimensionality of the output space.
  • activation (Activation) – Activation function to use. If you don’t specify anything, no activation is applied
  • recurrent_activation (Activation) – Activation function to use for the recurrent step.
  • use_bias (bool) – Boolean, whether the layer uses a bias vector
  • kernel_initializer (Initializer) – Initializer for the kernel weights matrix
  • bias_initializer (Initializer) – Initializer for the bias vector
  • recurrent_initializer (Initializer) – Initializer for the Recurrent Weight Matrix
  • unit_forget_bias (bool) – Boolean (default True). If True, add 1 to the bias of the forget gate at initialization.
  • dropout_rate (float) – Fraction of the units to drop for the linear transformation of the output.
  • input_dropout (float) – Fraction of the units to drop for the linear transformation of the inputs.
  • recurrent_dropout (float) – Fraction of the units to drop for the linear transformation of the recurrent state.
  • kernel_regularizer (Optional[Regularizer]) – Regularizer function applied to the kernel weights matrix
  • bias_regularizer (Optional[Regularizer]) – Regularizer function applied to the bias vector
  • recurrent_regularizer (Optional[Regularizer]) – Regularizer function applied to the recurrent_kernel weights matrix.
  • activity_regularizer (Optional[Regularizer]) – Regularizer function applied to the output of the layer (its “activation”)
  • kernel_constraint (Optional[Constraint]) – Constraint function applied to the kernel matrix
  • bias_constraint (Optional[Constraint]) – Constraint function applied to the bias vector.
  • recurrent_constraint (Optional[Constraint]) – Constraint function applied to the recurrent_kernel weights matrix
  • return_sequences (bool) – Whether to return sequence of all hidden states
  • inputs (Union[str, Tensor, None]) – Input Tensor
  • outputs (Optional[str]) – Output Tensor
  • hidden_state (Optional[str]) – Last hidden state output
  • cell_state (Optional[str]) – Last Cell state output
classmethod component_blueprint()[source]
create(inputs)[source]
parameter_validation(dropout_rate, recurrent_dropout, input_dropout)[source]
validate(inputs)[source]

rztdl.dl.components.layers.recurrent.rnn module

@created on: 12/28/19,
@author: Prathyush SP,
@version: v0.0.1

Description:

Sphinx Documentation Status:

class rztdl.dl.components.layers.recurrent.rnn.RNN(name: str, units: int, activation: rztdl.dl.helpers.activations.Activation = <rztdl.dl.helpers.activations.Tanh object>, use_bias: bool = True, kernel_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.GlorotUniform object>, recurrent_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Orthogonal object>, bias_initializer: rztdl.dl.helpers.initializers.Initializer = <rztdl.dl.helpers.initializers.Zeros object>, kernel_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, recurrent_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, bias_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, activity_regularizer: rztdl.dl.helpers.regularizers.Regularizer = None, kernel_constraint: rztdl.dl.helpers.constraints.Constraint = None, recurrent_constraint: rztdl.dl.helpers.constraints.Constraint = None, bias_constraint: rztdl.dl.helpers.constraints.Constraint = None, input_dropout: float = 0.0, dropout_rate: float = 0.0, recurrent_dropout: float = 0.0, return_sequences: bool = False, inputs: typing.Union[str, tensorflow.python.framework.ops.Tensor] = None, outputs: str = None, hidden_state: str = None)[source]

Bases: tensorflow.python.keras.layers.recurrent.SimpleRNN, rztdl.dl.components.layers.layer.Layer

RNN Layer

Parameters:
  • name (str) – name of component
  • units (int) – Positive integer, dimensionality of the output space.
  • activation (Activation) – Activation function to use. If you don’t specify anything, no activation is applied
  • use_bias (bool) – Boolean, whether the layer uses a bias vector
  • kernel_initializer (Initializer) – Initializer for the kernel weights matrix
  • bias_initializer (Initializer) – Initializer for the bias vector
  • recurrent_initializer (Initializer) – Initializer for the Recurrent Weight Matrix
  • dropout_rate (float) – Fraction of the units to drop for the linear transformation of the output.
  • input_dropout (float) – Fraction of the units to drop for the linear transformation of the inputs.
  • recurrent_dropout (float) – Fraction of the units to drop for the linear transformation of the recurrent state.
  • return_state – Whether to return the last state in addition to the output.
  • return_sequences (bool) – Whether to return the last output in the output sequence, or the full sequence.
  • kernel_regularizer (Optional[Regularizer]) – Regularizer function applied to the kernel weights matrix
  • bias_regularizer (Optional[Regularizer]) – Regularizer function applied to the bias vector
  • activity_regularizer (Optional[Regularizer]) – Regularizer function applied to the output of the layer (its “activation”)
  • recurrent_regularizer (Optional[Regularizer]) – Regularizer function applied to the recurrent_kernel weights matrix.
  • kernel_constraint (Optional[Constraint]) – Constraint function applied to the kernel matrix
  • bias_constraint (Optional[Constraint]) – Constraint function applied to the bias vector.
  • recurrent_constraint (Optional[Constraint]) – Constraint function applied to the recurrent_kernel weights matrix
  • return_sequences – Whether to return sequence
  • return_state – Whether to return state
  • inputs (Union[str, Tensor, None]) – Input Tensor
  • outputs (Optional[str]) – Output Tensor
  • hidden_state (Optional[str]) – Last Hidden State Output
classmethod component_blueprint()[source]
create(inputs)[source]
parameter_validation(dropout_rate, recurrent_dropout, input_dropout)[source]
validate(inputs)[source]

rztdl.dl.components.layers.recurrent.stacked_rnn module

@created on: 12/28/19,
@author: Prathyush SP,
@version: v0.0.1

Description:

Sphinx Documentation Status:

class rztdl.dl.components.layers.recurrent.stacked_rnn.StackedRNN(name: str, cells: typing.List[rztdl.dl.components.layers.recurrent.cells.Cells], inputs: typing.Union[str, tensorflow.python.framework.ops.Tensor] = None, outputs: str = None, hidden_state: str = None, cell_state: str = None, return_sequences: bool = False)[source]

Bases: tensorflow.python.keras.layers.recurrent.RNN, rztdl.dl.components.layers.layer.Layer

Stacked RNN Layer

Parameters:
  • name (str) – name of component
  • cells (List[Cells]) – Recurrent Cells
  • inputs (Union[str, Tensor, None]) – Input Tensor
  • outputs (Optional[str]) – Output Tensor
  • hidden_state (Optional[str]) – Last hidden state output
  • cell_state (Optional[str]) – Last Cell state output
  • return_sequences (bool) – Whether to return sequence of outputs from all hidden states
classmethod component_blueprint()[source]
create(inputs)[source]
parameter_validation(cells, hidden_state, cell_state)[source]
validate(inputs)[source]

Module contents