pyml.neural_network.layer.activation.tanh.Tanh#
- class Tanh[source]#
Bases:
_Activation
Tanh activation function
The tanh function is defined as:
\(\tanh x = {\frac {\sinh x}{\cosh x}} = {\frac {\mathrm {e} ^{x}-\mathrm {e} ^{-x}}{\mathrm {e} ^{x}+\mathrm {e} ^{-x}}} = {\frac {\mathrm {e} ^{2x}-1}{\mathrm {e} ^{2x}+1}}=1-{\frac {2}{\mathrm {e} ^{2x}+1}}\).
Methods
__init__
Computes the backward step
Computes a forward pass
Converts the calculated output into actual predictions
Set adjacent layers which are needed for the model to iterate through the layers.
- backward(dvalues)[source]#
Computes the backward step
The derivative of the tanh function will be calculated as follows:
\(\dfrac{d\tanh}{dx}=1-\tanh^2=\dfrac{1}{\cosh^2 x}\).
- Return type:
- Parameters:
dvalues (numpy.ndarray) – Derived gradient from the previous layers (reversed order).
- forward(inputs)[source]#
Computes a forward pass
- Return type:
- Parameters:
inputs (numpy.ndarray) – Input values from previous neural layer.
- abstract predictions()#
Converts the calculated output into actual predictions
- set_adjacent_layers(previous_layer, next_layer)#
Set adjacent layers which are needed for the model to iterate through the layers.
- Parameters:
previous_layer (_Layer) – Layer that is previous to this layer.
next_layer (_Layer) – Layer that is subsequent to this layer.