pyml.neural_network.layer.activation#

Implementation of various activation functions

pyml.neural_network.layer.activation.linear

Linear activation function used for regression tasks as final compoment of the network

pyml.neural_network.layer.activation.relu

ReLU is an activation function used mainly in hidden layers

pyml.neural_network.layer.activation.sigmoid

Sigmoid activation function used for binary classification problems as final compoment of the network; alias for logistic function.

pyml.neural_network.layer.activation.softmax

Softmax activation function used for multiclass classification problems as final compoment of the network.

pyml.neural_network.layer.activation.tanh

Tanh activation function is mainly used for classification between two classes