pyml.neural_network.layer.activation#
Implementation of various activation functions
Linear activation function used for regression tasks as final compoment of the network |
|
ReLU is an activation function used mainly in hidden layers |
|
Sigmoid activation function used for binary classification problems as final compoment of the network; alias for logistic function. |
|
Softmax activation function used for multiclass classification problems as final compoment of the network. |
|
Tanh activation function is mainly used for classification between two classes |