pyml.neural_network.layer.activation.relu#
ReLU is an activation function used mainly in hidden layers
Classes
Rectified linear unit (ReLU activation function) |
ReLU is an activation function used mainly in hidden layers
Classes
Rectified linear unit (ReLU activation function) |