berry.activations

Nonlinear activations applied after each layer.

Nonlinearities are responsible for the effectiveness of neural networks. They allow us to string together a long chain of layers and learn interesting, non-linear and highly complex functions.

The list of supported nonlinearities are:

  • relu
  • softplus
  • sigmoid
  • tanh
  • softmax
  • linear

Helper function

braid.berry.activations.get_activation(key)

Helper function to retrieve the appropriate activation function.

key
: string
Name of the type of activation - “relu”, “sigmoid”, etc.
function
The appropriate function given the key.