Activation functions
The following activation functions are supported
Activation functions get applied within neural network nodes.
-
class nntoolkit.activation_functions.Sigmoid[source]
The sigmoid function \(f(x) = 1/(1+e^{-x})\).
-
class nntoolkit.activation_functions.Softmax[source]
The softmax function.
-
nntoolkit.activation_functions.get_activation_function(function_name)[source]
Get an activation function object by its class name.
>>> get_activation_function('Sigmoid')
SigmoidFunction
Parameters: | function_name (str) – Name of the activation function |
-
nntoolkit.activation_functions.get_class(name, config_key, module)[source]
Get the class by its name as a string.
Parameters: |
- name – name of the class
- config_key – name of the config key (if any) for the path to a plugin
- module – Module where to look for classes
|