Enum ActivationFunctionRole
Defines the functional roles of activation functions in neural networks.
public enum ActivationFunctionRole
Fields
Attention = 3Used for attention mechanisms.
Commonly Softmax in Transformers, NTM, etc.
Cell = 4Used for memory cell state updates.
Commonly Tanh in LSTM and memory networks
Gate = 2Used for gate mechanisms that control information flow.
Commonly Sigmoid in LSTM, GRU, and NTM gates
Hidden = 0Used for standard hidden layer activations.
Commonly ReLU, Tanh, etc.
Normalization = 5Used for normalization functions.
Used in Layer Normalization, Batch Normalization, etc.
Output = 1Used for output layer activations.
Commonly Softmax (classification), Linear/Identity (regression), Sigmoid (binary)
Probability = 6Used for probability distributions.
Commonly Softmax or Sigmoid
Remarks
Different parts of neural networks typically require different activation behaviors. This enum categorizes activation functions by their role rather than by their mathematical form.
For Beginners: This enum helps organize the different "jobs" that activation functions perform in neural networks, similar to how different workers in a factory have different specialized roles.