Table of Contents

Enum ActivationFunctionRole

Namespace
AiDotNet.Enums
Assembly
AiDotNet.dll

Defines the functional roles of activation functions in neural networks.

public enum ActivationFunctionRole

Fields

Attention = 3

Used for attention mechanisms.

Commonly Softmax in Transformers, NTM, etc.

Cell = 4

Used for memory cell state updates.

Commonly Tanh in LSTM and memory networks

Gate = 2

Used for gate mechanisms that control information flow.

Commonly Sigmoid in LSTM, GRU, and NTM gates

Hidden = 0

Used for standard hidden layer activations.

Commonly ReLU, Tanh, etc.

Normalization = 5

Used for normalization functions.

Used in Layer Normalization, Batch Normalization, etc.

Output = 1

Used for output layer activations.

Commonly Softmax (classification), Linear/Identity (regression), Sigmoid (binary)

Probability = 6

Used for probability distributions.

Commonly Softmax or Sigmoid

Remarks

Different parts of neural networks typically require different activation behaviors. This enum categorizes activation functions by their role rather than by their mathematical form.

For Beginners: This enum helps organize the different "jobs" that activation functions perform in neural networks, similar to how different workers in a factory have different specialized roles.