Table of Contents

Interface IBayesianLayer<T>

Namespace
AiDotNet.UncertaintyQuantification.Interfaces
Assembly
AiDotNet.dll

Defines the contract for Bayesian neural network layers that support probabilistic inference.

public interface IBayesianLayer<T>

Type Parameters

T

The numeric type used for computations (e.g., float, double).

Remarks

For Beginners: A Bayesian layer is different from a regular neural network layer because instead of having fixed weights, it has distributions over weights.

Think of regular weights as saying "the connection strength is exactly 2.5", while Bayesian weights say "the connection strength is probably around 2.5, but could be anywhere from 2.0 to 3.0".

This probabilistic approach allows the network to express uncertainty in its predictions, which is crucial for safety-critical applications.

Methods

AddKLDivergenceGradients(T)

Adds the KL divergence gradients (regularization term) into the layer's accumulated gradients.

void AddKLDivergenceGradients(T klScale)

Parameters

klScale T

Scaling applied to the KL term (e.g., 1/N for dataset size).

GetKLDivergence()

Gets the KL divergence term for variational inference.

T GetKLDivergence()

Returns

T

The KL divergence value.

Remarks

For Beginners: KL divergence measures how different the learned weight distribution is from a simple baseline distribution (called the prior).

This is used during training to prevent the model from becoming too confident or too uncertain. Think of it as a "regularization penalty" that keeps the weight distributions reasonable.

SampleWeights()

Samples from the weight distribution for stochastic forward passes.

void SampleWeights()

Remarks

For Beginners: This method randomly selects weights from the probability distribution. Each time you call this, you might get slightly different weights, representing the model's uncertainty about what the true weights should be.