Table of Contents

Class MCDropoutLayer<T>

Namespace
AiDotNet.UncertaintyQuantification.Layers
Assembly
AiDotNet.dll

Implements Monte Carlo Dropout layer for uncertainty estimation in neural networks.

public class MCDropoutLayer<T> : LayerBase<T>, ILayer<T>, IJitCompilable<T>, IDiagnosticsProvider, IWeightLoadable<T>, IDisposable

Type Parameters

T

The numeric type used for computations (e.g., float, double).

Inheritance
MCDropoutLayer<T>
Implements
Inherited Members

Remarks

For Beginners: Monte Carlo Dropout is a simple yet powerful technique for estimating uncertainty.

Unlike regular dropout which is only active during training, MC Dropout keeps dropout active during prediction as well. By running multiple predictions with different random dropout masks, we get a distribution of predictions. The spread of this distribution tells us how uncertain the model is.

Think of it like asking multiple slightly different versions of the same expert for their opinion. If they all agree, you can be confident. If they disagree widely, there's high uncertainty.

This is particularly useful for:

  • Detecting out-of-distribution samples
  • Active learning (selecting which data to label next)
  • Safety-critical applications (knowing when to defer to a human expert)

Constructors

MCDropoutLayer(double, bool, int?)

Initializes a new instance of the MCDropoutLayer class.

public MCDropoutLayer(double dropoutRate = 0.5, bool mcMode = false, int? randomSeed = null)

Parameters

dropoutRate double

The probability of dropping out a neuron (between 0 and 1).

mcMode bool

Whether to enable Monte Carlo mode by default.

randomSeed int?

Exceptions

ArgumentException

Thrown when dropout rate is not between 0 and 1.

Properties

MonteCarloMode

Gets or sets whether Monte Carlo mode is enabled (applies dropout during inference).

public bool MonteCarloMode { get; set; }

Property Value

bool

Remarks

For Beginners: When MC mode is on, dropout is applied even during prediction, allowing you to estimate uncertainty by running multiple forward passes.

SupportsJitCompilation

Gets whether this layer supports JIT compilation.

public override bool SupportsJitCompilation { get; }

Property Value

bool

True if the layer can be JIT compiled, false otherwise.

Remarks

This property indicates whether the layer has implemented ExportComputationGraph() and can benefit from JIT compilation. All layers MUST implement this property.

For Beginners: JIT compilation can make inference 5-10x faster by converting the layer's operations into optimized native code.

Layers should return false if they:

  • Have not yet implemented a working ExportComputationGraph()
  • Use dynamic operations that change based on input data
  • Are too simple to benefit from JIT compilation

When false, the layer will use the standard Forward() method instead.

SupportsTraining

Gets a value indicating whether this layer supports training mode.

public override bool SupportsTraining { get; }

Property Value

bool

Methods

Backward(Tensor<T>)

Performs the backward pass of the MC dropout layer.

public override Tensor<T> Backward(Tensor<T> outputGradient)

Parameters

outputGradient Tensor<T>

The gradient from the next layer.

Returns

Tensor<T>

The gradient to pass to the previous layer.

Clone()

Creates a copy of this layer.

public override LayerBase<T> Clone()

Returns

LayerBase<T>

A new instance of the layer with the same configuration.

Remarks

This method creates a shallow copy of the layer with deep copies of the input/output shapes and activation functions. Derived classes should override this method to properly copy any additional fields they define.

For Beginners: This method creates a duplicate of this layer.

When copying a layer:

  • Basic properties like shapes are duplicated
  • Activation functions are cloned
  • The new layer works independently from the original

This is useful for:

  • Creating similar layers with small variations
  • Implementing complex network architectures with repeated patterns
  • Saving a layer's state before making changes

ExportComputationGraph(List<ComputationNode<T>>)

Exports the layer's computation graph for JIT compilation.

public override ComputationNode<T> ExportComputationGraph(List<ComputationNode<T>> inputNodes)

Parameters

inputNodes List<ComputationNode<T>>

List to populate with input computation nodes.

Returns

ComputationNode<T>

The output computation node representing the layer's operation.

Remarks

This method constructs a computation graph representation of the layer's forward pass that can be JIT compiled for faster inference. All layers MUST implement this method to support JIT compilation.

For Beginners: JIT (Just-In-Time) compilation converts the layer's operations into optimized native code for 5-10x faster inference.

To support JIT compilation, a layer must:

  1. Implement this method to export its computation graph
  2. Set SupportsJitCompilation to true
  3. Use ComputationNode and TensorOperations to build the graph

All layers are required to implement this method, even if they set SupportsJitCompilation = false.

Forward(Tensor<T>)

Performs the forward pass of the MC dropout layer.

public override Tensor<T> Forward(Tensor<T> input)

Parameters

input Tensor<T>

The input tensor.

Returns

Tensor<T>

The output tensor with dropout applied if in training or MC mode.

GetParameters()

Gets the trainable parameters (empty for dropout layers).

public override Vector<T> GetParameters()

Returns

Vector<T>

ResetState()

Resets the internal state of the layer.

public override void ResetState()

SetParameters(Vector<T>)

Sets the trainable parameters (no-op for dropout layers).

public override void SetParameters(Vector<T> parameters)

Parameters

parameters Vector<T>

UpdateParameters(T)

Updates the parameters (no-op for dropout layers).

public override void UpdateParameters(T learningRate)

Parameters

learningRate T