Class MCDropoutLayer<T>
- Namespace
- AiDotNet.UncertaintyQuantification.Layers
- Assembly
- AiDotNet.dll
Implements Monte Carlo Dropout layer for uncertainty estimation in neural networks.
public class MCDropoutLayer<T> : LayerBase<T>, ILayer<T>, IJitCompilable<T>, IDiagnosticsProvider, IWeightLoadable<T>, IDisposable
Type Parameters
TThe numeric type used for computations (e.g., float, double).
- Inheritance
-
LayerBase<T>MCDropoutLayer<T>
- Implements
-
ILayer<T>
- Inherited Members
Remarks
For Beginners: Monte Carlo Dropout is a simple yet powerful technique for estimating uncertainty.
Unlike regular dropout which is only active during training, MC Dropout keeps dropout active during prediction as well. By running multiple predictions with different random dropout masks, we get a distribution of predictions. The spread of this distribution tells us how uncertain the model is.
Think of it like asking multiple slightly different versions of the same expert for their opinion. If they all agree, you can be confident. If they disagree widely, there's high uncertainty.
This is particularly useful for:
- Detecting out-of-distribution samples
- Active learning (selecting which data to label next)
- Safety-critical applications (knowing when to defer to a human expert)
Constructors
MCDropoutLayer(double, bool, int?)
Initializes a new instance of the MCDropoutLayer class.
public MCDropoutLayer(double dropoutRate = 0.5, bool mcMode = false, int? randomSeed = null)
Parameters
dropoutRatedoubleThe probability of dropping out a neuron (between 0 and 1).
mcModeboolWhether to enable Monte Carlo mode by default.
randomSeedint?
Exceptions
- ArgumentException
Thrown when dropout rate is not between 0 and 1.
Properties
MonteCarloMode
Gets or sets whether Monte Carlo mode is enabled (applies dropout during inference).
public bool MonteCarloMode { get; set; }
Property Value
Remarks
For Beginners: When MC mode is on, dropout is applied even during prediction, allowing you to estimate uncertainty by running multiple forward passes.
SupportsJitCompilation
Gets whether this layer supports JIT compilation.
public override bool SupportsJitCompilation { get; }
Property Value
- bool
True if the layer can be JIT compiled, false otherwise.
Remarks
This property indicates whether the layer has implemented ExportComputationGraph() and can benefit from JIT compilation. All layers MUST implement this property.
For Beginners: JIT compilation can make inference 5-10x faster by converting the layer's operations into optimized native code.
Layers should return false if they:
- Have not yet implemented a working ExportComputationGraph()
- Use dynamic operations that change based on input data
- Are too simple to benefit from JIT compilation
When false, the layer will use the standard Forward() method instead.
SupportsTraining
Gets a value indicating whether this layer supports training mode.
public override bool SupportsTraining { get; }
Property Value
Methods
Backward(Tensor<T>)
Performs the backward pass of the MC dropout layer.
public override Tensor<T> Backward(Tensor<T> outputGradient)
Parameters
outputGradientTensor<T>The gradient from the next layer.
Returns
- Tensor<T>
The gradient to pass to the previous layer.
Clone()
Creates a copy of this layer.
public override LayerBase<T> Clone()
Returns
- LayerBase<T>
A new instance of the layer with the same configuration.
Remarks
This method creates a shallow copy of the layer with deep copies of the input/output shapes and activation functions. Derived classes should override this method to properly copy any additional fields they define.
For Beginners: This method creates a duplicate of this layer.
When copying a layer:
- Basic properties like shapes are duplicated
- Activation functions are cloned
- The new layer works independently from the original
This is useful for:
- Creating similar layers with small variations
- Implementing complex network architectures with repeated patterns
- Saving a layer's state before making changes
ExportComputationGraph(List<ComputationNode<T>>)
Exports the layer's computation graph for JIT compilation.
public override ComputationNode<T> ExportComputationGraph(List<ComputationNode<T>> inputNodes)
Parameters
inputNodesList<ComputationNode<T>>List to populate with input computation nodes.
Returns
- ComputationNode<T>
The output computation node representing the layer's operation.
Remarks
This method constructs a computation graph representation of the layer's forward pass that can be JIT compiled for faster inference. All layers MUST implement this method to support JIT compilation.
For Beginners: JIT (Just-In-Time) compilation converts the layer's operations into optimized native code for 5-10x faster inference.
To support JIT compilation, a layer must:
- Implement this method to export its computation graph
- Set SupportsJitCompilation to true
- Use ComputationNode and TensorOperations to build the graph
All layers are required to implement this method, even if they set SupportsJitCompilation = false.
Forward(Tensor<T>)
Performs the forward pass of the MC dropout layer.
public override Tensor<T> Forward(Tensor<T> input)
Parameters
inputTensor<T>The input tensor.
Returns
- Tensor<T>
The output tensor with dropout applied if in training or MC mode.
GetParameters()
Gets the trainable parameters (empty for dropout layers).
public override Vector<T> GetParameters()
Returns
- Vector<T>
ResetState()
Resets the internal state of the layer.
public override void ResetState()
SetParameters(Vector<T>)
Sets the trainable parameters (no-op for dropout layers).
public override void SetParameters(Vector<T> parameters)
Parameters
parametersVector<T>
UpdateParameters(T)
Updates the parameters (no-op for dropout layers).
public override void UpdateParameters(T learningRate)
Parameters
learningRateT