Class DenseNetConfiguration
- Namespace
- AiDotNet.Configuration
- Assembly
- AiDotNet.dll
Configuration options for DenseNet neural network architectures.
public class DenseNetConfiguration
- Inheritance
-
DenseNetConfiguration
- Inherited Members
Remarks
DenseNet (Densely Connected Convolutional Networks) connects each layer to every other layer in a feed-forward fashion, enabling strong gradient flow and feature reuse.
For Beginners: DenseNet is designed to maximize information flow by connecting each layer directly to all subsequent layers. This configuration lets you choose which variant to use and customize parameters like growth rate and compression factor.
Constructors
DenseNetConfiguration(DenseNetVariant, int, int, int, int, int, double, int[]?)
Initializes a new instance of the DenseNetConfiguration class.
public DenseNetConfiguration(DenseNetVariant variant, int numClasses, int inputHeight = 224, int inputWidth = 224, int inputChannels = 3, int growthRate = 32, double compressionFactor = 0.5, int[]? customBlockLayers = null)
Parameters
variantDenseNetVariantThe DenseNet variant to use.
numClassesintThe number of output classes for classification.
inputHeightintThe height of input images (default: 224).
inputWidthintThe width of input images (default: 224).
inputChannelsintThe number of input channels (default: 3 for RGB).
growthRateintThe growth rate (default: 32).
compressionFactordoubleThe compression factor for transition layers (default: 0.5).
customBlockLayersint[]Custom block layers (required when variant is Custom).
Properties
CompressionFactor
Gets the compression factor for transition layers.
public double CompressionFactor { get; }
Property Value
Remarks
For Beginners: Compression factor (theta) controls channel reduction at transition layers. A value of 0.5 means halving the channels at each transition, which helps control model size.
CustomBlockLayers
Gets the custom block layers configuration (only used when Variant is Custom).
public int[]? CustomBlockLayers { get; }
Property Value
- int[]
GrowthRate
Gets the growth rate (k in the paper).
public int GrowthRate { get; }
Property Value
Remarks
For Beginners: The growth rate determines how many new feature maps each layer adds. Typical values are 12, 24, or 32. Higher values increase capacity but also computational cost.
InputChannels
Gets the number of input channels.
public int InputChannels { get; }
Property Value
InputHeight
Gets the height of input images in pixels.
public int InputHeight { get; }
Property Value
InputShape
Gets the computed input shape as [channels, height, width].
public int[] InputShape { get; }
Property Value
- int[]
InputWidth
Gets the width of input images in pixels.
public int InputWidth { get; }
Property Value
NumClasses
Gets the number of output classes for classification.
public int NumClasses { get; }
Property Value
Variant
Gets the DenseNet variant to use.
public DenseNetVariant Variant { get; }
Property Value
Methods
CreateDenseNet121(int)
Creates a DenseNet-121 configuration (recommended default).
public static DenseNetConfiguration CreateDenseNet121(int numClasses)
Parameters
numClassesint
Returns
CreateForTesting(int)
Creates a minimal DenseNet configuration optimized for fast test execution.
public static DenseNetConfiguration CreateForTesting(int numClasses)
Parameters
numClassesintThe number of output classes.
Returns
- DenseNetConfiguration
A minimal DenseNet configuration for testing.
Remarks
Uses [2, 2, 2, 2] block configuration with small growth rate (8) and 32x32 input, resulting in approximately 8 dense layers instead of 58+ in DenseNet-121. Construction time is typically under 50ms.
GetBlockLayers()
Gets the number of layers per dense block for this variant.
public int[] GetBlockLayers()
Returns
- int[]
GetExpectedLayerCount()
Gets the expected total layer count for this configuration without constructing the network.
public int GetExpectedLayerCount()
Returns
- int
The expected number of layers in the network.
Remarks
This is useful for tests that need to compare layer counts without the overhead of actually constructing the networks. Formula: 1 (stem conv) + 1 (stem BN) + 1 (stem pool) + sum(block_layers * 2) for BN+Conv in each dense layer + (num_blocks - 1) * 2 for transition layers (Conv + Pool) + 1 (final BN) + 1 (classifier)