Table of Contents

Class PoolingOptions

Namespace
AiDotNet.Memory
Assembly
AiDotNet.dll

Configuration options for the tensor pool, which manages memory reuse during neural network operations. The tensor pool helps reduce memory allocations and garbage collection pressure by reusing tensor buffers.

public class PoolingOptions
Inheritance
PoolingOptions
Inherited Members

Remarks

Tensor pooling is especially beneficial for: - Inference operations with consistent input sizes - Training loops where tensor shapes are predictable - High-throughput scenarios where allocation overhead matters

Example usage:

var options = new PoolingOptions
{
    MaxPoolSizeMB = 512,        // Allow up to 512 MB of pooled tensors
    MaxItemsPerBucket = 20,     // Keep up to 20 tensors per shape bucket
    Enabled = true              // Enable pooling
};
var pool = new TensorPool<float>(options);

Properties

Enabled

Gets or sets a value indicating whether tensor pooling is enabled. When disabled, all tensor operations will allocate new memory instead of reusing pooled buffers.

public bool Enabled { get; set; }

Property Value

bool

true to enable pooling (default); false to disable pooling.

Remarks

Disabling pooling can be useful for:

  • Debugging memory issues
  • Profiling actual memory usage patterns
  • Scenarios where tensor shapes are highly unpredictable

MaxElementsToPool

Gets or sets the maximum number of elements a single tensor can have to be eligible for pooling. Tensors larger than this will not be pooled and will be allocated/deallocated normally.

public int MaxElementsToPool { get; set; }

Property Value

int

The maximum element count for poolable tensors. Default is 10,000,000 elements.

Remarks

Very large tensors are typically not good candidates for pooling because:

  • They consume significant memory in the pool
  • They are less likely to be reused (specific shapes)
  • Allocation overhead is proportionally smaller for large buffers

MaxItemsPerBucket

Gets or sets the maximum number of tensors to keep in each shape bucket. Tensors are grouped by shape, and this limits how many of each shape are retained.

public int MaxItemsPerBucket { get; set; }

Property Value

int

The maximum tensors per bucket. Default is 10.

Remarks

Higher values allow more tensor reuse but consume more memory. Lower values reduce memory usage but may increase allocations.

For batch processing with consistent batch sizes, a value of 5-20 is typically optimal. For varied batch sizes or dynamic shapes, consider a higher value like 20-50.

MaxPoolSizeBytes

Gets or sets the maximum memory size of the tensor pool in bytes. When this limit is reached, new tensors will not be pooled and will be garbage collected normally.

public long MaxPoolSizeBytes { get; set; }

Property Value

long

The maximum pool size in bytes. Default is 256 MB (268,435,456 bytes).

Remarks

Consider your application's memory constraints when setting this value. For memory-constrained environments, use a smaller value. For high-performance inference, consider increasing this value.

MaxPoolSizeMB

Gets or sets the maximum memory size of the tensor pool in megabytes. This is a convenience property that converts to/from MaxPoolSizeBytes.

public int MaxPoolSizeMB { get; set; }

Property Value

int

The maximum pool size in MB. Default is 256 MB.

Examples

// Set pool size to 512 MB
options.MaxPoolSizeMB = 512;

UseWeakReferences

Gets or sets a value indicating whether to use weak references for pooled tensors. When enabled, pooled tensors can be garbage collected under memory pressure.

public bool UseWeakReferences { get; set; }

Property Value

bool

true to use weak references; false to use strong references (default).

Remarks

Weak references allow the garbage collector to reclaim pooled tensors if memory is low, which can help prevent OutOfMemoryException in memory-constrained scenarios.

However, weak references may reduce pooling effectiveness since tensors can be unexpectedly collected. For best performance with sufficient memory, keep this disabled.