Table of Contents

Namespace AiDotNet.Memory

Classes

InferenceContext<T>

Provides a scoped context for inference operations with automatic tensor pooling and lifecycle management. All tensors rented through this context are tracked and automatically returned to the pool when disposed.

InferenceScope<T>

Provides ambient (thread-local) context support for InferenceContext<T>. Allows code to access the current inference context without parameter threading.

PoolStatistics

Provides statistics about the current state of a tensor pool. Use this class to monitor pool usage and tune pooling parameters.

PooledMemoryOwner<T>

An IMemoryOwner implementation that returns memory to the pool when disposed.

PooledTensor<T>

A RAII wrapper that automatically returns a pooled tensor to its pool when disposed. This class ensures tensors are properly returned to the pool even if an exception occurs.

PoolingOptions

Configuration options for the tensor pool, which manages memory reuse during neural network operations. The tensor pool helps reduce memory allocations and garbage collection pressure by reusing tensor buffers.

TensorPool<T>

A high-performance, thread-safe memory pool for reusing tensors during neural network operations. Reduces memory allocations and garbage collection pressure by pooling tensor buffers.

Structs

InferenceScopeHandle<T>

A disposable handle that restores the previous inference context when disposed. Returned by Begin(InferenceContext<T>) to enable proper scope nesting.