Namespace AiDotNet.NeuralNetworks.Attention
Classes
- FlashAttentionConfig
Configuration options for Flash Attention algorithm.
- FlashAttentionLayer<T>
A multi-head attention layer using the Flash Attention algorithm for memory-efficient computation.
Enums
- FlashAttentionPrecision
Precision modes for Flash Attention computation.