Table of Contents

Namespace AiDotNet.JitCompiler.IR.Operations

Classes

AbsOp
AddOp
AffineGridOp

Represents affine grid generation for spatial transformer in the IR.

ApplyActivationOp
AttentionOp

Represents a simplified attention operation for GPU code generation.

AvgPool2DOp

Represents 2D average pooling in the IR.

BackwardOp

Base class for backward (gradient) operations in the IR.

BatchNormOp

Represents batch normalization in the IR.

BentIdentityOp

Represents Bent Identity activation in the IR.

CELUOp

Represents CELU (Continuously Differentiable ELU) activation in the IR.

ComplexMatMulOp

Represents complex matrix multiplication in the IR.

ComplexMultiplyOp

Represents element-wise complex multiplication in the IR.

ConcatOp

Represents concatenation along an axis in the IR.

ConstantOp

Represents a constant tensor in the IR (result of constant folding).

Conv2DOp

Represents 2D convolution in the IR.

ConvTranspose2DOp

Represents transposed 2D convolution in the IR.

CropOp

Represents cropping operation in the IR.

DepthwiseConv2DOp

Represents depthwise 2D convolution in the IR.

DilatedConv2DOp

Represents dilated 2D convolution in the IR.

DivideOp
DropoutOp

Represents dropout operation in the IR.

ELUOp

Represents ELU (Exponential Linear Unit) activation in the IR.

ElementwiseMultiplyOp
EmbeddingOp

Represents embedding lookup operation in the IR.

ExpOp
FakeQuantizationOp

Represents a fake quantization operation with Straight-Through Estimator (STE).

FusedAddLayerNormOp

Fused add + layer normalization operation.

FusedAddReLUOp

Represents fused Add + ReLU operation in the IR.

FusedAttentionOp

Fused attention operation (Q*K^T + softmax + matmul V).

FusedBatchNormActivationOp

Fused batch normalization + activation operation.

FusedBiasActivationOp

Fused bias + activation operation.

FusedConvBatchNormActivationOp

Fused Conv2D + BatchNorm + Activation operation.

FusedConvBatchNormOp

Fused convolution + batch normalization operation.

FusedDenseLayerOp

Fused matrix multiply + add + activation (full dense layer).

FusedElementwiseActivationOp

Fused element-wise operation with activation.

FusedElementwiseChainOp

Fused chain of element-wise operations.

FusedGELUOp

Fused GELU activation operation.

FusedLayerNormAddOp

Fused layer normalization + add operation.

FusedLinearActivationOp

Fused linear + activation operation.

FusedLinearOp

Fused linear operation (MatMul + Add bias).

FusedLinearReLUOp

Represents fused Linear + ReLU operation in the IR.

FusedMatMulAddOp

Represents fused MatMul + Add operation in the IR.

FusedMultiHeadAttentionOp

Fused multi-head attention operation.

FusedResidualBlockOp

Fused residual block operation.

FusedSwishOp

Fused Swish/SiLU activation (x * sigmoid(x)).

GELUOp

Represents GELU (Gaussian Error Linear Unit) activation in the IR.

GRUCellOp

Represents a GRU (Gated Recurrent Unit) cell operation in the IR.

GaussianOp

Represents Gaussian activation in the IR.

GeometricProductOp

Represents the geometric product of two multivectors in the IR.

GradAccumulateOp

Gradient accumulation operation - sums gradients from multiple paths.

GradAddOp

Backward operation for AddOp.

GradAttentionOp

Backward operation for attention (Q*K^T + softmax + matmul V).

GradAvgPool2DOp

Backward operation for AvgPool2DOp.

GradBatchNormOp

Backward operation for BatchNormOp.

GradBentIdentityOp

Backward operation for BentIdentityOp.

GradBroadcastOp

Backward operation for BroadcastOp.

GradCELUOp

Backward operation for CELUOp.

GradConcatOp

Backward operation for ConcatOp.

GradConv2DOp

Backward operation for Conv2DOp.

GradConvTranspose2DOp

Backward operation for ConvTranspose2DOp.

GradCropOp

Backward operation for CropOp.

GradDepthwiseConv2DOp

Backward operation for DepthwiseConv2DOp.

GradDivideOp

Backward operation for DivideOp.

GradDropoutOp

Backward operation for DropoutOp.

GradELUOp

Backward operation for ELUOp.

GradElementwiseMultiplyOp

Backward operation for ElementwiseMultiplyOp.

GradEmbeddingOp

Backward operation for EmbeddingOp.

GradExpOp

Backward operation for ExpOp.

GradGELUOp

Backward operation for GELUOp.

GradGRUCellOp

Backward operation for GRUCellOp.

GradGRUSequenceOp

Backward operation for full GRU sequence.

GradGatherOp

Backward operation for GatherOp.

GradGaussianOp

Backward operation for GaussianOp.

GradGeometricProductOp

Represents the gradient of the geometric product in the IR.

GradHardSigmoidOp

Backward operation for HardSigmoidOp.

GradHardTanhOp

Backward operation for HardTanhOp.

GradISRUOp

Backward operation for ISRUOp.

GradLSTMCellInputOp

Backward operation for LSTMCellOp - computes gradient for input.

GradLSTMSequenceOp

Backward operation for full LSTM sequence.

GradLayerNormOp

Backward operation for LayerNormOp.

GradLeakyReLUOp

Backward operation for LeakyReLUOp.

GradLiSHTOp

Backward operation for LiSHTOp.

GradLogOp

Backward operation for LogOp.

GradLogSoftmaxOp

Backward operation for LogSoftmaxOp.

GradMatMulLeftOp

Backward operation for MatMulOp (left input).

GradMatMulRightOp

Backward operation for MatMulOp (right input).

GradMaxPool2DOp

Backward operation for MaxPool2DOp.

GradMeanOp

Backward operation for MeanOp.

GradMishOp

Backward operation for MishOp.

GradMobiusAddOp

Represents the gradient of Mobius addition in the IR.

GradMultiHeadAttentionOp

Backward operation for multi-head attention.

GradOctonionMultiplyOp

Represents the gradient of octonion multiplication in the IR.

GradPReLUOp

Backward operation for PReLUOp.

GradPadOp

Backward operation for PadOp.

GradPoincareExpMapOp

Represents the gradient of the Poincare exponential map in the IR.

GradPowerOp

Backward operation for PowerOp.

GradRReLUOp

Backward operation for RReLUOp.

GradReLUOp

Backward operation for ReLUOp.

GradReshapeOp

Backward operation for ReshapeOp.

GradSELUOp

Backward operation for SELUOp.

GradScaledTanhOp

Backward operation for ScaledTanhOp.

GradSigmoidOp

Backward operation for SigmoidOp.

GradSliceOp

Backward operation for SliceOp.

GradSoftPlusOp

Backward operation for SoftPlusOp.

GradSoftSignOp

Backward operation for SoftSignOp.

GradSoftmaxOp

Backward operation for SoftmaxOp.

GradSpMMOp

Represents the gradient of sparse matrix-matrix multiplication in the IR.

GradSpMVOp

Represents the gradient of sparse matrix-vector multiplication in the IR.

GradSparsemaxOp

Backward operation for SparsemaxOp.

GradSplitOp

Backward operation for SplitOp.

GradSqrtOp

Backward operation for SqrtOp.

GradSubtractOp

Backward operation for SubtractOp.

GradSumOp

Backward operation for SumOp.

GradSwishOp

Backward operation for SwishOp.

GradTanhOp

Backward operation for TanhOp.

GradThresholdedReLUOp

Backward operation for ThresholdedReLUOp.

GradTransposeOp

Backward operation for TransposeOp.

GradUpsampleOp

Backward operation for UpsampleOp.

GraphConvOp

Represents graph convolution in the IR.

GridSampleOp

Represents grid sampling for spatial transformer in the IR.

HardSigmoidOp

Represents Hard Sigmoid activation in the IR.

HardTanhOp

Represents Hard Tanh activation in the IR.

HierarchicalSoftmaxOp

Represents Hierarchical Softmax activation in the IR.

ISRUOp

Represents ISRU (Inverse Square Root Unit) activation in the IR.

LSTMCellOp

Represents an LSTM (Long Short-Term Memory) cell operation in the IR.

LayerNormOp

Represents layer normalization in the IR.

LeakyReLUOp

Represents Leaky ReLU activation in the IR.

LiSHTOp

Represents LiSHT (Linearly Scaled Hyperbolic Tangent) activation in the IR.

LocallyConnectedConv2DOp

Represents locally connected 2D convolution in the IR.

LogOp
LogSoftmaxOp

Represents LogSoftmax activation in the IR.

LogSoftminOp

Represents Log Softmin activation in the IR.

MatMulOp
MaxPool2DOp

Represents 2D max pooling in the IR.

MaxoutOp

Represents Maxout activation in the IR.

MeanOp

Represents mean reduction in the IR.

MishOp

Represents Mish activation in the IR.

MobiusAddOp

Represents Mobius addition in the Poincare ball model in the IR.

MultiHeadAttentionOp

Represents multi-head attention in the IR.

NegateOp
NormOp

Represents L2 norm operation in the IR.

OctonionMatMulOp

Represents octonion matrix multiplication in the IR.

OctonionMultiplyOp

Represents octonion multiplication in the IR.

PReLUOp

Represents PReLU (Parametric ReLU) activation in the IR.

PadOp

Represents padding operation in the IR.

PixelShuffleOp

Represents pixel shuffle (depth-to-space) operation in the IR.

PoincareExpMapOp

Represents the exponential map in the Poincare ball model in the IR.

PoincareLogMapOp

Represents the logarithmic map in the Poincare ball model in the IR.

PowerOp
RBFKernelOp

Represents RBF (Radial Basis Function) kernel computation in the IR.

RReLUOp

Represents RReLU (Randomized Leaky ReLU) activation in the IR.

ReLUOp
ReduceLogVarianceOp

Represents log variance reduction in the IR.

ReduceMaxOp

Represents max reduction in the IR.

ReduceMeanOp

Represents mean reduction in the IR.

ReshapeOp

Represents reshape operation in the IR.

SELUOp

Represents SELU (Scaled Exponential Linear Unit) activation in the IR.

SQRBFOp

Represents SQRBF (Squared Radial Basis Function) activation in the IR.

ScalarConstantOp

Represents a scalar constant in the IR (single value).

ScaledDotProductAttentionOp

Represents scaled dot-product attention in the IR.

ScaledTanhOp

Represents Scaled Tanh activation in the IR.

SigmoidOp
SignOp

Represents Sign activation in the IR.

SliceOp

Represents slice operation in the IR.

SoftKNNOp

Represents a soft K-Nearest Neighbors operation for differentiable instance-based learning.

SoftLocallyWeightedOp

Represents a soft locally-weighted regression operation for differentiable instance-based learning.

SoftPlusOp

Represents SoftPlus activation in the IR.

SoftSignOp

Represents SoftSign activation in the IR.

SoftSplitOp

Represents a soft split operation for differentiable decision trees in the IR.

SoftmaxOp
SoftminOp

Represents Softmin activation in the IR.

SpMMOp

Represents sparse matrix-matrix multiplication in the IR.

SpMVOp

Represents sparse matrix-vector multiplication in the IR.

SparsemaxOp

Represents Sparsemax activation in the IR.

SphericalSoftmaxOp

Represents Spherical Softmax activation in the IR.

SplitOp

Represents split operation in the IR.

SqrtOp
SquareOp

Represents square operation in the IR.

SquashOp

Represents Squash activation in the IR (for Capsule Networks).

SubtractOp
SumOp

Represents sum reduction in the IR.

SwishOp

Represents Swish/SiLU activation in the IR.

TanhOp
TaylorSoftmaxOp

Represents Taylor Softmax activation in the IR.

ThresholdedReLUOp

Represents Thresholded ReLU activation in the IR.

TransposeOp
UnrolledElementwiseOp

Represents an unrolled element-wise operation.

UnrolledReductionOp

Represents an unrolled reduction operation.

UnrolledSequenceOp

Represents an unrolled sequence of operations.

UpsampleOp

Represents upsampling operation in the IR.

VectorizedBinaryOp

Vectorized binary operation (Add, Subtract, Multiply, Divide).

VectorizedMatMulOp

Vectorized matrix multiplication operation.

VectorizedReductionOp

Vectorized reduction operation (Sum, Mean, Max).

VectorizedUnaryOp

Vectorized unary operation (Negate, Exp, Log, Sqrt, ReLU, etc.).

WedgeProductOp

Represents the wedge (outer) product of two multivectors in the IR.

Enums

VectorizedBinaryOpType

Types of vectorized binary operations.

VectorizedReductionType

Types of vectorized reduction operations.

VectorizedUnaryOpType

Types of vectorized unary operations.