Table of Contents

Class GradBatchNormOp

Namespace
AiDotNet.JitCompiler.IR.Operations
Assembly
AiDotNet.dll

Backward operation for BatchNormOp.

public class GradBatchNormOp : BackwardOp
Inheritance
GradBatchNormOp
Inherited Members

Remarks

Batch normalization has complex gradients involving batch statistics. Computes gradients for input, scale, and bias parameters.

Properties

Epsilon

public double Epsilon { get; set; }

Property Value

double

InputIndex

public int InputIndex { get; set; }

Property Value

int

Methods

ToString()

Gets a string representation of this operation for debugging.

public override string ToString()

Returns

string

A string describing this operation.

Remarks

The string format is: "tOutput = OpType(tInput1, tInput2, ...) : Type [Shape]"

For Beginners: This creates a readable description of the operation.

Example outputs:

  • "t2 = Add(t0, t1) : Float32 [3, 4]"
  • "t5 = MatMul(t3, t4) : Float32 [128, 256]"
  • "t8 = ReLU(t7) : Float32 [32, 128]"

This is super helpful for debugging - you can see exactly what each operation does and what shape tensors flow through the graph.

Validate()

Validates that this operation is correctly formed.

public override bool Validate()

Returns

bool

True if valid, false otherwise.

Remarks

Basic validation checks that the operation has required information. Derived classes can override to add operation-specific validation.

For Beginners: This checks that the operation makes sense.

Basic checks:

  • Output ID is valid (non-negative)
  • Has the right number of inputs
  • Shapes are compatible

Specific operations add their own checks:

  • MatMul: inner dimensions must match
  • Conv2D: kernel size must be valid
  • Reshape: total elements must be preserved

If validation fails, the operation can't be compiled.