Table of Contents

Class LogSoftmaxOp

Namespace
AiDotNet.JitCompiler.IR.Operations
Assembly
AiDotNet.dll

Represents LogSoftmax activation in the IR.

public class LogSoftmaxOp : IROp
Inheritance
LogSoftmaxOp
Inherited Members

Remarks

Computes LogSoftmax(x) = log(softmax(x)). Numerically stable for cross-entropy loss.

Properties

Axis

The axis along which to compute log softmax. Default is -1 (last axis).

public int Axis { get; set; }

Property Value

int

Methods

ToString()

Gets a string representation of this operation for debugging.

public override string ToString()

Returns

string

A string describing this operation.

Remarks

The string format is: "tOutput = OpType(tInput1, tInput2, ...) : Type [Shape]"

For Beginners: This creates a readable description of the operation.

Example outputs:

  • "t2 = Add(t0, t1) : Float32 [3, 4]"
  • "t5 = MatMul(t3, t4) : Float32 [128, 256]"
  • "t8 = ReLU(t7) : Float32 [32, 128]"

This is super helpful for debugging - you can see exactly what each operation does and what shape tensors flow through the graph.

Validate()

Validates that this operation is correctly formed.

public override bool Validate()

Returns

bool

True if valid, false otherwise.

Remarks

Basic validation checks that the operation has required information. Derived classes can override to add operation-specific validation.

For Beginners: This checks that the operation makes sense.

Basic checks:

  • Output ID is valid (non-negative)
  • Has the right number of inputs
  • Shapes are compatible

Specific operations add their own checks:

  • MatMul: inner dimensions must match
  • Conv2D: kernel size must be valid
  • Reshape: total elements must be preserved

If validation fails, the operation can't be compiled.