Class SoftLocallyWeightedOp
- Namespace
- AiDotNet.JitCompiler.IR.Operations
- Assembly
- AiDotNet.dll
Represents a soft locally-weighted regression operation for differentiable instance-based learning.
public class SoftLocallyWeightedOp : IROp
- Inheritance
-
SoftLocallyWeightedOp
- Inherited Members
Remarks
Implements differentiable locally-weighted regression using attention-based weighting. This enables gradient-based optimization and JIT compilation of LOESS/LOWESS-style models.
The operation computes:
distances[i] = ||input - X_train[i]||²
weights = softmax(-distances / bandwidth)
output = Σ weights[i] * y_train[i]
For Beginners: Locally-weighted regression makes predictions by computing a weighted average of nearby training examples, where "nearby" is determined by distance.
This soft version uses attention (softmax) to compute weights, making it fully differentiable:
- Points close to the query get high attention weights
- Points far from the query get low attention weights
- The bandwidth controls how quickly attention drops off with distance
Properties
Bandwidth
Gets or sets the bandwidth parameter controlling the locality of weighting. Smaller bandwidth = more local (only nearby points matter).
public double Bandwidth { get; set; }
Property Value
Methods
ToString()
Gets a string representation of this operation for debugging.
public override string ToString()
Returns
- string
A string describing this operation.
Remarks
The string format is: "tOutput = OpType(tInput1, tInput2, ...) : Type [Shape]"
For Beginners: This creates a readable description of the operation.
Example outputs:
- "t2 = Add(t0, t1) : Float32 [3, 4]"
- "t5 = MatMul(t3, t4) : Float32 [128, 256]"
- "t8 = ReLU(t7) : Float32 [32, 128]"
This is super helpful for debugging - you can see exactly what each operation does and what shape tensors flow through the graph.
Validate()
Validates that this operation is correctly formed.
public override bool Validate()
Returns
- bool
True if valid, false otherwise.
Remarks
Basic validation checks that the operation has required information. Derived classes can override to add operation-specific validation.
For Beginners: This checks that the operation makes sense.
Basic checks:
- Output ID is valid (non-negative)
- Has the right number of inputs
- Shapes are compatible
Specific operations add their own checks:
- MatMul: inner dimensions must match
- Conv2D: kernel size must be valid
- Reshape: total elements must be preserved
If validation fails, the operation can't be compiled.