Class MetaTrainingResult<T>
Results from a complete meta-training run with history tracking.
public class MetaTrainingResult<T>
Type Parameters
TThe numeric type used for calculations (e.g., float, double, decimal).
- Inheritance
-
MetaTrainingResult<T>
- Inherited Members
Remarks
This class aggregates metrics across an entire meta-training session, tracking how performance evolves over many meta-iterations. It combines the functionality of what were previously separate "Metrics" and "Metadata" classes into a unified Result pattern consistent with the codebase.
For Beginners: Meta-training is the process of training your model to be good at learning new tasks quickly. This happens over many iterations:
- Sample a batch of tasks
- Adapt to each task (inner loop)
- Update meta-parameters based on how well adaptations worked (outer loop)
- Repeat for many iterations
This result tracks:
- Learning curves: How loss and accuracy change over iterations
- Final performance: The end results after training
- Training time: How long it took
- Convergence: Whether training successfully improved the model
Use this to:
- Monitor training progress
- Diagnose training issues
- Compare different meta-learning configurations
- Report results in papers or documentation
Constructors
MetaTrainingResult(Vector<T>, Vector<T>, TimeSpan, Dictionary<string, T>?)
Initializes a new instance with complete training history.
public MetaTrainingResult(Vector<T> lossHistory, Vector<T> accuracyHistory, TimeSpan trainingTime, Dictionary<string, T>? additionalMetrics = null)
Parameters
lossHistoryVector<T>Meta-loss values from each training iteration.
accuracyHistoryVector<T>Accuracy values from each training iteration.
trainingTimeTimeSpanTotal time taken for training.
additionalMetricsDictionary<string, T>Optional algorithm-specific metrics.
Remarks
This constructor follows the established pattern of accepting raw data and deriving computed properties from it. The history vectors should contain one value per training iteration, in chronological order.
For Beginners: Call this at the end of training to package all your training history together. The constructor automatically calculates derived metrics like FinalLoss, InitialLoss, etc.
Exceptions
- ArgumentNullException
Thrown when lossHistory or accuracyHistory is null.
- ArgumentException
Thrown when vectors have different lengths or are empty.
Properties
AccuracyHistory
Gets the accuracy history across all iterations.
public Vector<T> AccuracyHistory { get; }
Property Value
- Vector<T>
A vector where each element is the average accuracy for that iteration. Higher values indicate better meta-learning performance.
AdditionalMetrics
Gets algorithm-specific metrics collected during training.
public Dictionary<string, T> AdditionalMetrics { get; }
Property Value
- Dictionary<string, T>
A dictionary of custom metrics with generic T values.
Remarks
For Production: Common additional metrics include: - "best_loss": Lowest loss achieved during training - "best_accuracy": Highest accuracy achieved during training - "gradient_norm_avg": Average gradient magnitude - "tasks_per_second": Training throughput - "convergence_iteration": When loss stabilized
FinalAccuracy
Gets the final accuracy after training.
public T FinalAccuracy { get; }
Property Value
- T
The accuracy from the last iteration, representing final training performance.
FinalLoss
Gets the final meta-loss after training.
public T FinalLoss { get; }
Property Value
- T
The meta-loss from the last iteration, representing final training performance.
InitialAccuracy
Gets the initial accuracy before training.
public T InitialAccuracy { get; }
Property Value
- T
The accuracy from the first iteration, representing baseline performance.
InitialLoss
Gets the initial meta-loss before training.
public T InitialLoss { get; }
Property Value
- T
The meta-loss from the first iteration, representing baseline performance.
LossHistory
Gets the meta-loss history across all iterations.
public Vector<T> LossHistory { get; }
Property Value
- Vector<T>
A vector where each element is the meta-loss for that iteration. Lower values indicate better meta-learning performance.
Remarks
For Production: Use this for: - Plotting learning curves - Detecting convergence or divergence - Implementing early stopping - Comparing training runs
The meta-loss measures how well the model adapts across tasks in the outer loop. It should generally decrease over training.
TotalIterations
Gets the total number of meta-training iterations completed.
public int TotalIterations { get; }
Property Value
- int
The count of outer loop updates performed during training.
Remarks
For Beginners: Each iteration represents one complete cycle of: - Sample tasks → Adapt to each → Update meta-parameters
More iterations generally lead to better meta-learning, but with diminishing returns. Typical values: 10,000-60,000 for research, 1,000-10,000 for practice.
TrainingTime
Gets the total time taken for meta-training.
public TimeSpan TrainingTime { get; }
Property Value
- TimeSpan
The elapsed time from start to finish of training.
Methods
CalculateAccuracyImprovement()
Calculates the total improvement in accuracy from start to finish.
public T CalculateAccuracyImprovement()
Returns
- T
The difference between final and initial accuracy (positive means improvement).
CalculateLossImprovement()
Calculates the total improvement in loss from start to finish.
public T CalculateLossImprovement()
Returns
- T
The difference between initial and final loss (positive means improvement).
Remarks
For Beginners: This tells you how much the model improved during training.
- Positive value: Loss decreased (good!)
- Zero: No improvement (needs investigation)
- Negative value: Loss increased (training problem)
For example:
- Initial loss: 2.5
- Final loss: 0.8
- Improvement: 1.7 (68% reduction)
FindBestAccuracy()
Finds the best (highest) accuracy achieved during training.
public (T BestAccuracy, int Iteration) FindBestAccuracy()
Returns
- (T BestLoss, int Iteration)
A tuple containing the best accuracy value and the iteration it occurred at.
FindBestLoss()
Finds the best (lowest) loss achieved during training.
public (T BestLoss, int Iteration) FindBestLoss()
Returns
- (T BestLoss, int Iteration)
A tuple containing the best loss value and the iteration it occurred at.
Remarks
For Production: The best loss might occur before the final iteration, especially if: - Learning rate is too high (oscillation) - Training ran too long (overfitting to training tasks) - Need early stopping or learning rate decay
GenerateReport()
Generates a comprehensive training report.
public string GenerateReport()
Returns
- string
A formatted string summarizing training results.
Remarks
For Beginners: This creates a human-readable summary of your training run that you can print, log, or include in documentation.
HasConverged(int, double)
Checks if training converged based on loss stabilization.
public bool HasConverged(int windowSize = 100, double varianceThreshold = 0.001)
Parameters
windowSizeintNumber of recent iterations to analyze (default: 100).
varianceThresholddoubleMaximum variance to consider converged (default: 0.001).
Returns
- bool
True if loss variance in recent window is below threshold.
Remarks
For Production: Use this to: - Implement automatic early stopping - Validate training completion - Diagnose non-convergent runs
Convergence means the loss has stabilized and further training is unlikely to help.