Namespace AiDotNet.KnowledgeDistillation
Classes
- CheckpointMetadata
Metadata about a saved checkpoint.
- DistillationCheckpointConfig
Configuration for distillation checkpoint management.
- DistillationCheckpointManager<T>
Manages checkpointing during knowledge distillation training.
- DistillationForwardResult<T>
Encapsulates the result of a forward pass during knowledge distillation training.
- DistillationStrategyBase<T>
Abstract base class for knowledge distillation strategies. Provides common functionality for computing losses and gradients in student-teacher training.
- DistillationStrategyFactory<T>
Factory for creating distillation strategies from enums and configurations.
- DistillationStrategyFactory<T>.StrategyBuilder
Fluent builder for configuring distillation strategies with custom parameters.
- FeatureDistillationStrategy<T>
Implements feature-based knowledge distillation (FitNets) where the student learns to match the teacher's intermediate layer representations.
- IntermediateActivations<T>
Stores intermediate layer activations collected during a forward pass.
- KnowledgeDistillationTrainerBase<T, TInput, TOutput>
Abstract base class for all knowledge distillation trainers. Provides common functionality for training loops, data shuffling, validation, and evaluation.
- KnowledgeDistillationTrainer<T>
Standard knowledge distillation trainer that uses a fixed teacher model to train a student.
- SelfDistillationTrainer<T>
Implements self-distillation where a model acts as its own teacher to improve calibration and generalization.
- TeacherModelBase<TInput, TOutput, T>
Abstract base class for teacher models used in knowledge distillation. Provides common functionality and utilities for teacher model implementations.
- TeacherModelFactory<T>
Factory for creating teacher models from enums and configurations.
- TeacherModelWrapper<T>
Wraps an existing trained IFullModel to act as a teacher for knowledge distillation.