Namespace AiDotNet.LossFunctions
Classes
- BinaryCrossEntropyLoss<T>
Implements the Binary Cross Entropy loss function for binary classification problems.
- CTCLossAdapter<T>
Provides an adapter for using CTCLoss within the LossFunctionBase framework.
- CTCLoss<T>
Implements the Connectionist Temporal Classification (CTC) loss function for sequence-to-sequence learning.
- CategoricalCrossEntropyLoss<T>
Implements the Categorical Cross Entropy loss function for multi-class classification.
- CharbonnierLoss<T>
Implements the Charbonnier loss function, a smooth approximation of L1 loss.
- ContrastiveLoss<T>
Implements the Contrastive Loss function for learning similarity metrics.
- CosineSimilarityLoss<T>
Implements the Cosine Similarity Loss between two vectors.
- CrossEntropyLoss<T>
Implements the Cross Entropy loss function for multi-class classification problems.
- DiceLoss<T>
Implements the Dice loss function, commonly used for image segmentation tasks.
- ElasticNetLoss<T>
Implements the Elastic Net Loss function, which combines Mean Squared Error with L1 and L2 regularization.
- ExponentialLoss<T>
Implements the Exponential Loss function, commonly used in boosting algorithms.
- FocalLoss<T>
Implements the Focal Loss function, which gives more weight to hard-to-classify examples.
- HingeLoss<T>
Implements the Hinge loss function commonly used in support vector machines.
- HuberLoss<T>
Implements the Huber loss function, which combines properties of both MSE and MAE.
- JaccardLoss<T>
Implements the Jaccard loss function, commonly used for measuring dissimilarity between sets.
- KullbackLeiblerDivergence<T>
Implements the Kullback-Leibler Divergence, a measure of how one probability distribution differs from another.
- LogCoshLoss<T>
Implements the Log-Cosh loss function, a smooth approximation of Mean Absolute Error.
- LossFunctionBase<T>
Base class for loss function implementations.
- MarginLoss<T>
Implements the Margin loss function, specifically designed for Capsule Networks.
- MeanAbsoluteErrorLoss<T>
Implements the Mean Absolute Error (MAE) loss function.
- MeanBiasErrorLoss<T>
Implements the Mean Bias Error (MBE) loss function.
- MeanSquaredErrorLoss<T>
Implements the Mean Squared Error (MSE) loss function.
- ModifiedHuberLoss<T>
Implements the Modified Huber Loss function, a smoother version of the hinge loss.
- NoiseContrastiveEstimationLoss<T>
Implements the Noise Contrastive Estimation (NCE) loss function for efficient training with large output spaces.
- OrdinalRegressionLoss<T>
Implements the Ordinal Regression Loss function for predicting ordered categories.
- PerceptualLoss<T>
Implements the Perceptual Loss function for comparing high-level features of images.
- PoissonLoss<T>
Implements the Poisson loss function for count data modeling.
- QuantileLoss<T>
Implements the Quantile loss function for quantile regression.
- QuantumLoss<T>
Represents a quantum-specific loss function for quantum neural networks.
- RealESRGANLoss<T>
Combined loss function for Real-ESRGAN super-resolution training.
- RootMeanSquaredErrorLoss<T>
Implements the Root Mean Squared Error (RMSE) loss function.
- RotationPredictionLoss<T>
Self-supervised loss function based on rotation prediction for images.
- ScaleInvariantDepthLoss<T>
Scale-invariant depth loss function for depth estimation training.
- SparseCategoricalCrossEntropyLoss<T>
Implements the Sparse Categorical Cross Entropy loss function for multi-class classification with integer labels.
- SquaredHingeLoss<T>
Implements the Squared Hinge Loss function for binary classification problems.
- TripletLoss<T>
Implements the Triplet Loss function for learning similarity embeddings.
- WassersteinLoss<T>
Implements the Wasserstein loss function used in Wasserstein Generative Adversarial Networks (WGAN).
- WeightedCrossEntropyLoss<T>
Implements the Weighted Cross Entropy loss function for classification problems with uneven class importance.