Table of Contents

Class TBATSModel<T>

Namespace
AiDotNet.TimeSeries
Assembly
AiDotNet.dll

Implements the TBATS (Trigonometric, Box-Cox transform, ARMA errors, Trend, and Seasonal components) model for complex time series forecasting with multiple seasonal patterns.

public class TBATSModel<T> : TimeSeriesModelBase<T>, ITimeSeriesModel<T>, IFullModel<T, Matrix<T>, Vector<T>>, IModel<Matrix<T>, Vector<T>, ModelMetadata<T>>, IModelSerializer, ICheckpointableModel, IParameterizable<T, Matrix<T>, Vector<T>>, IFeatureAware, IFeatureImportance<T>, ICloneable<IFullModel<T, Matrix<T>, Vector<T>>>, IGradientComputable<T, Matrix<T>, Vector<T>>, IJitCompilable<T>

Type Parameters

T

The numeric data type used for calculations (e.g., float, double).

Inheritance
TBATSModel<T>
Implements
IFullModel<T, Matrix<T>, Vector<T>>
IModel<Matrix<T>, Vector<T>, ModelMetadata<T>>
IParameterizable<T, Matrix<T>, Vector<T>>
ICloneable<IFullModel<T, Matrix<T>, Vector<T>>>
IGradientComputable<T, Matrix<T>, Vector<T>>
Inherited Members
Extension Methods

Remarks

The TBATS model is an advanced exponential smoothing method that can handle multiple seasonal patterns of different lengths. It uses trigonometric functions to model seasonality, Box-Cox transformations to handle non-linearity, and ARMA processes to model residual correlations.

For Beginners: TBATS is like a Swiss Army knife for time series forecasting. It can handle complex data with:

  • Multiple seasonal patterns (e.g., daily, weekly, and yearly patterns all at once)
  • Non-linear growth (using Box-Cox transformations)
  • Autocorrelated errors (using ARMA models)

For example, if you're analyzing hourly electricity demand, TBATS can simultaneously model:

  • Daily patterns (people use more electricity during the day than at night)
  • Weekly patterns (usage differs on weekdays versus weekends)
  • Yearly patterns (more electricity is used for heating in winter or cooling in summer)

This makes TBATS particularly useful for complex forecasting problems where simpler methods fail.

Constructors

TBATSModel(TBATSModelOptions<T>?)

Initializes a new instance of the TBATSModel class with optional configuration options.

public TBATSModel(TBATSModelOptions<T>? options = null)

Parameters

options TBATSModelOptions<T>

The configuration options for the TBATS model. If null, default options are used.

Remarks

For Beginners: When you create a TBATS model, you can customize it with various options:

  • Seasonal periods: The lengths of different seasonal patterns (e.g., 7 for weekly, 12 for monthly)
  • ARMA order: How many past observations and errors to consider for the error model
  • Box-Cox lambda: A parameter that controls the non-linear transformation (0 = log transform)
  • Max iterations: How long the model should try to improve its estimates
  • Tolerance: When to stop training (when improvements become smaller than this value)

The constructor initializes all the components that will be estimated during training.

Properties

SupportsJitCompilation

Gets whether this model supports JIT compilation.

public override bool SupportsJitCompilation { get; }

Property Value

bool

Returns true when the model has been trained and has valid components. TBATS model can be represented as a computation graph using differentiable approximations for Box-Cox transformation and state-space representation.

Remarks

For Beginners: JIT compilation converts the model's calculations into optimized native code for faster inference. TBATS achieves this by: - Using differentiable approximations for Box-Cox transformation - Representing seasonal components as lookup tables with gather operations - Expressing ARMA effects as linear combinations

Methods

CreateInstance()

Creates a new instance of the TBATS model with the same options.

protected override IFullModel<T, Matrix<T>, Vector<T>> CreateInstance()

Returns

IFullModel<T, Matrix<T>, Vector<T>>

A new TBATS model instance with the same configuration.

Remarks

This method creates a new instance of the TBATS model with the same configuration options as the current instance. The new instance is not trained and will need to be trained on data.

For Beginners: This method creates a fresh copy of your model with the same settings but no training.

It's useful when you want to:

  • Create multiple models with the same configuration
  • Train models on different subsets of data
  • Create ensemble models (combining multiple models)
  • Compare training results with identical starting points

Think of it like copying a recipe to share with a friend. They get the same instructions but will need to do their own cooking (training) to create the dish.

DeserializeCore(BinaryReader)

Deserializes the model's core parameters from a binary reader.

protected override void DeserializeCore(BinaryReader reader)

Parameters

reader BinaryReader

The binary reader to read from.

Remarks

For Beginners: Deserialization is the process of loading a previously saved model from disk. This method reads the model's parameters from a file and reconstructs the model exactly as it was when it was saved.

This allows you to:

  • Load a previously trained model without retraining
  • Make predictions with consistent results
  • Continue analysis from where you left off

It's like saving your work in a document and opening it later to continue editing.

EvaluateModel(Matrix<T>, Vector<T>)

Evaluates the model's performance on test data.

public override Dictionary<string, T> EvaluateModel(Matrix<T> xTest, Vector<T> yTest)

Parameters

xTest Matrix<T>

The input features matrix for testing.

yTest Vector<T>

The actual target values for testing.

Returns

Dictionary<string, T>

A dictionary containing evaluation metrics.

Remarks

For Beginners: This method tests how well the model performs by comparing its predictions to actual values. It calculates several error metrics:

  • MSE (Mean Squared Error): Average of squared differences between predictions and actual values
  • RMSE (Root Mean Squared Error): Square root of MSE, in the same units as the original data
  • MAE (Mean Absolute Error): Average of absolute differences, less sensitive to outliers than MSE
  • MAPE (Mean Absolute Percentage Error): Average percentage difference, useful for comparing across different scales

Lower values for these metrics indicate better model performance. They help you understand how accurate your forecasts are likely to be and compare different models or parameter settings.

ExportComputationGraph(List<ComputationNode<T>>)

Exports the TBATS model as a computation graph for JIT compilation.

public override ComputationNode<T> ExportComputationGraph(List<ComputationNode<T>> inputNodes)

Parameters

inputNodes List<ComputationNode<T>>

A list to which input nodes will be added.

Returns

ComputationNode<T>

The output computation node representing the forecast.

Remarks

The computation graph represents the TBATS prediction formula: prediction = (level + trend) * seasonal[0] * seasonal[1] * ... + ARMA effects

For Beginners: This converts the TBATS model into a computation graph. The graph represents: 1. Base value: level + trend 2. Seasonal adjustments: multiply by each seasonal component 3. ARMA corrections: add autoregressive effects

Expected speedup: 2-4x for inference after JIT compilation.

GetModelMetadata()

Gets metadata about the model, including its type, configuration, and learned parameters.

public override ModelMetadata<T> GetModelMetadata()

Returns

ModelMetadata<T>

A ModelMetaData object containing information about the model.

Remarks

This method returns detailed metadata about the model, including its type, configuration options, and information about the learned components. This metadata can be used for model selection, comparison, and documentation.

For Beginners: This method provides a summary of your model's configuration and what it has learned.

It includes information like:

  • The type of model (TBATS)
  • The configuration settings (seasonal periods, ARMA order, etc.)
  • Details about the learned components (level, trend, seasonal patterns)
  • Performance statistics

This metadata is useful for:

  • Comparing different models
  • Documenting your analysis
  • Understanding what the model has learned
  • Sharing model information with others

Think of it like getting a detailed report card for your model.

Predict(Matrix<T>)

Generates forecasts using the trained TBATS model.

public override Vector<T> Predict(Matrix<T> input)

Parameters

input Matrix<T>

The input matrix specifying the forecast horizon.

Returns

Vector<T>

A vector of forecasted values.

Remarks

For Beginners: This method predicts future values based on the patterns learned during training. For each future time point, it:

  1. Starts with the current level (base value)
  2. Adds the trend component (growth or decline)
  3. Multiplies by each seasonal component (for daily, weekly, yearly patterns, etc.)
  4. Adds the ARMA effects (patterns in the errors)

The result is a forecast that captures multiple seasonal patterns and trends. For example, if forecasting retail sales, it might predict higher values during weekends and holiday seasons while still capturing the overall growth trend.

PredictSingle(Vector<T>)

Predicts a single value for the given input vector.

public override T PredictSingle(Vector<T> input)

Parameters

input Vector<T>

The input vector containing features for prediction.

Returns

T

The predicted value.

Remarks

For Beginners: This method predicts a single future value based on the model's learned patterns.

In TBATS models, we generally don't use external features (like temperature or day of week) because predictions are based on the time series patterns themselves.

This method is required by the framework's interface, and it works by:

  1. Taking the input vector (which might represent time or other factors)
  2. Creating a simplified prediction request
  3. Getting the predicted value from the model

For example, if you want to predict tomorrow's sales, this method would give you a single number representing the expected sales value.

Reset()

Resets the model to its initial state.

public override void Reset()

Remarks

This method clears all learned parameters and returns the model to its initial state, as if it had just been created with the same options but not yet trained.

For Beginners: Resetting a model is like erasing what it has learned while keeping its configuration.

This is useful when you want to:

  • Retrain the model on different data
  • Try different initial values
  • Compare training results with the same configuration
  • Start fresh after experimenting

It's similar to keeping your recipe (the configuration) but throwing away the dish you've already cooked (the learned parameters) to start cooking again from scratch.

SerializeCore(BinaryWriter)

Serializes the model's core parameters to a binary writer.

protected override void SerializeCore(BinaryWriter writer)

Parameters

writer BinaryWriter

The binary writer to write to.

Remarks

For Beginners: Serialization is the process of converting the model's state into a format that can be saved to disk. This allows you to save a trained model and load it later without having to retrain it.

This method saves:

  • The level and trend components
  • All seasonal components
  • The ARMA coefficients
  • The Box-Cox transformation parameter
  • All configuration options

After serializing, you can store the model and later deserialize it to make predictions or continue analysis without repeating the training process.

TrainCore(Matrix<T>, Vector<T>)

Performs the core training logic for the TBATS model.

protected override void TrainCore(Matrix<T> x, Vector<T> y)

Parameters

x Matrix<T>

The input features matrix.

y Vector<T>

The time series data to model.

Remarks

For Beginners: This method implements the core training algorithm for the TBATS model. It handles the actual mathematical operations that help the model learn patterns from your time series data.

While the model primarily uses the time series values themselves (y) to learn patterns, this method takes both an input matrix (x) and a target vector (y) to maintain consistency with other models in the framework.

Think of this as the "engine" of the training process that coordinates all the individual learning steps like initializing components, updating coefficients, and checking for convergence.