Table of Contents

Interface ICertifiedDefense<T, TInput, TOutput>

Namespace
AiDotNet.Interfaces
Assembly
AiDotNet.dll

Defines the contract for certified defense mechanisms that provide provable robustness guarantees.

public interface ICertifiedDefense<T, TInput, TOutput> : IModelSerializer

Type Parameters

T

The numeric data type used for calculations (e.g., float, double).

TInput

The input data type for the model (e.g., Vector<T>, string).

TOutput

The output data type for the model (e.g., Vector<T>, int).

Inherited Members

Remarks

Certified defenses provide mathematical guarantees that a model's predictions won't change within a specified perturbation radius, unlike heuristic defenses.

For Beginners: Think of certified defenses as "guaranteed protection" for your model. While regular defenses make models harder to fool, certified defenses can mathematically prove that no attack within certain limits can trick the model.

Common certified defense methods include:

  • Randomized Smoothing: Uses random noise to create certified predictions
  • Interval Bound Propagation: Tracks ranges of possible values through the network
  • CROWN: Computes certified bounds for neural network outputs

Why certified defenses matter:

  • They provide provable security guarantees
  • They're essential for safety-critical applications
  • They help meet regulatory requirements
  • They give confidence bounds for predictions

Methods

CertifyBatch(TInput[], IFullModel<T, TInput, TOutput>)

Computes certified predictions for a batch of inputs.

CertifiedPrediction<T>[] CertifyBatch(TInput[] inputs, IFullModel<T, TInput, TOutput> model)

Parameters

inputs TInput[]

The batch of inputs to certify.

model IFullModel<T, TInput, TOutput>

The model to certify.

Returns

CertifiedPrediction<T>[]

Batch of certified prediction results.

Remarks

For Beginners: This is the same as CertifyPrediction, but processes multiple inputs at once for efficiency.

CertifyPrediction(TInput, IFullModel<T, TInput, TOutput>)

Computes a certified prediction with robustness guarantees.

CertifiedPrediction<T> CertifyPrediction(TInput input, IFullModel<T, TInput, TOutput> model)

Parameters

input TInput

The input to make a certified prediction for.

model IFullModel<T, TInput, TOutput>

The model to certify.

Returns

CertifiedPrediction<T>

Certified prediction result with robustness radius.

Remarks

This method provides a prediction that is guaranteed to be correct for all inputs within a specified perturbation radius.

For Beginners: This is like making a prediction with a "warranty". The method tells you: "I predict this class, and I guarantee that even if someone changes the input slightly (within specified limits), my prediction won't change."

The process typically involves:

  1. Analyzing the input and model
  2. Computing bounds on what the model could output
  3. If all possible outputs within the bounds agree, the prediction is certified
  4. Returning both the prediction and the guaranteed radius of correctness

ComputeCertifiedRadius(TInput, IFullModel<T, TInput, TOutput>)

Computes the maximum perturbation radius that can be certified for an input.

T ComputeCertifiedRadius(TInput input, IFullModel<T, TInput, TOutput> model)

Parameters

input TInput

The input to analyze.

model IFullModel<T, TInput, TOutput>

The model being certified.

Returns

T

The maximum certified robustness radius.

Remarks

For Beginners: This calculates how much an input can be changed while still guaranteeing the model's prediction stays the same. It's like measuring the "safety zone" around your input.

For example, if the radius is 0.1, you can change pixel values by up to 0.1 and the model is guaranteed to give the same answer.

EvaluateCertifiedAccuracy(TInput[], TOutput[], IFullModel<T, TInput, TOutput>, T)

Evaluates certified accuracy on a dataset.

CertifiedAccuracyMetrics<T> EvaluateCertifiedAccuracy(TInput[] testData, TOutput[] labels, IFullModel<T, TInput, TOutput> model, T radius)

Parameters

testData TInput[]

The test data to evaluate on.

labels TOutput[]

The true labels.

model IFullModel<T, TInput, TOutput>

The model to evaluate.

radius T

The perturbation radius to certify.

Returns

CertifiedAccuracyMetrics<T>

Certified accuracy metrics.

Remarks

For Beginners: This measures what percentage of predictions can be certified as robust. Higher certified accuracy means more predictions have guaranteed robustness.

GetOptions()

Gets the configuration options for the certified defense.

CertifiedDefenseOptions<T> GetOptions()

Returns

CertifiedDefenseOptions<T>

The configuration options for the certified defense.

Remarks

For Beginners: These are the settings that control how certification works, like the number of samples to use or the tightness of bounds.

Reset()

Resets the certified defense state.

void Reset()