Class OnnxExporter
Exports AiDotNet models to the ONNX format.
public static class OnnxExporter
- Inheritance
-
OnnxExporter
- Inherited Members
Remarks
This exporter supports sequential models with common layer types:
- Dense/Linear layers (exported as MatMul + Add)
- Activation functions (ReLU, Sigmoid, Tanh, etc.)
- Dropout (exported as Identity in inference mode)
Limitations: This is a proof-of-concept implementation that works with specific, known model structures (sequential models with supported layers). For production use, consider using framework-native export tools.
For Beginners: Use this class to convert your trained AiDotNet models to ONNX format for deployment:
var model = // your trained model
OnnxExporter.Export(model, "model.onnx");
Methods
ExportToBytes<T, TInput, TOutput>(IFullModel<T, TInput, TOutput>, int[]?)
Exports a model to ONNX format and returns the bytes.
public static byte[] ExportToBytes<T, TInput, TOutput>(IFullModel<T, TInput, TOutput> model, int[]? inputShape = null)
Parameters
modelIFullModel<T, TInput, TOutput>The model to export.
inputShapeint[]Optional input shape.
Returns
- byte[]
The ONNX model as a byte array.
Type Parameters
TThe numeric type of the model.
TInputTOutput
Export<T, TInput, TOutput>(IFullModel<T, TInput, TOutput>, string, int[]?)
Exports a model to ONNX format.
public static void Export<T, TInput, TOutput>(IFullModel<T, TInput, TOutput> model, string outputPath, int[]? inputShape = null)
Parameters
modelIFullModel<T, TInput, TOutput>The model to export.
outputPathstringThe path to write the ONNX file.
inputShapeint[]Optional input shape. If not provided, will try to infer from model.
Type Parameters
TThe numeric type of the model.
TInputTOutput
Exceptions
- ArgumentNullException
If model or outputPath is null.
- NotSupportedException
If the model contains unsupported layer types.