Table of Contents

Class GraphNeuralOperator<T>

Namespace
AiDotNet.PhysicsInformed.NeuralOperators
Assembly
AiDotNet.dll

Implements Graph Neural Operators for learning operators on graph-structured data.

public class GraphNeuralOperator<T> : NeuralNetworkBase<T>, INeuralNetworkModel<T>, INeuralNetwork<T>, IFullModel<T, Tensor<T>, Tensor<T>>, IModel<Tensor<T>, Tensor<T>, ModelMetadata<T>>, IModelSerializer, ICheckpointableModel, IParameterizable<T, Tensor<T>, Tensor<T>>, IFeatureAware, IFeatureImportance<T>, ICloneable<IFullModel<T, Tensor<T>, Tensor<T>>>, IGradientComputable<T, Tensor<T>, Tensor<T>>, IJitCompilable<T>, IInterpretableModel<T>, IInputGradientComputable<T>, IDisposable

Type Parameters

T

The numeric type used for calculations.

Inheritance
GraphNeuralOperator<T>
Implements
IFullModel<T, Tensor<T>, Tensor<T>>
IModel<Tensor<T>, Tensor<T>, ModelMetadata<T>>
IParameterizable<T, Tensor<T>, Tensor<T>>
ICloneable<IFullModel<T, Tensor<T>, Tensor<T>>>
IGradientComputable<T, Tensor<T>, Tensor<T>>
Inherited Members
Extension Methods

Remarks

For Beginners: Graph Neural Operators extend neural operators to irregular, graph-structured domains.

Why Graphs? Many physical systems are naturally represented as graphs:

  • Molecular structures (atoms = nodes, bonds = edges)
  • Mesh-based simulations (mesh points = nodes, connectivity = edges)
  • Traffic networks (intersections = nodes, roads = edges)
  • Social networks, power grids, etc.

Regular operators (FNO, DeepONet) work on:

  • Structured grids (images, regular spatial domains)
  • Euclidean spaces

Graph operators work on:

  • Irregular geometries
  • Non-Euclidean spaces
  • Variable-size domains

Key Idea - Message Passing: Information propagates through the graph via message passing:

  1. Each node has features (e.g., temperature, velocity)
  2. Nodes send messages to neighbors
  3. Nodes aggregate messages and update their features
  4. Repeat for multiple layers

Applications:

  • Molecular dynamics (predict molecular properties)
  • Computational fluid dynamics (irregular meshes)
  • Material science (crystal structures)
  • Climate modeling (irregular Earth grids)
  • Particle systems

Constructors

GraphNeuralOperator(NeuralNetworkArchitecture<T>, int, int, IGradientBasedOptimizer<T, Tensor<T>, Tensor<T>>?, int, bool)

public GraphNeuralOperator(NeuralNetworkArchitecture<T> architecture, int numLayers = 4, int hiddenDim = 64, IGradientBasedOptimizer<T, Tensor<T>, Tensor<T>>? optimizer = null, int inputDim = 0, bool normalizeAdjacency = true)

Parameters

architecture NeuralNetworkArchitecture<T>
numLayers int
hiddenDim int
optimizer IGradientBasedOptimizer<T, Tensor<T>, Tensor<T>>
inputDim int
normalizeAdjacency bool

Properties

ParameterCount

Gets the total number of parameters across graph layers.

public override int ParameterCount { get; }

Property Value

int

SupportsJitCompilation

Gets whether this model currently supports JIT compilation.

public override bool SupportsJitCompilation { get; }

Property Value

bool

True if the model can be JIT compiled, false otherwise.

Remarks

Some models may not support JIT compilation due to: - Dynamic graph structure (changes based on input) - Lack of computation graph representation - Use of operations not yet supported by the JIT compiler

For Beginners: This tells you whether this specific model can benefit from JIT compilation.

Models return false if they:

  • Use layer-based architecture without graph export (e.g., current neural networks)
  • Have control flow that changes based on input data
  • Use operations the JIT compiler doesn't understand yet

In these cases, the model will still work normally, just without JIT acceleration.

SupportsTraining

Indicates whether this network supports training (learning from data).

public override bool SupportsTraining { get; }

Property Value

bool

Remarks

For Beginners: Not all neural networks can learn. Some are designed only for making predictions with pre-set parameters. This property tells you if the network can learn from data.

Methods

Backpropagate(Tensor<T>)

Performs backpropagation to compute gradients for network parameters.

public override Tensor<T> Backpropagate(Tensor<T> outputGradients)

Parameters

outputGradients Tensor<T>

The gradients of the loss with respect to the network outputs.

Returns

Tensor<T>

The gradients of the loss with respect to the network inputs.

Remarks

For Beginners: Backpropagation is how neural networks learn. After making a prediction, the network calculates how wrong it was (the error). Then it works backward through the layers to figure out how each parameter contributed to that error. This method handles that backward flow of information.

The "gradients" are numbers that tell us how to adjust each parameter to reduce the error.

API Change Note: The signature changed from Vector<T> to Tensor<T> to support multi-dimensional gradients. This is a breaking change. If you need backward compatibility, consider adding an overload that accepts Vector<T> and converts it internally to Tensor<T>.

Exceptions

InvalidOperationException

Thrown when the network is not in training mode or doesn't support training.

CreateNewInstance()

Creates a new instance with the same configuration.

protected override IFullModel<T, Tensor<T>, Tensor<T>> CreateNewInstance()

Returns

IFullModel<T, Tensor<T>, Tensor<T>>

New graph operator instance.

DeserializeNetworkSpecificData(BinaryReader)

Deserializes graph operator-specific data.

protected override void DeserializeNetworkSpecificData(BinaryReader reader)

Parameters

reader BinaryReader

Binary reader.

Forward(Tensor<T>)

Forward pass using an identity adjacency matrix.

public Tensor<T> Forward(Tensor<T> input)

Parameters

input Tensor<T>

Node feature tensor.

Returns

Tensor<T>

Updated node features.

Forward(Tensor<T>, Tensor<T>)

public Tensor<T> Forward(Tensor<T> nodeFeatures, Tensor<T> adjacencyMatrix)

Parameters

nodeFeatures Tensor<T>
adjacencyMatrix Tensor<T>

Returns

Tensor<T>

Forward(T[,], T[,])

Forward pass through the graph neural operator.

public T[,] Forward(T[,] nodeFeatures, T[,] adjacencyMatrix)

Parameters

nodeFeatures T[,]

Features for each node.

adjacencyMatrix T[,]

Graph adjacency matrix.

Returns

T[,]

Updated node features.

GetGradients()

Gets the gradients from all layers in the neural network.

public override Vector<T> GetGradients()

Returns

Vector<T>

A vector containing all gradients from all layers concatenated together.

Remarks

This method collects the gradients from every layer in the network and combines them into a single vector. This is useful for optimization algorithms that need access to all gradients at once.

For Beginners: During training, each layer calculates how its parameters should change (the gradients). This method gathers all those gradients from every layer and puts them into one long list.

Think of it like:

  • Each layer has notes about how to improve (gradients)
  • This method collects all those notes into one document
  • The optimizer can then use this document to update the entire network

This is essential for the learning process, as it tells the optimizer how to adjust all the network's parameters to improve performance.

GetModelMetadata()

Gets metadata about the graph neural operator.

public override ModelMetadata<T> GetModelMetadata()

Returns

ModelMetadata<T>

Model metadata.

GetParameters()

Gets the operator parameters as a flattened vector.

public override Vector<T> GetParameters()

Returns

Vector<T>

InitializeLayers()

Initializes the layers of the neural network based on the architecture.

protected override void InitializeLayers()

Remarks

For Beginners: This method sets up all the layers in your neural network according to the architecture you've defined. It's like assembling the parts of your network before you can use it.

Predict(Tensor<T>)

Makes a prediction using the graph neural operator.

public override Tensor<T> Predict(Tensor<T> input)

Parameters

input Tensor<T>

Input tensor containing node features (and optional adjacency).

Returns

Tensor<T>

Predicted node features.

SerializeNetworkSpecificData(BinaryWriter)

Serializes graph operator-specific data.

protected override void SerializeNetworkSpecificData(BinaryWriter writer)

Parameters

writer BinaryWriter

Binary writer.

Train(Tensor<T>, Tensor<T>)

Performs a basic supervised training step using MSE loss.

public override void Train(Tensor<T> input, Tensor<T> expectedOutput)

Parameters

input Tensor<T>

Training input tensor.

expectedOutput Tensor<T>

Expected output tensor.

TrainOnGraph(T[,], T[,], T[,], int, double, bool)

Trains the graph neural operator on a single graph.

public TrainingHistory<T> TrainOnGraph(T[,] nodeFeatures, T[,] adjacencyMatrix, T[,] targetValues, int epochs = 200, double learningRate = 0.001, bool verbose = true)

Parameters

nodeFeatures T[,]

Node feature matrix.

adjacencyMatrix T[,]

Adjacency matrix.

targetValues T[,]

Target node features.

epochs int

Number of training epochs.

learningRate double

Learning rate.

verbose bool

Whether to print progress.

Returns

TrainingHistory<T>

Training history.

UpdateParameters(Vector<T>)

Updates the operator parameters from a flattened vector.

public override void UpdateParameters(Vector<T> parameters)

Parameters

parameters Vector<T>

Parameter vector.