Class GraphNeuralOperator<T>
- Namespace
- AiDotNet.PhysicsInformed.NeuralOperators
- Assembly
- AiDotNet.dll
Implements Graph Neural Operators for learning operators on graph-structured data.
public class GraphNeuralOperator<T> : NeuralNetworkBase<T>, INeuralNetworkModel<T>, INeuralNetwork<T>, IFullModel<T, Tensor<T>, Tensor<T>>, IModel<Tensor<T>, Tensor<T>, ModelMetadata<T>>, IModelSerializer, ICheckpointableModel, IParameterizable<T, Tensor<T>, Tensor<T>>, IFeatureAware, IFeatureImportance<T>, ICloneable<IFullModel<T, Tensor<T>, Tensor<T>>>, IGradientComputable<T, Tensor<T>, Tensor<T>>, IJitCompilable<T>, IInterpretableModel<T>, IInputGradientComputable<T>, IDisposable
Type Parameters
TThe numeric type used for calculations.
- Inheritance
-
GraphNeuralOperator<T>
- Implements
- Inherited Members
- Extension Methods
Remarks
For Beginners: Graph Neural Operators extend neural operators to irregular, graph-structured domains.
Why Graphs? Many physical systems are naturally represented as graphs:
- Molecular structures (atoms = nodes, bonds = edges)
- Mesh-based simulations (mesh points = nodes, connectivity = edges)
- Traffic networks (intersections = nodes, roads = edges)
- Social networks, power grids, etc.
Regular operators (FNO, DeepONet) work on:
- Structured grids (images, regular spatial domains)
- Euclidean spaces
Graph operators work on:
- Irregular geometries
- Non-Euclidean spaces
- Variable-size domains
Key Idea - Message Passing: Information propagates through the graph via message passing:
- Each node has features (e.g., temperature, velocity)
- Nodes send messages to neighbors
- Nodes aggregate messages and update their features
- Repeat for multiple layers
Applications:
- Molecular dynamics (predict molecular properties)
- Computational fluid dynamics (irregular meshes)
- Material science (crystal structures)
- Climate modeling (irregular Earth grids)
- Particle systems
Constructors
GraphNeuralOperator(NeuralNetworkArchitecture<T>, int, int, IGradientBasedOptimizer<T, Tensor<T>, Tensor<T>>?, int, bool)
public GraphNeuralOperator(NeuralNetworkArchitecture<T> architecture, int numLayers = 4, int hiddenDim = 64, IGradientBasedOptimizer<T, Tensor<T>, Tensor<T>>? optimizer = null, int inputDim = 0, bool normalizeAdjacency = true)
Parameters
architectureNeuralNetworkArchitecture<T>numLayersinthiddenDimintoptimizerIGradientBasedOptimizer<T, Tensor<T>, Tensor<T>>inputDimintnormalizeAdjacencybool
Properties
ParameterCount
Gets the total number of parameters across graph layers.
public override int ParameterCount { get; }
Property Value
SupportsJitCompilation
Gets whether this model currently supports JIT compilation.
public override bool SupportsJitCompilation { get; }
Property Value
- bool
True if the model can be JIT compiled, false otherwise.
Remarks
Some models may not support JIT compilation due to: - Dynamic graph structure (changes based on input) - Lack of computation graph representation - Use of operations not yet supported by the JIT compiler
For Beginners: This tells you whether this specific model can benefit from JIT compilation.
Models return false if they:
- Use layer-based architecture without graph export (e.g., current neural networks)
- Have control flow that changes based on input data
- Use operations the JIT compiler doesn't understand yet
In these cases, the model will still work normally, just without JIT acceleration.
SupportsTraining
Indicates whether this network supports training (learning from data).
public override bool SupportsTraining { get; }
Property Value
Remarks
For Beginners: Not all neural networks can learn. Some are designed only for making predictions with pre-set parameters. This property tells you if the network can learn from data.
Methods
Backpropagate(Tensor<T>)
Performs backpropagation to compute gradients for network parameters.
public override Tensor<T> Backpropagate(Tensor<T> outputGradients)
Parameters
outputGradientsTensor<T>The gradients of the loss with respect to the network outputs.
Returns
- Tensor<T>
The gradients of the loss with respect to the network inputs.
Remarks
For Beginners: Backpropagation is how neural networks learn. After making a prediction, the network calculates how wrong it was (the error). Then it works backward through the layers to figure out how each parameter contributed to that error. This method handles that backward flow of information.
The "gradients" are numbers that tell us how to adjust each parameter to reduce the error.
API Change Note: The signature changed from Vector<T> to Tensor<T> to support multi-dimensional gradients. This is a breaking change. If you need backward compatibility, consider adding an overload that accepts Vector<T> and converts it internally to Tensor<T>.
Exceptions
- InvalidOperationException
Thrown when the network is not in training mode or doesn't support training.
CreateNewInstance()
Creates a new instance with the same configuration.
protected override IFullModel<T, Tensor<T>, Tensor<T>> CreateNewInstance()
Returns
- IFullModel<T, Tensor<T>, Tensor<T>>
New graph operator instance.
DeserializeNetworkSpecificData(BinaryReader)
Deserializes graph operator-specific data.
protected override void DeserializeNetworkSpecificData(BinaryReader reader)
Parameters
readerBinaryReaderBinary reader.
Forward(Tensor<T>)
Forward pass using an identity adjacency matrix.
public Tensor<T> Forward(Tensor<T> input)
Parameters
inputTensor<T>Node feature tensor.
Returns
- Tensor<T>
Updated node features.
Forward(Tensor<T>, Tensor<T>)
public Tensor<T> Forward(Tensor<T> nodeFeatures, Tensor<T> adjacencyMatrix)
Parameters
nodeFeaturesTensor<T>adjacencyMatrixTensor<T>
Returns
- Tensor<T>
Forward(T[,], T[,])
Forward pass through the graph neural operator.
public T[,] Forward(T[,] nodeFeatures, T[,] adjacencyMatrix)
Parameters
nodeFeaturesT[,]Features for each node.
adjacencyMatrixT[,]Graph adjacency matrix.
Returns
- T[,]
Updated node features.
GetGradients()
Gets the gradients from all layers in the neural network.
public override Vector<T> GetGradients()
Returns
- Vector<T>
A vector containing all gradients from all layers concatenated together.
Remarks
This method collects the gradients from every layer in the network and combines them into a single vector. This is useful for optimization algorithms that need access to all gradients at once.
For Beginners: During training, each layer calculates how its parameters should change (the gradients). This method gathers all those gradients from every layer and puts them into one long list.
Think of it like:
- Each layer has notes about how to improve (gradients)
- This method collects all those notes into one document
- The optimizer can then use this document to update the entire network
This is essential for the learning process, as it tells the optimizer how to adjust all the network's parameters to improve performance.
GetModelMetadata()
Gets metadata about the graph neural operator.
public override ModelMetadata<T> GetModelMetadata()
Returns
- ModelMetadata<T>
Model metadata.
GetParameters()
Gets the operator parameters as a flattened vector.
public override Vector<T> GetParameters()
Returns
- Vector<T>
InitializeLayers()
Initializes the layers of the neural network based on the architecture.
protected override void InitializeLayers()
Remarks
For Beginners: This method sets up all the layers in your neural network according to the architecture you've defined. It's like assembling the parts of your network before you can use it.
Predict(Tensor<T>)
Makes a prediction using the graph neural operator.
public override Tensor<T> Predict(Tensor<T> input)
Parameters
inputTensor<T>Input tensor containing node features (and optional adjacency).
Returns
- Tensor<T>
Predicted node features.
SerializeNetworkSpecificData(BinaryWriter)
Serializes graph operator-specific data.
protected override void SerializeNetworkSpecificData(BinaryWriter writer)
Parameters
writerBinaryWriterBinary writer.
Train(Tensor<T>, Tensor<T>)
Performs a basic supervised training step using MSE loss.
public override void Train(Tensor<T> input, Tensor<T> expectedOutput)
Parameters
inputTensor<T>Training input tensor.
expectedOutputTensor<T>Expected output tensor.
TrainOnGraph(T[,], T[,], T[,], int, double, bool)
Trains the graph neural operator on a single graph.
public TrainingHistory<T> TrainOnGraph(T[,] nodeFeatures, T[,] adjacencyMatrix, T[,] targetValues, int epochs = 200, double learningRate = 0.001, bool verbose = true)
Parameters
nodeFeaturesT[,]Node feature matrix.
adjacencyMatrixT[,]Adjacency matrix.
targetValuesT[,]Target node features.
epochsintNumber of training epochs.
learningRatedoubleLearning rate.
verboseboolWhether to print progress.
Returns
- TrainingHistory<T>
Training history.
UpdateParameters(Vector<T>)
Updates the operator parameters from a flattened vector.
public override void UpdateParameters(Vector<T> parameters)
Parameters
parametersVector<T>Parameter vector.