Table of Contents

Enum GpuOptimizerType

Namespace
AiDotNet.Interfaces
Assembly
AiDotNet.dll

Enumerates the types of GPU-optimized optimizers available.

public enum GpuOptimizerType

Fields

Adagrad = 4

Adagrad optimizer with accumulated squared gradients.

Adam = 1

Adam optimizer with adaptive learning rates.

AdamW = 2

AdamW optimizer with decoupled weight decay.

Lamb = 7

Layer-wise Adaptive Moments optimizer.

Lars = 6

Layer-wise Adaptive Rate Scaling.

Nag = 5

Nesterov Accelerated Gradient with lookahead.

RmsProp = 3

RMSprop optimizer with moving average of squared gradients.

Sgd = 0

Simple SGD with optional momentum.