Class AdamOptimizer
- Namespace
- NeuralNetworks.Optimizers
- Assembly
- NeuralNetworks.dll
Implements the Adam optimizer for neural network training. Adam combines momentum and adaptive learning rates for each parameter.
public class AdamOptimizer : Optimizer
- Inheritance
-
AdamOptimizer
- Inherited Members
Constructors
AdamOptimizer(LearningRate, float, float, float)
public AdamOptimizer(LearningRate learningRate, float beta1 = 0.9, float beta2 = 0.999, float eps = 1E-08)
Parameters
learningRateLearningRatebeta1floatbeta2floatepsfloat
Methods
ToString()
Returns a string that represents the current object.
public override string ToString()
Returns
- string
A string that represents the current object.
Update(Layer?, float[,,,], float[,,,])
public override void Update(Layer? layer, float[,,,] param, float[,,,] paramGradient)
Parameters
Update(Layer?, float[,], float[,])
public override void Update(Layer? layer, float[,] param, float[,] paramGradient)
Parameters
Update(Layer?, float[], float[])
public override void Update(Layer? layer, float[] param, float[] paramGradient)