Table of Contents

Class AdamOptimizer

Namespace
NeuralNetworks.Optimizers
Assembly
NeuralNetworks.dll

Implements the Adam optimizer for neural network training. Adam combines momentum and adaptive learning rates for each parameter.

public class AdamOptimizer : Optimizer
Inheritance
AdamOptimizer
Inherited Members

Constructors

AdamOptimizer(LearningRate, float, float, float)

public AdamOptimizer(LearningRate learningRate, float beta1 = 0.9, float beta2 = 0.999, float eps = 1E-08)

Parameters

learningRate LearningRate
beta1 float
beta2 float
eps float

Methods

ToString()

Returns a string that represents the current object.

public override string ToString()

Returns

string

A string that represents the current object.

Update(object, Span<float>, ReadOnlySpan<float>)

Updates the specified parameters in place using the Adam optimization algorithm and the provided gradients.

protected override void Update(object paramsKey, Span<float> paramsToUpdate, ReadOnlySpan<float> paramGradients)

Parameters

paramsKey object

An object that uniquely identifies the parameter set to update. Used to maintain optimizer state for each parameter group.

paramsToUpdate Span<float>

A span containing the parameter values to be updated. The values are modified in place based on the computed Adam update.

paramGradients ReadOnlySpan<float>

A read-only span containing the gradients corresponding to each parameter in paramsToUpdate. Must have the same length as paramsToUpdate.

Remarks

This method applies the Adam optimizer update rule to the parameters, maintaining per-parameter first and second moment estimates across calls. The optimizer state is tracked per paramsKey. The method expects that paramsToUpdate and paramGradients have the same length; otherwise, a debug assertion will fail. The update is performed in place, modifying the values in paramsToUpdate directly.