Table of Contents

Class AdamOptimizer

Namespace
NeuralNetworks.Optimizers
Assembly
NeuralNetworks.dll

Implements the Adam optimizer for neural network training. Adam combines momentum and adaptive learning rates for each parameter.

public class AdamOptimizer : Optimizer
Inheritance
AdamOptimizer
Inherited Members

Constructors

AdamOptimizer(LearningRate, float, float, float)

public AdamOptimizer(LearningRate learningRate, float beta1 = 0.9, float beta2 = 0.999, float eps = 1E-08)

Parameters

learningRate LearningRate
beta1 float
beta2 float
eps float

Methods

ToString()

Returns a string that represents the current object.

public override string ToString()

Returns

string

A string that represents the current object.

Update(Layer?, float[,,,], float[,,,])

public override void Update(Layer? layer, float[,,,] param, float[,,,] paramGradient)

Parameters

layer Layer
param float[,,,]
paramGradient float[,,,]

Update(Layer?, float[,], float[,])

public override void Update(Layer? layer, float[,] param, float[,] paramGradient)

Parameters

layer Layer
param float[,]
paramGradient float[,]

Update(Layer?, float[], float[])

public override void Update(Layer? layer, float[] param, float[] paramGradient)

Parameters

layer Layer
param float[]
paramGradient float[]