Table of Contents

Class GradientDescentMomentumOptimizer

Namespace
NeuralNetworks.Optimizers
Assembly
NeuralNetworks.dll

Implements the classic "Stochastic" Gradient Descent (SGD) optimizer with momentum for neural network training. Updates parameters by applying momentum to the previous update and subtracting the scaled gradient using a specified learning rate.

public class GradientDescentMomentumOptimizer : Optimizer
Inheritance
GradientDescentMomentumOptimizer
Inherited Members

Remarks

This optimizer supports parameter updates for 1D, 2D, and 4D float arrays. The momentum term helps accelerate gradients in the relevant direction and dampens oscillations.

Constructors

GradientDescentMomentumOptimizer(LearningRate, float)

Implements the classic "Stochastic" Gradient Descent (SGD) optimizer with momentum for neural network training. Updates parameters by applying momentum to the previous update and subtracting the scaled gradient using a specified learning rate.

public GradientDescentMomentumOptimizer(LearningRate learningRate, float momentum)

Parameters

learningRate LearningRate
momentum float

Remarks

This optimizer supports parameter updates for 1D, 2D, and 4D float arrays. The momentum term helps accelerate gradients in the relevant direction and dampens oscillations.

Methods

ToString()

Returns a string that represents the current object.

public override string ToString()

Returns

string

A string that represents the current object.

Update(Layer?, float[,,,], float[,,,])

public override void Update(Layer? layer, float[,,,] param, float[,,,] paramGradient)

Parameters

layer Layer
param float[,,,]
paramGradient float[,,,]

Update(Layer?, float[,], float[,])

public override void Update(Layer? layer, float[,] param, float[,] paramGradient)

Parameters

layer Layer
param float[,]
paramGradient float[,]

Update(Layer?, float[], float[])

public override void Update(Layer? layer, float[] param, float[] paramGradient)

Parameters

layer Layer
param float[]
paramGradient float[]