Table of Contents

Namespace NeuralNetworks.Optimizers

Classes

AdamOptimizer

Implements the Adam optimizer for neural network training. Adam combines momentum and adaptive learning rates for each parameter.

GradientDescentMomentumOptimizer

Implements the classic "Stochastic" Gradient Descent (SGD) optimizer with momentum for neural network training. Updates parameters by applying momentum to the previous update and subtracting the scaled gradient using a specified learning rate.

GradientDescentOptimizer

Implements the classic "Stochastic" Gradient Descent (SGD) optimizer for neural network training. Updates parameters by subtracting the scaled gradient using a specified learning rate.

Optimizer

Base class for a neural network optimizer.