Adam opt

The Adam optimizer is a popular optimization algorithm used in machine learning for stochastic gradient descent (SGD)-based optimization. It stands for Adaptive Moment Estimation and combines the best parts of two other optimization algorithms, AdaGrad and RMSProp.

Adam compared to alternatvies

image

GDS

Gradient descent image

VAE

image

Variational auto encoder consists of an ecoder a decoder and a loss function. The encoder is a neural net, and the decoder is another neural net

UNET

RESTNET-18