A2 Notes
Adam opt
The Adam optimizer is a popular optimization algorithm used in machine learning for stochastic gradient descent (SGD)-based optimization. It stands for Adaptive Moment Estimation and combines the best parts of two other optimization algorithms, AdaGrad and RMSProp.
Adam compared to alternatvies
GDS
Gradient descent
VAE
Variational auto encoder consists of an ecoder a decoder and a loss function. The encoder is a neural net, and the decoder is another neural net