Global searching is not enabled.
Skip to main content
File

Materi 5: Optimization for Deep Learning

Completion requirements
View

PPT ini memuat materi mengenai Remember Backpropagation Algorithm, Gradient Descent, Basic Algorithm yaitu Stochastic gradient descent (SGD), Gradient Descent with Momentum, Root Mean Squared Prop (RMSProp) dan Adaptive Moment Estimation (Adam).

Click 5_Optimization for Deep Learning.pdf link to view the file.