码迷,mamicode.com
首页 > 编程语言 > 详细

优化算法动画演示Alec Radford's animations for optimization algorithms

时间:2016-03-28 13:28:40      阅读:448      评论:0      收藏:0      [点我收藏+]

标签:

Alec Radford has created some great animations comparing optimization algorithms SGDMomentumNAGAdagradAdadelta,RMSprop (unfortunately no Adam) on low dimensional problems. Also check out his presentation on RNNs.

"Noisy moons: This is logistic regression on noisy moons dataset from sklearn which shows the smoothing effects of momentum based techniques (which also results in over shooting and correction). The error surface is visualized as an average over the whole dataset empirically, but the trajectories show the dynamics of minibatches on noisy data. The bottom chart is an accuracy plot."
技术分享
"Beale‘s function: Due to the large initial gradient, velocity based techniques shoot off and bounce around - adagrad almost goes unstable for the same reason. Algos that scale gradients/step sizes like adadelta and RMSProp proceed more like accelerated SGD and handle large gradients with more stability."
技术分享
"Long valley: Algos without scaling based on gradient information really struggle to break symmetry here - SGD gets no where and Nesterov Accelerated Gradient / Momentum exhibits oscillations until they build up velocity in the optimization direction. Algos that scale step size based on the gradient quickly break symmetry and begin descent."
技术分享
"Saddle point: Behavior around a saddle point. NAG/Momentum again like to explore around, almost taking a different path. Adadelta/Adagrad/RMSProp proceed like accelerated SGD."
技术分享
 
from: http://www.denizyuret.com/2015/03/alec-radfords-animations-for.html

优化算法动画演示Alec Radford's animations for optimization algorithms

标签:

原文地址:http://www.cnblogs.com/GarfieldEr007/p/5328618.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!