码迷,mamicode.com
首页 > Web开发 > 详细

2017年12月19日 课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization),第二周(Optimization algorithms) —— 2.Programming assignments:Optimization

时间:2017-12-19 12:36:36      阅读:226      评论:0      收藏:0      [点我收藏+]

标签:png   mil   nbsp   pat   res   tween   ali   nim   model   

Optimization

Welcome to the optimization‘s programming assignment of the hyper-parameters tuning specialization. There are many different optimization algorithms you could be using to get you to the minimal cost. Similarly, there are many different paths down this hill to the lowest point.

技术分享图片

By completing this assignment you will:

- Understand the intuition (直觉)between Adam and RMS prop

- Recognize the importance of mini-batch gradient descent

- Learn the effects of momentum on the overall performance of your model

This assignment prepares you well for the upcoming assignment. Take your time to complete it and make sure you get the expected outputs when working through the different exercises. In some code blocks, you will find a "#GRADED FUNCTION: functionName" comment. Please do not modify it. After you are done, submit your work and check your results. You need to score 80% to pass. Good luck :) !

-----------------------------------------------------------------------------------------------------------------

 

2017年12月19日 课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization),第二周(Optimization algorithms) —— 2.Programming assignments:Optimization

标签:png   mil   nbsp   pat   res   tween   ali   nim   model   

原文地址:http://www.cnblogs.com/hezhiyao/p/8064667.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!