码迷,mamicode.com
首页 > 其他好文 > 详细

Advanced Optimization(高级优化)

时间:2017-08-17 18:38:41      阅读:290      评论:0      收藏:0      [点我收藏+]

标签:fast   option   test   object   rac   技术分享   tla   yourself   provided   

Note: [7:35 - ‘100‘ should be 100 instead. The value provided should be an integer and not a character string.]

"Conjugate gradient", "BFGS", and "L-BFGS" are more sophisticated, faster ways to optimize θ that can be used instead of gradient descent. We suggest that you should not write these more sophisticated algorithms yourself (unless you are an expert in numerical computing) but use the libraries instead, as they‘re already tested and highly optimized. Octave provides them.

We first need to provide a function that evaluates the following two functions for a given input value θ:

技术分享

We can write a single function that returns both of these:

function [jVal, gradient] = costFunction(theta)
  jVal = [...code to compute J(theta)...];
  gradient = [...code to compute derivative of J(theta)...];
end

Then we can use octave‘s "fminunc()" optimization algorithm along with the "optimset()" function that creates an object containing the options we want to send to "fminunc()". (Note: the value for MaxIter should be an integer, not a character string - errata in the video at 7:30)

options = optimset(‘GradObj‘, ‘on‘, ‘MaxIter‘, 100);
initialTheta = zeros(2,1);
   [optTheta, functionVal, exitFlag] = fminunc(@costFunction, initialTheta, options);

 We give to the function "fminunc()" our cost function, our initial vector of theta values, and the "options" object that we created beforehand. 

  

 

Advanced Optimization(高级优化)

标签:fast   option   test   object   rac   技术分享   tla   yourself   provided   

原文地址:http://www.cnblogs.com/ne-zha/p/7383091.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!