Optimization using Matlab

The organization is influenced by Van Loan: Introduction to Scientific Computing, Prentice Hall 2000 Matlab script file for combining Steepest Descent and Newton

Codes needed


Problem dependent: Objective function, gradient, Hessian

function z=objf(X)
%  Call: z=objf(x)
%  Input: n-columnvector x
%  Output: value of function at x, scalar
x=X(1);y=X(2);
z=100*(y-x^2)^2 + (1-x)^2;  % Rosenbrock's banana.


function v=objg(X)
%  z=objg(X)
%  Syöte: n-sarakevektori X
%  Tulos: kohdefunktion arvon X:ssä, skalaari  
x=X(1);y=X(2);
v=[-400*(y-x^2)*x-2+2*x;  % Gradient of rosenbrock
   200*y-200*x^2];


function [gr,H]=gradhess(X)
x=X(1);y=X(2);
gr=[-400*(y-x^2)*x-2+2*x; 
   200*y-200*x^2];
   
H=[-400*x, 200;
-400*x ,  1200*x^2-400*y+2];

Note: Its very convenient to generate these expressions with Maple.

General optimization steps

Steepest descent step

  function [xnew,fnew,gnew] = SDaskel(xc,fc,gc,Lmax,auto)
% [xnew,fnew,gnew] = SDStep(xc,f,g,Lmax)
% Generates a steepest descent step.
%
%  Arguments:
%    input:
%     xc     a column-vector, an approximate minimizer
%     fc     f(xc)
%     gc     g(xc)
%   Lmax     max step length
%    output:
%   xnew     a column n-vector, an improved minimizer
%   fnew     f(xnew)
%   gnew     g(new)
%

nplotvals = 20;

% Get the Steepest Descent Search Direction
sc = -gc;
sc = -gc;
% Line Search
% Manual search:
if auto==0
  % Sample across [0,L]
  L=Lmax;
  lambdavals = linspace(0,L,nplotvals);
  fvals = zeros(1,nplotvals);
  for k=1:nplotvals
    fvals(k) = objf(xc+lambdavals(k)*sc);
  end
  figure
  plot(lambdavals,fvals)

else
  % Automatic search by interval halving.
  % Try to get L<=Lmax so xc+L*sc is at least as good as xc.
  L = Lmax;
  Halvings = 0;
  fL = objf(xc+L*sc);
  while (fL>=fc) & Halvings<=10
    L = L/2;
    Halvings = Halvings+1;
    fL = objf(xc+L*sc);
   end  
% Sample across [0,L]
   lambdavals = linspace(0,L,nplotvals);
   fvals = zeros(1,nplotvals);
   for k=1:nplotvals
      fvals(k) = objf(xc+lambdavals(k)*sc);
   end
end

   % Line search.
[fnew,i] = min(fvals(1:nplotvals));
xnew = xc + lambdavals(i(1))*sc;
gnew = objg(xnew);

Newton step

function xuusi=Nminstep(X)
[g,H]=gradhess(X);
h=-H\g;
xuusi=X+h;