Housekeeping
close all; clear; clc;
Setup
x0 = [1.5; 3.0]; % Initial Guess
H0 = eye(2); % Initial Guess for H
iter = 0; % iteration counter
itermax = 1000; % maximum number of allowable iterations
tol = 1e-3; % tolerance for stopping iterations
xresults = [x0]; % array for holding x iterations
Functions for Evaluation
Before we begin the steepest descent search, we have to ready a few functions. The first is a symbolic function we can use to calculate the gradient.
First we generate a symbolic expression for our function so that we can use MATLAB's symbolic expression toolbox to create a symbolic gradient function.
Second we generate a general function for calculating the function value by passing through the array of x values
Third we substitute values into the symbolic gradient function to evaluate the gradient.
syms x y % symbolic variables for x(1) and x(2)
griewanksym = @(x,y) 1 + (1/4000)*x^2 + (1/4000)*y^2 - cos(x)*cos((sqrt(2)/2)*y); % symbolic expression for griewank function
gradientsym = [diff(griewanksym,x); diff(griewanksym,y)]; % symbolic expression for griewank gradient
griewank = @(x) 1 + (1/4000)*x(1)^2 + (1/4000)*x(2)^2 - cos(x(1))*cos((sqrt(2)/2)*x(2)); % regular function for evaluating griewank function
zresults = [griewank(x0)]; % array for holding z iterations
grad = double(subs(gradientsym, [x y], [x0(1) x0(2)])); % initial gradient calculation
beta = 0; % initial value for beta
d = -grad; % initial value for direction vector
Primary Loop
The general algorithm of the Steepest Descent Method is as follows:
1. Calculate the target functions gradient at the current point. If the magnitude of the gradient is below the desired threshold, break.
2. Search along the negative gradient (if minimizing) for an increase in the function value (bracketing) to create an initial line that contains the minimum (along the negative gradient).
3. Use the Golden Search Method to search along the bracketed area of the negative gradient to find the minimum and set that as our new function minimum
4. Returnt to step one
while (norm(grad,2) >= tol) && iter<itermax % Loop until the gradient is small enough or we've exceeded our iterations
d = -H0*grad; % calculating new direction vector
b = bracket(x0, d, griewank, 0.01); % bracketing the minimizer
xnew = goldensearch(b(:,1), b(:,2), griewank, 3); % finding the new point using golden ratio
gradnew = double(subs(gradientsym, [x y], [xnew(1) xnew(2)])); % calculating new gradient value
dx = xnew - x0; % calculating difference value
dg = gradnew - grad; % calculating difference value
x0 = xnew; % updating values
grad = gradnew; % updating values
Hpart1 = (dx*dx')/(dg'*dx); % value of first numerator
Hpart2 = ((H0*dg)*(H0*dg)')/(dg'*H0*dg); % value of second numerator
H0 = H0 + Hpart1 - Hpart2; % calculating new matrix
xresults = [xresults x0]; % storing the results
zresults = [zresults griewank(x0)]; % storing the results
iter = iter + 1;
end
Plotting Results
X = xresults(1,:); Y = xresults(2,:); Z = zresults; % separating variables for plotting
[XX,YY] = meshgrid(linspace(0,6,20),linspace(0,9,20)); % creating mesh for contour plot
ZZ = 1 + (1/4000).*XX.*XX + (1/4000).*YY.*YY - cos(XX).*cos((sqrt(2)/2).*YY); % calculating z-values for contour plot
f = figure(); f.Position = [0 0 800 800]; hold on;
contourf(XX,YY,ZZ, 50, 'EdgeColor', 'none'); shading interp; axis square; % smooth contour plot of the target function
%s1 = plot(X,Y, '.', 'Color', 'k', 'MarkerSize', 20); axis square; % plotting markers
s2 = plot(X,Y, 'Color', 'k', 'LineWidth', 2); axis square; % joining markers
title('DFP [1.5; 3.0]'); xlabel('x-axis'); ylabel('y-axis'); % axis labels
hold off