Comparison of Multivariate Optimization Methods - Maple Application Center
Application Center Applications Comparison of Multivariate Optimization Methods

Comparison of Multivariate Optimization Methods

Author
:
Engineering software solutions from Maplesoft
This Application runs in Maple. Don't have Maple? No problem!
 Try Maple free for 15 days!

The worksheet demonstrates the use of Maple to compare methods of unconstrained nonlinear minimization of multivariable function. Seven methods of nonlinear minimization of the n-variables objective function f(x1,x2,.,xn) are analyzed:

1) minimum search by coordinate and conjugate directions descent; 2) Powell's method; 3) the modified Hooke-Jeeves method; 4) simplex Nelder-Meed method; 5) quasi-gradient method; 6) random directions search; 7) simulated annealing. All methods are direct searching methods, i.e. they do not require the objective function f(x1,x2,.,xn) to be differentiable and continuous. Maple's Optimization package efficiency is compared with these programs. Optimization methods have been compared on the set of 21 test functions.

Application Details

Publish Date: September 15, 2009
Created In: Maple 13
Language: English

More Like This

Portfolio Optimization under Nonconvex Transaction Costs with the Global Optimization Toolbox

Polynomial multiplication using the fast Fourier transform

Lie algebra package

0
Fitting vapor pressure data with regression

Grobner Basis Package, original version

0
Terminal velocity of falling particles

Steady state material balances on a separation train

Binary batch distillation

ASCII padding

Diffusion and reaction in a 1-D slab

Reaction equilibrium for multiple gas phase reactions

Iterative solution to changing problem