HESSIAN - Maple Help
For the best experience, we recommend viewing online help using Google Chrome or Microsoft Edge.

Online Help

All Products    Maple    MapleSim


codegen

  

HESSIAN

  

compute the HESSIAN matrix of a Maple procedure

 

Calling Sequence

Parameters

Description

Examples

Calling Sequence

HESSIAN(F)

HESSIAN(F, X)

HESSIAN(F, X, ...)

Parameters

F

-

Maple procedure

X

-

list of symbols (parameters of F)

Description

• 

The first argument F is a Maple procedure which computes a function of x1,x2,...,xn.  The HESSIAN command outputs a new procedure H, which when executed at given values for x1,x2,...,xn, returns a matrix of the second partial derivatives of H w.r.t. x1,...,xn at the given values.  For example, given

F := proc(x, y) local t; t := exp(-x); y*t + t end proc

  

The output of H := HESSIAN(F); is the procedure

H := proc(x, y) local grd1, grd2, df, grd, df1, t, dfr0;

    t := exp(-x);

    df1 := y + 1;

    grd1 := - t*df1;

    grd2 := t;

    df := array(1 .. 4);

    dfr0 := array(1 .. 4);

    df[3] := 1;

    df[2] := - df[3]*t;

    df[1] := - df[3]*df1;

    dfr0[4] := 1;

    dfr0[1] := dfr0[4];

    grd := array(1 .. 2, 1 .. 2);

    grd[1, 1] := - df[1]*exp(-x);

    grd[1, 2] := df[2];

    grd[2, 1] := - dfr0[1]*exp(-x);

    grd[2, 2] := 0;

    return grd

end proc

  

The H procedure can be optimized by optimize(H). When H is called with inputs 1.0,1.0, it outputs the matrix

0.7357588824−0.3678794412−0.36787944120

• 

The code in H is constructed by applying the GRADIENT command to F twice.  The GRADIENT command uses automatic differentiation. This often leads to a more efficient computation than symbolic differentiation, that is, what you would obtain from using linalg[hessian]. See codegen[GRADIENT for further details on automatic differentiation. The remaining arguments to HESSIAN are optional, they are described below.

• 

By default, HESSIAN computes the partial derivatives of F w.r.t. all the parameters present in F.  The optional argument X, a list of symbols, may be used to specify which parameters to take the derivative w.r.t.

• 

Two algorithms are supported, the so-called forward and reverse modes. By default, HESSIAN tries to use the reverse mode since it usually leads to a more efficient code.  If it is unable to use the reverse mode, the forward mode is used.  The user may specify which algorithm is to be used by giving the optional argument mode=forward or mode=reverse.

• 

The matrix of partial derivatives is, by default, returned as an array. The optional argument result_type=list, result_type=array, or result_type=seq specifies that the matrix of derivatives returned by H is to be a Maple list, array, and sequence respectively.

• 

The command with(codegen,HESSIAN) allows the use of the abbreviated form of this command.

Examples

withcodegen:

F := proc(x,y) local t; t := x*y; x+t-y*t; end proc;

Fprocx,ylocalt;ty*x;x+ty*tend proc

(1)

Fx,y

y2x+yx+x

(2)

HHESSIANF

Hprocx,ylocaldf,df1,dfr0,grd,grd1,grd2,t1;df1−y+1;t1df1;grd1t1*y+1;grd2t1*xy*x;dfarray1..4;dfr0array1..4;df[3]1;df[2]df[3]*y;df[1]df[2];dfr0[4]1;dfr0[2]dfr0[4]*x;dfr0[1]dfr0[2];grdarray1..2,1..2;grd[1,1]0;grd[1,2]t1*df[3]df[1];grd[2,1]dfr0[4]*t1y;grd[2,2]−dfr0[4]*xdfr0[1];returngrdend proc

(3)

optimizeH

procx,ylocaldf1,grd;df1−y+1;grdarray1..2,1..2;grd[1,1]0;grd[1,2]df1y;grd[2,1]grd[1,2];grd[2,2]−2*x;grdend proc

(4)

H1Hx,y

H1grd

(5)

evalH1

02y+12y+12x

(6)

HESSIANF,mode=forward,result_type=seq

procx,ylocaldt1,t1;dt1array1..2;dt1[1]0;dt1[2]−1;t1−y+1;returny*dt1[1],y*dt1[2]+t1,x*dt1[1]+t1y,x*dt1[2]xend proc

(7)

This example we compute the Hessian w.r.t. the first two parameters phi and omega only.  Since the torus program returns a vector of values, the result is of dimension 3.

torus  := proc(phi,omega,R,r) local x,y,z;
     x := cos(phi)*(R+r*cos(omega));
     y := sin(phi)*(R+r*cos(omega));
     z := r*sin(omega);
     [x,y,z]
end proc:

torusoptimizetorus

torusprocφ,ω,R,rlocalt1,t2,t4,t5,t6;t1cosφ;t2cosω;t4r*t2+R;t5sinφ;t6sinω;t4*t1,t5*t4,t6*rend proc

(8)

optimizeHESSIANtorus,result_type=list

procφ,ω,R,rlocaldf,df2,dfr0,dfr3,dfr4,t1,t2,t30,t4,t42,t5,t6,t7,t9;t1cosφ;t2cosω;t4t2*r+R;t5sinφ;t6sinω;df2t1*r;dfarray1..8;dfr0array1..8;dfr3array1..8;dfr4array1..8;df[5]−t4;df[4]−t5;df[3]df[4];t7df[3];df[2]r*t7;dfr0[7]−t6;dfr0[6]−df2;dfr0[1]dfr0[7]*r;dfr3[2]df2;dfr4[8]dfr0[7];dfr4[6]−t5*r;t9dfr4[8];dfr4[5]r*t9;t30t2*t1;t420,0,0,0;t1*df[5],−t6*df[2],df[3],t2*t7,−t5*dfr0[1],t2*dfr0[6],0,t1*dfr4[8],−t5,0,0,0,−t5*t2,−t6*t1,0,0,−t4*t5,−t6*dfr3[2],t1,t30,t1*dfr4[5],t2*dfr4[6],0,t5*t9,t1,0,0,0,t30,−t6*t5,0,0,t42,0,−t6*r,0,t2,t42,0,t2,0,0end proc

(9)

See Also

codegen[cost]

codegen[GRADIENT

codegen[optimize]