octave-maintainers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Contribution to the optimization toolbox


From: Michael Creel
Subject: Re: Contribution to the optimization toolbox
Date: Tue, 8 Sep 2009 16:45:51 +0200

On Tue, Sep 8, 2009 at 4:20 PM, Jaroslav Hajek <address@hidden> wrote:
>
> Looking at the problem from a slightly different perspective, it now
> seems to me that taking the dogleg TR step selection method from
> fsolve may not have been that good idea as I thought initially. For
> fsolve, it shines because fsolve updates the model after each trial
> step, even an unsuccessful one. Uhm. Maybe a line search technique
> would be better. At least that would allow simply employing all three
> updating techniques in fminunc: BFGS, factored dual BFGS, and LBFGS.
>
> Btw., when I noticed that the rosenbrock function is actually a sum of
> squares, just for fun I tried rewriting the problem as a nonlinear
> least squares and feed it to fsolve. Doing thath, fsolve literally
> crushed the problem converging in 7 iterations (8 function and one
> jacobian evaluation). But of course that is a different story; it
> exploits the structure of the problem.

Well, that's evidence that fsolve works well :-).

>
> Do you have any references for the line search used in bfgsmin?

I could come up with some - for example it must be in Gill, Murray and
Wright. It's just a Newton step by default, falling pack to bisection
if the Newton step fails. There are a lot of references for those
methods. I worked out the details of the implementation myself, by
experimentation. The Newton step is just based on a Taylor's series
approximation about a step of 1, and bisection just starts at 1 and
halves until an improvement is found, and then keeps going until the
benefits from continuing get small.. I put bounds on the Newton step
to avoid flying off into uncharted territory.

Michael


reply via email to

[Prev in Thread] Current Thread [Next in Thread]