octave-maintainers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Suggestion - derivatives


From: Michael Creel
Subject: Re: Suggestion - derivatives
Date: Tue, 12 Oct 2010 09:29:12 +0200

On Tue, Oct 12, 2010 at 9:12 AM, GFotios <address@hidden> wrote:
>
> On Oct 12, 2010, at 8:47 AM, Michael Creel wrote:
>
>> On Mon, Oct 11, 2010 at 11:43 PM, Fotios Kasolis
>> <address@hidden> wrote:
>>>
>>> I suggest to include in Octave a function that calculates first
>>> derivatives/Jacobian matrix since optimization problems become of
>>> increasingly great importance. I am working on that and I have a weakly
>>> "optimized" finite difference routine using arbitrary stencils
>>>
>>> n = 9;
>>> h = 10^(-16/(n - 1));
>>> r = 0;
>>> A = ones (n);
>>> while (rcond (A) < eps)
>>>  x = x0 + [ -(n - 1)*h/2 : h : (n - 1)*h/2 ];
>>>  f = fun (x);
>>>  A_ = repmat (x(:), 1, n);
>>>  n_ = repmat (n - 1:-1:0, n, 1);
>>>  A = A_.^n_;
>>>  h = 2*h
>>> endwhile
>>> p = mldivide (A, f(:));
>>> dfdx = polyval (polyder (p), x0);
>>>
>>> I will also include the complex variable step derivative
>>>
>>> imag (fun(x0 + eps*1i)) / eps
>>>
>>> and finally for analytic functions I ll also implement the Cauchy's
>>> integration + fft method which I posted earlier and provides higher
>>> derivatives as well. Any other method that someone thinks is of interest
>>> and
>>> importance (other than the adjoint)? What do you think about this? Is
>>> this
>>> sth that should be included in Octave or?
>>>
>>> /Fotios
>>>
>>>
>>
>> Hi Fotios,
>> There are first and second derivative functions in Octave Forge,
>> numgradient and numhessian, which are in the optimization package. You
>> might like to check your code against them for speed and accuracy.
>> Best,
>> Michael
>
>
> Thanks for mentioning that Michael. So here are the advantages of my
> suggestions (i expect someone else to mention the drawbacks ;D).
>
> As far as i remember they use only 3-stencil central differences which is
> good but in cases not good enough since it is a 2nd order accurate
> approximation ~O(h^2). The first algorithm above uses central difference
> n-stencils where n = 2*k+1 for k>=1 which remains an approximation but a
> much better one ~O(h^(n-1)). Ofc this comes with a computational cost (the
> code above is relatively fast since vectorized). Regarding the complex step
> method - when applicable - it will be exact! (to machine epsilon (one can
> choose the step to be very small, for example eps since there is no
> subtractive cancellation)) and the computational price to pay is using
> complex arithmetic =2\times doubles. The Cauchy + fft method (described by
> Fornberg in Numerical differentiation of analytic functions) is amazing
> since it gives not only first and second derivatives but higher order as
> well (which simply is numerical approximation of Taylor series using
> trapezoidal rule on a circle, ofc higher derivatives become less and less
> accurate!).
>
> /Fotios
>

This sound very nice. Accurate and fast first and second derivatives
using an .m file implementation sounds like an improvement to me. I'm
looking forward to trying it out.
Thanks, M.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]