[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Contribution to the optimization toolbox
From: |
Michael Creel |
Subject: |
Re: Contribution to the optimization toolbox |
Date: |
Tue, 8 Sep 2009 11:10:54 +0200 |
On Mon, Sep 7, 2009 at 6:42 PM, Jaroslav Hajek <address@hidden> wrote:
> On Mon, Sep 7, 2009 at 5:36 PM, Michael Creel<address@hidden> wrote:
>> On Mon, Sep 7, 2009 at 3:28 PM, John W. Eaton <address@hidden> wrote:
>>> On 7-Sep-2009, Michael Creel wrote:
>>>
>>> | I'm trying to
>>> | compile a checkout of the development version to see if I can
>>> | replicate your results, but it crashes with
>>> | In file included from ./txt-eng-ft.h:28,
>>> | from ./gl-render.h:43,
>>> | from ./DLD-FUNCTIONS/fltk_backend.cc:60:
>>> | /usr/include/ft2build.h:56:38: error: freetype/config/ftheader.h: No
>>> | such file or directory
>>>
>>> I'd say fix this problem first. Why is freetype/config/ftheader.h
>>> missing? Note that Octave is not including this file directly, it is
>>> included from /usr/include/ft2build.h, so I think there is some
>>> problem with your installation of FreeType. Look at the
>>> /usr/include/ft2build.h file to see if there are some clues there.
>>>
>>> jwe
>>>
>>
>> Hmm, the fltk stuff is pretty new for me. I'm running Kubuntu 9.04,
>> and freetype, etc., come from the provided packages. My first problem
>> is that I don't have a clear idea of what libraries (or more usefully
>> for me, Debian package names) are required. Perhaps I'm missing a
>> needed package. I can compile 3.2.x release candidates without
>> problems, including with the fltk backend. There doesn't seem to be a
>> switch to disable compilation of the fltk backend in the development
>> version. Is there some way to do that?
>> Michael
>>
>>
>
> First, make sure you have all the -devel packages installed; there is
> sometimes a missing dependency between those.
> Can you locate the header <somepath>/freetype/config/ftheader.h? If
> yes, just add -I<somepath> to CXXFLAGS.
>
John and Jaroslav - thanks for the tips, I just had to make a symlink
to put a directory in the place it was expected to be. The development
version is now working fine for me. With the development version, I
checked the little test script included below, and got the results
octave:3> compare
Results using analytic gradient
ans =
0.018180 0.999000 0.093864 0.997000
Results using numeric gradient
ans =
0.025299 0.999000 0.061082 1.000000
The recent change in the initial trust region setting has helped
fminunc. In my results, bfgsmin is still faster than fminunc, and
there is no difference for me in the performance of fminunc using
Octave 3.2.2 versus the checkout of development sources. Perhaps I am
using GradObj incorrectly, because for fminunc the numeric gradient is
faster than the analytic gradient.
At any rate, I don't make too much of these test results, because it's
only one test function and there are all sorts of reasons our results
can vary (architectures/compilation switches, etc). This exercise
does show how having a body of tests could be useful for comparing
methods, and for checking for regressions.
Cheers, Michael
################## Last version of test code #######################3
1;
# in the calls to the minimization functions, use "objective" if you
want analytic gradient, otherwise, use "objective2"
# example obj. fn.: with gradient
function [obj_value, g] = objective(x)
[obj_value, g] = rosenbrock(x);
endfunction
# example obj. fn.: no gradient
function obj_value = objective2(x)
obj_value = rosenbrock(x);
endfunction
dim = 5;
replications = 1000;
results = zeros(replications,4);
ub = 2;
lb = 0;
control = {100,0}; # set the second number to 1,2,3 for increasing
levels of bfgsmin verbosity
optimset("GradObj", "on");
for i = 1:replications
x = (ub-lb).*rand(dim,1) + lb;
tic;
[theta, obj_value, convergence] = bfgsmin("objective", {x}, control);
results(i,1) = toc;
results(i,2) = norm(theta - ones(dim,1)) < 1e-5;
tic;
[theta, obj_value] = fminunc("objective", x);
results(i,3) = toc;
results(i,4) = norm(theta - ones(dim,1)) < 1e-5;
endfor
printf("Results using analytic gradient\n");
mean(results)
optimset("GradObj", "off");
for i = 1:replications
x = (ub-lb).*rand(dim,1) + lb;
tic;
[theta, obj_value, convergence] = bfgsmin("objective2", {x}, control);
- Re: Contribution to the optimization toolbox, (continued)
- Re: Contribution to the optimization toolbox, Michael D Godfrey, 2009/09/07
- Re: Contribution to the optimization toolbox, Michael Creel, 2009/09/04
- Re: Contribution to the optimization toolbox, Jaroslav Hajek, 2009/09/06
- Re: Contribution to the optimization toolbox, Michael Creel, 2009/09/07
- Re: Contribution to the optimization toolbox, John W. Eaton, 2009/09/07
- Re: Contribution to the optimization toolbox, Michael Creel, 2009/09/07
- Re: Contribution to the optimization toolbox, Jaroslav Hajek, 2009/09/07
- Re: Contribution to the optimization toolbox,
Michael Creel <=
- Re: Contribution to the optimization toolbox, Jaroslav Hajek, 2009/09/08
- Re: Contribution to the optimization toolbox, Michael Creel, 2009/09/08
- Re: Contribution to the optimization toolbox, Jaroslav Hajek, 2009/09/08
- Re: Contribution to the optimization toolbox, Jaroslav Hajek, 2009/09/08
- Re: Contribution to the optimization toolbox, Jaroslav Hajek, 2009/09/08
- Re: Contribution to the optimization toolbox, Michael Creel, 2009/09/08