lilypond-user
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Speed tips, again, for extremely large scores?


From: Trevor Daniels
Subject: Re: Speed tips, again, for extremely large scores?
Date: Mon, 1 Feb 2010 12:37:42 -0000


Mats Bengtsson wrote Monday, February 01, 2010 11:52 AM

Martin Tarenskeen wrote:

On Mon, 1 Feb 2010, Michael Kappler wrote:

I'm also still very interested if there are possibilities to increase LilyPond performance further. My machine is very slow, though and I cannot speak for many people when raising performance issues.

Would it be an idea to create a "Lilypond Benchmark" webpage, small, interesting, and useless ;-) This webpage would show a list with the following info:
1. Hardware / Processor type
2. Platform / OS
3. Lilypond version 4. The benchmark result: I suggest to give te time needed to "make" the (at least in this mailinglist) already famous "Reubke Psalm 94" score.

It would be both interesting and a useful check on
whether code additions to new releases have had an
effect on processing speed.  Although for this we
would have to establish one or maybe several standard
configurations so the tests are directly comparable.

Don't forget that RAM size is a major factor when it comes to processing time.

Probably the major factor along with cpu speed,
unless you already have enough RAM.

There are several other parameters which will affect
the speed but which are harder to determine.  These
vary depending on the particular computer model, so the
results might still be very variable.  These include

 Processor cache size and speed
 Memory transfer rates
 Hard disk seek/search times
   (affects page rates if RAM is insufficient)
 Whether page file is fragmented (ditto)

Also the effect of other processes would need to be
excluded by doing the test in a newly-booted clean
system, and running the test several times, retaining
just the fastest time.

Trevor






reply via email to

[Prev in Thread] Current Thread [Next in Thread]