Benchmarks

Last updated 26 July 1999.

We have graphed the performance of several popular implementations on the benchmarks that come with Gambit-C and on several other benchmarks that we have written or obtained from other sources. Please keep in mind that the results of benchmarking should not be taken too seriously.

To avoid comparing apples with oranges, we have separated our results into timings for

The benchmarked systems:

All of our timings represent the cpu time for a single run on a Sun Ultra 1 with no other users. (Eventually we will report the average of several runs, and for the garbage collection benchmarks we will eventually report real times, but the cpu times have been very repeatable.) When a few timings are missing for a system, this usually means that we have not been able to get the system to run those particular benchmarks; in some cases the problem is caused by non-portable code within a benchmark. When all of the timings are missing for a system, this usually means that the system cannot reasonably be configured to fit the category of systems being benchmarked. For example, gcc cannot be configured to generate safe code, to perform generic arithmetic, or to act as an interpreter.

The numbers shown are user cpu times obtained from a single run on an UltraSPARC with no other users. The bar graphs show relative performance. Longer is better.