[plug] Error in Time() command
daniel at rimspace.net
Mon May 24 08:56:31 WST 2010
"tenzero at iinet.net.au" <tenzero at iinet.net.au> writes:
> I'm seeking a preferably citeable reference to the amount of error in the
> returned result from a Time() command. I want to be able to quote the level
> of error in timing the execution speed of my project.
You mean time(1), the command line utility, as you did when you mentioned this
I doubt anyone has bothered with a scholarly citation, in part because the
answer is necessarily going to depend on the software and hardware
configuration of the system.
Given that time uses the wait4 rusage results, and those use the system clock,
that will referencing a hardware clock based on availability and reliability
of those clocks...
So, the short answer is: it varies. Generally, you should be able to find out
through inspection of the kernel code and all, though.
I very much suspect, however, that your question is meaningless: you are
asking us this because you want to use it to prove something, and you thought
about that, then looked at time(1), then asked questions about that.
If you go back and tell us what your actual goal is, we should be able to give
you a better answer that will help you get useful information.
Starting with "run the process multiple times, because cache effects and
memory pressure from other applications *will* change the results."
✣ Daniel Pittman ✉ daniel at rimspace.net ☎ +61 401 155 707
♽ made with 100 percent post-consumer electrons
More information about the plug