|
From: | Jonathan Wakely |
Subject: | Re: What does "Execution timeout is: $test_timeout" mean? |
Date: | Thu, 3 Dec 2020 13:00:11 +0000 |
On 02/12/20 21:48 -0600, Jacob Bachmeyer wrote:
Jonathan Wakely wrote:Hi, [...] If I set test_timeout in ~/.dejagnurc or site.exp then I do indeed see the value change. But it doesn't do anything. I can set test_timeout to 2 second, and tests that sleep for 20 seconds or more will still PASS and not timeout (as long as $tool_timeout is large enough).Evidently, the GCC testsuite is overriding that setting.
Yes, GCC replaces the standard_wait proc with one that uses its own value: https://gcc.gnu.org/git/?p=gcc.git;a=blob;f=gcc/testsuite/lib/timeout.exp;h=856c2e3184122cbd1128116703fef5480ec8805e;hb=HEAD And so now I see what's happening. DG prints the value of $test_timeout, calls remote_wait with that timeout, which calls standard_wait with that timeout, but GCC replaces that proc with one that uses a different timeout. I don't think DG can know about that really. Maybe what GCC should really do is set $test_timeout instead, which is possible now that proc unix_load allows it to be overridden.
The docs added by that patch say that $test_timeout is "the amount of time (in seconds) to wait for a remote test to complete" which doesn't match my observations. Is it because I'm testing locally, not remotely? If that value only affects remote tests, why does DejaGnu print "Execution timeout is: $test_timeout" to the logs unconditionally? It's unhelpful (i.e. downright confusing) to log it if it's not actually relevant. DejaGnu *does* know about the "real" timeout, because the remote_exec proc in remote.exp has already handled it and logged it via this command:verbose -log "Executing on $hostname: $program $pargs $inp $outp (timeout = $timeout)" 2I see that in my logs, and the $timeout value is the one I've set via $tool_timeout and actually works. If I set that to a low value, my tests timeout and FAIL. That timeout is the total time for both compiling and running the test. When I use -v -v -v with dejagnu-1.6.1 and set GCC's tool_timeout=30 and test_timeout=2 I see the output in the attached extract from the log. The lines beginning with ================ are printed by the test itself, showing it sleeps for 20 seconds. Despite DejaGnu telling me the execution timeout is 2, the test passes. Either the execution timeout is not used, or the timeout is not working.DejaGnu seems to only implement one timeout at a time, so the use of GCC's tool_timeout (probably) overrides DejaGnu's own test_timeout. The underlying Expect works this way, and DejaGnu does not currently select from the minimum of multiple timeouts. Also, DejaGnu's timeouts are meant to catch hung tests, so they are reset whenever output arrives from the test case. (That is probably not part of this issue here, but it is important to remember, especially since there are also places in DejaGnu where some number of timeouts are permitted to occur before an error is actually reported.)
I didn't know that, thanks.
Could unix.exp stop logging that line unconditionally if it's not true? Is it supposed to be true? Is the timeout failing to fire? Is it not relevant for local tests? I'm very confused.I am also confused, so I have added this to my list of future issues to address once I learn enough to understand (and document!) DejaGnu's remote testing code.
[Prev in Thread] | Current Thread | [Next in Thread] |