bug-bash
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: When a hashed pathname is deleted, search PATH


From: Linda Walsh
Subject: Re: When a hashed pathname is deleted, search PATH
Date: Tue, 18 Mar 2014 21:11:07 -0700
User-agent: Thunderbird



Mike Frysinger wrote:
On Tue 18 Mar 2014 01:04:03 Linda Walsh wrote:
Chet Ramey wrote:
Because the execution fails in a child process.  You'd be able to fix it
for that process, but would do nothing about the contents of the parent
shell's hash table.

The way the option works now is to check the hash lookups and delete
anything that is no longer an executable file, then redo the lookup and
hash the new value.
----
Wouldn't bash notice that the child exited in <.1 seconds (
or is it less?

as soon as you talk about trying to time something, you're obviously looking at it wrong. having a system that only works when the cpu/disk is fast and idle is a waste of time and bad for everyone.
---

Um... this is a User Interface involving humans, and you are looking
for something that needs to be 100%?  If this was a reactor control
program, that's one thing, but in deciding what solution to implement to
save some small lookup time or throw it away, an 90% solution is
probably fine.  It's called a heuristic.  AI machines use them.
Thinking people use them.  Why should bash be different?

Fixing it isn't about 0/100% fixed, but a combination of actual cost,
(measurable impact), user perception, and programmer cost to implement
something that works for most.

As you drive up the 'perfection rate' or 'uptime' to another 9 (i.e. 90%
to 99, or 99 to 99.9%) the costs usually go up exponentially.
If it costs 1 day to implement an 90% algorithm, a 99% algorithm
easily be a 1-3 month project depending on how you measure.
99.9 could could involve a year or more.

In security even though systems per proved to lower levels of
assurance, no computer system was ever validated to the level of A1 -
mathematical proof.  The cost was too prohibitive.

Same goes for software design and fixes.  I'm more often one to rat-hole
on small stuff while missing big stuff, but even I can see that a
less than "perfect" solution would achieve most of the original
design goals, presuming they are still necessary on today's machines.

If you have a machine that can't do a path lookup in <.1 seconds,
Then walking a PATH env var to do multiple path lookups is gonna hurt
that many times more.  If your system is so slow that everything is bad,
then having hashing turned on at all seems a rather unimportant issue.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]