l4-hurd
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Restricted storage


From: Jonathan S. Shapiro
Subject: Re: Restricted storage
Date: Thu, 01 Jun 2006 10:28:12 -0400

On Thu, 2006-06-01 at 12:30 +0200, Marcus Brinkmann wrote:
> At Thu, 01 Jun 2006 05:21:21 -0400,
> "Jonathan S. Shapiro" <address@hidden> wrote:
> > 
> > On Thu, 2006-06-01 at 10:20 +0200, Bas Wijnen wrote:
> > > On Wed, May 31, 2006 at 08:23:53PM -0400, Jonathan S. Shapiro wrote:
> > > > Indeed. And while we are about it: where do you propose to store keys
> > > > that are used for group signatures?
> > > 
> > > In some place that cannot be destroyed by any of the members of the 
> > > group, but
> > > only by the group administrators.  That is, in a special user account 
> > > created
> > > specially for that group.
> > 
> > Ah. So you propose that the computational "right of assembly" should be
> > present only with the consent of the system administrator?
> 
> Can you pelase define what you mean by "computational 'right of
> assembly'"?  The term is entirely void of meaning to me.

If the group keys should be stored in a specially created account, then
the system administrator's permission is required in order to form a
private group. This seems contrary to freedom.

The primary role of the administrator in this scenario is a policy
decision: how much system resource to allocate -- specifically, storage.
The actual setup of the account is done entirely by software.

Ironically, it would be very very easy to do this without an
administrator being involved if opaque storage is present and
verifiable. Each participant can "donate" some amount of storage as a
condition of joining the group.

However, this only works if the participants can verify that the storage
is opaque. See below.


> > > > The objects holding such keys must be shared, and all parties need to be
> > > > able to verify the storage safety and the identity (in the sense of 
> > > > "what
> > > > binary is executing here") of the key management object.
> > > 
> > > Yes.  They can do that socially.
> > 
> > No. The entire point of the need to verify is that you *can't* do that
> > socially, because you are forming a collaboration in which the parties
> > do not have absolute trust in each other. Where absolute trust exists,
> > no verification is necessary.
> >
> > I will note only that absolute trust has never been observed in the
> > wild, and people have been looking for it since (at least) the beginning
> > of recorded history. And I don't just mean computationally.
> 
> I don't know what "absolute trust" means.  But people trust other
> people in ways that are much, much more important to them than
> managing secret keys *all* the time.  Everyday, they put their very
> life into the hands of dozens of strangers.  Just count the number of
> cars that pass you by, and remember that it's just a flick of the
> wrist for the driver to kill you.  "Trust" is nothing special, it's
> just the personal belief of the correctness of something.
> 
> Two observations: It's totally ubiquitous, all over the place,
> somebody who does not trust anybody at all would be pathologic and has
> no chance to survive in a society with other human beings.  Second,
> there is not a bit of a difference between the two systems in that
> regard.  The difference is not that you have to exercise trust, but
> about which agents you have to exercise it.  You happen to trust the
> "trusted computing" component manufacturer, and you happen to have a
> deep distrust against basically everybody else.  Well, for me, it is
> the other way round, for the reasons explained in my posting
> "ownership and contracts".
> 
> It's a simple error of logic to attribute "more trust", in general, to
> the one system than to the other.  "Trust" is a personal conviction,
> and can not be attributed to an object without a subject.

That soliloquy is not relevant to the point that I was making. Yes,
trust always has a subject. I did not say otherwise. Yes, trust is
always conditional, in the sense that it is predicated on certain
assumptions.

But the factor that you seem to be ignoring is the importance of
*confidence*. Every one of those preconditions is a potential weakness
in the trust relationship. Every precondition that can be verified
strengthens the trust relationship. We (humans) do not engage in risky
collaborations based on trust. We engage in risky collaborations based
on an assessment of benefit and risk (ideally: an informed assessment).

The ability to verify that the preconditions on which a trust decision
is based is very important to forming these relationships.

In my lab, we have found that the term "trust" is almost always a bad
term for useful discussion. It appeals to all sorts of social
assumptions that are misleading in computational contexts. We try very
hard to replace every use of the word "trust" with "depends on".

In this use case, the users of the group key *depend on* opaque storage
to avoid private key disclosure. The relevant questions now are:

  1. Is this dependency satisfied?
  2. How do the participants know?


shap





reply via email to

[Prev in Thread] Current Thread [Next in Thread]