l4-hurd
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Restricted storage


From: Jonathan S. Shapiro
Subject: Re: Restricted storage
Date: Wed, 07 Jun 2006 18:17:22 -0400

On Wed, 2006-06-07 at 23:42 +0200, Michal Suchanek wrote:
> On 6/6/06, Jonathan S. Shapiro <address@hidden> wrote:
> > Actually, this need not be true. It is possible (on top of TC) to
> > construct a keying system in which the administrator does not manage the
> > keys -- or at least: can manage them only in "opaque" form in a way that
> > does not permit them to be used or inspected.
> 
> oh, so you argue that if TC is reliable it can make TC reliable?

Hmm. I was not arguing that, but I think it is true. Your original
comment concerned key generation on a host being vulnerable to its
administrator.

However, it does indeed appear to be true that a single, TC-capable
system would be capable of key generation that it's administrator could
not inspect, provided that all of the software was suitably assured and
trustworthy. I had not considered this.

> Would  they use chips from another vendor for that, or their own chips?

It doesn't really matter. In either case the bootstrap TC system needs
to be hand-assured.

> > > Even if you verify some chips, there is no guarantee that they will not
> > > - start producing a new revision
> > > - give away keys to sign something else than the chips
> >
> > There is no "guarantee". However, the financial incentives *not* to do
> > this are *extremely* powerful.
> >
> > One of the recurring problems with security schemes in general is
> > incentives. In practice, they often rely on some party to preserve some
> > property or secret, but in reality it is not financially in the
> > interests of that party to actually preserve it. At best, people get
> > lazy about such commitments. At worst, they break them explicitly.
> >
> > One of the things about TC that is good (from an engineering
> > perspective) is that the financial incentives of the TC chip vendors
> > align with the protection that the TC vendors must preserve.
> 
> I would think the same applies to CAs.

Unfortunately this is not quite the case. The difference is this:

A TC system generates a number. This number has no social significance
beyond being random and non-colliding. There are no "political" or
"human factors" that would lead us to prefer any one number over any
other number.

The purpose of a CA is to establish a human-verified association between
a random number and some real-world property such as identity, email
address, or some such. These real-world identifiers have social and
human significance, and because of this they must be checked by a
central authority hierarchy. There are very few reasons to cheat about
the production of the keys themselves. There are *many* reasons why a CA
might choose to be more or less reliable about the association between
the key and the real-world identity.

> > I'm not saying "TC is good" here. I'm simply saying that this particular
> > aspect of TC was engineered well and realistically.
> >
> > > Plus there is the problem of signing all those chips. How whould an US
> > > chip maufacturer manage that? Will they have the chips signed in
> > > Taiwan and China, or will they first get all the zillions of chips
> > > transported to the US and sign them there?
> >
> > The chips are not signed, so this is not an issue.
> 
> How do you tell they are the genuine chips then?

I may have gotten myself confused here. There have been a couple of
proposed designs and I don't remember which one ended up being
implemented in TPM.

In one design, there is no per-chip information. All copies of a TPM
chip contain the same master key. When the chip is initialized, a random
key is generated and the master key is used to *sign* it. The proof that
you have a valid TC chip lies in the demonstration that it is able to
generate a validly signed working key.

In the second design, each chip is assigned a distinct master key by the
chip vendor, which is signed by a vendor master key. By adding one
additional step to the signature chain it is no longer necessary to have
the same key inside every TPM chip from that manufacturer (the
manufacturer can rotate the key at some point, but that is a side
issue). In this second design, each chip is, in effect, individually
signed.

I do not know which design was actually adopted. I *suspect* that it was
the first design because of the simpler process of manufacturing, but
this is simply a guess.

> > This is not entirely true. If a single TC chip vendor is compromised,
> > then the chips supplied by that vendor "die" but chips supplied by other
> > vendors remain just as "safe" as they were before.
> >
> > In the eyes of the user, this is no worse than having a shipment of
> > motherboards all of which [go] bad.
> 
> Given that there is about a half dozen chip vendors compromising one
> of them would have much greater impact. Moreover, the capacitors only
> stopped working, and a few boards at a time. Compromising the chips
> would completetly break the security of a large number of systems at
> once. Even systems that do not use the chips directly but rely
> (relied) on them to attest remote parties.

Yes. I think you are right, but we should distinguish two cases here:

  1. The ability to do attestation in this scenario is lost.
  2. The ability to recover locally-stored encrypted information is
     retained.

What is disrupted is commerce, not local enforcement. I agree that this
is more disruptive than the motherboard case, but I think that the
impact on a given user depends very much on how much electronic commerce
of this type they do.

shap





reply via email to

[Prev in Thread] Current Thread [Next in Thread]