l4-hurd
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Restricted storage


From: Michal Suchanek
Subject: Re: Restricted storage
Date: Thu, 8 Jun 2006 14:10:08 +0200

On 6/8/06, Jonathan S. Shapiro <address@hidden> wrote:
On Wed, 2006-06-07 at 23:42 +0200, Michal Suchanek wrote:
> On 6/6/06, Jonathan S. Shapiro <address@hidden> wrote:

> > > Even if you verify some chips, there is no guarantee that they will not
> > > - start producing a new revision
> > > - give away keys to sign something else than the chips
> >
> > There is no "guarantee". However, the financial incentives *not* to do
> > this are *extremely* powerful.
> >
> > One of the recurring problems with security schemes in general is
> > incentives. In practice, they often rely on some party to preserve some
> > property or secret, but in reality it is not financially in the
> > interests of that party to actually preserve it. At best, people get
> > lazy about such commitments. At worst, they break them explicitly.
> >
> > One of the things about TC that is good (from an engineering
> > perspective) is that the financial incentives of the TC chip vendors
> > align with the protection that the TC vendors must preserve.
>
> I would think the same applies to CAs.

Unfortunately this is not quite the case. The difference is this:

A TC system generates a number. This number has no social significance
beyond being random and non-colliding. There are no "political" or
"human factors" that would lead us to prefer any one number over any
other number.

The purpose of a CA is to establish a human-verified association between
a random number and some real-world property such as identity, email
address, or some such. These real-world identifiers have social and
human significance, and because of this they must be checked by a
central authority hierarchy. There are very few reasons to cheat about
the production of the keys themselves. There are *many* reasons why a CA
might choose to be more or less reliable about the association between
the key and the real-world identity.

It's pretty much the same in both cases.
The CA gives out keys that are itself just numbers but attest that you
are certain entity and that this is backed by the CA.
The TC manufacturer puts keys in chips that are just numbers but you
can use them to attest you are running certain software and that this
is backed by the manufacturer.

If the CA leaks (gives out to somebody else than is written on the
key) some keys then poeple would lose faith in the backing provided by
the CA. Since selling keys is thier only business one would think this
hurts the CA. Apparently the CAs do not care enough.

I do not see why this would be different with TC chip manufacturers
and thier suppliers.



> > I'm not saying "TC is good" here. I'm simply saying that this particular
> > aspect of TC was engineered well and realistically.
> >
> > > Plus there is the problem of signing all those chips. How whould an US
> > > chip maufacturer manage that? Will they have the chips signed in
> > > Taiwan and China, or will they first get all the zillions of chips
> > > transported to the US and sign them there?
> >
> > The chips are not signed, so this is not an issue.
>
> How do you tell they are the genuine chips then?

I may have gotten myself confused here. There have been a couple of
proposed designs and I don't remember which one ended up being
implemented in TPM.

In one design, there is no per-chip information. All copies of a TPM
chip contain the same master key. When the chip is initialized, a random
key is generated and the master key is used to *sign* it. The proof that
you have a valid TC chip lies in the demonstration that it is able to
generate a validly signed working key.

In the second design, each chip is assigned a distinct master key by the
chip vendor, which is signed by a vendor master key. By adding one
additional step to the signature chain it is no longer necessary to have
the same key inside every TPM chip from that manufacturer (the
manufacturer can rotate the key at some point, but that is a side
issue). In this second design, each chip is, in effect, individually
signed.

I do not know which design was actually adopted. I *suspect* that it was
the first design because of the simpler process of manufacturing, but
this is simply a guess.

Either way the chip is signed by the manufacturer master key. Wether
the keys are the same or different they must be embedded in the chip
at someplace by somebody.


> > This is not entirely true. If a single TC chip vendor is compromised,
> > then the chips supplied by that vendor "die" but chips supplied by other
> > vendors remain just as "safe" as they were before.
> >
> > In the eyes of the user, this is no worse than having a shipment of
> > motherboards all of which [go] bad.
>
> Given that there is about a half dozen chip vendors compromising one
> of them would have much greater impact. Moreover, the capacitors only
> stopped working, and a few boards at a time. Compromising the chips
> would completetly break the security of a large number of systems at
> once. Even systems that do not use the chips directly but rely
> (relied) on them to attest remote parties.

Yes. I think you are right, but we should distinguish two cases here:

  1. The ability to do attestation in this scenario is lost.
  2. The ability to recover locally-stored encrypted information is
     retained.

What is disrupted is commerce, not local enforcement. I agree that this

What do you mean by commerce? You wanted to rely on attestation for
all and any security. By removing the attestation the OS is no longer
secure.

is more disruptive than the motherboard case, but I think that the
impact on a given user depends very much on how much electronic commerce
of this type they do.

But you wanted to rely on TC (attestation) rather than the
administrator for local enforcement. So when the TC is broken local
enforcement no longer exists.

Thanks

Michal




reply via email to

[Prev in Thread] Current Thread [Next in Thread]