l4-hurd
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Potential use case for opaque space bank: domain factored network st


From: Marcus Brinkmann
Subject: Re: Potential use case for opaque space bank: domain factored network stack
Date: Mon, 08 Jan 2007 02:24:04 +0100
User-agent: Wanderlust/2.14.0 (Africa) SEMI/1.14.6 (Maruoka) FLIM/1.14.7 (Sanjō) APEL/10.6 Emacs/21.4 (i486-pc-linux-gnu) MULE/5.0 (SAKAKI)

At Sun, 07 Jan 2007 14:13:14 -0500,
"Jonathan S. Shapiro" <address@hidden> wrote:
> 
> On Sun, 2007-01-07 at 03:02 +0100, Pierre THIERRY wrote:
> > Scribit Marcus Brinkmann dies 07/01/2007 hora 02:35:
> > > The right question to ask is if there is a significant difference in
> > > the harm that can result from such an arrangement if I am out of
> > > control compared to when I retain control.  I believe that to be the
> > > case.
> 
> PLEASE be SPECIFIC! Control over *what*!?

No need to shout.  Control over inspection and modification of the
memory resource, in the context of the above quote.
 
> > I may not remember well the past debate, but did you already give facts
> > supporting that belief? I'm not sure that opaque memory as we discussed
> > it now can do any harm per se.
> 
> So far as I can tell, there is harm in both cases. The harm of the
> "opaque memory" design is the possibility of non-transitive copy and
> non-inspectable code.
> 
> The immediate harms of the "transparent memory" proposal are (1) a
> requirement for hierarchy, which has been found in other systems to
> inhibit or defeat the Principle of Least Authority (POLA). POLA has been
> found to be a very effective architectural tool in achieving robust
> system designs.

Do you have a reference that I can check?  I would like to see if
those lessons apply to my proposal at all, and if the word hierarchy
means the same in that context as here.  I note that the EROS space
bank is hierarchical as well, and it does not inhibit POLA either.

I also want to point out that today most systems deployed do not
implement POLA, and thus the harm, if it exists at all, is at most
opportunistic.

> (2) The apparent need to introduce a significantly more
> complicated and more vulnerable resource management model (partitioned
> resource types, a.k.a. network space banks) to handle as a special case
> those places where opaque memory was desirable.

First of all, there is no need at all, apparent or not.  I have said
before that I have no principled objections to mechanisms, only to
policy.  Second, the discussion of that proposal is on-going, given
that I only posted it yesterday and you seemed to have misunderstood
it partly.

> The larger harm of the "transparent memory" proposal is that we do not
> (yet) have any comprehensive description of an overall system design
> based on this model, and we certainly have no security design for it
> (yet).

Let's cut to the chase.  The issue is not the lack of formal
descriptions and models.  The trade-offs are clear enough.  The real
issue is that my proposal makes it impossible or hard to express and
implement certain security policies compared to EROS.

Whereas you consider these excluded security policies valuable, I
consider them to be a threat.  The underlying conflict is not a
technical one, but a political one.  We seem to have very different
views on social, political and economical virtues, strategies and
assumptions, which, given different personal life experiences, may not
be very surprising.

The realization that these values influence technical design
parameters of a security system is also not surprising, given that
security design is about formulation and implementation of security
policies, which are mechanisms of exercising control (in this
particular case: control over flow of information).  That control
directly translates to social, political and/or economical power.

Research of the inter-relationship of IT security and social,
political and economical issues is a relatively new discipline, but it
is happening (I posted a link to Ross Anderson's work in this thread).
With the increasing impact of IT security on people's life, this will
only gain momentum.  As a corollary, it becomes less viable (and
attractive, IMO) to argue about security mechanism based on their
technical merits alone, as if they exist in isolation.

I am as lured as anybody by an elegant mathematical argument and a
conclusive formal design, and will rip my left arm out to achieve that
(figuratively).  But, as is certainly true for any of us, my heart
bleeds for the people, not for technology.  If that means I have to
work harder, because the requirements in line with my personal social,
political and economic goals require a more complicated or less
elegant system design, then I know where my priorities are.

These personal goals can be very different from one person to another,
and for very good reasons.  As we are a bunch of particularly friendly
and smart people here, I am strongly convinced that this is not a
major obstacle to mutual understanding and cooperation.  We all have a
lot to benefit, both technically and personally, from such
interactions.  I do believe however that one critical step in reaching
that mutual understanding is to understand the politicial, social and
economic virtues, strategies and assumptions, that motivate the
technical aspects of the design, and to take them into account in the
evaluation.

> I completely support Marcus in his view that the "transparent memory"
> proposal is worth exploring, but in my opinion it would be irresponsible
> to design this assumption into a widely deployed system until its
> implications are more fully understood.  My concern is that I do not see
> the necessary design work occurring that would determine that. This may
> be simply because that discussion is not occurring here.

Jonathan, I couldn't have said it any better, but for the system
design you propose to be widely deployed, and referring to its social,
policital and economic implications as well as technical ones.

Fairness requires that I equally apply this requirement to the system
I am proposing, and I am willing to do that.  However, please note
that virtually all systems widely deployed today do have "transparent
memory", do you know any exceptions?  I am not saying that this
relieves one from the obligation to reevaluate this decision, but it's
not as if I am proposing anything new, quite the opposite.  My
position on this is decidedly conservative.

Thanks,
Marcus






reply via email to

[Prev in Thread] Current Thread [Next in Thread]