"J. Noel Chiappa" wrote:

>     > From: Ed Gerck <[EMAIL PROTECTED]>
>
>     > maybe this is what the market wants -- a multiple-protocol Internet,
>     > where tools for IPv4/IPv6 interoperation will be needed ... and valued.
>
> This relates to an approach that seems more fruitful, to me - let's try and
> figure out things that sidestep this incredibly divisive, upsetting and
> fundamentally unproductive argument, and try and find useful things we can do
> to make things work better.

I suggest we first revisit the concept of collaboration itself.  IMO, collaboration
can no longer be understood as similar agents doing similar things at the same
time but as different agents doing different things at different times, for the same
objective.  Then, we can build protocols that will support this notion of
collaboration, where diversity is not ironed-out by hypotheses but actually
*valued* and used in interoperation.

>     > Which can, undoubtably, be put in a sound theoretical framework for
>     > NATs, in network topology. NATs do not have to be a hack.
>
> Well, the fundamental architectural premise of NAT's *as we know them today* -
> that there are no globally unique names at the internetwork level - is one
> which is inherently problematic (long architectural rant explaining why
> omitted).

That fundamental premise is trivially true (so, no need for rant ;-) ). However,
this is not what I was mentioning, as I think we are talking about something
even more fundamental.  A topology is simply a division of space, simply
speaking.

In these terms, data is no longer an absolute quantity.  Indeed, when thinking
about data in communication processes (networks) it has so far seemed
possible and undisputed to regard data as “information in numerical form
that can be digitally transmitted or processed”, and whose total quantity is
preserved when a system is divided into sub-systems or when different data
from different sources are compared. Actually, this picture is wrong to a large
extent and NATs are the living proof of it -- there are natural laws also in
cyberspace.

The very concept of data needs thus to revisited. Suppose we define data as the
*difference* D2 - D1 that can be measured between two states of data systems.
Then, it can be shown that this difference can be measured by means of a
communication process only if 1 and 2 are two states of the same closed system.
When they are not, NATs are a solution to create a third-system, a common
reference between 1 and 2.  Which can be conceptual or physical or both, but is
needed. In this formalism, a numerical value for data can be defined even though
1 and 2 may belong to different systems, or even though the data systems may be
open --  the only restriction is to have a common reference.

This is the mind-picture we need to overcome IMO -- that data is absolute. It is
not and this answer implies that we need to find "data laws" in order to describe
exchanges of data much in the same way as we needed to develop Thermodynamic
laws in order to describe exchanges of energy (itself, not an absolute concept,
either).

> So I don't think that the classic NAT model is a good idea, long-term.

I suggest we don't yet have a "NAT model", in engineering sense, where
a model fits in a larger model and so on. All we have is a "NAT hack".
And, I agree that the NAT hack is not a good idea, even mid-term.

> However, I think it's a bit of a logical fault to think that the only options
> are i) IPv6 and ii) NAT's.

Yes, especially NATs as they are -- somewhat born out of need, not so
much design.

>     > NATs ... seem to have been discovered before being modeled, that is
>     > all.
>
> Umm, not quite, IIRC. Papers by Paul Tsuchiya and Van Jacobsen discussed the
> concept a long time before any were commercially available.

Discussed the concept, as one may argue that telegraph systems also did when
they needed to define telegraph codes in each station, so that different
"John  Smith" could respectively get their proper messages even though
they all "shared" the same name.

What I meant is not this. What I meant is an ab initio model of  data in
network systems, where NATs are one instance of a third-system that is
*needed* in order to provide a common but quite arbitrary reference for
"measuring" data between different systems, without requiring any
change to them.  In such a formalism, there are data levels NATs can handle
and others it cannot, try as one may  -- which needs to be recognized and
provided for each case, by yet other objects.

Cheers,

Ed Gerck

Reply via email to