<html>
<body>
At 14:04 16/09/2013, Suresh Ramasubramanian wrote:<br>
<blockquote type=cite class=cite cite="">As the ciphers in question are
potentially used worldwide.. I would encourage anybody in the caucus who
has specific skills in this area to submit their
feedback</blockquote><br>
Dear Suresh,<br><br>
You want personal responses. Here is mine for the records
(<a href="http://iucg.org/wiki/20130917_-_Requested_comment_on_igcaucus" eudora="autourl">
http://iucg.org/wiki/20130917_-_Requested_comment_on_igcaucus</a>). What
follows, I must stress, is my own personal theory (in Greek: a commented
observation) and refers only to my experience, reflection, and
action.<br><br>
The response to your wishes is not in cryptographic details and it is
already under way. I am sorry to repeatedly return to the same RFC 6852
topic, but this response is that RFC. The RFC 6852 process results from
the long and convincing work of Russ Housley (by then the IETF and now
the IAB Chair) who is one of the Internet leading network security
independent experts. <br><br>
<br>
<b>The cryptographic misunderstanding<br><br>
</b>You call for feedback from people having the necessary expertise. The
minimum necessary expertise we all share now is that cryptography has now
fully scaled to 4D with a time dimension. The “normal security
<i>practitioner</i>s” (we all are) have learned from NSA that there is
evidence that “normal” algorithmic protection extends to the scientific,
technical, political, etc. fields of the prior standardization, R&D,
and deployment processes.<br><br>
There are no “NSA stooges”: there are dedicated patriots, employees,
activists, volunteers from every walk under the sun with national,
corporate, ethical, and civic concerns who have accumulated expertise at
many involved complexity layers and share similar yet possibly opposing
precautionary duties. They are engaged in national, business, civil
right, and ethitechnical global counterwars aiming at small normative
victories which may prevent major economic defeats, civilizational
conflicts, or global catastrophes. RFC 4646 and RFC 5895, in my own case,
belong to this kind of victories in multicultural practical support area
that responded to my French constitutional precautionary duty.<br><br>
The French Constitution states that man has a growing influence on the
conditions of his own life and evolution; this makes the preservation of
the [digital] environment equal to other fundamental interests of the
Nation. As a consequence, everyone has the right to live in a fair
environment of all kinds that is respectful of health, and the duty to
take part in the preservation and improvement of every form of this
healthy environment. “Health is a state of complete physical, mental, and
social wellbeing and not merely the absence of disease or infirmity”
(Constitution of the World Health Organization, July 22, 1946): it
necessarily includes physical integrity, cultural active respect, and
personal privacy protection.<br><br>
<br>
<b>Technical separation from USG influence?<br><br>
</b>Through the OpenStand RFC 6852, ISOC, IEEE, and W3C (including IAB
and IETF) have disengaged themselves from the USG direct affiliation that
they did in fact document and call for in RFC 3869. Their “modern”
OpenStand standardization paradigm has various direct
"nethical" (to use this portmanteau word for “ethics in network
design, use, and management”) impacts: <br><br>
1. As a result, IEEE, W3C, ISOC (and its IETF and IAB affiliates), as
well as the OpenStand signatories, are no longer striving for any of
humanity's "common good" (who could legitimately decide for
them? While before it was through USG funding deemed as technologically
“neutral”), nor to support any “core values” as stated in RFC 3935, which
were those of a liberal America (the English language seems, however, to
remain as a technical lingua franca). <br><br>
The RFC 3935 which documents the IETF mission stated: “The Internet isn't
value-neutral, and neither is the IETF. We want the Internet to be useful
for communities that share our commitment to openness and fairness. We
embrace technical concepts such as decentralized control, edge-user
empowerment and sharing of resources, because those concepts resonate
with the core values of the IETF community. These concepts have little to
do with the technology that's possible, and much to do with the
technology that we choose to create.”<br><br>
In front of the different stakeholders’ influence (starting with
governmental biases) this has been replaced by transparent (?)
merchantability.<br><br>
2. As a consequence, the internet engineering/manufacturing have now
reached a commercial mature level where they are only influenced by their
commercial results from their clients' "global communities".
<br><br>
3. Governments form together one of these global communities. Within that
community, the USG is a leader as well as a dominant direct client for
the industry. This is something that we already know, and we know that we
have to live with the influence of the GAC at ICANN, or in other
industrial sectors, like aeronautics. <br><br>
4. The civil society is also a global community gathering much smaller,
however numerous, customers. It must use its weight to make technologists
listen to its needs.<br><br>
<br>
<b>How should we behave?<br><br>
</b>There are two complementary ways for the Civil Society to address
this new situation;<br><br>
1. as citizens, its members are to watch and participate in their
national legislative processes, calling for a clear separation between
the cyber-civil and cyber-war specific issues (cf. “NATO's Tallinn
manual” documenting the international laws of cyberwar: the US/Russia
agreement over Syria is inspired by some of its thinking).<br><br>
2. as “intelligent use” customers having a coherent and concerted
purchasing strategy:<br><br>
2.1. Issuing pertinent technical specifications that they may protect in
the OpenStand standardization process through a respected and loyal
appeal process:<br><br>
- this is what I am
trying to ascertain right now, that IESG and IAG are actually trying to
disregard, thereby leading me to appeal to ISOC to test their willingness
to consider us as partnering stakeholders instead of obedient
consumers.<br>
<br>
- This is the
rightful proper appeal procedure that
(<a href="http://threatpost.com/uk-cryptographers-call-for-outing-of-deliberately-weakened-protocols-products/102301">
http://threatpost.com/uk-cryptographers-call-for-outing-of-deliberately-weakened-protocols-products/102301</a>
) UK cryptographers should be in a position to engage.<br><br>
2.2. Showing their own ability (as the other stakeholders group, and in
particular governments) to call upon their own grassroots solutions
should they not trust the commercial, public, or international technical
ones that they would propose. Our negotiation/influence power is an
enhanced P2P vision of the network use.<br><br>
<br>
<b>Let’s get real<br><br>
</b>In this process, we must accept that the current 40 year old or more
internet architectural concepts and logics – in spite of the time,
effort, and investments that many people have spent on them to enhance
their surety, security, neutrality, and privacy protection – have led to
the vulnerable solutions of today. We, therefore, have to be prepared to
advocate, explore, research, document, develop, test, validate, and
deploy entirely new approaches. In a first move, OSI level six,
presentation layer, is of the essence.<br><br>
However, this is not enough. As Karl has mentioned, this is not because
we have an open logic and source code access that our security can be
ensured anymore. Our security is first and is increasingly to be deeply
buried in the principle of the chosen architecture and in the way of
thinking, designing, and writing its protocols and programs. I suppose
that the datagram is going to be the core, but nothing guarantees that we
will not need byte oriented “virtual datagrams” as was initially the case
for the first technology of the InterNet (international
network).<br><br>
The true problem that we face is the "singularity" that Raymond
Kurzweil from Google and several others forecast for 2019/2020 (computers
becoming “more clever” than humans) – because this means that we cannot
trust the technology anymore to be unable to develop faster than what we
expect. <br><br>
NB. This is true in the two visions that we started debating in the early
1980s at Tymshare (by then the International Network leader): either
computers can surpass human intelligence in <u>augmenting</u> their
thinking quality (Doug Engelbart’s “Augment” division) or in
<u>extending</u> its processing capacity and quantity (my “Tymnet
Extended Services” department). <br><br>
<br>
<b>The consequence we must accept<br><br>
</b>This is why an architectonic debate is necessary to discuss what is
below the architectural basis, the esthetic of what humanity wants to
achieve, and the agoric (from the agora, the place where all the ideas
and goods were exchanged) extended form of thinking (i.e. a generalized
multidimensional logic) that we will need. <br><br>
Trust was the basis of our civilization: we now MUST base it on mutual
distrust as it went out of control. The only way we can do it is in
finding and developing an extended way of thinking that will make it
possible to demonstrate to oneself the dependability of the choices one
makes. We must scale Aristotle’s logic, as it based a new trust on the
dialectic comparison of two already trusted premises. This is because we
cannot control all the mutual influences, lobbies, biases, agendas, etc.
of a permanent polylectic process. NSA is just a speaking, notorious
example of our most common daily reality.<br><br>
How can we make it possible? Actually that is simple enough to
understand: <br><br>
- the same as <b>encryption protection</b> is based upon the idea that,
while considering the existing state of the technology, to break it will
take longer than the time we need the data to stay protected. <br><br>
- <b>dependability</b> means that the effort to discover how to
circumvent the coherence of my thinking must take longer than the time I
need to pursue that thinking along that coherence. Actually this is
scientific thinking. Scientists are under the pressure of new scientific
discoveries, and digital users are under the pressure of new intellition
(cf. below) discoveries.<br><br>
<br>
<b>Dependability is not easy to ensure<br><br>
</b>However, if discovering a way to circumvent the coherence that I
decided to adopt is possible, nothing guarantees me that smarter and
smarter machines cannot outsmart me faster than I wish. There is,
therefore, only one way to protect myself: it is to target an indefinite
coherence, i.e. that my own coherent way of seeing reality is to be
always smarter than any NSA hacker's way of seeing that reality.
<br><br>
Such a target was not possible for encryption because encryption is based
upon a logic hierarchy (i.e. an “if – then” dialectic). It is possible in
the coherence case because coherence is a meshed network of polylectic
(if – if – if …. then if ...) intricate loops. This is agorics. This is
an n-body, genuinely new way of thinking and demonstrating, that we have
to discover. It generalizes the one-body cybernetics and two-body logic.
The human brain can consider and design it, but 135 years ago Poincaré
showed (with deterministic chaos, relativity, quantum physic,
unification, etc. as a result) that he cannot perform it alone: agoric
thinking needs to be facilitated. Logic could be ported to mathematics
and is, as a result, limited by their incompleteness; we now have to see
how agorics can behave in extending the neuronal process that we are used
to with the Turing machine process and its physical (communication) and
intellectual (intellition) networking. <br><br>
However, we have to understand that with agorics there are no more
“reasonable conclusions” as there are in logic. In agorics, there are
reflective dynamic emergences. <br><br>
There, we have to think farther than the programmed logic that would lead
us to <u>truth</u> if we could trust in all of it. To that end, we have
to harness the unlimited probabilistic quantum coherence of <u>life</u>
and natural complexity. There is no more trusted third party; there is
only the coherent dynamic of the universe that we can only predict
through a limited information and disinformation set that we have to
deconstruct and rebuild in applying our intelligence on its different
components. This is that process that I name with by the portmanteau word
“intellition” (intelligence applied on the information and disinformation
gathered through communications in order to make coherence emerge). This
is also why we need Shannon’s information theory to be completed with a
communication theory up to intercomprehension (it is only at that layer
that we can check by resulting coherence that the communication process
has not been tampered with), and an intellition theory which is already
at work with big-data-mining and the omnipresent algorithmic governance
and its social engineering implications.<br><br>
<br>
<b>A general process<br><br>
</b>This certainly still calls for a few geniuses more to help us making
it, but we have already carried out some homework since Plato and
Aristotle, through the Middle Ages, Vinci, Pascal, Descartes, Newton,
Leibnitz, Kant, Frege, Poincaré, Einstein, Russell, Bohr, Gödel,
Wittgenstein, Turing, Wiener, Coufignal, Shannon, Thom, Pouzin,
Engelbart, Cerf, Kahn, Chaitin, Per Bak, Connes, Ulanowicz, etc. in order
to creep out of Plato’s Cave in using our intelligence as he and
Aristotle predicted it once we could get rid of the Greek gods to clarify
what science was supposed to explain...until probability, chaos, entropy,
and negentropy came in.<br><br>
What is exciting is that this process is most probably coherent with the
general process that shapes and maintains reality through the natural
self-organization of the universal, historical, technical,
constitutional, societal, political, human, etc. and natural
architectonical complexity. This is why everything is intricate, and that
it is the case when we want to protect our privacy from governments and
marketers, governments are concerned by the architectonics of
cyber-sovereignty and defense, majors are concerned by IPR laws, etc.
<br><br>
<br>
<b>How to move on?<br><br>
</b>Until Snowden, all of this was deemed by civil public as fairy tales.
I was a lunatic troll when trying to moderate the USG and industry biased
influence. Therefore, I needed to proceed along the then existing rules.
This is why I created the IUCG@IETF
(<a href="http://iutf.org/wiki">http://iutf.org/wiki</a>) site and
mailing list as a Civil Society gateway to the standardization
process.<br><br>
It is now available to whoever wants to use it to debate these serious
issues and consider the intelligent use of the internet technology (use,
not necessarily a change) in a more person oriented technical context. To
help this process, I have created the
<a href="http://openarch.net">http://openarch.net</a> IUCG
portal.<br><br>
<a name="_GoBack"></a>My personal priority, once the ISOC RFC 6852 appeal
is completed, will be to try to exemplify this “Iuse” through an
architectural set of specifications and applied examples in a way to best
intelligently use and propose the existing internet technology lower
layers. Even if this may complete and compete with the sole way of how
the US government and industry (statUS-quo) wished us to use the TCP/IP
world and people’s infrastructure.<br>
<br>
jfc<br><br>
<br><br>
</body>
</html>