[governance] FoE rights
Meryem Marzouki
marzouki at ras.eu.org
Wed Apr 5 12:57:29 EDT 2006
Hi Milton and all,
Back to subtantive issues ! Late, as usual, but in any case I'm
afraid editorial changes (even major) wouldn't be enough to entirely
agree...
I understand your point about 'ethics', though I think it would be
clearer to talk about 'corporate social responibility' (or 'societal'
responsibility, if you prefer, but I personnally this social/societal
distinction rather artificial. I understand however where it comes
from). At least, it would be helpful to specify 'ethical obligation
for companies' to comply with FoE rights. The text of your proposal
refers to "ethical interactions with the governments of countries
that heavily regulate and censor content".
-> This point would indeed be easily addressed by minor editorial
changes.
But, while we agree on the intentions and the objectives, there are
more contentious points with this proposal.
The text starts with the following question: "Are the Internet
filtering and censorship practices of states compatible with Article
XIX of the UN Declaration on Human Rights?". I'm afraid the answer is
likely to be, when asking in such an arena, another question: "Is
unfiltered and uncensored Internet compatible with Article XXIX of
the UN Declaration on Human Rights?
Article 29 says:
"(1) Everyone has duties to the community in which alone the free and
full development of his personality is possible.
(2) In the exercise of his rights and freedoms, everyone shall be
subject only to such limitations as are determined by law solely for
the purpose of securing due recognition and respect for the rights
and freedoms of others and of meeting the just requirements of
morality, public order and the general welfare in a democratic society.
(3) These rights and freedoms may in no case be exercised contrary to
the purposes and principles of the United Nations."
I think I don't need to remind anyone that Article 29 provisions are
reaffirmed in the Geneva Declaration fo Principles (para 5).
In addition, in case you haven't noticed, among the 11 post-WSIS
action lines, there is no "human rights" action line, but rather
"Ethical dimensions of the Information Society" (C10), and among the
more than 15 UN commissions or agencies respectively in charge of
their implementation and follow-up, almost only one is strangely
missing: the Office of the High Commissioner for Human Righst
(OHCHR)! Even the ICAO is referred, but not the OHCHR.
All this to explain how weak FoE supporters may be in an arena like
IGF. As we were in WSIS as a whole. This is not something new to
discover: the UN Human Rights Commission has witnessed for years the
consequences of this situation. And it's not guaranteed that the new
UN HR Council would avoid such a situation.
On FoE, the balance of powers is simply not in our favor, and this
issue is only a question of balance of powers, not of exploring new
ways of doing/managing/"enhancing" something. You would tell me that
all issues in IGF are subject to a balance of powers. That's not
exactly true or, more exactly, for all other issues, the balance of
powers in not so severely unequal. For most of them, one may find
allies because there are many different things at stake, not so
obvious to be on one side or another. If you're not convinced, simply
look at the 'diversity' of opinions, if only within CS, on any other
issue.
-> This second point makes it really dangerous to address this issue
in the IGF framework/arena. It's likely to backfire, and severely.
Under section "Why it is important", the text of the proposal states:
"Content regulation, filtering and censorship are issues that do not
fall within the scope of any existing international body, but cut
across many of them; e.g., UNESCO, ICANN, ITU and WIPO."
That's not true, not at all. What is true is that many international
bodies make decisions that may lead to content regulation, filtering
and censorship (you wrote yourself on ICANN's UDRP rules and their
impact on FoE. No need to mention WIPO, etc.). It is also true that
regional institutions like, for Europe, the EU, the CoE, have made or
still try to make decisions directly recommending content regulation
and filtering leading to censorship, and sometimes even provided by
law (e.g. by the EC Directive on e-commerce with the notice and take
down procedure).
But what the proposal states is not true, in that content regulation,
filtering and censorship are not issues dealt with at international
level, specially not as governance issues.
They are issues relevant to civil and penal law, fundamentally as a
matter national legislations - or, in the EU case, at supra-national
level but still, the EU is a regional entity with its institutions
and its legislation in many sectors. Even the CoE cybercrime
Convention - which has been adopted in a regional intergovernmental
entity more or less consistent, mostly because of the European
Convention on HR _and_ the existence of the European Court of HR
enforcing the ECHR - doesn't deal with content regulation and
filtering. It rather criminalize some very specific infractions
(child porn, intellectual property infractions, and racism and
xenophobia through an additional protocol), the rest of the
cybercrime Convention mostly addressing procedural cooperation.
The need to deal with these issues at national level (or in a very
coherent regional entity) comes from the difference in cultures and
contexts. What is illegal - or simply 'harmful' - in a given country
is not in another country. This has been for long an argument against
filters for e.g. parental control. Even people that favors the use of
filters, e.g. in France, find them inadequate because they are mainly
designed by US companies, with US culture criteria.
-> In summary, my third point is thus that the issues of content
regulation, filtering and censorship are not a matter of governance,
but a matter of national legislation/regulation. And it is at this
level that FoE rights, which are universal, should be raised as main
argument against this censorship.
If you open any way to address the content regulation, filtering and
censorship at international level, then this would lead to the
definition of a set of 'content seen as inappropriate by all', i.e.
FoE would be reduced to the least common denominator. Again, the
initial good intentions carry a risk of backfire.
My final remark deals with the workability of the proposal. The
objective is, in the end, that companies would agree to act against
the willing of national governements of countries which are a market
they want to enter.
Given on the one hand that commercial companies are only driven by
their profits (I don't know if anyone really trust 'civic behaviors
of companies' here, in any case I don't, by no mean) and, when this
may have a serious impact on their profits, by their image; and given
on the other hand that these companies are, in this case de facto
monopolies, so they afford a bad image (which in addition they may
sometimes balance with some charity actions); incentives to have them
stop selling filtering and censorship tools - or doing themselves
filtering and censorship - can hardly be identified.
What remains is constraint by law. And this constraint can only come
from the law of their country of origin/establishment, the law they
have to respect. For most if not all of them, its the US legislation.
RSF has recently made a lot of noise on this issue in the US. There
have been some back-up from some Congress members. I'm not sure
things are advancing in the US (you can probably brief me on that),
but I really doubt that, with the huge China market at stake (to only
talk about China), this noise will go further than that.
In my first comment on your proposal, I've mentioned that I've had
myself recommended this kind of corporate social responsibility
guidelines in order to fight racism and xenophobia (I haven't said
holocaust denial, because I think that, unlike racism and xenophobia,
it's not a matter of court, but of history). But the situation is
rather different, and more workable, provided that political will
exists, of course. In the case of racism, concerned companies - ISPs,
portals, etc. - are US, and they want to operate, directly or through
affiliates, in countries where racist and xenophobic speech is
forbidden by law. And here the governments of these countries can
play with both incentives and constraints, coming from their side.
-> This fourth point shows, I hope, that the proposal is not only
dangerous since it may backfire, but also rather useless in that it
is rather unworkable.
I hope this message - rather long, I'm sorry - has helped to clarify
my comments on your proposal. I remain interested in further arguments.
Best,
Meryem
Le 2 avr. 06 à 19:39, Milton Mueller a écrit :
>>>> Meryem Marzouki <marzouki at ras.eu.org> 4/1/2006 4:45:22 PM >>>
>
>> I do apologize. It's not that easy to have a full time job, and
>> run (many) volunteer activities besides that.
>
> I fully understand. I am in the same condition! No need to
> apologize, just expressing my disappointment that we didn't benefit
> from your wisdom sooner.
>
> Based on your argument below, I am not convinced we are raising the
> dangers you claim, but it is not too late to modify the theme
> proposal in the next two days if you have specific changes to propose.
>
> Let me explain why I am not convinced. You say that we base our
> claim on "ethics" and not "rights." This constitutes a
> misunderstanding of what we are doing, I think. Or maybe we didn't
> formulate it clearly.
>
> We assume that individuals have rights to FoE, and wish to assert
> those globally by confronting the problem of multinational ISPs who
> cooperate with states who violate those rights. Recognizing that
> neither IGF nor probably anyone else can overcome national
> sovereignty of the local law, we ask: how can we define principles
> for interaction with these states that a) put pressure on the
> repressive states to recognize rights to FoE, and b) encourage ISPs
> to minimize the damage to FoE; c) encourage ISPs to recognize an
> ethical obligation to do so.
>
> So it is the ethical obligation to recognize FoE rights, not ethics
> in a broad sense that would encompass any kind of claim to regulate
> content regardless of rights, that we are interested in.
>
> Let me know if that addresses any of your concerns. And I welcome
> simple editorial changes that might clarify things or avoid
> problems. Even major editorial changes, if you are willing to do that.
_______________________________________________
governance mailing list
governance at lists.cpsr.org
https://ssl.cpsr.org/mailman/listinfo/governance
More information about the Governance
mailing list