[governance] net neutrality
Janna Anderson
andersj at elon.edu
Sun Jan 23 07:38:04 EST 2011
Perhaps Sir Tim or at least the leadership of the Web Foundation would be
interested in assisting in the leadership on this in Nairobi.
Long Live the Web: A Call for Continued Open Standards and Neutrality
The Web is critical not merely to the digital revolution but to our
continued prosperityand even our liberty. Like democracy itself, it needs
defending
By Tim Berners-Lee
http://www.scientificamerican.com/article.cfm?id=long-live-the-web
Monday, November 22, 2010
The World Wide Web went live, on my physical desktop in Geneva, Switzerland
<http://www.cern.ch/> , in December 1990. It consisted of one Web site
<http://en.wikipedia.org/wiki/Website> and one browser
<http://en.wikipedia.org/wiki/Web_browser> , which happened to be on the
same computer <http://en.wikipedia.org/wiki/NeXT_Computer> . The simple
setup demonstrated a profound concept: that any person could share
information with anyone else, anywhere. In this spirit, the Web spread
quickly from the grassroots up. Today, at its 20th anniversary, the Web is
thoroughly integrated into our daily lives. We take it for granted,
expecting it to ³be there² at any instant, like electricity.
The Web evolved into a powerful, ubiquitous tool because it was built on
egalitarian principles and because thousands of individuals, universities
and companies have worked, both independently and together as part of the
World Wide Web Consortium <http://www.w3.org/> , to expand its capabilities
based on those principles.
The Web as we know it
<http://cacm.acm.org/magazines/2008/7/5366-web-science/fulltext> , however,
is being threatened in different ways. Some of its most successful
inhabitants have begun to chip away at its principles. Large
social-networking sites are walling off information posted by their users
from the rest of the Web. Wireless Internet providers are being tempted to
slow traffic to sites with which they have not made deals.
Governmentstotalitarian and democratic alikeare monitoring people¹s online
habits, endangering important human rights.
If we, the Web¹s users, allow these and other trends to proceed unchecked,
the Web could be broken into fragmented islands. We could lose the freedom
to connect with whichever Web sites we want. The ill effects could extend to
smartphones and pads, which are also portals to the extensive information
that the Web provides.
Why should you care? Because the Web is yours. It is a public resource on
which you, your business, your community and your government depend. The Web
is also vital to democracy, a communications channel that makes possible a
continuous worldwide conversation. The Web is now more critical to free
speech than any other medium. It brings principles established in the U.S.
Constitution <http://www.archives.gov/exhibits/charters/constitution.html> ,
the British Magna Carta <http://www.bl.uk/treasures/magnacarta/index.html>
and other important documents into the network age: freedom from being
snooped on, filtered, censored and disconnected.
Yet people seem to think the Web is some sort of piece of nature, and if it
starts to wither, well, that¹s just one of those unfortunate things we can¹t
help. Not so. We create the Web, by designing computer protocols and
software; this process is completely under our control. We choose what
properties we want it to have and not have. It is by no means finished (and
it¹s certainly not dead). If we want to track what government is doing, see
what companies are doing, understand the true state of the planet, find a
cure for Alzheimer¹s disease, not to mention easily share our photos with
our friends, we the public, the scientific community and the press must make
sure the Web¹s principles remain intactnot just to preserve what we have
gained but to benefit from the great advances that are still to come.
Universality Is the Foundation
Several principles are key to assuring that
the Web becomes ever more valuable. The primary design principle
<http://www.w3.org/DesignIssues> underlying the Web¹s usefulness and growth
is universality. When you make a link, you can link to anything. That means
people must be able to put anything on the Web, no matter what computer they
have, software they use or human language they speak and regardless of
whether they have a wired or wireless Internet connection. The Web should be
usable by people with disabilities <http://www.w3.org/WAI/> . It must work
with any form of information, be it a document or a point of data, and
information of any qualityfrom a silly tweet to a scholarly paper. And it
should be accessible from any kind of hardware that can connect to the
Internet: stationary or mobile, small screen or large.
These characteristics can seem obvious, self-maintaining or just
unimportant, but they are why the next blockbuster Web site or the new
homepage for your kid¹s local soccer team will just appear on the Web
without any difficulty. Universality is a big demand, for any system.
Decentralization is another important design feature. You do not have to get
approval from any central authority to add a page or make a link. All you
have to do is use three simple, standard protocols: write a page in the HTML
<http://www.w3.org/html> (hypertext markup language) format, name it with
the URI <http://www.w3.org/Addressing/> naming convention, and serve it up
on the Internet using HTTP (hypertext transfer protocol). Decentralization
has made widespread innovation possible and will continue to do so in the
future.
The URI is the key to universality. (I originally called the naming scheme
URI, for universal resource identifier; it has come to be known as URL, for
uniform resource locator.) The URI allows you to follow any link, regardless
of the content it leads to or who publishes that content. Links turn the
Web¹s content into something of greater value: an interconnected information
space.
Several threats to the Web¹s universality have arisen recently. Cable
television companies that sell Internet connectivity are considering whether
to limit their Internet users to downloading only the company¹s mix of
entertainment. Social-networking sites present a different kind of problem.
Facebook, LinkedIn, Friendster and others typically provide value by
capturing information as you enter it: your birthday, your e-mail address,
your likes, and links indicating who is friends with whom and who is in
which photograph. The sites assemble these bits of data into brilliant
databases and reuse the information to provide value-added servicebut only
within their sites. Once you enter your data into one of these services, you
cannot easily use them on another site. Each site is a silo, walled off from
the others. Yes, your site¹s pages are on the Web, but your data are not.
You can access a Web page about a list of people you have created in one
site, but you cannot send that list, or items from it, to another site.
The isolation occurs because each piece of information does not have a URI.
Connections among data exist only within a site. So the more you enter, the
more you become locked in. Your social-networking site becomes a central
platforma closed silo of content, and one that does not give you full
control over your information in it. The more this kind of architecture
gains widespread use, the more the Web becomes fragmented, and the less we
enjoy a single, universal information space.
A related danger is that one social-networking siteor one search engine or
one browsergets so big that it becomes a monopoly, which tends to limit
innovation. As has been the case since the Web began, continued grassroots
innovation may be the best check and balance against any one company or
government that tries to undermine universality. GnuSocial
<http://www.gnu.org/software/social/> and Diaspora
<http://www.joindiaspora.com/> are projects on the Web that allow anyone to
create their own social network from their own server, connecting to anyone
on any other site. The Status.net <http://status.net/> project, which runs
sites such as identi.ca <http://identi.ca/> , allows you to operate your own
Twitter <http://twitter.com/> -like network without the Twitter-like
centralization.
Open Standards Drive Innovation
Allowing any site to link to any other site
is necessary but not sufficient for a robust Web. The basic Web technologies
that individuals and companies need to develop powerful services must be
available for free, with no royalties. Amazon.com, for example, grew into a
huge online bookstore, then music store, then store for all kinds of goods
because it had open, free access to the technical standards on which the Web
operates. Amazon, like any other Web user, could use HTML, URI and HTTP
without asking anyone¹s permission and without having to pay. It could also
use improvements to those standards developed by the World Wide Web
Consortium <http://www.w3.org/> , allowing customers to fill out a virtual
order form, pay online, rate the goods they had purchased, and so on.
By ³open standards² I mean standards that can have any committed expert
involved in the design, that have been widely reviewed as acceptable, that
are available for free on the Web, and that are royalty-free (no need to
pay) for developers and users. Open, royalty-free standards that are easy to
use create the diverse richness of Web sites, from the big names such as
Amazon <http://www.amazon.com/> , Craigslist <http://www.craigslist.org/>
and Wikipedia <http://www.wikipedia.org/> to obscure blogs written by adult
hobbyists and to homegrown videos posted by teenagers.
Openness also means you can build your own Web site or company without
anyone¹s approval. When the Web began, I did not have to obtain permission
or pay royalties to use the Internet¹s own open standards, such as the
well-known transmission control protocol (TCP
<http://en.wikipedia.org/wiki/Transmission_Control_Protocol> ) and Internet
protocol (IP <http://en.wikipedia.org/wiki/Internet_Protocol> ). Similarly,
the Web Consortium¹s royalty-free patent policy
<http://www.w3.org/Consortium/Patent-Policy-20040205/> says that the
companies, universities and individuals who contribute to the development of
a standard must agree they will not charge royalties to anyone who may use
the standard.
Open, royalty-free standards do not mean that a company or individual cannot
devise a blog or photo-sharing program and charge you to use it. They can.
And you might want to pay for it if you think it is ³better² than others.
The point is that open standards allow for many options, free and not.
Indeed, many companies spend money to develop extraordinary applications
precisely because they are confident the applications will work for anyone,
regardless of the computer hardware, operating system or Internet service
provider (ISP <http://en.wikipedia.org/wiki/Internet_service_provider> )
they are usingall made possible by the Web¹s open standards. The same
confidence encourages scientists to spend thousands of hours devising
incredible databases that can share information about proteins, say, in
hopes of curing disease. The confidence encourages governments such as those
of the U.S. <http://www.data.gov/> and the U.K. <http://data.gov.uk/> to
put more and more data online so citizens can inspect them, making
government increasingly transparent. Open standards also foster
serendipitous creation: someone may use them in ways no one imagined. We
discover that on the Web every day.
In contrast, not using open standards creates closed worlds. Apple¹s iTunes
system, for example, identifies songs and videos using URIs that are open.
But instead of ³http:² the addresses begin with ³itunes:,² which is
proprietary. You can access an ³itunes:² link only using Apple¹s proprietary
iTunes program. You can¹t make a link to any information in the iTunes
worlda song or information about a band. You can¹t send that link to
someone else to see. You are no longer on the Web. The iTunes world is
centralized and walled off. You are trapped in a single store, rather than
being on the open marketplace. For all the store¹s wonderful features, its
evolution is limited to what one company thinks up.
Other companies are also creating closed worlds. The tendency for magazines,
for example, to produce smartphone ³apps² rather than Web apps is
disturbing, because that material is off the Web. You can¹t bookmark it or
e-mail a link to a page within it. You can¹t tweet it. It is better to build
a Web app that will also run on smartphone browsers, and the techniques for
doing so are getting better all the time.
Some people may think that closed worlds are just fine. The worlds are easy
to use and may seem to give those people what they want. But as we saw in
the 1990s with the America Online dial-up information system that gave you a
restricted subset of the Web, these closed, ³walled gardens,² no matter how
pleasing, can never compete in diversity, richness and innovation with the
mad, throbbing Web market outside their gates. If a walled garden has too
tight a hold on a market, however, it can delay that outside growth.
Keep the Web separate from the Internet
Keeping the web universal and
keeping its standards open help people invent new services. But a third
principlethe separation of layerspartitions the design of the Web from
that of the Internet.
This separation is fundamental. The Web is an application that runs on the
Internet <http://en.wikipedia.org/wiki/Internet> , which is an electronic
network that transmits packets of information among millions of computers
according to a few open protocols. An analogy is that the Web is like a
household appliance that runs on the electricity network. A refrigerator or
printer can function as long as it uses a few standard protocolsin the
U.S., things like operating at 120 volts and 60 hertz. Similarly, any
applicationamong them the Web, e-mail or instant messagingcan run on the
Internet as long as it uses a few standard Internet protocols, such as TCP
and IP.
Manufacturers can improve refrigerators and printers without altering how
electricity functions, and utility companies can improve the electrical
network without altering how appliances function. The two layers of
technology work together but can advance independently. The same is true for
the Web and the Internet. The separation of layers is crucial for
innovation. In 1990 the Web rolled out over the Internet without any changes
to the Internet itself, as have all improvements since. And in that time,
Internet connections have sped up from 300 bits per second to 300 million
bits per second (Mbps) without the Web having to be redesigned to take
advantage of the upgrades.
Electronic Human Rights
Although Internet and web designs are separate, a
Web user is also an Internet user and therefore relies on an Internet that
is free from interference. In the early Web days it was too technically
difficult for a company or country to manipulate the Internet to interfere
with an individual Web user. Technology for interference has become more
powerful, however. In 2007 BitTorrent, a company whose ³peer-to-peer²
network protocol allows people to share music, video and other files
directly over the Internet, complained to the Federal Communications
Commission that the ISP giant Comcast was blocking or slowing traffic to
subscribers who were using the BitTorrent application. The FCC told Comcast
to stop the practice, but in April 2010 a federal court ruled
<http://www.nytimes.com/2010/04/07/technology/07net.html?_r=1&pagewanted=2>
the FCC could not require Comcast to do so. A good ISP will often manage
traffic so that when bandwidth is short, less crucial traffic is dropped, in
a transparent way, so users are aware of it. An important line exists
between that action and using the same power to discriminate.
This distinction highlights the principle of net neutrality. Net neutrality
<http://en.wikipedia.org/wiki/Network_neutrality> maintains that if I have
paid for an Internet connection at a certain quality, say, 300 Mbps, and you
have paid for that quality, then our communications should take place at
that quality. Protecting this concept would prevent a big ISP from sending
you video from a media company it may own at 300 Mbps but sending video from
a competing media company at a slower rate. That amounts to commercial
discrimination. Other complications could arise. What if your ISP made it
easier for you to connect to a particular online shoe store and harder to
reach others? That would be powerful control. What if the ISP made it
difficult for you to go to Web sites about certain political parties, or
religions, or sites about evolution?
Unfortunately, in August, Google and Verizon for some reason suggested that
net neutrality should not apply to mobile phonebased connections. Many
people in rural areas from Utah to Uganda have access to the Internet only
via mobile phones; exempting wireless from net neutrality would leave these
users open to discrimination of service. It is also bizarre to imagine that
my fundamental right to access the information source of my choice should
apply when I am on my WiFi-connected computer at home but not when I use my
cell phone.
A neutral communications medium is the basis of a fair, competitive market
economy, of democracy, and of science. Debate has risen again in the past
year about whether government legislation is needed to protect net
neutrality. It is. Although the Internet and Web generally thrive on lack of
regulation, some basic values have to be legally preserved.
No Snooping
Other threats to the web result from meddling with the Internet,
including snooping <http://www.w3.org/DesignIssues/NoSnooping.html> . In
2008 one company, Phorm, devised a way for an ISP to peek inside the packets
of information it was sending. The ISP could determine every URI that any
customer was browsing. The ISP could then create a profile of the sites the
user went to in order to produce targeted advertising.
Accessing the information within an Internet packet is equivalent to
wiretapping a phone or opening postal mail. The URIs that people use reveal
a good deal about them. A company that bought URI profiles of job applicants
could use them to discriminate in hiring people with certain political
views, for example. Life insurance companies could discriminate against
people who have looked up cardiac symptoms on the Web. Predators could use
the profiles to stalk individuals. We would all use the Web very differently
if we knew that our clicks can be monitored and the data shared with third
parties.
Free speech should be protected, too. The Web should be like a white sheet
of paper: ready to be written on, with no control over what is written.
Earlier this year Google accused the Chinese government of hacking into its
databases to retrieve the e-mails of dissidents. The alleged break-ins
occurred after Google resisted the government¹s demand that the company
censor certain documents on its Chinese-language search engine.
Totalitarian governments aren¹t the only ones violating the network rights
of their citizens. In France a law created in 2009, named Hadopi
<http://en.wikipedia.org/wiki/HADOPI_law> , allowed a new agency by the same
name to disconnect a household from the Internet for a year if someone in
the household was alleged by a media company to have ripped off music or
video. After much opposition, in October the Constitutional Council of
France required a judge to review a case before access was revoked, but if
approved, the household could be disconnected without due process. In the
U.K., the Digital Economy Act
<http://www.legislation.gov.uk/ukpga/2010/24/contents> , hastily passed in
April, allows the government to order an ISP to terminate the Internet
connection of anyone who appears on a list of individuals suspected of
copyright infringement. In September the U.S. Senate introduced the
Combating Online Infringement and Counterfeits Act
<http://www.govtrack.us/congress/bill.xpd?bill=s111-3804> , which would
allow the government to create a blacklist of Web sites
<http://www.eff.org/deeplinks/2010/09/censorship-internet-takes-center-stage
-online> hosted on or off U.S. soilthat are accused of infringement and to
pressure or require all ISPs to block access to those sites.
In these cases, no due process of law protects people before they are
disconnected or their sites are blocked. Given the many ways the Web is
crucial to our lives and our work, disconnection is a form of deprivation of
liberty. Looking back to the Magna Carta, we should perhaps now affirm: ³No
person or organization shall be deprived of the ability to connect to others
without due process of law and the presumption of innocence.²
When your network rights are violated, public outcry is crucial. Citizens
worldwide objected to China¹s demands on Google, so much so that Secretary
of State Hillary Clinton said the U.S. government supported Google¹s
defiance and that Internet freedomand with it, Web freedomshould become a
formal plank in American foreign policy
<http://www.state.gov/secretary/rm/2010/01/135519.htm> . In October, Finland
made broadband access, at 1 Mbps, a legal right
<http://www.guardian.co.uk/technology/2009/oct/14/finland-broadband> for
all its citizens.
Linking to the Future
As long as the web¹s basic principles are upheld, its
ongoing evolution is not in the hands of any one person or
organizationneither mine nor anyone else¹s. If we can preserve the
principles, the Web promises some fantastic future capabilities.
For example, the latest version of HTML, called HTML5
<http://dev.w3.org/html5/spec/Overview.html> , is not just a markup language
but a computing platform that will make Web apps even more powerful than
they are now. The proliferation of smartphones will make the Web even more
central to our lives. Wireless access will be a particular boon to
developing countries, where many people do not have connectivity by wire or
cable but do have it wirelessly. Much more needs to be done, of course,
including accessibility for people with disabilities and devising pages that
work well on all screens <http://www.w3.org/Mobile/> , from huge 3-D
displays that cover a wall to wristwatch-size windows.
A great example of future promise, which leverages the strengths of all the
principles, is linked data <http://www.w3.org/DesignIssues/LinkedData.html>
. Today¹s Web is quite effective at helping people publish and discover
documents, but our computer programs cannot read or manipulate the actual
data within those documents. As this problem is solved, the Web will become
much more useful, because data about nearly every aspect of our lives are
being created at an astonishing rate. Locked within all these data is
knowledge about how to cure diseases, foster business value and govern our
world more effectively.
Scientists are actually at the forefront of some of the largest efforts to
put linked data on the Web. Researchers, for example, are realizing that in
many cases no single lab or online data repository is sufficient to discover
new drugs. The information necessary to understand the complex interactions
between diseases, biological processes in the human body, and the vast array
of chemical agents is spread across the world in a myriad of databases,
spreadsheets and documents.
One success relates to drug discovery to combat Alzheimer¹s disease
<http://www.nytimes.com/2010/08/13/health/research/13alzheimer.html> . A
number of corporate and government research labs dropped their usual refusal
to open their data and created the Alzheimer¹s Disease Neuroimaging
Initiative. They posted a massive amount of patient information and brain
scans as linked data, which they have dipped into many times to advance
their research. In a demonstration I witnessed, a scientist asked the
question, ³What proteins are involved in signal transduction and are related
to pyramidal neurons?² When put into Google, the question got 233,000
hitsand not one single answer. Put into the linked databases world,
however, it returned a small number of specific proteins that have those
properties.
The investment and finance sectors can benefit from linked data, too. Profit
is generated, in large part, from finding patterns in an increasingly
diverse set of information sources. Data are all over our personal lives as
well. When you go onto your social-networking site and indicate that a
newcomer is your friend, that establishes a relationship. And that
relationship is data.
Linked data raise certain issues that we will have to confront. For example,
new data-integration capabilities could pose privacy challenges that are
hardly addressed by today¹s privacy laws. We should examine legal, cultural
and technical options that will preserve privacy without stifling beneficial
data-sharing capabilities.
Now is an exciting time. Web developers, companies, governments and citizens
should work together openly and cooperatively, as we have done thus far, to
preserve the Web¹s fundamental principles, as well as those of the Internet,
ensuring that the technological protocols and social conventions we set up
respect basic human values. The goal of the Web is to serve humanity. We
build it now so that those who come to it later will be able to create
things that we cannot ourselves imagine.
On 1/23/11 1:34 AM, "parminder" <parminder at itforchange.net> wrote:
> Read below an article that got published on NN in the UK today.
>
> I do not think we, as a premier global CS group, can afford to *not* do
> something about this issue. So many times a discussion on NN on this list has
> run into this wall - it is a very complex issues with many sides to it'. So
> ??? I dont think this is a good enough reason for abdication. One often hears
> excuses like, with voice and video domination the internet today NN is a
> meaningless concept. Not so at all. We can have specific provisions whereby
> specific applications can have different treatments while being
> content-provider neutral, this latter being the key issue. Norway's NN
> guidelines have oftne been mentioned in discussions here earlier. These
> guidelines allow space to manage voice and vedio applications related issues.
> IS there any reason why Norway's guidelines cannot be used globally, and why
> should IGC be forcefully pushing for them. I fear that if soon enough there is
> not a basic global consensus on NN guidelines even Norway like countries may
> not be able to preserve NN, such is the globalness of the Internet and its
> basic architectural principles.
>
> What I am arguing for is that we should not only propose NN as a plenary topic
> and absolutely put our foot down that it must be accepted as a plenary topic,
> or else we find the whole exercise meaningless and may not even want to
> participate.... I mean the kind of warnings we issue about Ms-ism. Parminder
>
> The end of the net as we know it
> Posted on 21 Jan 2011 at 13:34
>
> ISPs are threatening to cripple websites that don't pay them first. Barry
> Collins fears a disastrous end to net neutrality
>
> You flip open your laptop, click on the BBC iPlayer bookmark and press Play on
> the latest episode of QI. But instead of that tedious, plinky-plonky theme
> tune droning out of your laptop¹s speakers, you¹re left staring at the
> whirring, circular icon as the video buffers and buffers and buffers...
>
> That¹s odd. Not only have you got a new 40Mbits/sec fibre broadband
> connection, but you were watching a Full HD video on Sky Player just moments
> ago. There¹s nothing wrong with your connection; it must be iPlayer. So you
> head to Twitter to find out if anyone else is having problems streaming
> Stephen Fry et al. The message that appears on your screen leaves you looking
> more startled than Bill Bailey. ³This service isn¹t supported on your
> broadband service. Click here to visit our social-networking partner,
> Facebook.²
>>
>>
>> Net neutrality? We don¹t have it today
>
> The free, unrestricted internet as we know it is under threat. Britain¹s
> leading ISPs are attempting to construct a two-tier internet, where websites
> and services that are willing to pay are thrust into the ³fast lane², while
> those that don¹t are left fighting for scraps of bandwidth or even blocked
> outright. They¹re not so much ripping up the cherished notion of net
> neutrality as pouring petrol over the pieces and lighting the match. The only
> question is: can they get away with it?
>
> No such thing as net neutrality
>
> It¹s worth pointing out that the concept of net neutrality ISPs treating
> different types of internet traffic or content equally is already a busted
> flush. ³Net neutrality? We don¹t have it today,² argues Andrew Heaney,
> executive director of strategy and regulation at TalkTalk, Britain¹s second
> biggest ISP.
>
> ³We have an unbelievably good, differentiated network at all levels, with huge
> levels of widespread discrimination of traffic types. [Some consumers] buy
> high speed, some buy low speed; some buy a lot of capacity, some buy less;
> some buy unshaped traffic, some buy shaped.
> ³So the suggestion that oh dear, it is terrible, we might move to a
> two-tiered internet in the future'... well, let¹s get real, we have a very
> multifaceted and multitiered internet today,² Heaney said.
>
> Indeed, the major ISPs claim it would be ³unthinkable² to return to an
> internet where every packet of data was given equal weight. ³Yes, the internet
> of 30 years ago was one in which all data, all the bits and the packets were
> treated in the same way as they passed through the network,² said Simon
> Milner, BT¹s director of group industry policy. ³That was an internet that
> wasn¹t about the internet that we have today: it wasn¹t about speech, it
> wasn¹t about video, and it certainly wasn¹t about television.
>
> ³Twenty years ago, the computer scientists realised that applications would
> grab as much bandwidth as they needed, and therefore some tools were needed to
> make this network work more effectively, and that¹s why traffic management
> techniques and guaranteed quality of service were developed in the 1990s, and
> then deep-packet inspection came along roughly ten years ago,² he added.
> ³These techniques and equipment are essential for the development of the
> internet we see today.²
>
> It¹s interesting to note that some smaller (and, yes, more expensive) ISPs
> such as Zen Internet don¹t employ any traffic shaping across their network,
> and Zen has won the PC Pro Best Broadband ISP award
> <http://www.pcpro.co.uk/html/awards-2010/> for the past seven years.
>
> Even today¹s traffic management methods can cause huge problems for certain
> websites and services. Peer-to-peer services are a common victim of ISPs¹
> traffic management policies, often being deprioritised to a snail¹s pace
> during peak hours. While the intended target may be the bandwidth hogs using
> BitTorrent clients to download illicit copies of the latest movie releases,
> legitimate applications can also fall victim to such blunderbuss filtering.
>
> ³Peer-to-peer applications are very wide ranging,² said Jean-Jacques Sahel,
> director of government and regulatory affairs at VoIP service Skype. ³They go
> from the lovely peer-to-peer file-sharing applications that were referred to
> in the Digital Economy Act, all the way to things such as the BBC iPlayer
> [which used to run on P2P software] or Skype. So what does that mean? If I
> manage my traffic from a technical perspective, knowing that Skype actually
> doesn¹t eat up much bandwidth at all, why should it be deprioritised because
> it¹s peer-to-peer?²
>>
>>
>> Nowhere has the effect of draconian traffic management been felt more vividly
>> than on the mobile internet
>
> Nowhere has the effect of draconian traffic management been felt more vividly
> than on the mobile internet. Websites and services blocked at the whim of the
> network, video so compressed it looks like an Al-Qaeda propaganda tape, and
> varying charges for different types of data are already commonplace.
>
> Skype is outlawed by a number of British mobile networks fearful of losing
> phone call revenue; 02 bans iPhone owners from watching the BBC iPlayer over a
> 3G connection; and almost all networks outlaw tethering a mobile phone to a
> laptop or tablet on standard ³unlimited data² contracts.
>
> Jim Killock, executive director of the Open Rights Group, has this chilling
> warning for fixed-line broadband users: ³Look at the mobile market, think if
> that is how you want your internet and your devices to work in the future,
> because that¹s where things are leading.²
>
> Video blockers
>
> Until now, fixed-line ISPs have largely resisted the more drastic blocking
> measures chosen by the mobile operators. But if there¹s one area in which ISPs
> are gagging to rip up what¹s left of the cherished concept of net neutrality,
> it¹s video.
>
> Streaming video recently overtook peer-to-peer to become the largest single
> category of internet traffic, according to Cisco¹s Visual Networking Index.
> It¹s the chief reason why the amount of data used by the average internet
> connection has shot up by 31% over the past year, to a once unthinkable 14.9GB
> a month.
>
>
> <http://www.pcpro.co.uk/gallery/features/364573/the-end-of-the-net-as-we-know-
> it/159070>
>
> Managing video traffic is unquestionably a major headache for ISPs and
> broadcasters alike. ISPs are introducing ever tighter traffic management
> policies to make sure networks don¹t collapse under the weight of
> video-on-demand during peak hours. Meanwhile, broadcasters such as the BBC and
> Channel 4 pay content delivery networks (CDNs) such as Akamai millions of
> pounds every year to distribute their video across the network and closer to
> the consumer; this helps avoid bandwidth bottlenecks when tens of thousands of
> people attempt to stream The Apprentice at the same time.
>
> Now the ISPs want to cut out the middleman and get video broadcasters to pay
> them instead of the CDNs for guaranteed bandwidth. So if, for example, the
> BBC wants to guarantee that TalkTalk customers can watch uninterrupted HD
> streams from iPlayer, it had better be willing to pay for the privilege. A
> senior executive at a major broadcaster told PC Pro that his company has
> already been approached by two leading ISPs looking to cut such a deal.
>
> Broadcasters willing to pay will be put into the ³fast lane²; those who don¹t
> will be left to fight their way through the regular internet traffic jams.
> Whether or not you can watch a video, perhaps even one you¹ve paid for, may no
> longer depend on the raw speed of your connection or the amount of network
> congestion, but whether the broadcaster has paid your ISP for a prioritised
> stream.
>
> ³We absolutely could see situations in which some content or application
> providers might want to pay BT for a quality of service above best efforts,²
> admitted BT¹s Simon Milner at a recent Westminster eForum. ³That is the kind
> of thing that we¹d have to explain in our traffic management policies, and
> indeed we¹d do so, and then if somebody decided, well, actually I don¹t want
> to have that kind of service¹, they would be free to go elsewhere.²
>>
>>
>> We absolutely could see situations in which some content or application
>> providers might want to pay BT for a quality of service above best efforts
>
> It gets worse. Asked directly at the same forum whether TalkTalk would be
> willing to cut off access completely to BBC iPlayer in favour of YouTube if
> the latter was prepared to sign a big enough cheque, TalkTalk¹s Andrew Heaney
> replied: ³We¹d do a deal, and we¹d look at YouTube and we¹d look at BBC and we
> should have freedom to sign whatever deal works.²
>
> That¹s the country¹s two biggest ISPs with more than eight million broadband
> households between them openly admitting they¹d either cut off or
> effectively cripple video streams from an internet
> broadcaster if it wasn¹t willing to hand over a wedge of cash.
>
> Understandably, many of the leading broadcasters are fearful. ³The founding
> principle of the internet is that everyone from individuals to global
> companies has equal access,² wrote the BBC¹s director of future media and
> technology, Erik Huggers, in a recent blog post on net neutrality. ³Since the
> beginning, the internet has been neutral¹, and everyone has been treated the
> same. But the emergence of fast and slow lanes allow broadband providers to
> effectively pick and choose what you see first and fastest.²
>
> ITV also opposes broadband providers being allowed to shut out certain sites
> or services. ³We strongly believe that traffic throttling shouldn¹t be
> conducted on the basis of content provider; throttling access to content from
> a particular company or institution,² the broadcaster said in a recent
> submission to regulator Ofcom¹s consultation on net neutrality.
>
> Sky, on the other hand which is both a broadcaster and one of the country¹s
> leading ISPs, and a company that could naturally benefit from shutting out
> rival broadcasters raised no such objection in its submission to Ofcom.
> ³Competition can and should be relied upon to provide the necessary consumer
> safeguards,² Sky argued.
>
> Can it? Would YouTube which was initially run from a small office above a
> pizzeria before Google weighed in with its $1.65 billion takeover have got
> off the ground if its three founders had been forced to pay ISPs across the
> globe to ensure its videos could be watched smoothly? It seems unlikely.
>
> Walled-garden web
>
> It isn¹t only high-bandwidth video sites that could potentially be blocked by
> ISPs. Virtually any type of site could find itself barred if one of its rivals
> has signed an exclusive deal with an ISP, returning the web to the kind of AOL
> walled-garden approach of the late 1990s.
>
>
> <http://www.pcpro.co.uk/gallery/features/364573/the-end-of-the-net-as-we-know-
> it/159073>
>
> This isn¹t journalistic scaremongering: the prospect of hugely popular sites
> being blocked by ISPs is already being debated by the Government. ³I sign up
> to the two-year contract [with an ISP] and after 18 months my daughter comes
> and knocks on the lounge door and says father, I can¹t access Facebook any
> more¹,² hypothesised Nigel Hickson, head of international ICT policy at the
> Department for Business, Innovation and Skills. ³I say Why?¹. She says It¹s
> quite obvious, I have gone to the site and I have found that TalkTalk, BT,
> Virgin, Sky, whatever, don¹t take Facebook any more. Facebook wouldn¹t pay
> them the money, but YouTube has, so I have gone to YouTube¹: Minister, is that
> acceptable? That is the sort of question we face.²
>
> Where¹s the regulator?
>
> So what does Ofcom, the regulator that likes to say ³yes², think about the
> prospect of ISPs putting some sites in the fast lane and leaving the rest to
> scrap over the remaining bandwidth? It ran a consultation on net neutrality
> earlier this year, with spiky contributions from ISPs and broadcasters alike,
> but it appears to be coming down on the side of the broadband providers.
>
> ³I think we were very clear in our discussion document [on net neutrality]
> that we see the real economic merits to the idea of allowing a two-sided
> market to emerge,² said Alex Blowers, international director at Ofcom.
>
> ³Particularly for applications such as IPTV, where it seems to us that the
> consumer expectation will be a service that¹s of a reasonably consistent
> quality, that allows you to actually sit down at the beginning of a film and
> watch it to the end without constant problems of jitter or the picture
> hanging,² he said. Taking that argument to its logical conclusion means that
> broadcasters who refuse to pay the ISPs¹ bounty will be subject to stuttering
> quality.
>
> Broadcasters are urging the regulator to be tougher. ³We are concerned that
> Ofcom isn¹t currently taking a firm stance in relation to throttling,² ITV
> said in its submission to the regulator. The BBC also said it has ³concerns
> about the increasing potential incentives for discriminatory behaviour by
> network operators, which risks undermining the internet¹s character, and
> ultimately resulting in consumer harm².
>
> Ofcom¹s Blowers argues regulation would be premature as ³there is very little
> evidence² that ³the big beasts of the content application and services world
> are coming together and doing deals with big beasts of the network and ISP
> world².
>
> The regulator also places great faith in the power of competition: the theory
> that broadband subscribers would simply jump ship to another ISP if their
> provider started doing beastly things for example, cutting off services such
> as the iPlayer. It¹s a theory echoed by the ISPs themselves. ³If we started
> blocking access to certain news sites, you could be sure within about 23
> minutes it would be up on a blog and we¹d be chastised for it, quite rightly
> too,² said TalkTalk¹s Heaney.
>>
>>
>> First and foremost, users should be able to access and distribute the
>> content, services and applications they want
>
> Yet, in the age of bundled packages where broadband subscriptions are
> routinely sold as part of the same deal as TV, telephone or mobile services
> hopping from one ISP to another is rarely simple. Not to mention the 18-month
> or two-year contracts broadband customers are frequently chained to. As the
> BBC pointed out in its submission to the regulator, ³Ofcom¹s 2009 research
> showed that a quarter of households found it difficult to switch broadband and
> bundled services², with the ³perceived hassle of the switching process² and
> ³the threat of additional charges² dissuading potential switchers.
>
> ³Once you have bought a device or entered a contract, that¹s that,² argued the
> Open Rights Group¹s Jim Killock. ³So you make your choice and you lump it,
> whereas the whole point of the internet is you make your choice, you don¹t
> like it, you change your mind.²
>
> The best hope of maintaining the status quo of a free and open internet may
> lie with the EU (although even its determination is wavering). The EU¹s 2009
> framework requires national regulators such as Ofcom to promote ³the ability
> of end users to access and distribute information or run applications and
> services of their choice² and that ISPs are transparent about any traffic
> management.
>
> It even pre-empts the scenario of ISPs putting favoured partners in the ³fast
> lane² and crippling the rest, by giving Ofcom the power to set ³minimum
> quality of service requirements² forcing ISPs to reserve a set amount of
> bandwidth so that their traffic management doesn¹t hobble those sites that
> can¹t afford to pay.
>
> It¹s a concept enthusiastically backed by the BBC and others, but not by the
> ISPs or Ofcom, which doesn¹t have to use this new power handed down by
> Brussels and seems reluctant to do so. ³There doesn¹t yet seem to us to be an
> overwhelming case for a public intervention that would effectively create a
> new industry structure around this idea of a guaranteed best efforts¹
> internet underpinned by legislation,² said Ofcom¹s Blowers.
>
> It¹s an attitude that sparks dismay from campaigners. ³Ofcom¹s approach
> creates large risks for the open internet,² said Killock. ³Its attempts to
> manage and mitigate the risks are weak, by relying on transparency and
> competition alone, and it¹s unfortunate it hasn¹t addressed the idea of a
> minimum service guarantee.²
>
> At least the EU is adamant that ISPs shouldn¹t be permitted to block legal
> websites or services that conflict with their commercial interests. ³First and
> foremost, users should be able to access and distribute the content, services
> and applications they want,² said European Commission vice president Neelie
> Kroes earlier this year.
> ³Discrimination against undesired competitors for instance, those providing
> voice-over the internet services shouldn¹t be allowed.²
>
> Yet, Ofcom doesn¹t even regard this as a major issue. ³When VoIP services were
> first launched in the UK, most [mobile] network operators were against
> permitting VoIP,² Blowers said. ³We now know that you can find packages from a
> number of suppliers that do permit VoIP services.
> So I¹m not as pessimistic as some may be that this kind of gaming behaviour
> around blocking services will be a real problem.²
>
> If the EU doesn¹t drag the UK¹s relaxed regulator into line with the rest of
> the world, it will be British internet users who have the real problem.
>
> Author: Barry Collins
>
>
> Read more: The end of the net as we know it | Broadband | Features | PC Pro
> <http://www.pcpro.co.uk/features/364573/the-end-of-the-net-as-we-know-it/print
> #ixzz1BpvJk95Y>
> http://www.pcpro.co.uk/features/364573/the-end-of-the-net-as-we-know-it/print#
> ixzz1BpvJk95Y
--
Janna Quitney Anderson
Director of Imagining the Internet
www.imaginingtheinternet.org
Associate Professor of Communications
Director of Internet Projects
School of Communications
Elon University
andersj at elon.edu
(336) 278-5733 (o)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.igcaucus.org/pipermail/governance/attachments/20110123/bd0efb97/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image.jpg
Type: image/jpeg
Size: 8272 bytes
Desc: not available
URL: <http://lists.igcaucus.org/pipermail/governance/attachments/20110123/bd0efb97/attachment.jpg>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image.jpg
Type: image/jpeg
Size: 5729 bytes
Desc: not available
URL: <http://lists.igcaucus.org/pipermail/governance/attachments/20110123/bd0efb97/attachment-0001.jpg>
-------------- next part --------------
____________________________________________________________
You received this message as a subscriber on the list:
governance at lists.cpsr.org
To be removed from the list, visit:
http://www.igcaucus.org/unsubscribing
For all other list information and functions, see:
http://lists.cpsr.org/lists/info/governance
To edit your profile and to find the IGC's charter, see:
http://www.igcaucus.org/
Translate this email: http://translate.google.com/translate_t
More information about the Governance
mailing list