[bestbits] (RightsCon) Confronting Online Hate Speech
Janna Anderson
andersj at elon.edu
Wed Mar 29 08:58:50 EDT 2017
IMPORTANT for all interested in the future of Online Discourse (hate and manipulation/fake news)
* Pew Research is releasing a new report at 10 a.m. today on this topic. For an early look, see the full report here: http://www.elon.edu/e-web/imagining/surveys/2016_survey/social_future_of_the_internet.xhtml
* The report will be posted on the Pew Internet home page http://www.pewinternet.org/ as well at 10 today. More than 80% of experts surveyed said that uncivil and/or manipulative online speech will stay at the same level as today or increase and only 19% expressed hope that some solution to this problem will cut down on such speech in the next decade…
* I am attaching some direct quotes from some expert respondents below, and I have also attached for you here a chart with common themes from this report.
* Pew and Elon also recently released “Code-Dependent," an important report on experts concerns over the IMPACT of ALGORITHMS: http://www.elon.edu/e-web/imagining/surveys/2016_survey/algorithm_impacts.xhtml
Janna Anderson
Director of the Imagining the Internet Center, Elon University
Senior Contract Researcher, Pew Internet, Science & Technology Project
Pew Research | Elon University report - release 10 a.m. March 29, 2017
Top quotes from…
The Future of Free Speech, Trolls, Anonymity and Fake News Online
“The major internet platforms are driven by a profit motive. Very often, hate, anxiety, and anger drive participation with the platform. Whatever behavior increases ad revenue will not only be permitted, but encouraged, excepting of course some egregious cases.”
Frank Pasquale, University of Maryland law professor, author of Black Box Society
https://twitter.com/FrankPasquale
“Distrust and trolling is happening at the highest levels of political debate, and the lowest. The Overton Window<https://en.wikipedia.org/wiki/Overton_window> has been widened considerably by the 2016 U.S. presidential campaign, and not in a good way… Trolling is a mainstream form of political discourse.”
Kate Crawford, researcher of AI, machine learning, power and ethics
https://twitter.com/katecrawford
“It’s a brawl, a forum for rage and outrage… If the overall tone is lousy, if the culture tilts negative, if political leaders popularize hate, then there’s good reason to think all of that will dominate the digital debate as well.”
Andrew Nachison, founder at We Media
https://twitter.com/anachison
“One of the biggest challenges will be finding an appropriate balance between protecting anonymity and enforcing consequences for the abusive behavior that has been allowed to characterize online discussions for far too long.”
Bailey Poland, author of Haters: Harassment, Abuse, and Violence Online https://twitter.com/the_author_
"Cyber attacks, doxing, and trolling will continue, while social platforms, security experts, ethicists… The worst outcome is that we end up with a kind of Potemkin internet in which everything looks reasonably bright and sunny, which hides a more troubling and less transparent reality.”
Susan Etlinger, an industry analyst at Altimeter
https://twitter.com/setlinger
“I would very much love to believe that discourse will improve, but I fear the forces making it worse haven't played out at all yet. After all, it took us almost 70 years to mandate seatbelts. And we're not uniformly wise about how to conduct dependable online conversations, never mind debates on difficult subjects."
Jerry Michalski, founder at REX
https://twitter.com/jerrymichalski
"The application space on the internet today is shaped by large commercial actors, and their goals are profit-seeking, not the creation of a better commons… We are coming to a fork in the road—either a new class of actor emerges with a different set of motivations… or I fear the overall character of discourse will decline.”
David Clark, senior research scientist at MIT and Internet Hall of Famer
of @MIT_CSAIL<https://twitter.com/MIT_CSAIL>
"There’s very little on the horizon in terms of improving discourse. It’s all too easy for bad actors to organize and flood message boards and social media with posts that drive people away. Or, to paraphrase Gresham’s law, bad posts drive out the good.”
Jason Hong, associate professor at Carnegie Mellon University
https://twitter.com/jas0nh0ng
"Social media bring every bad event to our attention, making us feel as if they all happened in our backyards—leading to an overall sense of unease. The combination of bias-reinforcing enclaves and global access to bad actions seems like a toxic mix. It is not clear there is a way to counterbalance their socially harmful effects.”
Vint Cerf, Internet Hall of Fame member, Google vice president
https://twitter.com/vgcerf
"The internet is the natural battleground for whatever breaking point we reach to play out, and it’s also a useful surveillance, control, and propaganda tool for monied people hoping to forestall a redistributive future. That will create even more inflammatory dialogue, flamewars, polarized debates, etc.”
Cory Doctorow, activist-in-residence at MIT Media Lab and co-owner of Boing Boing
https://twitter.com/doctorow
“Unfortunately, I see the present prevalence of trolling as an expression of a broader societal trend across many developed nations, towards belligerent factionalism in public debate, with particular attacks directed at women as well as ethnic, religious, and sexual minorities.”
Axel Bruns, professor, Queensland University of Technology Digital Media Research Centre
https://twitter.com/snurb_dot_info
“[By 2026] social media and other forms of discourse will include all kinds of actors who had no voice in the past; these include terrorists, critics of all kinds of products and art forms, amateur political pundits, and more.”
Matt Hamblen, senior editor at Computerworld
https://twitter.com/matthamblen
“Social media's affordances, including increased visibility and persistence of content, amplify the volume of negative commentary. As more people get internet access—and especially smartphones, which allow people to connect 24/7—there will be increased opportunities for bad behavior.”
Jessica Vitak, an assistant professor at the University of Maryland
https://twitter.com/jvitak
"The number of venues [for trolling, other bad conduct and misinformation online] will rise with the expansion of the Internet of Things and when consumer-production tools become available for virtual and mixed reality.”
Bryan Alexander, president of Bryan Alexander Consulting
https://twitter.com/BryanAlexander
"Venture-backed tech companies have a huge bias toward algorithmic solutions that have tended to reward that which keeps us agitated. Very few media companies now have staff dedicated to guiding conversations online.”
Steven Waldman, founder and CEO of LifePosts
https://twitter.com/stevenwaldman
“Free speech will be amplified through peer-to-peer multicast, mesh network technologies. Earlier-generation platforms that enabled free speech… will usher in even broader and more pervasive person-to-person (P2P) communication technologies, powered by the Internet of Things."
Scott Amyx, CEO of Amyx+, an Internet of Things business consultancy
https://twitter.com/AmyxIoT
"New online structures something like affinity guilds will evolve that allow individuals to associate with and benefit from the protection of and curation of a trusted group. People need extremely well-designed interfaces to control the barrage of content coming to their awareness. Public discourse forums will increasingly use artificial intelligence, machine learning, and wisdom-of-crowds reputation-management techniques to help keep dialog civil."
Susan Price, digital architect at Continuum Analytics
https://twitter.com/ContinuumIO
"Blockchain technologies hold much promise for giving individuals this appropriate control over their attention, awareness, and the data we all generate through our actions. They will require being uniquely identified in transactions and movements, and readable to holders of the keys. A thoughtful, robust architecture and systems can give individuals control over the parties who hold those keys.”
Susan Price, digital architect at Continuum Analytics
https://twitter.com/ContinuumIO
"I imagine… digital ventriloquists that combine predictive analytics with algorithms that interpret the appropriateness of various remarks. The software could detect you’re communicating with a person that, historically, you’ve have hard time being civil with. It could data-mine past conversations and recommend a socially acceptable response to that person worded in your own personal style of writing.”
Evan Selinger, professor of philosophy, Rochester Institute of Technology
https://twitter.com/EvanSelinger
"The success of online communities will hinge on the extent to which they are able to prevent the emergence of a hostile environment in the spaces under their stewardship. Algorithms will play an increasing role, but probably never fully replace the need for human curators in keeping our online communities civil.”
Demian Perry, mobile director at NPR (U.S. National Public Radio)
https://twitter.com/perrydc @NPR
“We are still figuring out the netiquette of online social interaction. Networks seem to rearrange themselves over time (newsgroups -> IRC -> MySpace -> Facebook) and interaction becomes more inclusive and more structured. I believe we are at the point of highest integration but lowest regulation. Over the next decade … the benevolent majority becomes relatively more vocal and crowds out the trolls.”
Bart Knijnenburg, assistant professor, human-centered computing, Clemson University
https://twitter.com/usabart
“We are likely headed toward more-insular online communities where you speak to and hear more from like-minded people. Obvious trolls will become easier to mute or ignore (either manually or by algorithm) within these communities. This future is not necessarily desirable for meaningful social discourse that crosses ideologies but it is a trend that may emerge.”
Michael Whitaker, vice president of emerging solutions at ICF International
"The Internet has given voice to those who had been voiceless because they were oppressed minorities and to those who were voiceless because they are crackpots… It may not necessarily be ‘bad actors’—i.e., racists, misogynists, etc.—who win the day, but I do fear it will be the more strident."
Steven Waldman, founder and CEO of LifePosts
https://twitter.com/stevenwaldman
“The continuing diminution of general-interest intermediaries such as newspapers, network television, etc., means we have reached a point in our society where wildly different versions of ‘reality’ can be chosen and customized by people to fit their existing ideological and other biases. In such an environment there is little hope for collaborative dialogue and consensus.”
John Anderson, director of journalism and media studies at Brooklyn College
https://twitter.com/diymediadotnet
"The id unbound from the monitoring and control by the superego is both the originator of communication and the nemesis of understanding and civility… Like civility in all contexts, controlling the id is more social and personal in the end. Technologies will nonetheless attempt to augment civility—and to circumvent it."
Paul Jones, clinical professor and director of ibiblio.org, UNC-Hill
https://twitter.com/smalljones
“It is in the interest of the paid-for media and most political groups to continue to encourage ‘echo-chamber’ thinking and to consider pragmatism and compromise as things to be discouraged. While this trend continues, the ability for serious civilized conversations about many topics will remain very hard to achieve.”
David Durant, a business analyst at UK Government Digital Service
https://twitter.com/hashtag/GDS
“Misinformation and anti-social networking are degrading our ability to debate and engage in online discourse. When opinions based on misinformation are given the same weight as those of experts and propelled to create online activity, we tread a dangerous path. Online social behaviour, without community-imposed guidelines, is subject to many potentially negative forces."
Karen Blackmore, a lecturer in IT at the University of Newcastle
https://twitter.com/karen_blackmore
"Social online communities such as Facebook also function as marketing tools, where sensationalism is widely employed, and community members who view this dialogue as their news source, gain a very distorted view of current events and community views on issues. This is exacerbated with social network and search engine algorithms effectively sorting what people see to reinforce worldviews.”
Karen Blackmore, a lecturer in IT at the University of Newcastle
https://twitter.com/karen_blackmore
“The reason [online discourse] will probably get worse is that companies and governments are starting to realise that they can influence people's opinions. And these entities sure know how to circumvent any protection in place. Russian troll armies are a good example of something that will become more and more common in the future.”
Laurent Schüpbach, neuropsychologist at University Hospital in Zurich, Switzerland
“Unfortunately, most people are easily manipulated by fear. Donald Trump's success is a testament to this fact. Negative activities on the internet will exploit those fears, and disproportionate responses will also attempt to exploit those fears. Soon, everyone will have to take off their shoes and endure a cavity search before boarding the internet.”
David Wuertele, a software engineer at Tesla Motors
“As language-processing technology develops, technology will help us identify and remove bad actors, harassment, and trolls from accredited public discourse.”
Galen Hunt, a research manager at Microsoft Research NexT
“I anticipate that AIs will be developed that will rapidly decrease the impact of trolls. Free speech will remain possible, although AI filtering will make a major dent on how views are expressed, and hate speech will be blocked.”
Stowe Boyd, chief researcher at Gigaom
https://twitter.com/stoweboyd
"I expect we will create bots that would promote beneficial connections and potentially insert context-specific data/facts/stories that would benefit more positive discourse. Of course, any filters and algorithms will create issues around what is being filtered out and what values are embedded in algorithms.”
Marina Gorbis, executive director at the Institute for the Future
https://twitter.com/mgorbis
"First, conversations can have better containers that filter for real people who consistently act with decency. Second, software is getting better and more nuanced in sentiment analysis, making it easier for software to augment our filtering out of trolls. Third, we are at peak identity crisis and a new wave of people want to cross the gap in dialogue to connect with others before the consequences of being tribal get worse (Brexit, Trump, etc.).”
Jean Russell of Thrivable Futures
https://twitter.com/NurtureGirl
“My own research group is exploring several novel directions in digital commentary. In the not too distant future all this work will yield results. Trolling, doxxing, echo chambers, click-bait, and other problems can be solved. We will be able to ascribe sources and track provenance in order to increase the accuracy and trustworthiness of information online."
David Karger, a professor of computer science at MIT
https://twitter.com/karger
"We will create tools that increase people’s awareness of opinions differing from their own, and support conversations with and learning from people who hold those opinions…. The future Web will give people much better ways to control the information they receive, which will ultimately make problems like trolling manageable (trolls will be able to say what they want, but few will be listening).”
David Karger, a professor of computer science at MIT
https://twitter.com/karger
“We'll see some consolidation as it becomes easier to shape commercial interactive spaces to the desired audience. There will be free-for-all spaces and more-tightly-moderated walled gardens, depending on the sponsor's strategic goals. There will also be private spaces maintained by individuals and groups for specific purposes.”
Valerie Bock, of VCB Consulting
https://twitter.com/vcbock
“We’re in a spy vs. spy internet world where the faster that hackers and trolls attack, the faster companies [and] for-profits come up with ways to protect against them and then the hackers develop new strategies against those protections, and so it goes. I don’t see that ending."
Cathy Davidson, director, Futures Initiative at the City University of New York
https://twitter.com/CathyNDavidson
"I would not be surprised at more publicity in the future, as a form of cyber-terror. That’s different from trolls, more geo-politically orchestrated to force a national or multinational response. That is terrifying if we do not have sound, smart, calm leadership.”
Cathy Davidson, director, Futures Initiative at the City University of New York
https://twitter.com/CathyNDavidson
“Surveillance and censorship will become more systematic, even in supposedly free countries such as the U.S. Terrorism and harassment by trolls will be presented as the excuses, but the effect will be dangerous for democracy.”
Richard Stallman, Internet Hall of Fame member and president of the Free Software Foundation
https://twitter.com/Internet_HOF
“I’m very concerned about the future of free speech... Demands for governments and companies to censor and monitor internet users are coming from an increasingly diverse set of actors with concerns about safety and security, and concerns about whether civil discourse is becoming so poisoned as to make rational governance based on facts impossible.”
Rebecca MacKinnon, director of Ranking Digital Rights at New America
https://twitter.com/rmack
"There will be a move toward firm identities—even legal identities issued by nations—for most users of the Web…There would still be anonymity available, just as there is in the real world today. But there would be online activities in which anonymity was not permitted. Clearly this could have negative free-speech impacts.”
Michael Rogers, author and futurist at Practical Futurist
https://twitter.com/rogersma
“The regulation of online communications is a natural response to the identification of real problems, the maturing of the industry, and the increasing expertise of government regulators.”
Marc Rotenberg, executive director of the Electronic Privacy Information Center (EPIC)
https://twitter.com/MarcRotenberg of @EPICPrivacy
“There is growing evidence that that the Net is a polarizing force in the world. I don’t believe to completely understand the dynamic, but my surmise is that it is actually building more walls than it is tearing down.”
John Markoff, senior writer at the New York Times
https://twitter.com/markoff @nytimes
"With less anonymity and less diversity, the two biggest problems of the Web 1.0 era have been solved from a commercial perspective: fewer trolls who can hide behind anonymity. Yet, what are we losing in the process? Algorithmic culture creates filter bubbles, which risk an opinion polarisation inside echo chambers.”
Marcus Foth, a professor at Queensland University of Technology
https://twitter.com/sunday9pm
“Monitoring is and will be a massive problem, with increased government control and abuse. The fairness and freedom of the internet's early days are gone. Now it's run by big data, Big Brother, and big profits. Anonymity is a myth, it only exists for end-users who lack lookup resources.”
Thorlaug Agustsdottir of Iceland’s Pirate Party
https://twitter.com/thorlaug
"It is easy for governments to demand that social media companies do ‘more’ to regulate everything... We see this with the European Union's 'code of conduct' with social media companies. This privatisation of regulation of free speech (in a context of huge, disproportionate, asymmetrical power due to the data stored and the financial reserves of such companies) raises existential questions for the functioning of healthy democracies.”
Joe McNamee, executive director at European Digital Rights
https://twitter.com/edri
“Between troll attacks, chilling effects of government surveillance and censorship, etc., the internet is becoming narrower every day.”
Randy Bush, Internet Hall of Fame member and research fellow at Internet Initiative Japan
https://twitter.com/Internet_HOF
"It is far too easy right now for anyone to launch a large-scale public negative attack… This then can be picked up by others and spread. The 'mob mentality' can be easily fed, and there is little fact-checking... This will cause some governments to want to step in to protect citizens and thereby potentially endanger both free speech and privacy.”
Dan York, senior content strategist at the Internet Society
https://twitter.com/danyork @InternetSociety
“To quote everyone ever, things will get worse before they get better. We’ve built a system in which access and connectivity are easy, the cost of publishing is near zero, and accountability and consequences for bad action are difficult to impose or toothless when they do."
Baratunde Thurston, director’s Fellow at MIT Media Lab, Fast Company columnist
https://twitter.com/baratunde @MediaLab
"More people are getting online every day with no norm-setting for their behavior and the systems that prevail now reward attention-grabbing and extended time online. They reward emotional investment whether positive or negative. They reward conflict. So we’ll see more bad before more good because… the financial models backing these platforms remains largely ad-based and rapid/scaled user growth-centric.”
Baratunde Thurston, director’s Fellow at MIT Media Lab, Fast Company columnist
https://twitter.com/baratunde @MediaLab
“Now that everybody knows about this problem I expect active technological efforts to reduce the efforts of the trolls, and we should reach ‘peak troll’ before long. There are concerns for free speech. My hope is that pseudonymous reputation systems might protect privacy while doing this.”
Brad Templeton, chair for computing at Singularity University, EFF board member
https://twitter.com/bradtem @EFF
“Things will get somewhat better because people will find it tougher to avoid accountability. Reputations will follow you more than they do now…. There will also be clever services like CivilComments.com that foster crowdsourced moderation rather than censorship of comments. That approach will help."
Esther Dyson, founder of EDventure Holdings and technology entrepreneur
https://twitter.com/edyson
"Anonymity is an important right—and freedom of speech with impunity (except for actual harm, yada yada)—is similarly important. Anonymity should be discouraged in general, but it is necessary in regimes or cultures or simply situations where the truth is avoided and truth-speakers are punished.”
Esther Dyson, founder of EDventure Holdings and technology entrepreneur
https://twitter.com/edyson
"In the coming decade, you will have more and more conversations with operating systems, and especially with chatbots programmed to listen to, learn from and react to us. You will encounter bots first throughout social media, and during the next decade, they will become pervasive digital assistants ... Currently, there is no case law governing free speech of a chatbot."
Amy Webb, futurist and CEO at the Future Today Institute
https://twitter.com/amywebb @FTI
"There were thousands of bots created to mimic Latino/Latina voters supporting Donald Trump. If someone tweeted a disparaging remark about Trump and Latinos, bots… would target that person with tweets supporting Trump. Now, many of the chatbots we interact with aren’t so smart. But with improvements in AI, machine learning, that will change."
Amy Webb, futurist and CEO at the Future Today Institute
https://twitter.com/amywebb @FTI
"Without a dramatic change… we will realize a decade from now that we inadvertently encoded structural racism, homophobia, sexism, and xenophobia into the bots helping to power our everyday lives. When chatbots start running amok—targeting individuals with hate speech—how will we define 'speech'? At the moment, our legal system isn’t planning for a future in which we must consider the free speech infringements of bots.”
Amy Webb, futurist and CEO at the Future Today Institute
https://twitter.com/amywebb @FTI
"Harassment, trolling... these things thrive with distance, which favors the reptile brains in us all… As individuals and distributed solutions to problems (e.g., blockchain) gain more power and usage, we will see many more distributed solutions to fundamental social and business issues, such as how we treat each other.”
Doc Searls, journalist, speaker, and director of Project VRM at Harvard
https://twitter.com/dsearls @BKCHarvard
"We need systems that support pseudonymity: locally persistent identities. Persistence provides accountability: people are responsible for their words. Locality protects privacy: people can participate in discussions without concern that their government, employer, insurance company, marketers, etc., are listening in."
Judith Donath of Harvard, author of The Social Machine: Designs for Living Online
https://twitter.com/judithd @BKCHarvard
"We should have digital portraits that succinctly depict a (possibly pseudonymous) person’s history of interactions and reputation within a community. We need to be able to quickly see who is new, who is well-regarded, what role a person has played in past discussions. A few places do so now but their basic charts are far from the goal: intuitive and expressive portrayals."
Judith Donath of Harvard, author of The Social Machine: Designs for Living Online
https://twitter.com/judithd @BKCHarvard
"‘Bad actors’ and trolls (and spammers, harassers, etc.) have no place in most discussions—the tools we need for them are filters. We need to develop better algorithms for detecting destructive actions as defined by the local community. The more socially complex question is how to facilitate constructive discussions among people who disagree. Here, we need to rethink the structure of online discourse."
Judith Donath of Harvard, author of The Social Machine: Designs for Living Online
https://twitter.com/judithd @BKCHarvard
"The role of discussion host/ moderator is poorly supported by current tech—and many discussions would proceed much better in a model other than the current linear free for all. Our face-to-face interactions have amazing subtlety—we can encourage or dissuade with slight changes in gaze, facial expression, etc. We need to create tools for conversation hosts (think of your role when you post something on your own Facebook page that sparks controversy) to help gracefully steer conversations.”
Judith Donath of Harvard, author of The Social Machine: Designs for Living Online
https://twitter.com/judithd @BKCHarvard
"We are seeing the release of a pressure valve (or perhaps an explosion) of pent-up speech: the ‘masses’ who for so long could not be heard can now speak, revealing their own interests, needs, and frustrations—their own identities distinct from the false media concept of the mass. Yes, it’s starting out ugly. But I hope that we will develop norms around civilized discourse."
Jeff Jarvis, a professor at the City University of New York Graduate School of Journalism
https://twitter.com/jeffjarvis
"I hope we develop norms around civilized discourse. Oh, yes, there will always be… trolls. What we need is an expectation that it is destructive to civil discourse to encourage them. Yes, it might have seemed fun to watch the show of angry fights. It might seem fun to media to watch institutions like the Republican Party implode. But it soon becomes evident that this is no fun."
Jeff Jarvis, a professor at the City University of New York Graduate School of Journalism
https://twitter.com/jeffjarvis
"A desire and demand for civil, intelligent, useful discourse will return; no society or market can live on misinformation and emotion alone. Or that is my hope. How long will this take? It could be years. It could be a generation. It could be, God help us, never.”
Jeff Jarvis, a professor at the City University of New York Graduate School of Journalism
https://twitter.com/jeffjarvis
“Most attempts at reasoned discourse on topics interesting to me have been disrupted by trolls in last decade or so. Many individuals faced with this harassment simply withdraw... My wife’s reaction is ‘why are you surprised?’ in regard to seeing behavior online that already exists offline."
Mike Roberts, Internet Hall of Fame member and first president and CEO of ICANN
https://twitter.com/ICANN
"There is a somewhat broader question of whether expectations of ‘reasoned’ discourse were ever realistic. History of this, going back to Plato, is one of self-selection into congenial groups. The internet, among other things, has energized a variety of anti-social behaviors by people who get satisfaction from the attendant publicity."
Mike Roberts, Internet Hall of Fame member and first president and CEO of ICANN
https://twitter.com/ICANN
"Nowness is the ultimate arbiter: the value of our discourse will be weighted by how immediate or instantly seen and communicated the information is. Real-time search, geolocation, just-in-time updates, Twitter, etc., are making of now, the present moment, an all-subsuming reality that tends to bypass anything that isn’t hyper-current."
Barry Chudakov, founder and principal at Sertain Research and StreamFuzion Corp.
https://twitter.com/barrychudakov
"Given the lack [online] of ‘facework’ or immediate facial response that defined human response for millennia, we will ramp up the emotional content of messaging to ensure some kind of response, frequently rewarding the brash and outrageous over the slow and thoughtful.”
Barry Chudakov, founder and principal at Sertain Research and StreamFuzion Corp.
https://twitter.com/barrychudakov
"Our present ‘filter failure<https://www.youtube.com/watch?v=LabqeJEOQyI>,’ to borrow Clay Shirky’s phrase, is almost complete lack of context, reality check, or perspective. In the next decade we will start building better contextual frameworks for information."
Barry Chudakov, founder and principal at Sertain Research and StreamFuzion Corp.
https://twitter.com/barrychudakov
"The Volume Formula [in online discourse], the volume of content from all quarters—anyone with a keypad, a device—makes it difficult to manage responses, or even to filter for relevance but tends to favor emotional button-pushing in order to be noticed."
Barry Chudakov, founder and principal at Sertain Research and StreamFuzion Corp.
https://twitter.com/barrychudakov
“After Snowden’s revelations, and in context accelerating cybercrimes and cyberwars it’s clear that every layer of the technology stack and every node in our networked world is potentially vulnerable. Meanwhile both magnitude and frequency of exploits are accelerating."
Mike Liebhold, senior researcher and distinguished fellow at the Institute for the Future
https://twitter.com/mikeliebhold @IFTF
"Users will continue to modify their behaviors and internet usage, and designers of internet services, systems, and technologies will have to expend growing time and expense on personal and collective security.”
Mike Liebhold, senior researcher and distinguished fellow at the Institute for the Future
https://twitter.com/mikeliebhold @IFTF
“The struggle we're facing is a societal issue we have to address at all levels, and that the structure of social media platforms can exacerbate. Social media companies will need to address this, beyond community policing and algorithmic shaping of our newsfeeds."
Jillian York, director, international freedom of expression, Electronic Frontier Foundation
https://twitter.com/jilliancyork @EFF
"There are many ways to [moderate discourse] while avoiding censorship; for instance, better-individualized blocking tools and upvote/downvote measures can add nuance to discussions. I worry that if we don't address the root causes of our current public discourse, politicians and companies will engage in an increasing amount of censorship.”
Jillian York, director, international freedom of expression, Electronic Frontier Foundation
https://twitter.com/jilliancyork @EFF
“Privacy as we tend to think of nowadays is going to be further eroded, if only because of the ease with which one can collect data and identify people. Free speech, if construed as the freedom to say whatever one thinks, will continue to exist and even flourish, but the flip side will be a number of derogatory and ugly comments that will become more pervasive.”
Bernardo A. Huberman, senior fellow and director at Hewlett Packard Enterprise
https://twitter.com/bhuberman @HPEnterprise 16
"Perception of public discourse is shaped by… our experience of online public discourse and media reports concerning the nature of public discourse… we have evidence that there is a lot of influence from bad actors, harassment, trolls…. But a great deal of public online discourse consists of what we and others don’t see.”
Stephen Downes, researcher at National Research Council, Canada
https://twitter.com/oldaily
"We do not confront our neighbours/children/friends with antisocial behaviour. The problem is not [only] anonymous bullying: many bullies have faces and are shameless, and they have communities that encourage bullying. And government subsidies stimulate them—the most frightening aspect of all."
Marcel Bullinga, trendwatcher and keynote speaker
https://twitter.com/futurecheck
"We will see the rise of the social robots, technological tools that can help us act as polite, decent social beings (like the REthink app). But more than that we need to go back to teaching and experiencing morals in business and education: back to behaving socially."
Marcel Bullinga, trendwatcher and keynote speaker
https://twitter.com/futurecheck
“Today’s negative online user environment is supported and furthered by two trends that are unlikely to last into the next decade: anonymity in posting and validation from self-identified subgroups… The passing of anonymity will also shift the cost benefit analysis of writing or posting something to appeal to only a self-identified bully group."
Patrick Tucker, author of The Naked Future and technology editor at Defense One
https://twitter.com/DefTechPat
"Trolling, harassment, etc., will remain commonplace, but not be the overwhelming majority of discourse. We’ll see repeated efforts to clamp down on bad online behavior through both tools and norms; some of these efforts will be (or seem) successful, even as new variations of digital malfeasance arise.”
Jamais Cascio, distinguished fellow at the Institute for the Future
https://twitter.com/cascio
“I expect the negative influences on social media to get worse, and the positive factors to get better. Networks will try to respond to prevent the worst abuses, but new sites and apps will pop up that repeat the same mistakes.”
Anil Dash, technologist
https://twitter.com/anildash
"Free speech is made possible and more freely distributed by technology. Capture (read: production) and distribution are burgeoning. The individual has more than a soapbox; he or she or they have video and streaming or 'store now and play later' with repositories in the cloud becoming cheaper by the moment.”
Dean Landsman, digital strategist and executive director of PDEC
https://twitter.com/DeanLand
"Social media allows people to take part in a public debate that they may have not previously had access to. But alongside this an increasing culture of attack language, trolls, and abuse online has the potential to do more damage to this potential.”
Tim Norton, chair of Digital Rights Watch
https://twitter.com/norton_tim
"Anonymity fuels a lack of accountability… producing at times an online Lord of the Flies (LoF) situation. [These] have persisted in human social groups for eons and are not created by the availability of online fora. Despite the poor behavior of some, the world of social discourse online is growing in depth, diversity, and levels of participation."
Andrew Walls, managing vice president at Gartner
https://twitter.com/Gartner_inc
"Groups that have previously had minimal access to mass communication are able to have a voice and those who previously had the microphone aren't quite sure how to react pleasantly. Technological evolution has surpassed the evolution of civil discourse. We'll catch up eventually. I hope. We are in a defining time.”
Ryan Sweeney, director of analytics at Ignite Social Media
https://twitter.com/ryantsweeney https://twitter.com/ignitesma
“As we connect our identity more to what we say, there will be more accountability. Since it is easier to say harsh words anonymously, the continued direction of transparency will lead to more civil discourse.”
Tiffany Shlain, filmmaker and founder of The Webby Awards
https://twitter.com/tiffanyshlain
"The problem is the tendency of the cacophony of negative media voices to increase the social schisms contributing to the rising anger over a world undergoing massive shifts. We are watching what happens when the audience becomes accustomed to 'having a voice' and begins to assume that being heard entitles one's opinion to dominate."
Pamela Rutledge, director of the Media Psychology Research Center
https://twitter.com/mediapsychology
"Unpleasant [and]… rising concentrations and complexity within society and within social and natural systems… makes us more vulnerable, while also making it harder to see cause and effect and assign accountability and responsibility to the injuries we suffer. Anger and frustration are a predictable consequence, and I expect public discourse online to reflect it.”
Chris Kutarna, a fellow at the Oxford Martin School and author of Age of Discovery
https://twitter.com/ChrisKutarna @oxmartinschool
“The internet will continue to serve as an outlet for voices to vent in ways that are both productive and necessary. Societal and political 'griping' and 'disgust' often are necessary mechanisms for fostering change. We are going to find ways to preserve anonymity where necessary but also evolve online mechanisms of trust and identity verification."
Scott McLeod, associate professor of educational leadership, University of Colorado-Denver
https://twitter.com/mcleod
"With more voices in the discussion, facilitated by the internet, negative elements have become more visible/audible in civil discourse. This could be seen as the body politic releasing toxins—and as they are released, we can deal with them and hopefully mitigate their effect.”
Jon Lebkowsky, CEO of Polycot Associates
https://twitter.com/jonl @polycotplus
“It's clear that the level of abusive attacks on sites like Twitter or those that leverage multiple sites and technologies operates at a vastly different scale than the more-confined spaces of the past.”
Wendy M. Grossman, a science writer and author of net.wars
https://twitter.com/wendyg
"It is of great concern that we have yet to find a funding model that will replace the Fourth Estate functions of the press... We desperately need to create interest in serious, fact-laden, truth-seeking discourse. The internet could be, but it largely isn’t, doing this.”
Glenn Ricart, Internet Hall of Fame member and founder/CTO of US Ignite
https://twitter.com/gricart @US_Ignite
“Highly regarded media outlets set the tone of public discourse to a great degree—when the media we see is brash, brazen, and inflammatory, we adopt that language. I hope we will see a conscious shift in social networks to promote diversity of ideas and of thinking… I fear that will only come when we are able to come up with business models that don’t depend on hyper-targeting content for advertising.”
Louisa Heinrich, founder at Superhuman Limited
https://twitter.com/customdeluxe
“There may be a segregation into different types of public discourse… Facebook is more likely to develop mechanisms where comments can be filtered, or people will learn to ignore comments on all but personal messages. (Recent announcements by Facebook about selecting fewer news stories are an indirect indicator. Heated debates about gun control don’t mix well with pictures of puppies.)”
Henning Schulzrinne, a professor at Columbia University and Internet Hall of Fame
https://twitter.com/sece5
“Particularly over the last five years, we have seen the growth of technologies of reputation, identity, and collaborative moderation. Newspapers that initially rejected comment streams because of their tendency of toxicity now embrace them. YouTube, once a backwater of horrible commentary, has been tamed. While there are still spaces for nasty commentary and activities, they are becoming destinations that are sought out by interested participants rather than the default.”
Alexander Halavais, director, MA in social technologies at Arizona State University
https://twitter.com/halavais
“Already automation (creating social bots on social media platforms) is amplifying the voices of the bad people most of the time. Terrorist organizations are able to recruit many young people through these platforms... Privacy and anonymity are double-edged swords online.”
Norah Abokhodair, information privacy researcher at the University of Washington
https://twitter.com/Norahak
“Humans universally respond to anger and fear. For balanced dialogue, this is a challenging combination.”
Susan Mernit, CEO and co-founder at Hack the Hood
https://twitter.com/susanmernit
“Reputation systems will evolve to diminish the negative impact that bad actors have on online public discourse, and will become as broadly and silently available as effective spam systems have become over the last decade.”
Robert Matney, COO at Polycot Associates
https://twitter.com/rmatney
"There is a great interest in helping to create 'safe' or self-regulating communities through the development of metrics of mutual ratification. However at the same time, there will be an enlargement in the opportunities and modes for engagement, through annotation or development of new forums, and these will be characterized by the same range of human discourse as we see now.”
Peter Brantley, director of online strategy at the University of California-Davis
https://twitter.com/naypinya
“Currently the common Western ideology is very much focused on individuals and the right to do whatever technologies allow us to do—the problem is that it might not be a very good approach from the perspective of humankind as a whole. More-focused ideas of what we would like human society to be as a whole would be much needed. The technology comes first after that.”
Isto Huvila, a professor at Uppsala University in Sweden
https://twitter.com/ihuvila
“We may have augmented-reality apps that help gauge whether assertions are factually correct, or flag logical fallacies, etc. Rather than just argue back and forth I imagine we’ll invite bots into the conversation to help sort out the arguments and tie things to underlying support data, etc.”
Ryan Hayes, owner of Fit to Tweet
https://twitter.com/fittotweet
“Humanity’s reaction to negative forces will likely contribute more to the ever-narrowing filter bubble, which will continue to create an online environment that lacks inclusivity... An increased demand for systemic internet-based AI will create bots that will begin to interact—as proxies for the humans that train them—with humans online in real-time."
Lisa Heinz, a doctoral student at Ohio University
"We will see bots become part of the filter bubble phenomenon as a sort of mental bodyguard that prevents an intrusion of people and conversations to which individuals want no part. The unfortunate aspect of this iteration of the filter bubble means that while free speech itself will not be affected, people will project their voices into the chasm, but few will hear them.”
Lisa Heinz, a doctoral student at Ohio University
"My fear is that because of the virtually unlimited opportunities for negative use of social media globally we will experience a rising worldwide demand for restrictive regulation. This response may work against support of free speech in the U.S."
Paula Hooper Mayhew, a professor of humanities at Fairleigh Dickinson University
"Donald Trump is demonstrating to other politicians how to effectively exploit such an environment. He wasn’t the first to do it, by far. But he’s showing how very high-profile, powerful people can adapt and apply such strategies to social media. Basically, we’re moving out of the ‘early adopter’ phase of online polarization, into making it mainstream."
Seth Finkelstein, writer and pioneering computer programmer
"Many of the worst excesses come from people who believe in their own minds that they are not bad actors at all, but are fighting a good fight for all which is right and true (indeed, in many cases, both sides of a conflict can believe this, and where you stand depends on where you sit)."
Seth Finkelstein, writer and pioneering computer programmer
"When reward systems favor outrage-mongering and attention-seeking almost exclusively, nothing is going to be solved by inveighing against supposed moral degenerates.”
Seth Finkelstein, writer and pioneering computer programmer
"Conversations are shaped by norms and what the environment enables. For example, seating 100 dinner guests at one long table will shape the conversations differently than putting them at ten tables of ten, or 25 tables of four. The acoustics of the room will shape the conversations. Assigning seats or not will shape the conversations. Even serving wine instead of beer may shape the conversations."
David Weinberger, senior researcher at Harvard's Berkman Klein Center for Internet & Society
"[The Internet's] global nature means that we have fewer shared norms, and its digital nature means that we have far more room to play with ways of bringing people together. We’re getting much better at nudging conversations into useful interchanges. I believe we will continue to get better at it.”
David Weinberger, senior researcher at Harvard's Berkman Klein Center for Internet & Society
“Currently, online discourse is becoming more polarized and thus more extreme, mirroring the overall separation of people with differing viewpoints in the larger U.S. population. Simultaneously, several of the major social media players have been unwilling or slow to take action to curb organized harassment."
Alice Marwick, a fellow at Data & Society
"The marketplace of online attention encourages so-called ‘clickbait’ articles and sensationalized news items that often contain misinformation or disinformation, or simply lack rigorous fact-checking."
Alice Marwick, a fellow at Data & Society
"Without structural changes in both how social media sites respond to conflict, and the economic incentives for spreading inaccurate or sensational information, extremism and therefore conflict will continue."
Alice Marwick, a fellow at Data & Society
"The geographical and psychological segmentation of the U.S. population into ‘red’ and ‘blue’ neighborhoods, communities, and states is unlikely to change. It is the latter that gives rise to overall political polarization, which is reflected in the incivility of online discourse.”
Alice Marwick, a fellow at Data & Society
“Some company will get rich by offering registered pseudonyms, so that individuals may wander the Web ‘anonymously’ and yet vouched-for and accountable for bad behavior. When this happens, almost all legitimate sites will ban the unvouched anonymous.”
David Brin, leader at UCSD's Arthur C. Clarke Center and author of The Transparent Society
“Communications in any medium (the internet being but one example) reflects the people communicating. If those people use profane language, are misogynistic, judge people on irrelevant factors such as race, gender, creed, or other such factors in other parts of their lives, they will do so in any medium of communication."
Fred Baker, fellow at Cisco Systems
"If we worry about the youth of our age ‘going to the dogs,’ are we so different from our ancestors? In Book III of Odes, circa 20 BC, Horace wrote: ‘Our sires’ age was worse than our grandsires. We, their sons, are more worthless than they; so in our turn we shall give the world a progeny yet more corrupt.’"
Fred Baker, fellow at Cisco Systems
"The human race is not doomed, not today any more than in Horace’s day. But we have the opportunity to choose to lead them to more noble pursuits, and more noble discussion of them.”
Fred Baker, fellow at Cisco Systems
"The pressure for more transparency and authenticity that comes with increasing connectivity and flow of information will tend to make life more difficult for the trolls…. Privacy will yield to ‘publicy’ in knowledge economy of abundance…. What we need is Network Publicy Governance instead of market individualism and bureaucratic hierarchies.”
David Krieger, director of the Institute for Communication & Leadership at IKF in Lucerne
“It seems clear—at least in the U.S.—that 'bad actors,' children of all ages who have never been effectively taught civility and cooperation, are becoming more and more free to 'enjoy' sharing the worst of their 'social' leanings.”
Jim Warren, internet pioneer and longtime technology entrepreneur and activist
“I expect digital public discourse to skew more negative for several reasons, including: the polarization of the country; the rise of websites, Twitter accounts, and Facebook pages dedicated to portraying an opponent in a bad light; and the awful online trolling and harassment of women who are active in social media."
Jan Schaffer, executive director at J-Lab
"It is possible that technological solutions may be developed to assign crowdsourced reputation values for what is posted online. This, in my opinion, will not stop people from consuming and re-posting information of low value provided it conforms with their way of thinking.”
Daniel Menasce, professor of computer science at George Mason University
"Perhaps we will see the development of social media sites with stringent requirements for traceable identity. These systems may have to demand evidence of real-world identity and impose strong (e.g., multifactor) authentication of identity. Even so, malefactors will continue to elude even the best of the attempts to enforce consequences for bad behavior.”
M.E. Kabay, a professor of computer information systems at Norwich University
“Most dangerous is the emerging monopoly-like power of Facebook and Google to impose their own censorship norms, 100,000’s of thousands of times…This is just one reason to reduce the market dominance by making sure others can take market share, interoperability, users’ ability to take their data (social graph) to new services.”
Dave Burstein, editor at fastnet.news
“The mass media encourages negative and hateful speech by shifting the bulk of their media coverage to hot-button click-bait.”
Jesse Drew, professor of digital media at the University of California-Davis
“I see the forces within the market, with Facebook in particular pushing us toward narrower and narrower spheres of interaction. My sense is that 'widespread demand' will be seen as re-affirming that push by social platforms.”
Oscar Gandy, emeritus professor of communication at the University of Pennsylvania
“The continued expansion of sale of personal data by social media platforms and browser companies is bound to expand to distasteful and perhaps criminal activities based on the availability of greater amounts of information about individuals and their relationships.”
Ian Peter, Internet pioneer and historian based in Australia
“The internet will serve as a reflection of society, good and bad… The Internet echo chamber allows for extreme political or social positions to gain hold. [In] online communities… there is no need for moderation or listening to different points of view; if you don't like what you're reading, you can leave; there is no loyalty. In an actual community... one is more likely to compromise, see differing viewpoints.”
Tse-Sung Wu, project portfolio manager at Genentech
From: Sarvjeet Singh <sarvjeet.singh at nludelhi.ac.in<mailto:sarvjeet.singh at nludelhi.ac.in>>
Reply-To: Sarvjeet Singh <sarvjeet.singh at nludelhi.ac.in<mailto:sarvjeet.singh at nludelhi.ac.in>>
Date: Tuesday, March 28, 2017 at 4:30 PM
To: bestbits <bestbits at lists.bestbits.net<mailto:bestbits at lists.bestbits.net>>
Subject: [bestbits] (RightsCon) Confronting Online Hate Speech: Identification and Strategies (Tomorrow – 10.30 am)
Dear All,
We would like to invite you to a discussion on Confronting Online Hate Speech: Identification and Strategies organised by the Centre for Communication Governance at National Law University, Delhi on March 29, 2017 (tomorrow) from 10:30 - 11.45 am at the Klimt (Ground Floor).
The session is the launch point (the first scheduled session) for Track 13 - Stop the Hate. The Stop the Hate track, will focus on countering violent extremism and responding to hate speech.
The hashtag for the session is #UnPackingHateSpeech. Additional details about the session and the speakers are provided below.
We look forward to seeing you tomorrow.
Best,
Sarvjeet
A DISCUSSION ON
Confronting Online Hate Speech: Identification & Strategies
#UnPackingHateSpeech
10:30 – 11.45 am, March 29, 2017
at
Klimt
organised by
Centre for Communication Governance at National Law University, Delhi
Speakers
Professor David Kaye, UN Special Rapporteur on the Right to Freedom of Opinion and Expression and Clinical Professor of Law, UC Irvine School of Law
Dr. Rob Faris, Research Director, Berkman Center for Internet & Society at Harvard University (moderator)
Siddharth Narrain,Research Associate, Sarai - Centre for the Study of Developing Societies (CSDS)
Professor Susan Benesch, Faculty Associate, Berkman Klein Center for Internet and Society at Harvard University and Director, Dangerous Speech Project
Professor (Dr.) Wolfgang Schulz, Director Hans Bredow Institute and Chair, Council of Europe Committee of Experts on Internet Intermediaries
Chinmayi Arun, Executive Director, Centre for Communication Governance at National Law University, Delhi and Faculty Associate, Berkman Klein Center for Internet and Society at Harvard University
--
[https://docs.google.com/uc?export=download&id=0BycAZd9M5_7NTWs4YXZxbnJXcWc&revid=0BycAZd9M5_7Na3ROMTFBMzhiakF0dkVacm9tNks2aGlxa2o4PQ]
Sarvjeet Singh | Programme Manager
Centre for Communication Governance | National Law University, Delhi | Sector-14, Dwarka, New Delhi - 110078 | Cell: (+91) 999-023-2298<tel:+91%2099902%2032298> | Fax: (+91) 11-280-34256 | skype: sarvjeet.moond | www.ccgdelhi.org<http://www.ccgdelhi.org> . www.ccgtlr.org<http://www.ccgdelhi.org/> . www.nludelhi.ac.in<http://www.nludelhi.ac.in> | Twitter: @ccgnlud<https://twitter.com/CCGNLUD> . @sarvjeetmoond<https://twitter.com/sarvjeetmoond>
PGP: 0x44F8 0166 | Fingerprint: 213D AC8B 5634 B8A2 EDB6 A78D 543B 11EF 44F8 0166
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.igcaucus.org/pipermail/bestbits/attachments/20170329/11d9d76e/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Future Online Discourse Themes Elon Pew Report.jpg
Type: image/jpeg
Size: 1486925 bytes
Desc: Future Online Discourse Themes Elon Pew Report.jpg
URL: <http://lists.igcaucus.org/pipermail/bestbits/attachments/20170329/11d9d76e/attachment.jpg>
More information about the Bestbits
mailing list