5OIIPieces – The Policy and Internet Blog https://ensr.oii.ox.ac.uk Understanding public policy online Mon, 07 Dec 2020 14:24:48 +0000 en-GB hourly 1 Five Pieces You Should Probably Read On: Fake News and Filter Bubbles https://ensr.oii.ox.ac.uk/five-pieces-you-should-probably-read-on-fake-news-and-filter-bubbles/ Fri, 27 Jan 2017 10:08:39 +0000 http://blogs.oii.ox.ac.uk/policy/?p=3940 This is the second post in a series that will uncover great writing by faculty and students at the Oxford Internet Institute, things you should probably know, and things that deserve to be brought out for another viewing. This week: Fake News and Filter Bubbles!

Fake news, post-truth, “alternative facts”, filter bubbles — this is the news and media environment we apparently now inhabit, and that has formed the fabric and backdrop of Brexit (“£350 million a week”) and Trump (“This was the largest audience to ever witness an inauguration — period”). Do social media divide us, hide us from each other? Are you particularly aware of what content is personalised for you, what it is you’re not seeing? How much can we do with machine-automated or crowd-sourced verification of facts? And are things really any worse now than when Bacon complained in 1620 about the false notions that “are now in possession of the human understanding, and have taken deep root therein”?

 

1. Bernie Hogan: How Facebook divides us [Times Literary Supplement]

27 October 2016 / 1000 words / 5 minutes

“Filter bubbles can create an increasingly fractured population, such as the one developing in America. For the many people shocked by the result of the British EU referendum, we can also partially blame filter bubbles: Facebook literally filters our friends’ views that are least palatable to us, yielding a doctored account of their personalities.”

Bernie Hogan says it’s time Facebook considered ways to use the information it has about us to bring us together across political, ideological and cultural lines, rather than hide us from each other or push us into polarized and hostile camps. He says it’s not only possible for Facebook to help mitigate the issues of filter bubbles and context collapse; it’s imperative, and it’s surprisingly simple.

 

2. Luciano Floridi: Fake news and a 400-year-old problem: we need to resolve the ‘post-truth’ crisis [the Guardian]

29 November 2016 / 1000 words / 5 minutes

“The internet age made big promises to us: a new period of hope and opportunity, connection and empathy, expression and democracy. Yet the digital medium has aged badly because we allowed it to grow chaotically and carelessly, lowering our guard against the deterioration and pollution of our infosphere. […] some of the costs of misinformation may be hard to reverse, especially when confidence and trust are undermined. The tech industry can and must do better to ensure the internet meets its potential to support individuals’ wellbeing and social good.”

The Internet echo chamber satiates our appetite for pleasant lies and reassuring falsehoods, and has become the defining challenge of the 21st century, says Luciano Floridi. So far, the strategy for technology companies has been to deal with the ethical impact of their products retrospectively, but this is not good enough, he says. We need to shape and guide the future of the digital, and stop making it up as we go along. It is time to work on an innovative blueprint for a better kind of infosphere.

 

3. Philip Howard: Facebook and Twitter’s real sin goes beyond spreading fake news

3 January 2017 / 1000 words / 5 minutes

“With the data at their disposal and the platforms they maintain, social media companies could raise standards for civility by refusing to accept ad revenue for placing fake news. They could let others audit and understand the algorithms that determine who sees what on a platform. Just as important, they could be the platforms for doing better opinion, exit and deliberative polling.”

Only Facebook and Twitter know how pervasive fabricated news stories and misinformation campaigns have become during referendums and elections, says Philip Howard — and allowing fake news and computational propaganda to target specific voters is an act against democratic values. But in a time of weakening polling systems, withholding data about public opinion is actually their major crime against democracy, he says.

 

4. Brent Mittelstadt: Should there be a better accounting of the algorithms that choose our news for us?

7 December 2016 / 1800 words / 8 minutes

“Transparency is often treated as the solution, but merely opening up algorithms to public and individual scrutiny will not in itself solve the problem. Information about the functionality and effects of personalisation must be meaningful to users if anything is going to be accomplished. At a minimum, users of personalisation systems should be given more information about their blind spots, about the types of information they are not seeing, or where they lie on the map of values or criteria used by the system to tailor content to users.”

A central ideal of democracy is that political discourse should allow a fair and critical exchange of ideas and values. But political discourse is unavoidably mediated by the mechanisms and technologies we use to communicate and receive information, says Brent Mittelstadt. And content personalization systems and the algorithms they rely upon create a new type of curated media that can undermine the fairness and quality of political discourse.

 

5. Heather Ford: Verification of crowd-sourced information: is this ‘crowd wisdom’ or machine wisdom?

19 November 2013 / 1400 words / 6 minutes

“A key question being asked in the design of future verification mechanisms is the extent to which verification work should be done by humans or non-humans (machines). Here, verification is not a binary categorisation, but rather there is a spectrum between human and non-human verification work, and indeed, projects like Ushahidi, Wikipedia and Galaxy Zoo have all developed different verification mechanisms.”

‘Human’ verification, a process of checking whether a particular report meets a group’s truth standards, is an acutely social process, says Heather Ford. If code is law and if other aspects in addition to code determine how we can act in the world, it is important that we understand the context in which code is deployed. Verification is a practice that determines how we can trust information coming from a variety of sources — only by illuminating such practices and the variety of impacts that code can have in different environments can we begin to understand how code regulates our actions in crowdsourcing environments.

 

.. and just to prove we’re capable of understanding and acknowledging and assimilating multiple viewpoints on complex things, here’s Helen Margetts, with a different slant on filter bubbles: “Even if political echo chambers were as efficient as some seem to think, there is little evidence that this is what actually shapes election results. After all, by definition echo chambers preach to the converted. It is the undecided people who (for example) the Leave and Trump campaigns needed to reach. And from the research, it looks like they managed to do just that.”

 

The Authors

Bernie Hogan is a Research Fellow at the OII; his research interests lie at the intersection of social networks and media convergence.

Luciano Floridi is the OII’s Professor of Philosophy and Ethics of Information. His  research areas are the philosophy of Information, information and computer ethics, and the philosophy of technology.

Philip Howard is the OII’s Professor of Internet Studies. He investigates the impact of digital media on political life around the world.

Brent Mittelstadt is an OII Postdoc His research interests include the ethics of information handled by medical ICT, theoretical developments in discourse and virtue ethics, and epistemology of information.

Heather Ford completed her doctorate at the OII, where she studied how Wikipedia editors write history as it happens. She is now a University Academic Fellow in Digital Methods at the University of Leeds. Her forthcoming book “Fact Factories: Wikipedia’s Quest for the Sum of All Human Knowledge” will be published by MIT Press.

Helen Margetts is the OII’s Director, and Professor of Society and the Internet. She specialises in digital era government, politics and public policy, and data science and experimental methods. Her most recent book is Political Turbulence (Princeton).

 

Coming up! .. It’s the economy, stupid / Augmented reality and ambient fun / The platform economy / Power and development / Internet past and future / Government / Labour rights / The disconnected / Ethics / Staying critical

]]>
Five Pieces You Should Probably Read On: The US Election https://ensr.oii.ox.ac.uk/five-pieces-you-should-probably-read-on-the-us-election/ Fri, 20 Jan 2017 12:22:18 +0000 http://blogs.oii.ox.ac.uk/policy/?p=3927 This is the first post in a series that will uncover great writing by faculty and students at the Oxford Internet Institute, things you should probably know, and things that deserve to be brought out for another viewing. This week: The US Election.

This was probably the nastiest Presidential election in recent memory: awash with Twitter bots and scandal, polarisation and filter bubbles, accusations of interference by Russia and the Director of the FBI, and another shock result. We have written about electoral prediction elsewhere: instead, here are five pieces that consider the interaction of social media and democracy — the problems, but also potential ways forward.

 

1. James Williams: The Clickbait Candidate

10 October 2016 / 2700 words / 13 minutes

“Trump is very straightforwardly an embodiment of the dynamics of clickbait: he is the logical product (though not endpoint) in the political domain of a media environment designed to invite, and indeed incentivize, relentless competition for our attention […] Like clickbait or outrage cascades, Donald Trump is merely the sort of informational packet our media environment is designed to select for.”

James Williams says that now is probably the time to have that societal conversation about the design ethics of the attention economy — because in our current media environment, attention trumps everything.

 

2. Sam Woolley, Philip Howard: Bots Unite to Automate the Presidential Election [Wired]

15 May 2016 / 850 words / 4 minutes

“Donald Trump understands minority communities. Just ask Pepe Luis Lopez, Francisco Palma, and Alberto Contreras […] each tweeted in support of Trump after his victory in the Nevada caucuses earlier this year. The problem is, Pepe, Francisco, and Alberto aren’t people. They’re bots.”

It’s no surprise that automated spam accounts (or bots) are creeping into election politics, say Sam Woolley and Philip Howard. Demanding bot transparency would at least help clean up social media — which, for better or worse, is increasingly where presidents get elected.

 

3. Phil Howard: Is Social Media Killing Democracy?

15 November 2016 / 1100 words / 5 minutes

“This is the big year for computational propaganda — using immense data sets to manipulate public opinion over social media. Both the Brexit referendum and US election have revealed the limits of modern democracy, and social media platforms are currently setting those limits […] these technologies permit too much fake news, encourage our herding instincts, and aren’t expected to provide public goods.”

Phil Howard discusses ways to address fake news, audit social algorithms, and deal with social media’s “moral pass” — social media is damaging democracy, he says, but can also be used to save it.

 

4. Helen Margetts: Don’t Shoot the Messenger! What part did social media play in 2016 US e­lection?

15 November 2016 / 600 words / 3 minutes

“Rather than seeing social media solely as the means by which Trump ensnared his presidential goal, we should appreciate how they can provide a wealth of valuable data to understand the anger and despair that the polls missed, and to analyse political behaviour and opinion in the times ahead.”

New social information and visibility brings change to social behaviour, says Helen Margetts — ushering in political turbulence and unpredictability. Social media made visible what could have remain a country’s dark secret (hatred of women, rampant racism, etc.), but it will also underpin any radical counter-movement that emerges in the future.

 

5. Helen Margetts: Of course social media is transforming politics. But it’s not to blame for Brexit and Trump

9 January 2017 / 1700 words / 8 minutes

“Even if political echo chambers were as efficient as some seem to think, there is little evidence that this is what actually shapes election results. After all, by definition echo chambers preach to the converted. It is the undecided people who (for example) the Leave and Trump campaigns needed to reach. And from the research, it looks like they managed to do just that.”

Politics is a lot messier in the social media era than it used to be, says Helen Margetts, but rather than blaming social media for undermining democracy, we should be thinking about how we can improve the (inevitably major) part that it plays.

 

The Authors

James Williams is an OII doctoral candidate, studying the ethics of attention and persuasion in technology design.

Sam Woolley is a Research Assistant on the OII’s Computational Propaganda project; he is interested in political bots, and the intersection of political communication and automation.

Philip Howard is the OII’s Professor of Internet Studies and PI of the Computational Propaganda project. He investigates the impact of digital media on political life around the world.

Helen Margetts is the OII’s Director, and Professor of Society and the Internet. She specialises in digital era government, politics and public policy, and data science and experimental methods. Her most recent book is Political Turbulence (Princeton).

 

Coming up .. Fake news and filter bubbles / It’s the economy, stupid / Augmented reality and ambient fun / The platform economy / Power and development / Internet past and future / Government / Labour rights / The disconnected / Ethics / Staying critical

#5OIIPieces

]]>