- Jun 2024
-
disruptedjournal.postdigitalcultures.org disruptedjournal.postdigitalcultures.org
-
In this respect, we join Fitzpatrick (2011) in exploring “the extent to which the means of media production and distribution are undergoing a process of radical democratization in the Web 2.0 era, and a desire to test the limits of that democratization”
Comment by chrisaldrich: Something about this is reminiscent of WordPress' mission to democratize publishing. We can also compare it to Facebook whose (stated) mission is to connect people, while it's actual mission is to make money by seemingly radicalizing people to the extremes of our political spectrum.
This highlights the fact that while many may look at content moderation on platforms like Facebook as removing their voices or deplatforming them in the case of people like Donald J. Trump or Alex Jones as an anti-democratic move. In fact it is not. Because of Facebooks active move to accelerate extreme ideas by pushing them algorithmically, they are actively be un-democratic. Democratic behavior on Facebook would look like one voice, one account and reach only commensurate with that person's standing in real life. Instead, the algorithmic timeline gives far outsized influence and reach to some of the most extreme voices on the platform. This is patently un-democratic.
-
- Aug 2023
-
hackernoon.com hackernoon.com
-
- for: titling elections, voting - social media, voting - search engine bias, SEME, search engine manipulation effect, Robert Epstein
- summary
- research that shows how search engines can actually bias towards a political candidate in an election and tilt the election in favor of a particular party.
-
In our early experiments, reported by The Washington Post in March 2013, we discovered that Google’s search engine had the power to shift the percentage of undecided voters supporting a political candidate by a substantial margin without anyone knowing.
- for: search engine manipulation effect, SEME, voting, voting - bias, voting - manipulation, voting - search engine bias, democracy - search engine bias, quote, quote - Robert Epstein, quote - search engine bias, stats, stats - tilting elections
- paraphrase
- quote
- In our early experiments, reported by The Washington Post in March 2013,
- we discovered that Google’s search engine had the power to shift the percentage of undecided voters supporting a political candidate by a substantial margin without anyone knowing.
- 2015 PNAS research on SEME
- http://www.pnas.org/content/112/33/E4512.full.pdf?with-ds=yes&ref=hackernoon.com
- stats begin
- search results favoring one candidate
- could easily shift the opinions and voting preferences of real voters in real elections by up to 80 percent in some demographic groups
- with virtually no one knowing they had been manipulated.
- stats end
- Worse still, the few people who had noticed that we were showing them biased search results
- generally shifted even farther in the direction of the bias,
- so being able to spot favoritism in search results is no protection against it.
- stats begin
- Google’s search engine
- with or without any deliberate planning by Google employees
- was currently determining the outcomes of upwards of 25 percent of the world’s national elections.
- This is because Google’s search engine lacks an equal-time rule,
- so it virtually always favors one candidate over another, and that in turn shifts the preferences of undecided voters.
- Because many elections are very close, shifting the preferences of undecided voters can easily tip the outcome.
- stats end
-
What if, early in the morning on Election Day in 2016, Mark Zuckerberg had used Facebook to broadcast “go-out-and-vote” reminders just to supporters of Hillary Clinton? Extrapolating from Facebook’s own published data, that might have given Mrs. Clinton a boost of 450,000 votes or more, with no one but Mr. Zuckerberg and a few cronies knowing about the manipulation.
- for: Hiliary Clinton could have won, voting, democracy, voting - social media, democracy - social media, election - social media, facebook - election, 2016 US elections, 2016 Trump election, 2016 US election, 2016 US election - different results, 2016 election - social media
- interesting fact
- If Facebook had sent a "Go out and vote" message on election day of 2016 election, Clinton may have had a boost of 450,000 additional votes
- and the outcome of the election might have been different
- If Facebook had sent a "Go out and vote" message on election day of 2016 election, Clinton may have had a boost of 450,000 additional votes
Tags
- democracy - search engine bias
- 2016 US election
- Washington Post story - search engine bias
- search engine bias
- 2016 US election - different results
- search engine manipulation effect
- elections - bias
- voting - social media
- election - social media
- voting
- facebook - election
- democracy - social media
- quote - search engine bias
- Robert Epstein
- PNAS SEME study
- quote - Robert Epstein
- Trump could have lost
- democracy
- quote
- voting - search engine bias
- SEME
- stats - tilting elections
- stats
- elections - interference
- Hilary Clinton could have won
Annotators
URL
-
- Dec 2022
-
zephoria.medium.com zephoria.medium.com
-
A lot has changed about our news media ecosystem since 2007. In the United States, it’s hard to overstate how the media is entangled with contemporary partisan politics and ideology. This means that information tends not to flow across partisan divides in coherent ways that enable debate.
Our media and social media systems have been structured along with the people who use them such that debate is stifled because information doesn't flow coherently across the political partisan divide.
-
- May 2022
- Mar 2022
-
www.cs.umd.edu www.cs.umd.edu
-
The current mass media such as t elevision, books, and magazines are one-directional, and are produced by a centralized process. This can be positive, since respected editors can filter material to ensure consistency and high quality, but more widely accessible narrowcasting to specific audiences could enable livelier decentralized discussions. Democratic processes for presenting opposing views, caucusing within factions, and finding satisfactory compromises are productive for legislative, commercial, and scholarly pursuits.
Social media has to some extent democratized the access to media, however there are not nearly enough processes for creating negative feedback to dampen ideas which shouldn't or wouldn't have gained footholds in a mass society.
We need more friction in some portions of the social media space to prevent the dissemination of un-useful, negative, and destructive ideas swamping out the positive ones. The accelerative force of algorithmic feeds for the most extreme ideas in particular is one of the most caustic ideas of the last quarter of a century.
-
- Feb 2022
-
www.joinexpeditions.com www.joinexpeditions.com
-
Democracy in the age of social media. (n.d.). EXPeditions - Meet the World’s Best Minds. Retrieved February 5, 2022, from https://www.joinexpeditions.com/exps/43
-
- Sep 2021
-
www.nature.com www.nature.com
-
Zuckerman, E. (2021). Demand five precepts to aid social-media watchdogs. Nature, 597(7874), 9–9. https://doi.org/10.1038/d41586-021-02341-9
-
- Aug 2021
-
-
Fukuyama's work, which draws on both competition analysis and an assessment of threats to democracy, joins a growing body of proposals that also includes Mike Masnick's "protocols not platforms," Cory Doctorow's "adversarial interoperability," my own "Magic APIs," and Twitter CEO Jack Dorsey's "algorithmic choice."
Nice overview of work in the space for fixing monopoly in social media space the at the moment. I hadn't heard about Fukuyama or Daphne Keller's versions before.
I'm not sure I think Dorsey's is actually a thing. I suspect it is actually vaporware from the word go.
IndieWeb has been working slowly at the problem as well.
-
- Mar 2021
-
www.theatlantic.com www.theatlantic.com
-
There's a reasonably good overview of some ideas about fixing the harms social media is doing to democracy here and it's well framed by history.
Much of it appears to be a synopsis from the perspective of one who's only managed to attend Pariser and Stround's recent Civic Signals/New_Public Festival.
There could have been some touches of other research in the social space including those in the Activity Streams and IndieWeb spaces to provide some alternate viewpoints.
-
Tang has sponsored the use of software called Polis, invented in Seattle. This is a platform that lets people make tweet-like, 140-character statements, and lets others vote on them. There is no “reply” function, and thus no trolling or personal attacks. As statements are made, the system identifies those that generate the most agreement among different groups. Instead of favoring outrageous or shocking views, the Polis algorithm highlights consensus. Polis is often used to produce recommendations for government action.
An example of social media for proactive government action.
-
-
journal.disruptivemedia.org.uk journal.disruptivemedia.org.uk
-
In this respect, we join Fitzpatrick (2011) in exploring “the extent to which the means of media production and distribution are undergoing a process of radical democratization in the Web 2.0 era, and a desire to test the limits of that democratization”
Something about this is reminiscent of WordPress' mission to democratize publishing. We can also compare it to Facebook whose (stated) mission is to connect people, while it's actual mission is to make money by seemingly radicalizing people to the extremes of our political spectrum.
This highlights the fact that while many may look at content moderation on platforms like Facebook as removing their voices or deplatforming them in the case of people like Donald J. Trump or Alex Jones as an anti-democratic move. In fact it is not. Because of Facebooks active move to accelerate extreme ideas by pushing them algorithmically, they are actively be un-democratic. Democratic behavior on Facebook would look like one voice, one account and reach only commensurate with that person's standing in real life. Instead, the algorithmic timeline gives far outsized influence and reach to some of the most extreme voices on the platform. This is patently un-democratic.
-
- Oct 2020
-
www.hope-project.dk www.hope-project.dkApp1
-
The HOPE-project (http://hope-project.dk ) tracks public opinion during #covid19, sharing findings with the public & authorities. This graph is the most concerning yet: The # willing to use an approved COVID-vaccine recommended for them
-
- May 2020
-
-
@DFRLab. (2020, May 14). Op-Ed: The criminalization of COVID-19 clicks and conspiracies. Medium. https://medium.com/dfrlab/op-ed-the-criminalization-of-covid-19-clicks-and-conspiracies-3af077f5a7e7
-
- Dec 2019
-
www.theatlantic.com www.theatlantic.com
-
Madison’s design has proved durable. But what would happen to American democracy if, one day in the early 21st century, a technology appeared that—over the course of a decade—changed several fundamental parameters of social and political life? What if this technology greatly increased the amount of “mutual animosity” and the speed at which outrage spread? Might we witness the political equivalent of buildings collapsing, birds falling from the sky, and the Earth moving closer to the sun?
Jonathan Haidt, you might have noticed, is a scholar that I admire very much. In this piece, his colleague Tobias Rose-Stockwell and he ask the following questions: Is social media a threat to our democracy? Let's read the following article together and think about their question together.
-
- Oct 2019
-
journals.sagepub.com journals.sagepub.com
-
Differential Effects of Capital-Enhancing and Recreational Internet Use on Citizens’ Demand for Democracy
-
-
theconversation.com theconversation.com
-
Is internet freedom a tool for democracy or authoritarianism?
I chose this article because its topic speaks to our time. Also, this article has a lot of great links out just in case you want to learn more about the topics the authors speak of.
-
- Oct 2017
-
www.nytimes.com www.nytimes.com
-
Back in 2012, Zeynep Tufecki pointed out that election campaigns driven by big-data and social media could be bad for democracy.
-
- Jul 2017
-
-
The backfire effect is getting turbocharged online. I think we’re getting more angry and convinced about everything, not because we’re surrounded by like-minded people, but by people who disagree with us. Social media allows you to find the worst examples of your opponents. It’s not a place to have your own views corroborated, but rather where your worst suspicions about the other lot can be quickly and easily confirmed.
-
- Apr 2017
-
idlewords.com idlewords.com
-
Obviously, in this situation whoever controls the algorithms has great power. Decisions like what is promoted to the top of a news feed can swing elections. Small changes in UI can drive big changes in user behavior. There are no democratic checks or controls on this power, and the people who exercise it are trying to pretend it doesn’t exist
-
On Facebook, social dynamics and the algorithms’ taste for drama reinforce each other. Facebook selects from stories that your friends have shared to find the links you’re most likely to click on. This is a potent mix, because what you read and post on Facebook is not just an expression of your interests, but part of a performative group identity.
So without explicitly coding for this behavior, we already have a dynamic where people are pulled to the extremes. Things get worse when third parties are allowed to use these algorithms to target a specific audience.
-
any system trying to maximize engagement will try to push users towards the fringes. You can prove this to yourself by opening YouTube in an incognito browser (so that you start with a blank slate), and clicking recommended links on any video with political content.
...
This pull to the fringes doesn’t happen if you click on a cute animal story. In that case, you just get more cute animals (an experiment I also recommend trying). But the algorithms have learned that users interested in politics respond more if they’re provoked more, so they provoke. Nobody programmed the behavior into the algorithm; it made a correct observation about human nature and acted on it.
-
-
shareblue.com shareblue.com
-
Trump-Russia sock puppets posing as "Bernie bros" targeted women and black men with harassment on Twitter when they expressed support for Hillary Clinton.
-
- Dec 2016
-
-
http://digipo.io/doku.php<br> The Digital Polarization Initiative<br> "The primary purpose of this wiki is to provide a place for students to fact-check, annotate, and provide context to the different news stories that show up in their Twitter and Facebook feeds. It's like a student-driven Snopes, but with a broader focus: we don't aim to just investigate myths, but to provide context and sanity to all the news – from the article about voter fraud to the health piece on a new cancer treatment."
-
-
politicalbots.org politicalbots.org
-
http://phys.org/news/2016-12-pro-trump-bot-colonised-pro-clinton-twitter.html
http://politicalbots.org/?p=787 Bots and Automation over Twitter during the U.S. Election
http://politicalbots.org/ An international research group studying political bots.
Tags
Annotators
URL
-
-
hybridpedagogy.org hybridpedagogy.org
-
The Web has become an insidious propaganda tool. To fight it, digital literacy education must rise beyond technical proficiency to include wisdom.
- Double-check every claim before you share.
- Be wary of casual scrolling.<br> Everything you see affects your attitudes.
- Don't automatically disbelieve the surreal (or unpleasant).
- Do not exaggerate your own claims.
- Be prepared to repeat the truth over and over.
- Curate good resources, and share updates to them.
- It will reinforce the previous information.
- it will boost search engine rankings of the collection.
-
-
hapgood.us hapgood.us
-
A survey of voters asked if they remembered seeing a headline, and if so, whether they believed it was true.
It may come as no surprise that high percentages of Trump voters believed stories that favored Trump or demonized Clinton. But the percentage of Clinton voters who believed the fake stories was also fairly high!
familiarity equals truth: when we recognize something as true, we are most often judging if this is something we’ve heard more often than not from people we trust.
...
if you want to be well-informed it’s not enough to read the truth — you also must avoid reading lies.
-
-
www.theguardian.com www.theguardian.com
-
This is our internet. Not Google’s. Not Facebook’s. Not rightwing propagandists. And we’re the only ones who can reclaim it.
This is our nation, and our world.<br> It is up to us to reclaim it.
-
- Nov 2016
-
www.npr.org www.npr.org
-
Interview with a man who has run several fake news sites since 2013.
Well, this isn't just a Trump-supporter problem. This is a right-wing issue.
...
We've tried to do similar things to liberals. It just has never worked, it never takes off. You'll get debunked within the first two comments and then the whole thing just kind of fizzles out.
-
-
-
Journalism faces an 'existential crisis' in the Trump era, Christine Amanpour
As all the international journalists we honor in this room tonight and every year know only too well: First the media is accused of inciting, then sympathizing, then associating -- until they suddenly find themselves accused of being full-fledged terrorists and subversives. Then they end up in handcuffs, in cages, in kangaroo courts, in prison
...
First, like many people watching where I was overseas, I admit I was shocked by the exceptionally high bar put before one candidate and the exceptionally low bar put before the other candidate.
It appeared much of the media got itself into knots trying to differentiate between balance, objectivity, neutrality, and crucially, truth.
...
The winning candidate did a savvy end run around us and used it to go straight to the people. Combined with the most incredible development ever -- the tsunami of fake news sites -- aka lies -- that somehow people could not, would not, recognize, fact check, or disregard.
...
The conservative radio host who may be the next white house press secretary says mainstream media is hostile to traditional values.
I would say it's just the opposite. And have you read about the "heil, victory" meeting in Washington, DC this past weekend? Why aren't there more stories about the dangerous rise of the far right here and in Europe? Since when did anti-Semitism stop being a litmus test in this country?
-
-
www.washingtonpost.com www.washingtonpost.com
-
Paul Horner publishes fake news that is often shared widely. He claims that his stories are intended to be taken as satire like The Onion.
Honestly, people are definitely dumber. They just keep passing stuff around. Nobody fact-checks anything anymore — I mean, that’s how Trump got elected. He just said whatever he wanted, and people believed everything, and when the things he said turned out not to be true, people didn’t care because they’d already accepted it. It’s real scary. I’ve never seen anything like it.
My sites were picked up by Trump supporters all the time. I think Trump is in the White House because of me. His followers don’t fact-check anything — they’ll post everything, believe anything. His campaign manager posted my story about a protester getting paid $3,500 as fact. Like, I made that up. I posted a fake ad on Craigslist.
-
-
-
But as managing editor of the fact-checking site Snopes, Brooke Binkowski believes Facebook’s perpetuation of phony news is not to blame for our epidemic of misinformation. “It’s not social media that’s the problem,” she says emphatically. “People are looking for somebody to pick on. The alt-rights have been empowered and that’s not going to go away anytime soon. But they also have always been around.”
The misinformation crisis, according to Binkowski, stems from something more pernicious. In the past, the sources of accurate information were recognizable enough that phony news was relatively easy for a discerning reader to identify and discredit. The problem, Binkowski believes, is that the public has lost faith in the media broadly — therefore no media outlet is considered credible any longer. The reasons are familiar: as the business of news has grown tougher, many outlets have been stripped of the resources they need for journalists to do their jobs correctly.
The problem is not JUST social media and fake news. But most of the false stories do not come from mainstream media. The greatest evils of mainstream media are sensationalism, and being too willing to spin stories the way their sources want them to.
-
-
www.npr.org www.npr.org
-
But a former employee, Antonio Garcia-Martinez, disagrees and says his old boss is being "more than a little disingenuous here."
...
"There's an entire political team and a massive office in D.C. that tries to convince political advertisers that Facebook can convince users to vote one way or the other," Garcia-Martinez says. "Then Zuck gets up and says, 'Oh, by the way, Facebook content couldn't possibly influence the election.' It's contradictory on the face of it."
-
-
-
Mike Caulfield says Facebook's feed algorithms are far from its only problem. The entire site design encourages sharing of items that users haven't inspected beyond reading the headline.
-
-
www.vox.com www.vox.com
-
Facebook hasn’t told the public very much about how its algorithm works. But we know that one of the company’s top priorities for the news feed is “engagement.” The company tries to choose posts that people are likely to read, like, and share with their friends. Which, they hope, will induce people to return to the site over and over again.
This would be a reasonable way to do things if Facebook were just a way of finding your friends’ cutest baby pictures. But it’s more troubling as a way of choosing the news stories people read. Essentially, Facebook is using the same criteria as a supermarket tabloid: giving people the most attention-grabbing headlines without worrying about whether articles are fair, accurate, or important.
-
- Oct 2016
-
www.theguardian.com www.theguardian.com
-
“Among millennials, especially,” [Ross] Douthat argues, “there’s a growing constituency for whom rightwing ideas are so alien or triggering, leftwing orthodoxy so pervasive and unquestioned, that supporting a candidate like Hillary Clinton looks like a needless form of compromise.”
...
“I don’t see sufficient evidence to buy the argument about siloing and confirmation bias,” Jeff Jarvis,a professor at the City University of New York’s graduate school of journalism said. “That is a presumption about the platforms – because we in media think we do this better. More important, such presumptions fundamentally insult young people. For too long, old media has assumed that young people don’t care about the world.”
“Newspapers, remember, came from the perspective of very few people: one editor, really,” Jarvis said. “Facebook comes with many perspectives and gives many; as Zuckerberg points out, no two people on Earth see the same Facebook.”
-
- Jun 2016
-
www.lewrockwell.com www.lewrockwell.com
-
Automated posts from social media accounts pretending to be real individuals are being used to influence public opinion. (The Chinese government uses regular employees to post "real" messages at strategic times.)
-
- Apr 2016
-
blog.jonudell.net blog.jonudell.net
-
Jon Udell on productive social discourse.
changeable minds<br> What’s something you believed deeply, for a long time, and then changed your mind about?
David Gray's Liminal Thinking points out that we all have beliefs that are built on hidden foundations. We need to carefully examine our own beliefs and their origins. And we need to avoid judgment as we consider the beliefs of others and their origins.
Wael Ghonim asks us to design social media that encourages civility, thoughtfulness, and open minds rather than self-promotion, click-bait, and echo chambers.
-