53 Matching Annotations
  1. Aug 2023
    1. Democracy and Education was written before the assemblyline had achieved its dominant position in the industrialworld and before mechanization had depopulated the farmsof America.

      Interesting history and possible solutions.

      Dewey on the humanization of work front running the dramatic changes of and in work in an industrial age?


      Note here the potential coupling of democracy and education as dovetailing ideas rather than separate ideas which can be used simultaneously. We should take care here not to end up with potential baggage that could result in society and culture the way scholasticism combined education and religion in the middle ages onward.

      • for: titling elections, voting - social media, voting - search engine bias, SEME, search engine manipulation effect, Robert Epstein
      • summary
        • research that shows how search engines can actually bias towards a political candidate in an election and tilt the election in favor of a particular party.
    1. In our early experiments, reported by The Washington Post in March 2013, we discovered that Google’s search engine had the power to shift the percentage of undecided voters supporting a political candidate by a substantial margin without anyone knowing.
      • for: search engine manipulation effect, SEME, voting, voting - bias, voting - manipulation, voting - search engine bias, democracy - search engine bias, quote, quote - Robert Epstein, quote - search engine bias, stats, stats - tilting elections
      • paraphrase
      • quote
        • In our early experiments, reported by The Washington Post in March 2013,
        • we discovered that Google’s search engine had the power to shift the percentage of undecided voters supporting a political candidate by a substantial margin without anyone knowing.
        • 2015 PNAS research on SEME
          • http://www.pnas.org/content/112/33/E4512.full.pdf?with-ds=yes&ref=hackernoon.com
          • stats begin
          • search results favoring one candidate
          • could easily shift the opinions and voting preferences of real voters in real elections by up to 80 percent in some demographic groups
          • with virtually no one knowing they had been manipulated.
          • stats end
          • Worse still, the few people who had noticed that we were showing them biased search results
          • generally shifted even farther in the direction of the bias,
          • so being able to spot favoritism in search results is no protection against it.
          • stats begin
          • Google’s search engine 
            • with or without any deliberate planning by Google employees 
          • was currently determining the outcomes of upwards of 25 percent of the world’s national elections.
          • This is because Google’s search engine lacks an equal-time rule,
            • so it virtually always favors one candidate over another, and that in turn shifts the preferences of undecided voters.
          • Because many elections are very close, shifting the preferences of undecided voters can easily tip the outcome.
          • stats end
    2. What if, early in the morning on Election Day in 2016, Mark Zuckerberg had used Facebook to broadcast “go-out-and-vote” reminders just to supporters of Hillary Clinton? Extrapolating from Facebook’s own published data, that might have given Mrs. Clinton a boost of 450,000 votes or more, with no one but Mr. Zuckerberg and a few cronies knowing about the manipulation.
      • for: Hiliary Clinton could have won, voting, democracy, voting - social media, democracy - social media, election - social media, facebook - election, 2016 US elections, 2016 Trump election, 2016 US election, 2016 US election - different results, 2016 election - social media
      • interesting fact
        • If Facebook had sent a "Go out and vote" message on election day of 2016 election, Clinton may have had a boost of 450,000 additional votes
          • and the outcome of the election might have been different
      • for: polycrisis, collapse, tweedledums, tweedledees, wicked problem, social mess, stuck, stuckness, complexity
      • title
        • Is This How Political Collapse Will Unfold?
      • author
        • Dave Pollard
      • date
        • Aug 3, 2023
      • comment
        • thought provoking
        • honest, diverse, open thinking
        • a good piece of writing to submit to SRG / Deep Humanity analysis for surfacing insights
        • adjacency
          • complexity
          • emptiness
          • stuckness
            • this word "stuckness" stuck out in me (no pun intended) today - so many intractable, stuck problems, at all levels of society, because we oversimplify complexity to the point of harmful abstraction.
      • definition

        • Tweedledums

          • This is a Reactionary Caste that believes that salvation lies in a return to a non-existent nostalgic past, characterized by respect for
            • authority,
            • order,
            • hierarchy,
            • individual initiative, and
            • ‘traditional’ ways of doing things,
          • governed by a
            • strict,
            • lean,
            • paternalistic elite
          • that leaves as much as possible up to individual families guided by
            • established ‘family values’ and
            • by their interpretation of the will of their god.
        • Tweedledees

          • This is a PM (Professional-Managerial) Caste that believes that salvation lies in striving for an impossibly idealistic future characterized by
            • mutual care,
            • affluence
            • relative equality for all,
          • governed by a
            • kind,
            • thoughtful,
            • educated,
            • informed and
            • representative
          • elite that appreciates the role of public institutions and regulations, and is guided by principles of
            • humanism and
            • ‘fairness’.
        • references
        • Aurélien
        • source
        • led here by reading Dave Pollard's other article
  2. Dec 2022
    1. A lot has changed about our news media ecosystem since 2007. In the United States, it’s hard to overstate how the media is entangled with contemporary partisan politics and ideology. This means that information tends not to flow across partisan divides in coherent ways that enable debate.

      Our media and social media systems have been structured along with the people who use them such that debate is stifled because information doesn't flow coherently across the political partisan divide.

  3. Oct 2022
    1. Mosca backs up histhesis with this assertion: It's the power of organization thatenables the minority always to rule. There are organizedminorities and they run things and men. There are unorganizedmajorities and they are run.

      In a democracy, is it not just rule by majority, but rule by the most organized that ends up dominating the society?

      Perhaps C. Wright Mills' work on the elite has some answers?

      The Republican party's use of organization to create gerrymandering is a clear example of using extreme organization to create minority rule. Cross reference: Slay the Dragon in which this issue is laid out with the mention of using a tiny amount of money to careful gerrymander maps to provide outsized influences and then top-down outlines to imprint broad ideas from a central location onto smaller individual constituencies (state and local).

  4. May 2022
  5. Mar 2022
    1. The current mass media such as t elevision, books, and magazines are one-directional, and are produced by a centralized process. This can be positive, since respected editors can filter material to ensure consistency and high quality, but more widely accessible narrowcasting to specific audiences could enable livelier decentralized discussions. Democratic processes for presenting opposing views, caucusing within factions, and finding satisfactory compromises are productive for legislative, commercial, and scholarly pursuits.

      Social media has to some extent democratized the access to media, however there are not nearly enough processes for creating negative feedback to dampen ideas which shouldn't or wouldn't have gained footholds in a mass society.

      We need more friction in some portions of the social media space to prevent the dissemination of un-useful, negative, and destructive ideas swamping out the positive ones. The accelerative force of algorithmic feeds for the most extreme ideas in particular is one of the most caustic ideas of the last quarter of a century.

  6. Feb 2022
    1. Also, we shouldn’t underestimate the advantages of writing. In oralpresentations, we easily get away with unfounded claims. We candistract from argumentative gaps with confident gestures or drop acasual “you know what I mean” irrespective of whether we knowwhat we meant. In writing, these manoeuvres are a little too obvious.It is easy to check a statement like: “But that is what I said!” Themost important advantage of writing is that it helps us to confrontourselves when we do not understand something as well as wewould like to believe.

      In modern literate contexts, it is easier to establish doubletalk in oral contexts than it is in written contexts as the written is more easily reviewed for clarity and concreteness. Verbal ticks like "you know what I mean", "it's easy to see/show", and other versions of similar hand-waving arguments that indicate gaps in thinking and arguments are far easier to identify in writing than they are in speech where social pressure may cause the audience to agree without actually following the thread of the argument. Writing certainly allows for timeshiting, but it explicitly also expands time frames for grasping and understanding a full argument in a way not commonly seen in oral settings.

      Note that this may not be the case in primarily oral cultures which may take specific steps to mitigate these patterns.

      Link this to the anthropology example from Scott M. Lacy of the (Malian?) tribe that made group decisions by repeating a statement from the lowest to the highest and back again to ensure understanding and agreement.


      This difference in communication between oral and literate is one which leaders can take advantage of in leading their followers astray. An example is Donald Trump who actively eschewed written communication or even reading in general in favor of oral and highly emotional speech. This generally freed him from the need to make coherent and useful arguments.

  7. Oct 2021
  8. Sep 2021
  9. Aug 2021
    1. Fukuyama's work, which draws on both competition analysis and an assessment of threats to democracy, joins a growing body of proposals that also includes Mike Masnick's "protocols not platforms," Cory Doctorow's "adversarial interoperability," my own "Magic APIs," and Twitter CEO Jack Dorsey's "algorithmic choice."

      Nice overview of work in the space for fixing monopoly in social media space the at the moment. I hadn't heard about Fukuyama or Daphne Keller's versions before.

      I'm not sure I think Dorsey's is actually a thing. I suspect it is actually vaporware from the word go.

      IndieWeb has been working slowly at the problem as well.

  10. Mar 2021
    1. There's a reasonably good overview of some ideas about fixing the harms social media is doing to democracy here and it's well framed by history.

      Much of it appears to be a synopsis from the perspective of one who's only managed to attend Pariser and Stround's recent Civic Signals/New_Public Festival.

      There could have been some touches of other research in the social space including those in the Activity Streams and IndieWeb spaces to provide some alternate viewpoints.

    2. Tang has sponsored the use of software called Polis, invented in Seattle. This is a platform that lets people make tweet-like, 140-character statements, and lets others vote on them. There is no “reply” function, and thus no trolling or personal attacks. As statements are made, the system identifies those that generate the most agreement among different groups. Instead of favoring outrageous or shocking views, the Polis algorithm highlights consensus. Polis is often used to produce recommendations for government action.

      An example of social media for proactive government action.

    1. In this respect, we join Fitzpatrick (2011) in exploring “the extent to which the means of media production and distribution are undergoing a process of radical democratization in the Web 2.0 era, and a desire to test the limits of that democratization”

      Something about this is reminiscent of WordPress' mission to democratize publishing. We can also compare it to Facebook whose (stated) mission is to connect people, while it's actual mission is to make money by seemingly radicalizing people to the extremes of our political spectrum.

      This highlights the fact that while many may look at content moderation on platforms like Facebook as removing their voices or deplatforming them in the case of people like Donald J. Trump or Alex Jones as an anti-democratic move. In fact it is not. Because of Facebooks active move to accelerate extreme ideas by pushing them algorithmically, they are actively be un-democratic. Democratic behavior on Facebook would look like one voice, one account and reach only commensurate with that person's standing in real life. Instead, the algorithmic timeline gives far outsized influence and reach to some of the most extreme voices on the platform. This is patently un-democratic.

  11. Feb 2021
  12. Oct 2020
  13. Sep 2020
  14. Aug 2020
  15. May 2020
  16. Apr 2020
  17. Feb 2020
    1. socialists do not support capitalism, meaning they want workers to control the means of production

      Workers controlling the means of production sounds like co-operative industries. This paradigm is not antithetical to 'capitalism' in the sense that there is still private ownership of the means of production. I disagree with the statement that democratic socialists do not support capitalism.

      A good debate on this topic here - https://politics.stackexchange.com/questions/323/are-worker-cooperatives-socialist-capitalist-or-their-own-category

  18. Dec 2019
    1. Madison’s design has proved durable. But what would happen to American democracy if, one day in the early 21st century, a technology appeared that—over the course of a decade—changed several fundamental parameters of social and political life? What if this technology greatly increased the amount of “mutual animosity” and the speed at which outrage spread? Might we witness the political equivalent of buildings collapsing, birds falling from the sky, and the Earth moving closer to the sun?

      Jonathan Haidt, you might have noticed, is a scholar that I admire very much. In this piece, his colleague Tobias Rose-Stockwell and he ask the following questions: Is social media a threat to our democracy? Let's read the following article together and think about their question together.

  19. Oct 2019
  20. Nov 2018
    1. Entscheidend ist, dass sie Herren des Verfahrens bleiben - und eine Vision für das neue Maschinenzeitalter entwickeln.

      Es sieht für mich nicht eigentlich so aus als wären wir jemals die "Herren des Verfahrens" gewesen. Und auch darum geht es ja bei Marx. Denke ich.

  21. Oct 2017
  22. Sep 2017
  23. Jul 2017
    1. The backfire effect is getting turbocharged online. I think we’re getting more angry and convinced about everything, not because we’re surrounded by like-minded people, but by people who disagree with us. Social media allows you to find the worst examples of your opponents. It’s not a place to have your own views corroborated, but rather where your worst suspicions about the other lot can be quickly and easily confirmed.

  24. Apr 2017
    1. Obviously, in this situation whoever controls the algorithms has great power. Decisions like what is promoted to the top of a news feed can swing elections. Small changes in UI can drive big changes in user behavior. There are no democratic checks or controls on this power, and the people who exercise it are trying to pretend it doesn’t exist

    2. On Facebook, social dynamics and the algorithms’ taste for drama reinforce each other. Facebook selects from stories that your friends have shared to find the links you’re most likely to click on. This is a potent mix, because what you read and post on Facebook is not just an expression of your interests, but part of a performative group identity.

      So without explicitly coding for this behavior, we already have a dynamic where people are pulled to the extremes. Things get worse when third parties are allowed to use these algorithms to target a specific audience.

    3. any system trying to maximize engagement will try to push users towards the fringes. You can prove this to yourself by opening YouTube in an incognito browser (so that you start with a blank slate), and clicking recommended links on any video with political content.

      ...

      This pull to the fringes doesn’t happen if you click on a cute animal story. In that case, you just get more cute animals (an experiment I also recommend trying). But the algorithms have learned that users interested in politics respond more if they’re provoked more, so they provoke. Nobody programmed the behavior into the algorithm; it made a correct observation about human nature and acted on it.

  25. Dec 2016
    1. http://digipo.io/doku.php<br> The Digital Polarization Initiative<br> "The primary purpose of this wiki is to provide a place for students to fact-check, annotate, and provide context to the different news stories that show up in their Twitter and Facebook feeds. It's like a student-driven Snopes, but with a broader focus: we don't aim to just investigate myths, but to provide context and sanity to all the news – from the article about voter fraud to the health piece on a new cancer treatment."

    1. The Web has become an insidious propaganda tool. To fight it, digital literacy education must rise beyond technical proficiency to include wisdom.

      • Double-check every claim before you share.
      • Be wary of casual scrolling.<br> Everything you see affects your attitudes.
      • Don't automatically disbelieve the surreal (or unpleasant).
      • Do not exaggerate your own claims.
      • Be prepared to repeat the truth over and over.
      • Curate good resources, and share updates to them.
        • It will reinforce the previous information.
        • it will boost search engine rankings of the collection.
    1. A survey of voters asked if they remembered seeing a headline, and if so, whether they believed it was true.

      It may come as no surprise that high percentages of Trump voters believed stories that favored Trump or demonized Clinton. But the percentage of Clinton voters who believed the fake stories was also fairly high!

      familiarity equals truth: when we recognize something as true, we are most often judging if this is something we’ve heard more often than not from people we trust.

      ...

      if you want to be well-informed it’s not enough to read the truth — you also must avoid reading lies.

    1. This is our internet. Not Google’s. Not Facebook’s. Not rightwing propagandists. And we’re the only ones who can reclaim it.

      This is our nation, and our world.<br> It is up to us to reclaim it.

  26. Nov 2016
    1. Interview with a man who has run several fake news sites since 2013.

      Well, this isn't just a Trump-supporter problem. This is a right-wing issue.

      ...

      We've tried to do similar things to liberals. It just has never worked, it never takes off. You'll get debunked within the first two comments and then the whole thing just kind of fizzles out.

    1. Journalism faces an 'existential crisis' in the Trump era, Christine Amanpour

      As all the international journalists we honor in this room tonight and every year know only too well: First the media is accused of inciting, then sympathizing, then associating -- until they suddenly find themselves accused of being full-fledged terrorists and subversives. Then they end up in handcuffs, in cages, in kangaroo courts, in prison

      ...

      First, like many people watching where I was overseas, I admit I was shocked by the exceptionally high bar put before one candidate and the exceptionally low bar put before the other candidate.

      It appeared much of the media got itself into knots trying to differentiate between balance, objectivity, neutrality, and crucially, truth.

      ...

      The winning candidate did a savvy end run around us and used it to go straight to the people. Combined with the most incredible development ever -- the tsunami of fake news sites -- aka lies -- that somehow people could not, would not, recognize, fact check, or disregard.

      ...

      The conservative radio host who may be the next white house press secretary says mainstream media is hostile to traditional values.

      I would say it's just the opposite. And have you read about the "heil, victory" meeting in Washington, DC this past weekend? Why aren't there more stories about the dangerous rise of the far right here and in Europe? Since when did anti-Semitism stop being a litmus test in this country?

    1. Paul Horner publishes fake news that is often shared widely. He claims that his stories are intended to be taken as satire like The Onion.

      Honestly, people are definitely dumber. They just keep passing stuff around. Nobody fact-checks anything anymore — I mean, that’s how Trump got elected. He just said whatever he wanted, and people believed everything, and when the things he said turned out not to be true, people didn’t care because they’d already accepted it. It’s real scary. I’ve never seen anything like it.

      My sites were picked up by Trump supporters all the time. I think Trump is in the White House because of me. His followers don’t fact-check anything — they’ll post everything, believe anything. His campaign manager posted my story about a protester getting paid $3,500 as fact. Like, I made that up. I posted a fake ad on Craigslist.

    1. But as managing editor of the fact-checking site Snopes, Brooke Binkowski believes Facebook’s perpetuation of phony news is not to blame for our epidemic of misinformation. “It’s not social media that’s the problem,” she says emphatically. “People are looking for somebody to pick on. The alt-rights have been empowered and that’s not going to go away anytime soon. But they also have always been around.”

      The misinformation crisis, according to Binkowski, stems from something more pernicious. In the past, the sources of accurate information were recognizable enough that phony news was relatively easy for a discerning reader to identify and discredit. The problem, Binkowski believes, is that the public has lost faith in the media broadly — therefore no media outlet is considered credible any longer. The reasons are familiar: as the business of news has grown tougher, many outlets have been stripped of the resources they need for journalists to do their jobs correctly.

      The problem is not JUST social media and fake news. But most of the false stories do not come from mainstream media. The greatest evils of mainstream media are sensationalism, and being too willing to spin stories the way their sources want them to.

    1. But a former employee, Antonio Garcia-Martinez, disagrees and says his old boss is being "more than a little disingenuous here."

      ...

      "There's an entire political team and a massive office in D.C. that tries to convince political advertisers that Facebook can convince users to vote one way or the other," Garcia-Martinez says. "Then Zuck gets up and says, 'Oh, by the way, Facebook content couldn't possibly influence the election.' It's contradictory on the face of it."

    1. Mike Caulfield says Facebook's feed algorithms are far from its only problem. The entire site design encourages sharing of items that users haven't inspected beyond reading the headline.

    1. Facebook hasn’t told the public very much about how its algorithm works. But we know that one of the company’s top priorities for the news feed is “engagement.” The company tries to choose posts that people are likely to read, like, and share with their friends. Which, they hope, will induce people to return to the site over and over again.

      This would be a reasonable way to do things if Facebook were just a way of finding your friends’ cutest baby pictures. But it’s more troubling as a way of choosing the news stories people read. Essentially, Facebook is using the same criteria as a supermarket tabloid: giving people the most attention-grabbing headlines without worrying about whether articles are fair, accurate, or important.

  27. Oct 2016
    1. “Among millennials, especially,” [Ross] Douthat argues, “there’s a growing constituency for whom rightwing ideas are so alien or triggering, leftwing orthodoxy so pervasive and unquestioned, that supporting a candidate like Hillary Clinton looks like a needless form of compromise.”

      ...

      “I don’t see sufficient evidence to buy the argument about siloing and confirmation bias,” Jeff Jarvis,a professor at the City University of New York’s graduate school of journalism said. “That is a presumption about the platforms – because we in media think we do this better. More important, such presumptions fundamentally insult young people. For too long, old media has assumed that young people don’t care about the world.”

      “Newspapers, remember, came from the perspective of very few people: one editor, really,” Jarvis said. “Facebook comes with many perspectives and gives many; as Zuckerberg points out, no two people on Earth see the same Facebook.”

  28. Jun 2016
    1. Automated posts from social media accounts pretending to be real individuals are being used to influence public opinion. (The Chinese government uses regular employees to post "real" messages at strategic times.)

  29. May 2016
  30. Apr 2016
    1. Jon Udell on productive social discourse.

      changeable minds<br> What’s something you believed deeply, for a long time, and then changed your mind about?

      David Gray's Liminal Thinking points out that we all have beliefs that are built on hidden foundations. We need to carefully examine our own beliefs and their origins. And we need to avoid judgment as we consider the beliefs of others and their origins.

      Wael Ghonim asks us to design social media that encourages civility, thoughtfulness, and open minds rather than self-promotion, click-bait, and echo chambers.

  31. Feb 2016
    1. At some dark day in the future, when considered versus the Google Caliphate, the NSA may even come to be seen by some as the “public option.” “At least it is accountable in principle to some parliamentary limits,” they will say, “rather than merely stockholder avarice and flimsy user agreements.”

      In the last few years I've come to understand that my tolerance for most forms of surveillance should be considered in terms of my confidence in the judiciary.