938 Matching Annotations
  1. Jun 2020
    1. Altay, S., de Araujo, E., & Mercier, H. (2020, June 4). “If this account is true, it is most enormously wonderful”: Interestingness-if-true and the sharing of true and false news.

  2. May 2020
  3. Apr 2020
    1. It seems kind of intuitively obvious that if you put something—whether it’s a scarf or a mask—in front of your nose and mouth, that will filter out some of these viruses that are floating around out there,” says Dr. William Schaffner, professor of medicine in the division of infectious diseases at Vanderbilt University. The only problem: that’s not effective against respiratory illnesses like the flu and COVID-19. If it were, “the CDC would have recommended it years ago,” he says. “It doesn’t, because it makes science-based recommendations.”
  4. Mar 2020
    1. Right now, Facebook is tackling “misinformation that has imminent risk of danger, telling people if they have certain symptoms, don’t bother going getting treated … things like ‘you can cure this by drinking bleach.’ I mean, that’s just in a different class.”
    2. Many of the most alarmist claims about misinformation are themselves misleading.
  5. Apr 2019
    1. They had been told that the insulin that their son was supposed to be taking for his Type 1 diabetes was poison, Solis testified. Timothy Morrow, a herbalist based in Torrance, Calif., had instead told them to rub lavender oil on the boy’s spine and prescribed herbal medicine that he claimed would cure Lopez for life.
  6. Sep 2018
    1. How can we get back to that common ground? We need new mechanisms—suited to the digital age—that allow for a shared understanding of facts and that focus our collective attention on the most important problems.
    2. Deluged by apparent facts, arguments and counterarguments, our brains resort to the most obvious filter, the easiest cognitive shortcut for a social animal: We look to our peers, see what they believe and cheer along. As a result, open and participatory speech has turned into its opposite. Important voices are silenced by mobs of trolls using open platforms to hurl abuse and threats. Bogus news shared from one friend or follower to the next becomes received wisdom. Crucial pieces of information drown in so much irrelevance that they are lost. If books were burned in the street, we would be alarmed. Now, we are simply exhausted.
    3. For the longest time, we thought that as speech became more democratized, democracy itself would flourish. As more and more people could broadcast their words and opinions, there would be an ever-fiercer battle of ideas—with truth emerging as the winner, stronger from the fight. But in 2018, it is increasingly clear that more speech can in fact threaten democracy. The glut of information we now face, made possible by digital tools and social media platforms, can bury what is true, greatly elevate and amplify misinformation and distract from what is important.
    4. But in the digital age, when speech can exist mostly unfettered, the big threat to truth looks very different. It’s not just censorship, but an avalanche of undistinguished speech—some true, some false, some fake, some important, some trivial, much of it out-of-context, all burying us.
  7. Aug 2018
    1. The first of the two maps in the GIF image below shows the US political spectrum on the eve of the 2016 election. The second map highlights the followers of a 30-something American woman called Jenna Abrams, a following gained with her viral tweets about slavery, segregation, Donald Trump, and Kim Kardashian. Her far-right views endeared her to conservatives, and her entertaining shock tactics won her attention from several mainstream media outlets and got her into public spats with prominent people on Twitter, including a former US ambassador to Russia. Her following in the right-wing Twittersphere enabled her to influence the broader political conversation. In reality, she was one of many fake personas created by the infamous St. Petersburg troll farm known as the Internet Research Agency.
    2. Instead of trying to force their messages into the mainstream, these adversaries target polarized communities and “embed” fake accounts within them. The false personas engage with real people in those communities to build credibility. Once their influence has been established, they can introduce new viewpoints and amplify divisive and inflammatory narratives that are already circulating. It’s the digital equivalent of moving to an isolated and tight-knit community, using its own language quirks and catering to its obsessions, running for mayor, and then using that position to influence national politics.
    3. However, as the following diagrams will show, the middle is a lot weaker than it looks, and this makes public discourse vulnerable both to extremists at home and to manipulation by outside actors such as Russia.
  8. Jul 2018
    1. "The internet has become the main threat — a sphere that isn't controlled by the Kremlin," said Pavel Chikov, a member of Russia's presidential human rights council. "That's why they're going after it. Its very existence as we know it is being undermined by these measures."
    2. Gatov, who is the former head of Russia's state newswire's media analytics laboratory, told BuzzFeed the documents were part of long-term Kremlin plans to swamp the internet with comments. "Armies of bots were ready to participate in media wars, and the question was only how to think their work through," he said. "Someone sold the thought that Western media, which specifically have to align their interests with their audience, won't be able to ignore saturated pro-Russian campaigns and will have to change the tone of their Russia coverage to placate their angry readers."
    3. "There's no paradox here. It's two sides of the same coin," Igor Ashmanov, a Russian internet entrepreneur known for his pro-government views, told BuzzFeed. "The Kremlin is weeding out the informational field and sowing it with cultured plants. You can see what will happen if they don't clear it out from the gruesome example of Ukraine."
    4. The trolls appear to have taken pains to learn the sites' different commenting systems. A report on initial efforts to post comments discusses the types of profanity and abuse that are allowed on some sites, but not others. "Direct offense of Americans as a race are not published ('Your nation is a nation of complete idiots')," the author wrote of fringe conspiracy site WorldNetDaily, "nor are vulgar reactions to the political work of Barack Obama ('Obama did shit his pants while talking about foreign affairs, how you can feel yourself psychologically comfortable with pants full of shit?')." Another suggested creating "up to 100" fake accounts on the Huffington Post to master the site's complicated commenting system.
    5. According to the documents, which are attached to several hundred emails sent to the project's leader, Igor Osadchy, the effort was launched in April and is led by a firm called the Internet Research Agency. It's based in a Saint Petersburg suburb, and the documents say it employs hundreds of people across Russia who promote Putin in comments on Russian blogs.
    6. The documents show instructions provided to the commenters that detail the workload expected of them. On an average working day, the Russians are to post on news articles 50 times. Each blogger is to maintain six Facebook accounts publishing at least three posts a day and discussing the news in groups at least twice a day. By the end of the first month, they are expected to have won 500 subscribers and get at least five posts on each item a day. On Twitter, the bloggers are expected to manage 10 accounts with up to 2,000 followers and tweet 50 times a day.
    7. Russia's campaign to shape international opinion around its invasion of Ukraine has extended to recruiting and training a new cadre of online trolls that have been deployed to spread the Kremlin's message on the comments section of top American websites.Plans attached to emails leaked by a mysterious Russian hacker collective show IT managers reporting on a new ideological front against the West in the comments sections of Fox News, Huffington Post, The Blaze, Politico, and WorldNetDaily.The bizarre hive of social media activity appears to be part of a two-pronged Kremlin campaign to claim control over the internet, launching a million-dollar army of trolls to mold American public opinion as it cracks down on internet freedom at home.
    1. creating a new international news operation called Sputnik to “provide an alternative viewpoint on world events.” More and more, though, the Kremlin is manipulating the information sphere in more insidious ways.
    1. The New Yorker’s Sasha Frere-Jones called Twitter a “self-cleaning oven,” suggesting that false information could be flagged and self-corrected almost immediately. We no longer had to wait 24 hours for a newspaper to issue a correction.
    1. We’ve built an information ecosystem where information can fly through social networks (both technical and personal). Folks keep looking to the architects of technical networks to solve the problem. I’m confident that these companies can do a lot to curb some of the groups who have capitalized on what’s happening to seek financial gain. But the battles over ideology and attention are going to be far trickier. What’s at stake isn’t “fake news.” What’s at stake is the increasing capacity of those committed to a form of isolationist and hate-driven tribalism that has been around for a very long time. They have evolved with the information landscape, becoming sophisticated in leveraging whatever tools are available to achieve power, status, and attention. And those seeking a progressive and inclusive agenda, those seeking to combat tribalism to form a more perfect union —  they haven’t kept up.
    1. Dissemination MechanismsFinally, we need to think about how this content is being disseminated. Some of it is being shared unwittingly by people on social media, clicking retweet without checking. Some of it is being amplified by journalists who are now under more pressure than ever to try and make sense and accurately report information emerging on the social web in real time. Some of it is being pushed out by loosely connected groups who are deliberately attempting to influence public opinion, and some of it is being disseminated as part of sophisticated disinformation campaigns, through bot networks and troll factories.
    2. When messaging is coordinated and consistent, it easily fools our brains, already exhausted and increasingly reliant on heuristics (simple psychological shortcuts) due to the overwhelming amount of information flashing before our eyes every day. When we see multiple messages about the same topic, our brains use that as a short-cut to credibility. It must be true we say — I’ve seen that same claim several times today.
    3. I saw Eliot Higgins present in Paris in early January, and he listed four ‘Ps’ which helped explain the different motivations. I’ve been thinking about these a great deal and using Eliot’s original list have identified four additional motivations for the creation of this type of content: Poor Journalism, Parody, to Provoke or ‘Punk’, Passion, Partisanship, Profit, Political Influence or Power, and Propaganda.This is a work in progress but once you start breaking these categories down and mapping them against one another you begin to see distinct patterns in terms of the types of content created for specific purposes.
    4. Back in November, I wrote about the different types of problematic information I saw circulate during the US election. Since then, I’ve been trying to refine a typology (and thank you to Global Voices for helping me to develop my definitions even further). I would argue there are seven distinct types of problematic content that sit within our information ecosystem. They sit on a scale, one that loosely measures the intent to deceive.
    5. By now we’ve all agreed the term “fake news” is unhelpful, but without an alternative, we’re left awkwardly using air quotes whenever we utter the phrase. The reason we’re struggling with a replacement is because this is about more than news, it’s about the entire information ecosystem. And the term fake doesn’t begin to describe the complexity of the different types of misinformation (the inadvertent sharing of false information) and disinformation (the deliberate creation and sharing of information known to be false).
  9. Nov 2017
  10. Oct 2017
    1. Anti-vaccinations groups, for example, have reliedon viral videos to sell the panic of vaccination side-effects

      Unfortunately, this is very true. We can say the same about fake news. Such practices can contribute to hurting the validity of the overall data. The Twitter data is not collected with systematic investigation or systematic collection methods. This data collection method heavily relies on “public opinion”. I do think that if one wants to find general public sentiment or general public opinion, this is a great way to do it.

  11. Sep 2017
  12. Jun 2016
    1. which can be very sophisticated

      Worth noting that they are becoming even more sophisticated greatly surpassing non-sophisticated voters.

  13. Dec 2015
    1. RAJ: Good morning, Paul. I am glad to hear from you this morning. I know yesterday was a rugged day for you, as it also was for Susan.1 PAUL: I do not understand why it was necessary. However, I do not want to dwell on that level or in those feelings.

      Answer: to question of what Raj is up to when he tells Paul that Maitreya (Christ) will make himself known on March 14 (yesterday in the timeframe of this chapter).

      The footnote explains that Raj was making a point with Paul to not look outside himself for answers. They were expecting a life-changing announcement by a being revealing himself as the return of Christ. That's huge and Raj was playing them - telling them outright this was going to happen as proposed by Benjamin Creme.

      This could feel like total and deliberate deceit on the part of Raj. The teacher must really know what he is doing...

      The lesson - don't seek for answers outside of your Self.

  14. Oct 2015
    1. It is true that there is infinite progression—infinite, universal progression. It is true that there is more to the infinite progression of Being than you can imagine at the present time. But, it is also true that there is an abundance of misinformation available on your planet regarding these subjects, which apparently have come through Masters, but which were coming through individualities who were caught in great mental complexities. They had not truly grown to the point of being the open Door. They were communicating their own theories and concepts. Tonight I let you have a taste of such theories and concepts.