270 Matching Annotations
  1. Feb 2024
    1. we 00:11:13 have a media that needs to survive based on clicks and controversy and serving the most engaged people

      for - quote - roots of misinformation, quote - roots of fake news, key insight - roots of misinformation

      key insight - roots of misinformation - (see below)

      quote - roots of misinformation - we have a media that needs to survive based on - clicks and - controversy and - serving the most engaged people - so they both sides the issues - they they lift up - facts and - lies - as equivalent in order to claim no bias but - that in itself is a bias because - it gives more oxygen to the - lies and - the disinformation - that is really dangerous to our society and - we are living through the impacts of - those errors and - that malpractice -done by media in America

    2. for - misinformation - media misinformation

  2. Apr 2023
    1. Now, I've made a number of documentaries about fake news. And what interests me is the first person to use the phrase mainstream media was Joseph Goebbels. And he, in one of his propaganda sheets, said “It's very important that you don't read the mainstream media because they'll tell you lies.” You must read the truth by the ramblings of his boss and his associated work. And you do have to watch this. This is a very, very well-established technique of fascists, is to tell you, don't read this stuff, read our stuff.<br /> —Ian Hislop, Editor, Private Eye Magazine 00:16:00, Satire in the Age of Murdoch and Trump, The Problem with Jon Stewart Podcast

  3. Mar 2023
    1. Die schiere Menge sprengt die Möglichkeiten der Buchpublikation, die komplexe, vieldimensionale Struktur einer vernetzten Informationsbasis ist im Druck nicht nachzubilden, und schließlich fügt sich die Dynamik eines stetig wachsenden und auch stetig zu korrigierenden Materials nicht in den starren Rhythmus der Buchproduktion, in der jede erweiterte und korrigierte Neuauflage mit unübersehbarem Aufwand verbunden ist. Eine Buchpublikation könnte stets nur die Momentaufnahme einer solchen Datenbank, reduziert auf eine bestimmte Perspektive, bieten. Auch das kann hin und wieder sehr nützlich sein, aber dadurch wird das Problem der Publikation des Gesamtmaterials nicht gelöst.

      link to https://hypothes.is/a/U95jEs0eEe20EUesAtKcuA

      Is this phenomenon of "complex narratives" related to misinformation spread within the larger and more complex social network/online network? At small, local scales, people know how to handle data and information which is locally contextualized for them. On larger internet-scale communication social platforms this sort of contextualization breaks down.

      For a lack of a better word for this, let's temporarily refer to it as "complex narratives" to get a handle on it.

    1. Title: Fox News producer files explosive lawsuits against the network, alleging she was coerced into providing misleading Dominion testimony

      // - This is an example of how big media corporations can deceive the public and compromise the truth - It helps create a nation of misinformed people which destabilizes political governance - the workspace sounds toxic - the undertone of this story: the pathological transformation of media brought about by capitalism - it is the need for ratings, which is the indicator for profit in the marketing world, that has corrupted the responsibility to report truthfully - making money becomes the consumerist dream at the expense of all else of intrinsic value within a culture - knowledge is what enables culture to exist, modernity is based on cumulative cultural evolution - this is an example of NON-conscious cumulative cultural evolution or pathological cumulaitve cultural evolution

  4. Feb 2023
    1. “It makes me feel like I need a disclaimer because I feel like it makes you seem unprofessional to have these weirdly spelled words in your captions,” she said, “especially for content that's supposed to be serious and medically inclined.”

      Where's the balance for professionalism with respect to dodging the algorithmic filters for serious health-related conversations online?

      link to: https://hypothes.is/a/uBq9HKqWEe22Jp_rjJ5tjQ

  5. Dec 2022
    1. . Furthermore, our results add to the growing body of literature documenting—at least at this historical moment—the link between extreme right-wing ideology and misinformation8,14,24 (although, of course, factors other than ideology are also associated with misinformation sharing, such as polarization25 and inattention17,37).

      Misinformation exposure and extreme right-wing ideology appear associated in this report. Others find that it is partisanship that predicts susceptibility.

    2. . We also find evidence of “falsehood echo chambers”, where users that are more often exposed to misinformation are more likely to follow a similar set of accounts and share from a similar set of domains. These results are interesting in the context of evidence that political echo chambers are not prevalent, as typically imagined
  6. Nov 2022
  7. Oct 2022
    1. Edgerly noted that disinformation spreads through two ways: The use of technology and human nature.Click-based advertising, news aggregation, the process of viral spreading and the ease of creating and altering websites are factors considered under technology.“Facebook and Google prioritize giving people what they ‘want’ to see; advertising revenue (are) based on clicks, not quality,” Edgerly said.She noted that people have the tendency to share news and website links without even reading its content, only its headline. According to her, this perpetuates a phenomenon of viral spreading or easy sharing.There is also the case of human nature involved, where people are “most likely to believe” information that supports their identities and viewpoints, Edgerly cited.“Vivid, emotional information grabs attention (and) leads to more responses (such as) likes, comments, shares. Negative information grabs more attention than (the) positive and is better remembered,” she said.Edgerly added that people tend to believe in information that they see on a regular basis and those shared by their immediate families and friends.

      Spreading misinformation and disinformation is really easy in this day and age because of how accessible information is and how much of it there is on the web. This is explained precisely by Edgerly. Noted in this part of the article, there is a business for the spread of disinformation, particularly in our country. There are people who pay what we call online trolls, to spread disinformation and capitalize on how “chronically online” Filipinos are, among many other factors (i.e., most Filipinos’ information illiteracy due to poverty and lack of educational attainment, how easy it is to interact with content we see online, regardless of its authenticity, etc.). Disinformation also leads to misinformation through word-of-mouth. As stated by Edgerly in this article, “people tend to believe in information… shared by their immediate families and friends”; because of people’s human nature to trust the information shared by their loved ones, if one is not information literate, they will not question their newly received information. Lastly, it most certainly does not help that social media algorithms nowadays rely on what users interact with; the more that a user interacts with a certain information, the more that social media platforms will feed them that information. It does not help because not all social media websites have fact checkers and users can freely spread disinformation if they chose to.

    1. Trolls, in this context, are humans who hold accounts on social media platforms, more or less for one purpose: To generate comments that argue with people, insult and name-call other users and public figures, try to undermine the credibility of ideas they don’t like, and to intimidate individuals who post those ideas. And they support and advocate for fake news stories that they’re ideologically aligned with. They’re often pretty nasty in their comments. And that gets other, normal users, to be nasty, too.

      Not only programmed accounts are created but also troll accounts that propagate disinformation and spread fake news with the intent to cause havoc on every people. In short, once they start with a malicious comment some people will engage with the said comment which leads to more rage comments and disagreements towards each other. That is what they do, they trigger people to engage in their comments so that they can be spread more and produce more fake news. These troll accounts usually are prominent during elections, like in the Philippines some speculates that some of the candidates have made troll farms just to spread fake news all over social media in which some people engage on.

    2. So, bots are computer algorithms (set of logic steps to complete a specific task) that work in online social network sites to execute tasks autonomously and repetitively. They simulate the behavior of human beings in a social network, interacting with other users, and sharing information and messages [1]–[3]. Because of the algorithms behind bots’ logic, bots can learn from reaction patterns how to respond to certain situations. That is, they possess artificial intelligence (AI). 

      In all honesty, since I don't usually dwell on technology, coding, and stuff. I thought when you say "Bot" it is controlled by another user like a legit person, never knew that it was programmed and created to learn the usual patterns of posting of some people may be it on Twitter, Facebook, and other social media platforms. I think it is important to properly understand how "Bots" work to avoid misinformation and disinformation most importantly during this time of prominent social media use.

  8. Sep 2022
    1. https://www.scientificamerican.com/article/information-overload-helps-fake-news-spread-and-social-media-knows-it/

      Good overview article of some of the psychology research behind misinformation in social media spaces including bots, AI, and the effects of cognitive bias.

      Probably worth mining the story for the journal articles and collecting/reading them.

    2. “Limited individual attention and online virality of low-quality information,” By Xiaoyan Qiu et al., in Nature Human Behaviour, Vol. 1, June 2017

      The upshot of this paper seems to be "information overload alone can explain why fake news can become viral."

  9. Aug 2022
    1. Many U.S.educators believe that increasing political polarization combine with the hazards ofmisinformation and disinformation in ways that underscore the need for learners to acquire theknowledge and skills required to navigate a changing media landscape (Hamilton et al. 2020a)



  10. Apr 2022
    1. Katherine Ognyanova. (2022, February 15). Americans who believe COVID vaccine misinformation tend to be more vaccine-resistant. They are also more likely to distrust the government, media, science, and medicine. That pattern is reversed with regard to trust in Fox News and Donald Trump. Https://osf.io/9ua2x/ (5/7) https://t.co/f6jTRWhmdF [Tweet]. @Ognyanova. https://twitter.com/Ognyanova/status/1493596109926768645

    1. Mike Caulfield. (2021, March 10). One of the drivers of Twitter daily topics is that topics must be participatory to trend, which means one must be able to form a firm opinion on a given subject in the absence of previous knowledge. And, it turns out, this is a bit of a flaw. [Tweet]. @holden. https://twitter.com/holden/status/1369551099489779714

    1. Dr. Syra Madad. (2021, February 7). What we hear most often “talk to your health care provider if you have any questions/concerns on COVID19 vaccines” Vs Where many are actually turning to for COVID19 vaccine info ⬇️ This is also why it’s so important for the media to report responsibly based on science/evidence [Tweet]. @syramadad. https://twitter.com/syramadad/status/1358509900398272517

  11. Mar 2022
    1. Prof Peter Hotez MD PhD. (2021, December 30). When the antivaccine disinformation crowd declares twisted martyrdom when bumped from social media or condemned publicly: They contributed to the tragic and needless loss of 200,000 unvaccinated Americans since June who believed their antiscience gibberish. They’re the aggressors [Tweet]. @PeterHotez. https://twitter.com/PeterHotez/status/1476393357006065670

  12. Feb 2022
    1. Stephan Lewandowsky. (2022, January 15). This is an extremely important development. The main vector for misinformation are not fringe websites but “mainstream” politicians who inherit and adapt fringe material. So keeping track of their effect is crucial, and this is a very welcome first step by @_mohsen_m @DG_Rand 1/n [Tweet]. @STWorg. https://twitter.com/STWorg/status/1482265289022746628

  13. Jan 2022
  14. Dec 2021
    1. Timothy Caulfield. (2021, December 30). #RobertMalone suspended by #twitter today. Reaction: 1) Great news. He has been spreading harmful #misinformation. (He has NOT contributed to meaningful/constructive scientific debate. His views demonstrably wrong & polarizing.) 2) What took so long? #ScienceUpFirst [Tweet]. @CaulfieldTim. https://twitter.com/CaulfieldTim/status/1476346919890796545

    1. Health Nerd. (2021, December 13). Accusing everyone you disagree with of being a shill for pharmaceutical companies is a very simple way to tell anyone with even the slightest insight that you have absolutely no idea what you’re talking about and no desire to do simple things to educate yourself [Tweet]. @GidMK. https://twitter.com/GidMK/status/1470287869168152576

  15. Nov 2021
  16. Oct 2021