- Feb 2024
-
pluralistic.net pluralistic.net
-
Broderick makes a more important point: AI search is about summarizing web results so you don't have to click links and read the pages yourself. If that's the future of the web, who the fuck is going to write those pages that the summarizer summarizes? What is the incentive, the business-model, the rational explanation for predicting a world in which millions of us go on writing web-pages, when the gatekeepers to the web have promised to rig the game so that no one will ever visit those pages, or read what we've written there, or even know it was us who wrote the underlying material the summarizer just summarized? If we stop writing the web, AIs will have to summarize each other, forming an inhuman centipede of botshit-ingestion. This is bad news, because there's pretty solid mathematical evidence that training a bot on botshit makes it absolutely useless. Or, as the authors of the paper – including the eminent cryptographer Ross Anderson – put it, "using model-generated content in training causes irreversible defects"
Broderick: https://www.garbageday.email/p/ai-search-doomsday-cult, Anderson: https://arxiv.org/abs/2305.17493
AI search hides the authors of the material it presents, summarising it is abstracting away the authors. It doesn't bring readers to those authors, it just presents a summary to the searcher as end result. Take it or leave it. At the same time, if one searches for something you know about, you see those summaries are always of. Leaving you guessing how of it is when searching something you don't know about. Search should never be the endpoint, always a starting point. I think that is my main aversion against AI search tools. Despite those clamoring 'it will get better over time' I don't think it will easily because the tool nor its makers have any interest in the quality of output necessarily and definitely can't assess it. So what's next, humans factchecking AI output. Why not prevent bs at its source? Nice ref to Maggie Appleton's centipede metaphor in [[The Expanding Dark Forest and Generative AI]]
-
- May 2023
-
maggieappleton.com maggieappleton.com
-
These are machine-learning models that can generate content that before this point in history, only humans could make. This includes text, images, videos, and audio.
Appleton posits that the waves of generative AI output will expand the dark forest enormously in the sense of feeling all alone as a human online voice in an otherwise automated sea of content.
-
However, even personal websites and newsletters can sometimes be too public, so we retreat further into gatekept private chat apps like Slack, Discord, and WhatsApp.These apps allow us to spend most of our time in real human relationships and express our ideas, with things we say taken in good faith and opportunities for real discussions.The problem is that none of this is indexed or searchable, and we’re hiding collective knowledge in private databases that we don’t own. Good luck searching on Discord!
Appleton sketches a layering of dark forest web (silos mainly), cozy web (personal sites, newsletters, public but intentionally less reach), and private chat groups, where you are in pseudo closed or closed groups. This is not searchable so any knowledge gained / expressed there is inaccessible to the wider community. Another issue I think is that these closed groups only feel private, but are in fact not. Examples mentioned like Slack, Discord and Whatsapp are definitely not private. The landlord is wacthing over your shoulder and gathering data as much as the silos up in the dark forest.
-
The overwhelming flood of this low-quality content makes us retreat away from public spaces of the web. It's too costly to spend our time and energy wading through it.
Strickler compares this to black zones as described in [[Three Body Problem _ Dark Forest by Cixin Liu]], withdraw into something smaller which is safe but also excluding yourself permanently from the greater whole. Liu describes planets that lower the speed of light around them on purpose so they can't escape their own planet anymore. Which makes others leave them alone, because they can't approach them either.
-
This is a theory proposed by Yancey Striker in 2019 in the article The Dark Forest Theory of the InternetYancey describes some trends and shifts around what it feels like to be in the public spaces of the web.
Hardly a 'theory', a metaphor re-applied to experiencing online interaction. (Strickler ipv Striker)
The internet feels lifeless: ads, trolling factories, SEO optimisation, crypto scams, all automated. No human voices. The internet unleashes predators: aggressie behaviour at scale if you do show yourself to be a human. This is the equivalent of Dark Forest.
Yancey Strickler https://onezero.medium.com/the-dark-forest-theory-of-the-internet-7dc3e68a7cb1 https://onezero.medium.com/beyond-the-dark-forest-a905e2dd8ae0 https://www.ystrickler.com/
-
the dark forest theory of the universe
A specific proposed solution to [[Fermi Paradox 20201123150738]] where is everybody? Dark forest, it's full of life but if you walk through it it seems empty. Universe seems empty of intelligent life to us as well. Because life forms know that if you let yourself be heard/seen you'll be attacked by predators. Leading theme in [[Three Body Problem _ Dark Forest by Cixin Liu]]
-
https://web.archive.org/web/20230503150426/https://maggieappleton.com/forest-talk
Maggie Appleton on the impact of generative AI on internet, with a focus on it being a place for humans and human connection. Take out some of the concepts as shorthand, some of the examples mentioned are new to me --> add to lists, sketch out argumentation line and arguments. The talk represents an updated version of earlier essay https://maggieappleton.com/ai-dark-forest which I probably want to go through next for additional details.
-