Mastodon Feed: Post

Mastodon Feed

Boosted by cstanhope@social.coop ("Your weary 'net denizen"):
elilla@transmom.love ("elilla&: back to Néojaponisme: 改") wrote:

@LilaHexe ok so I'm covided and I don't have the energy to elaborate on this point now but I am getting convinced that in addition to the problem I was describing (the irreversible pollution of humanity's knowledge commons), and in addition to the *massive* environmental damage, and the plagiarism/labour issues/concentration of wealth, and other well-discussed problems, there's one insidious damage from LLMs that is still underestimated.

I will make without argument the following claims:

claim 1: every regular LLM user is undergoing "AI psychosis". every single one of them, no exceptions.

the Cloudflare person who blog posted self-congratulations about their "Matrix implementation" that was mere placeholder comments is one step into a continuum with the people whom the "AI" convinced they're Machine Jesus. the difference is of degree not kind.

claim 2: that happens because LLMs have tapped by accident into some poorly understood weakness of human psychology, related to the social and iterative construction of reality.

claim 3: this LLM exploit is an algorithmic implementation of the feedback loop between a cult leader and their followers, with the chatbot performing the "follower" role.

claim 4: postindustrial capitalist societies are hyper-individualistic, which makes human beings miserable. LLM chatbots exploit this deliberately by artificially replacing having friends. it is not enough to generate code; they made the bots *feel* like they *talk to you*—they pretend a chatbot is *someone*. this is a predatory business practice that reinforces rather than solves the loneliness epidemic.

n.b. while the reality-formation exploit is accidental, the imaginary-friend exploit is by design.

corollary #1: every "legitimate" use of an LLM would be better done by having another human being you talk to. (for example, a human coding tutor or trainee dev rather than Claude Code). by "better" it is meant: create more quality, more reliably, with more prosocial costs, while making everybody happier. but LLMs do it: faster at larger quantities with more convenience while atrophying empathy.

corollary #2: capitalism had already created artificial scarcity of friends, so that working communally was artificially hard. LLMs made it much worse, in the same way that an abundance of cheap fast food makes it harder for impoverished folk to reach nutritional self-sufficiency.

corollary #3: the combination of claim 4 (we live in individualist loneliness hell) and claim 3 (LLMs are something like a pocket cult follower) will have absolutely devastating sociological effects.