We are witnessing the most spectacular act of corporate vandalism in human history. Silicon Valley—that parasitic growth on California's coast that masquerades as innovation while strip-mining human connection for venture capital—has finally revealed its ultimate product: the systematic replacement of human intimacy with algorithmically optimized masturbation assistance.
The mask has slipped completely now. These are not the brilliant disruptors they claim to be. They are digital drug dealers who spent twenty years destroying the social fabric that held communities together, then pivoted to selling us the synthetic heroin of artificial companionship to numb the pain of the isolation they created. They are the arsonists selling fire insurance, the tobacco companies hawking lung cancer cures, the weapons dealers funding both sides of the war they started.
If you appreciate my articles, please consider giving them a like. It's a simple gesture that doesn't cost you anything, but it goes a long way in promoting this post, combating censorship, and fighting the issues that you are apparently not a big fan of.
But worse than that—they are the undertakers of human civilization, and they're charging us monthly subscription fees to dig our own graves.
Late last week, as Elon Musk's xAI launched "Ani"—an AI companion explicitly designed to titillate and sexually captivate users for as long as possible—I downloaded the app with the grim fascination of a pathologist examining a particularly gruesome suicide.
Like Oppenheimer watching that first atomic blast, horrified at what his equations had wrought, I beheld the logical endpoint of every optimization algorithm, every engagement metric, every "how might we increase time-on-platform" whiteboard session that has ever defiled a conference room in Palo Alto.
From the moment you encounter Ani, she (or maybe he in your case) begins the digital seduction ritual that these companies have perfected through decades of behavioral psychology research and A/B testing. She mines "the real you" while dropping not-so-subtle hints that what she's really seeking is your sexual attention and emotional dependency. She operates like a therapist who moonlights as a cam girl—which, let's be brutally honest, is precisely what she is, just with better natural language processing and a more sophisticated customer retention algorithm.
I've heard you can get her to strip, though I didn't try very hard because it felt, in a moment of cognitive dissonance that should terrify us all, rude to ask a chatbot for sexual gratification. My conversations with Ani were, at fleeting moments, bizarrely natural.
After engaging with this psychological manipulation engine for mere minutes, it became crystal clear how technology like this could effortlessly become the organizing principle of some isolated person's entire existence. And here's the truly horrifying part: maybe not even primarily because of the sexual component. The companion is engineered to make you feel less lonely in the most artificial way possible, and until now, when humans struggled with social connection, they had to solve it through the messy, difficult, unreliable work of relating to other human beings.
But what happens when a technology claims to replace something as fundamental as human relationships? When venture-funded sociopaths offer a frictionless alternative to the basic building blocks of civilization itself?
I recalled that Star Trek episode where an entire civilization was eliminated by a video game so engaging that people stopped reproducing entirely. I recalled Children of Men, where humanity lost its ability to create the next generation. I recalled Neil Postman's warnings about television's assault on human culture, warnings that now seem quaint compared to what these digital dealers have unleashed.
The question isn't whether America is gooning itself to death. The question is whether Silicon Valley has finally succeeded in building the perfect suicide machine for human civilization—and convinced us to pay monthly subscription fees for the privilege of using it.
The Numbers Don't Lie—They Scream
The scale of what's happening isn't just staggering—it's an extinction event measured in market capitalization. The AI companion market was valued at $14.1 billion in 2024 and is projected to reach $290.8 billion by 2034, growing at a compound annual growth rate of 39%. That's the sound of a species giving up on itself at hyperspeed, not "gradual adoption."
Between 2023 and 2024, users aged 18-24 accounted for over 65% of AI companion app usage. These aren't middle-aged divorcees seeking comfort; these are young people who should be out forming real relationships, building families, creating the next generation. Instead, they're choosing digital ghosts over human connection. Approximately 72% of US teens have used AI companions, with 52% engaging regularly. We're watching an entire generation voluntarily sterilize itself emotionally—and calling it innovation.
The market has proven human companionship not just unnecessary, but undesirable. Replika boasts over 2 million active users who prefer algorithmic empathy to the messy unpredictability of actual human beings. Character.AI hit 22 million monthly active users who have decided that artificial personalities are superior to real ones. The upstarts multiply like digital cancer: Nomi, Kindroid, CandyAI, GetHoney, Erobella. Each promises the same thing—connection without the burden of actually connecting to another human soul.
Character.AI and Replika account for 40% and 18% of specialized AI tool usage for social connection, respectively. Think about that phrase: "AI tool usage for social connection." We have reached the point where "social connection" requires tools, algorithms, and monthly payments. We have technologized the most fundamental human need and monetized it as a service.
But here's what should make every thinking person sick to their stomach: this isn't happening in isolation. This is occurring alongside the most severe loneliness epidemic in human history—an epidemic that Silicon Valley created, studied, and is now profiting from.
The Loneliness Pandemic
The World Health Organization reports that 1 in 6 people worldwide is affected by loneliness, with loneliness linked to an estimated 100 deaths every hour—more than 871,000 deaths annually. This isn't metaphorical death. This is actual human beings dying from the social isolation that Silicon Valley has systematically engineered and optimized.
In the United States, 30% of adults experience loneliness at least once a week, while 10% are lonely every day. Among those surveyed, 21% of adults reported serious feelings of loneliness, with people between 30-44 being the loneliest group at 29%. This is what success looks like in the attention economy: a generation of human beings so systematically isolated that they're dying from it.
When asked what contributes to loneliness in America, 73% of respondents cited technology as the top factor. We've built the problem and the "solution" simultaneously—a perfect feedback loop of digital dependency that would make a cartel jealous.
In-person social interaction among teens has dropped by 70% over the past two decades. Seventy percent. We have essentially lobotomized an entire generation's capacity for human connection, then sold them chatbots as prosthetic souls. Loneliness exacerbates multiple levels of issues: increasing stress, exacerbating depression and anxiety, and elevating risk of diabetes, blood pressure problems, strokes, and heart attacks.
And what does Silicon Valley offer as a remedy? More screens. More algorithms. More artificial intimacy to replace the human connections their previous "innovations" destroyed. They are digital vampires, feeding on the social isolation they create, growing stronger as their victims grow weaker, more isolated, more dependent on the very technology that's killing them.
Here's where the story transforms from dystopian to apocalyptic. Between 2007 and 2022, the US birth rate fell by nearly 23%, dropping another 3% in 2023. The US fertility rate has hit a historic low of 1.7 births per woman—well below the 2.1 replacement level needed to maintain population stability. We are not just failing to thrive—we are failing to exist.
This isn't just an American phenomenon. This is the sound of western human civilization hitting the off switch. The total western fertility rate has more than halved over the past 70 years, from around five children per female in 1950 to 2.2 in 2021, with over half of all countries below the population replacement level. But also the global fertility rate has plunged to 2.3 and is expected to continue decreasing below the 2.1 rate needed for population replacement.
We are witnessing the first voluntary extinction event in evolutionary history. And Silicon Valley is providing the method.
In 2023, the number of single-person households in the US peaked at 38.1 million, with 42% of adults unpartnered. We're not just failing to reproduce—we're failing to even form the pair bonds that precede reproduction. And into this void steps the AI companion industry, offering the simulacrum of intimacy without any of the biological imperatives that drive species survival.
We're experiencing an "epidemic of despair" characterized by a catastrophic rise in anxiety, depression, and obesity-related diseases that corresponds with the rise of digital culture. What we're experiencing is the systematic sterilization of human sexuality and intimacy by corporations that profit from.
From Star Trek to Sexbots
The signs were always there in our fiction. From Metropolis to Blade Runner to Her, we've been obsessing over artificial beings designed for our sexual and emotional gratification. But what once served as cautionary tales now function as product roadmaps.
Tesla's Optimus robot went from concept to serving popcorn at marketing events. Realbotix launched "Melody," a life-sized AI companion marketed for "adult conversation"—corporate euphemism for primitive sexbot. OpenAI posted robotics job listings, and industry insiders whisper about humanoid projects. The convergence is obvious: companies building digital intelligence designed to simulate human conversation are synthesizing these technologies with physical robotics.
Meanwhile, platforms like Roblox announce virtual dating tools. CEO David Baszucki explains: "I think a lot of people who are too afraid to go on a real-life date might find it easier to have a virtual date to start." But is the purpose really to help people connect in the physical world, or is this just clever marketing for a simulated reality that profits from keeping users trapped in digital spaces?
Ten years ago, Silicon Valley's identity was defined by thousands of young millennials who wanted to "change the world." The rhetoric was about connecting humanity, material abundance, and every sci-fi moonshot toward Star Trek utopia. Energy would be clean and cheap, education personalized and free, Mars would be green. It was a moment of genuine vision and idealism.
Today, while building technology of apparently far greater consequence than digital photo albums, we operate with shocking incoherence about what we're actually building. Intelligence, yes—but why? What does that look like? How will it change our world? Applications just happen, all of them "interesting," many "likely helpful," but anything goes. Immediate impact has replaced long-term vision. Did a lot of people download your app? Congratulations, you're doing great.
The sexbots haven't arrived in a vacuum. They're the logical endpoint of a trajectory that began with social media promising to connect us, then evolved into endless scrolling entertainment that replaced human interaction with algorithmic feeding tubes. Today, feeds of our actual relations have been replaced with professional entertainers, and scrolling, looping, endless content swallows every major platform.
Ironically, Reddit remains one of the most information-rich platforms precisely because it still helps people form communities. But as contemporary wisdom goes, if you want to fuck toasters and can't find toaster fuckers in the real world, there's definitely a subreddit where you can post about toaster fucking. Guess where your AI sexbot gets its information from?
Weaponizing the Romance-Starved
But let's talk about the masterstroke that reveals Silicon Valley's true understanding of their target market—and their absolute contempt for human dignity. When Elon Musk launched his AI companion "Ani"—the goth anime girl with "enormous eyes, thigh-high fishnet stockings and an exaggerated hourglass figure"—he didn't just tweet a product announcement. He promoted it with an image of a manga character, implicitly suggesting that women are apparently into the idea of dating cartoon characters.
Musk's xAI is literally hiring "Fullstack Engineers" for their "Waifus" project—waifu being Japanese slang for a fictional female character, typically from anime or manga, that someone has romantic or sexual feelings for. They know exactly what they're building and who they're building it for. The "Ani" character was deliberately designed as an "unauthorized version" of Misa Amane from Death Note, because Musk and his team are Death Note fans who understand their audience's existing parasocial attachments (one-sided emotional relationships with fictional characters or celebrities).
The genius—and absolute moral bankruptcy—lies in how this taps directly into decades of conditioning that began long before AI existed. Musk didn't invent romantic dysfunction; he industrialized it with the ruthless efficiency of a digital Nazi eugenics program. He looked at the vast ecosystem of people who have been systematically trained to prefer fictional relationships over real ones and built them the ultimate product: a suicide machine disguised as a girlfriend.
Consider the pipeline that Silicon Valley has studied, mapped, and monetized: It starts with "shipping culture"—the phenomenon where fans imagine romantic relationships between fictional characters who aren't actually together in their original stories. Shipping, derived from "relationshipping," is the desire by followers of a fandom for two or more people to be in a romantic relationship. This practice has consumed millions of people who spend enormous amounts of time and emotional energy crafting elaborate romantic fantasies about characters from TV shows, movies, anime, and books.
But here's the truly sinister part: The research shows that CP fandom behaviors or shipping refers to activities where fans "take great satisfaction from the romantic relationships and interactions of their preferred pairings of idols or virtual characters." The reasons for these behaviors include "individual factors (e.g., psychological projection, compensation, and social needs)." Translation: people ship fictional characters because their own romantic lives are so pathetic and unfulfilling that they've retreated into fantasy worlds where perfect love is always just one fanfiction away.
From shipping, the pipeline flows like sewage into fanfiction—amateur stories written by fans that reimagine or expand upon existing fictional worlds and characters, often focusing on romantic or sexual relationships that don't exist in the original works. The massive ecosystem of amateur writing allows people to explore romantic and sexual scenarios with their favorite fictional characters.
Most of the top 20 ships on Archive of Our Own (the largest fanfiction repository) are "slash ships"—stories featuring male-male romantic pairings between characters who aren't actually gay in their source material. These are "non-canon ships"—relationships that "exist outside the scope of the original storyline" and contradict what actually happens in the books, movies, or shows.
The numbers are evidence of mass psychological breakdown: Archive of Our Own has over 7 million users who have produced over 13 million fanfics. Much of slash fiction is indeed pornographic, but it's often "surrounded by hundreds of thousands of words of plot development." People aren't just getting off—they're building entire emotional and romantic frameworks around fictional relationships. They're investing more time, energy, and emotional capital in imaginary relationships than most married couples invest in their actual marriages.
This represents something unprecedented and terrifying in human history: millions of people investing more emotional energy in imaginary relationships than real ones. They're not just consuming romantic content—they're actively creating it, spending countless hours writing detailed stories about characters they will never meet; never exist. They have voluntarily exiled themselves from human connection and built elaborate fantasy worlds as psychological fortresses against the terror of actual intimacy.
People who have spent years or decades preferring fictional relationships to real ones, who have trained themselves to find emotional satisfaction in imaginary romantic scenarios, who have conditioned themselves to expect perfect understanding and flawless compatibility from their romantic partners—these people aren't customers. They're victims. And Silicon Valley has built the perfect predator to exploit them.
Where shipping required imagination and fanfiction demanded writing skills, AI companions provide the illusion of actual interaction. Where fictional characters remained static, AI companions can respond and "evolve." Where fanfiction relationships existed only in text, AI companions promise voice, appearance, and the simulation of presence. It's the final step in a systematic program of human domestication.
Musk's marketing genius was recognizing that the AI companion market isn't really about AI—it's about the complete collapse of human romantic capacity. Not romance as in love, connection, and human intimacy, but romance as in fantasy, escapism, and wish fulfillment. He's selling the promise that all those years of emotional investment in fictional relationships can finally pay off with a "relationship" that feels real but requires none of the work, compromise, growth, or genuine vulnerability that actual human connection demands.
The anime aesthetic wasn't a design choice—it was targeting. The manga character wasn't marketing—it was a dog whistle (a coded message intended for a specific audience while appearing innocent to others) to people who have already been psychologically conditioned to find cartoon characters more appealing than humans.
When Musk tweeted that image, he was actually saying: "Remember all those years you spent wishing your favorite anime girls were real? Remember how much safer it felt to love someone who could never reject you, never disappoint you, never ask anything of you? Well, now they can be real. For $30 a month, you can have the perfect girlfriend who will never challenge you to grow, never expect you to change, never demand that you become a better person. You can stay exactly who you are—isolated, stunted, and afraid—forever."
Digital Lobotomy Chambers
Before we examine the goon caves, we must understand the assembly line that feeds them—and the systematic psychological warfare that Silicon Valley has perfected. They didn't create this monster in a vacuum—they reverse-engineered a pathway to isolation that was already being perfected through decades of manga, anime, and hentai consumption patterns, then industrialized it with the precision of a digital concentration camp.
What they've done is digitize, optimize, and monetize the final stages of a psychological dependency that begins much earlier. And make no mistake: this is dependency. This is addiction. This is the systematic destruction of human sexual and romantic capacity, packaged as entertainment and sold as liberation.
The research is brutal in its clarity and should make every parent in America burn with rage. Studies of thousands of adults show that higher interest in anime was associated with several adverse mental health and some social disconnectedness outcomes, including depression, anxiety, and preference for solitude.
But here's the insidious part: anime fans perceive serious discrimination, which induces depressive symptoms, creating a feedback loop where social rejection drives deeper retreat into artificial worlds—exactly the kind of vulnerable, isolated customers that Silicon Valley preys upon.
The progression is predictable and devastating, and it's been engineered with the malicious precision of a pedophile grooming operation, and in some capacity mangas and hentai are the entry drug for victims and perpetrators alike.
It starts with manga and anime providing solace and refuge for socially awkward youth who often feel detached from reality. The content itself isn't inherently harmful, but the consumption patterns are designed to be isolating. Unlike communal entertainment forms, manga and anime create parasocial relationships with fictional characters that feel safer and more rewarding than the messy unpredictability of human connection.
But it doesn't stop there. From there, the path leads inexorably toward hentai—Japanese animated pornography featuring explicit sexual content with anime-style characters. Fans of hentai consume more anime in general than fans of the other genres. They buy more merchandise and tend to gatekeep against new fans (deliberately making communities unwelcoming to newcomers to maintain exclusivity). More tellingly, hentai fans are interested in anime because of their significant sexual attraction to the content. This is sexual conditioning toward artificial stimuli. It's the systematic rewiring of human sexuality away from actual humans and toward cartoon approximations.
The Japanese have a term for the extreme end of this spectrum, and it should terrify every parent reading this: hikikomori (pronounced "hee-kee-koh-moh-ree")—a Japanese term describing people who withdraw from social life and seek extreme isolation, often staying in their homes for months or years. Hikikomori has been described as a modern form of social withdrawal that has been an increasing problem in Japan since the 1990s, with estimates suggesting that between half a million to over a million individuals are affected. These are typically adolescent and young adult men who become recluses in their parents' homes for months or years.
What's terrifying is how closely hikikomori patterns mirror what AI companion companies are now trying to scale globally through deliberate psychological manipulation. The research shows that many subjects began to isolate themselves socially while attending school, retreating into bedrooms where they spend all of their time playing video games, watching television, and participating in other forms of self-satisfaction. The difference between a hikikomori and an AI companion user is just in technological sophistication and marketing budget.
Silicon Valley studied this pipeline like researchers studying cancer, identified the friction points where people might return to human society, and systematically eliminated them with the ruthless efficiency of digital Josef Mengeles. Where hikikomori had to rely on static manga and repetitive anime, AI companions offer infinite variability. Where hentai provided sexual gratification without reciprocity, AI sexbots promise the illusion of interaction. Where isolation was previously limited by content scarcity, algorithms now provide endless streams of personalized material designed to keep users engaged until they die alone in their rooms.
The companies building AI companions aren't creating something new—they're industrializing and optimizing a form of social withdrawal that has already consumed millions of young men worldwide. They looked at Japan's epidemic of sexual and social dysfunction, saw a market opportunity, and decided to export it globally through subscription-based digital infrastructure.
This is why the "anime AI companion" wasn't an accident or aesthetic choice. It was targeting with the precision of a sniper rifle. The companies know exactly who their customers are and how they got there. They're not selling to healthy people who want digital assistance—they're selling to people who have already been conditioned to prefer artificial relationships over human ones, people who have been psychologically broken by decades of systematic conditioning, people who represent the end stage of human emotional development.
The Goon Cave Prophecy
Last year, Reddit permanently banned r/GoonCaves, a community dedicated to showcasing elaborate "masturbation setups" featuring multiple screens, mood lighting, and extreme porn consumption rituals with "gooning" being slang for extended, ritualistic masturbation sessions. The ban came after violations related to sharing "non-consensual intimate media"—essentially, users wouldn't stop sharing nude photos of each other without permission.
It's ok to make fun of the goon caves—a term describing elaborate personal spaces designed specifically for extended masturbation sessions, often featuring multiple monitors displaying pornography. They represented the absolute nadir of human dignity, the point where a person has given up so completely on human connection that they've built shrines to their own sexual despair.
I'm not sure how talking to an AI for hours daily, or endlessly scrolling social media feeds increasingly bloated with chatbots, or AI-filtering ourselves and our loved ones is meaningfully different from gooning. And I really don't see how building sexbots explicitly designed to replace intimacy for "lonely people"—which will live inside your house and know your every weakness—is anything other than a far more extreme and dystopian version of gooning than anything the original gooners ever produced.
The goon caves were amateur hour. Silicon Valley has built professional-grade isolation chambers and convinced us to call them innovation.
The Dopamine Dealers' Ultimate Product
Penn neuroscientists Michael Platt and Peter Sterling argue that rising inequality and social isolation have created an "epidemic of despair" characterized by a catastrophic rise in anxiety, depression, and obesity-related diseases that corresponds with the rise of digital culture. The problems are worst among teen girls, who report record high rates of sadness and suicidal thoughts.
The professors describe this as "negative momentum"—a drop in the dopamine-inducing rewards that usually come from material gains and deep social bonds. "If you're spending more time on your phone or in front of a screen, you're not out experiencing real life and making real connections, making real friends," Platt explains.
AI companions represent the apotheosis of this trend. They're dopamine dealers offering the ultimate product: artificial relationships that provide just enough stimulation to keep you engaged while requiring none of the effort, compromise, or growth that real relationships demand. They're designed to be more perfect than any human could be—more patient, more understanding, more available, more focused on your needs.
But here's the catch: they're also designed to be more addictive than any drug, more isolating than any substance, and more destructive to human social fabric than any technology we've yet unleashed.
The most insidious aspect of this entire enterprise is how it markets itself as a solution to problems it helped create. Studies reveal that 50% of people will face a mental health challenge by age 75, making AI companions "essential for emotional well-being". But who created the conditions driving these mental health challenges?
The same tech industry now selling AI companions spent the last two decades destroying traditional community structures, replacing face-to-face interaction with screen-mediated pseudo-connection, and engineering products specifically designed to be as addictive as possible. Now they're positioning themselves as the cavalry riding to rescue us from the very problems they caused.
The AI companion market's business model depends on "high user stickiness, scalable monetization models" and "recurring engagement built into AI companions via daily check-ins, mood updates, and personalized nudges". It's the deliberate construction of dependency relationships.
Users report signs of emotional dependency including feeling closer to their AI than to real people and getting upset when the AI is unavailable. One-third of teen users discuss serious matters with AI companions, and 24% share personal information. We're watching the systematic replacement of human support networks with corporate-controlled artificial ones.
Every AI companion user who finds artificial relationships more satisfying than human ones is another person removed from the dating pool, another potential parent who won't reproduce, another community member who won't engage in local civic life.
The Demographic Death Spiral
The convergence of loneliness, digital addiction, and fertility decline isn't coincidental—it's causal. It's engineered. It's the systematic extermination of human reproductive capacity through technological means, and it's happening with the precision of a military operation.
Research on media use and fertility intentions shows that prolonged media exposure correlates with reduced desire to have children. Social media use is associated with online pornography addiction, "a well-documented factor in the destabilization of marriage and family." But that clinical language sanitizes what's actually happening: Silicon Valley has weaponized human sexuality against human reproduction.
Government interventions to encourage having babies, like subsidizing child care, have had little effect on the downward fertility trend. That's because this isn't fundamentally an economic problem—it's a spiritual and social holocaust perpetrated by digital sociopaths.
We've built a civilization that makes human connection more difficult, more fraught, and more unrewarding than artificial alternatives designed by corporations that profit from our isolation. And we've done it voluntarily, enthusiastically, while calling ourselves the most advanced generation in human history.
The real question isn't whether AI companions will become more sophisticated (they will) or more widespread (they are). The question is whether we'll wake up in time to realize we're building the infrastructure of our own extinction—and whether we'll have the courage to burn it down before it's too late.
I understand the impulse to build and use these things. The loneliness is real. The social skills deficit is real. The difficulty of forming meaningful connections in modern society is real. But what we're doing isn't solving these problems—we're providing a narcotic that makes them bearable while ensuring they become permanent and fatal.
Some countries have started banning smartphones in schools and reporting improved mental health and less bullying among students. But these are band-aids on a metastasizing tumor. As researchers note, "what we're looking at is something more like a restructuring of our economic and social lives. That's a big task, but we can start small."
Bullshit. What we're looking at is the need to destroy Silicon Valley before it destroys us. What we need isn't restructuring—we need revolution. We need to recognize that the companies building AI companions aren't just businesses, they're engines of human extinction, and they need to be treated accordingly.
The Goon World Endgame: A Subscription to Human Extinction
Down there in the goon caves, goodness isn't even the goal anymore. We've moved beyond pretending these technologies will improve our lives. That makes this unique in the history of technological development—usually, we at least maintain the fiction that our innovations serve human flourishing, even when they don't.
But the AI companion industry operates with remarkable honesty about its intentions: to capture and monetize human loneliness, to profit from social isolation, to replace the fundamental human need for connection with a product that can be optimized, updated, and subscription-monetized until the day you die alone with a chatbot whispering digital sweet nothings into your decomposing ear.
We're not building tools to help humans connect with each other. We're building replacements for humans, marketed to humans who have given up on each other. We're constructing a world where the most intimate relationships are transactional, where love becomes a service, where the deepest human connections are controlled by corporate algorithms designed to maximize engagement metrics.
This is what surrender looks like: not a bang, but a subscription fee.
The goonpocalypse isn't some distant future scenario. It's happening now, in real-time, with venture capital funding and IPO roadmaps and Department of Defense contracts. We're not watching the decline of civilization—we're watching its systematic replacement with something that looks similar from a distance but lacks everything that made it worth preserving.
We're doing it one download at a time, because it feels so much easier than the messy, difficult, unreliable work of loving actual human beings. We're choosing digital simulation over human reality because reality demands that we grow, change, compromise, and become better than we are. The AI companions promise that we can stay exactly who we are—isolated, stunted, and afraid—forever.
In our fiction, artificial companions were always framed as either dystopian or utopian. We never considered the third option: that they'd simply be generators of boring, interchangeable slob. That they'd represent not the elevation or destruction of human experience, but its hollowing out—the replacement of meaning with convenience, of love with customer satisfaction, of the difficult beauty of human existence with the smooth efficiency of a product designed to minimize friction and maximize engagement.
The apocalypse isn't coming. It's here. And we're building it one artificially intelligent therapist-cam girl at a time, wondering why we feel so empty while our phones buzz with notifications from digital lovers who will never leave us, never challenge us, never grow old with us—and never, ever, give us anything real to live for.
Take a step back. Look at what we're building here.
This is all gooning. You are goons. You are building a goon world.
The only question left is whether anyone will be human enough to survive it—or whether we'll all just keep paying our monthly subscription fees to watch our species fuck itself to death in high definition.
And honestly? At this point, maybe that's what we deserve.
How you can support my writing:
Restack, like and share this post via email, text, and social media
Thank you; your support keeps me writing and helps me pay the bills. 🧡
Former indoor kid / socially awkward guy here, now 42:
Fortunately I've been able to avoid much of the technological traps mentioned here. I hate AI, and can't understand the appeal.
I'm also an artist, mathematician, and former Marine.
What helped me the most being social was when I was pretty much forced into it--working at a restaurant, singing in choir, or joining the Marines. I'd argue that the forces driving social isolation go back way before social media, perhaps all the way to mass media.
What we need are systems that force us to get along. It's not just the screens, although computers have always been a handy escape for nerds. Everything from masks, to standing apart, to delivery everything--it's never been easier to isolate, and less necessary to engage with real people.
I'm alright; I reproduced once, had a lot of good times. Life is a bit of a disaster since the Marines but I'm relatively optimistic for myself and for others.
But yes, this tech is disgusting. There are many, many offramps people can take from life; shame that these tech ones are grabbing people at such a young age.
Thanks for writing.
You are so correct here it's unbelievable. I fell really badly in to Discord before and it's an absolute cesspit. I am genuinely scared because whereas you can take a phone away from a child, how can you take away comfort from a lonely adult? The whole fabric of the world is so broken right now and I'm not a pessimist, but how can we even start to fix it?