Skip to Content
Capital

The Future Is Too Easy

An Ai Me robot from Chinese company TCL is displayed during the 2025 Consumer Electronics Show CES in Las Vegas.
Zeng Hui/Xinhua via Getty Images

LAS VEGAS — There is something unstable at the most basic level about any space with too much capitalism happening in it. The air is all wrong, there's simultaneously too much in it and not enough of it. Everyone I spoke to about the Consumer Electronics Show before I went to it earlier this month kept describing it in terms that involved wetness in some way. I took this as a warning, which I believe was the spirit in which it was intended, but I felt prepared for it. Your classically damp commercial experiences have a sort of terroir to them, a signature that marks a confluence of circumstances and time- and place-specific appetites; I have carried with me for decades the peculiar smell, less that of cigarette smoke than cigarette smoke in hair, that I remember from a baseball card show at a Ramada Inn that I attended as a kid. Only that particular strain of that particular kind of commerce, at that moment, gave off that specific distress signal. It was the smell of a living thing, and the dampness in the (again, quite damp) room was in part because that thing was breathing, heavily.

CES, which this year claimed 141,000 attendees and 4,500 exhibitors, could on the other hand have been anywhere, or everywhere. That it was held across several vast convention spaces in Las Vegas certainly helped with that placelessness, or everyplacefulness, but the overwhelming volume and scale of the thing ensured it. The air in every space was charged with all that commerce, somehow shortened by the competition for it, but it wasn't close in the ways that I had prepared myself for.

Instead, the experience felt oddly distant even when and where it was crowded. On the bus taking attendees from the hotel that hosted CES to the parts of it at the Las Vegas Convention Center, people blithely joined Zoom calls and talked about "deep partnerships" in competition with a recording that played, over and over, in English and then in Spanish, warning people not to stand in the rear door. Conversations between lanyard-draped coworkers rustled under all that in various languages; the only word I could identify with any certainty from those was "Covid." The show itself was sometimes loud and sometimes quiet, seldom linear and never embarrassed, shabby or grand from one booth and one moment to the next, but for the most part it stubbornly refused to cohere. The overwhelming theme, which various attendees I talked to said was basically a rerun of the previous year's version, amounted to turning your life over, bit by bit and moment by moment, to artificial intelligence technology that would do ever larger amounts of that living for you.

The Tesla-branded Hyperloop that moved attendees around the sprawling convention center feels like the apotheosis of something or other—mass transit made dumb and inefficient through its insistent refusal to honor the concept; a futuristic aesthetic gone janky around the edges, in service of a howling and willful category error; neither a solution nor quite a problem, but more a strange and showboat-y rephrasing of the question. I wrote in my notebook, from the front seat of a silent Tesla carrying me, its driver, and one other person through a pastel-toned tunnel, that "nothing can prepare you for how fucking stupid this shit is." But I think even a few hours at CES actually did a pretty good job of that.


"I don't know exactly when it'll come," the AI CEO Dario Amodei said last week, in Davos. "I don't know if it'll be 2027. I think it's plausible it could be longer than that. I don't think it will be a whole bunch longer than that when AI systems are better than humans at almost everything. Better than almost all humans at almost everything. And then eventually better than all humans at everything."

AI executives love saying stuff like this, but they are not just saying it for their own pleasure. They're selling, of course, but also this grandiose and mystified mixture of awe and dread—something amazing is happening very quickly just out of sight, and will be here soon despite always being exactly 18-36 months away, and you will need to be protected from it, but also it will improve your life—is as much the product as anything else. It is pitched less at the general public, to whom it might reasonably sound like The Jigsaw Killer explaining why he had placed a bear trap on their head, than at the investors and politicians whose faith keeps the industry afloat. Those dire promises will stand in for the product until such time as there is a product worth selling; the speculative stuff will continue either way.

The tech industry has cycled with increasing speed over the last few years through a number of boomlets and bubbles that were sold in the same way—a new virtual world in which we will all live and play and flourish and mostly go to meetings, accessible through some sort of wearable technology or suite of carbon-intensive gimmicks, and within which all the same people and companies will be exactly as in charge as they are in this one. Everyone will be there, albeit remotely; somehow only the same few creeps ever showed up.

The comparison between AI and true-blue boondoggles like the Metaverse is admittedly not entirely fair. There really are some useful applications for machine learning, although none of those are big enough to justify the scale that the companies pushing AI are seeking and, crucially, none of them are the things that those companies have made central to their products, which mostly produce flabby, haplessly uncanny dross. At this moment, there is just not much positive public-facing utility or appeal to much that AI does—it's a way to automate various online annoyances; a low-effort homework-evasion tool; a way to make different kinds of stuff, all of which suck, at industrial scale and with industrial-scale externalities. Its most practical use-case remains Doing Crimes.

A bunch of people taking photos of a robot dog at CES in Las Vegas.
A robot dog and its admirers. Photo by Zeng Hui/Xinhua via Getty Images

There is nothing much to sell yet, is the problem. New iterations of these companies' signature products are reliably bad; that any of it might someday become something different, not even along the lines of Amodei's darkly fanciful Creating Humanity 2.0 gambit but just as a thing that actually does what it purports to do, is all there is to market. The idea of a machine that thinks and feels and cares has fascinated and tantalized and frightened and titillated humans for as long as we've had technology, and that plus the familiar cultural deference given to things that are large, rich, and loud has created an oddly abstracted discourse around AI—one that is extremely attentive to all its promised possibilities and pathologically patient with the goofy results that repeatedly surface instead.

It's deranging, but in a familiar way. The industry's leading powers say ridiculous things, the people whose job it is to report on the industry repeat them, and everyone downstream is left to reconcile the distance and rationalize the difference between those soaring possibilities and the reliably shabby fact of the thing. For bosses, to whom a great deal of this technology is being marketed, the choice is between deploying a technology that bungles even relatively simple tasks or continuing to entrust that work to people who are much likelier to do it correctly, but who will also periodically need to go to the doctor or just to bed; whether AI ever gets "better than" humans at any of these things matters less to this audience than how quickly it might become acceptably bad, and both matter much less than the speculative economy surrounding it. On the user end of things, at the moment, the choice is either to offer that objectively scammy reality either some qualified credit or to avoid it to the extent possible.

Some percentage of those users will simply accept that this is already magic, because they are inclined to believe or easily impressed or just because it is more fun to believe that than not. Some other percentage will see the lapse between what is promised and what is delivered and feel, understandably, like they are being lied to. The people covering this technology, by and large, have remained very polite about it all. "If your question is 'what can Operator do better than existing tools?' the answer is not clear," Platformer's Casey Newton wrote about OpenAI's new "agent" product. "It can take action on your behalf in ways that are new to AI systems—but at the moment it requires a lot of hand-holding, and may cause you to throw up your hands in frustration." This assessment gets more complicated when Newton explains what those actions were:

My most frustrating experience with Operator was my first one: trying to order groceries. “Help me buy groceries on Instacart,” I said, expecting it to ask me some basic questions. Where do I live? What store do I usually buy groceries from? What kinds of groceries do I want? 

It didn’t ask me any of that. Instead, Operator opened Instacart in the browser tab and begin searching for milk in grocery stores located in Des Moines, Iowa. 

At that point, I told Operator to buy groceries from my local grocery store in San Francisco. Operator then tried to enter my local grocery store’s address as my delivery address. 

Newton ultimately had to take over and do the work himself. "In the end," he writes, "adding six bananas, a 12-pack of seltzer, and a package of raspberries to a cart had taken me 15 minutes." This is worth mentioning because the biggest companies at CES were all pushing the idea that technology like this—well, not like this, but a notional version of it that actually works—will usher in a new way to live. The technology currently lavishly fucking up your grocery order in a supervised setting will soon make you breakfast and drive you to work and help raise your child and manage both your glucose levels and those of your pet. It will know everything about you, and it will also care about you. At CES, the Korean tech giant LG called their version of this Affectionate Intelligence, which will allow users to "experience affectionate moments" on the road or in their smart home, thanks to a technology that "truly gets you and manages your life with care." This will leave you more time in which to, uh.


Climbing the food chain at CES was an escalating process of getting lost. This was a very big show, and not all of it was aimed at the general public. A whole floor of the Las Vegas Convention Center's South Hall was given over to the white-label electronics brands whose products dominate Amazon searches, and that floor was silent and strange and mostly empty of pedestrian traffic when I visited. There was nothing but business to be done up there, bulk orders to be placed for USB cables or smartphone cases or headphones or whatever with companies whose names were designed to be forgotten—Marvo Business Group, Soonleader, Shenzen EarFun. A man stepped out of one of these booths, silently handed me a fun-size Milky Way, and then retreated back into it.

The problem that these companies are solving is easy to grasp. It was occasionally uncanny in how it presented itself—downstairs and elsewhere, as Daniel Cooper noted at Engadget, were four different concerns advertising different products under the Kodak brand, all of which held some of the 1,100 patents that company sold off as part of its bankruptcy in 2012. There was also a company selling e-bikes and scooters under the old Memorex trademark; the people in Radio Shack polo shirts in the Radio Shack booth were not representatives of the bankrupt American franchise but the familiars of a zombie brand now owned by the El Salvador-based Unicomer group. But, Cooper wrote, it was also more or less what it looked like: "nothing more than names and logos slapped on products that are shipped in from various manufacturers."

JaVale McGee taking a selfie of himself with a lollipop of some kind in his mouth and some kind of headphone thing on his head at CES.
JaVale McGee was there. Artur Widak/NurPhoto via Getty Images

This was supply rising to meet a global demand, businesses whose success or failure will, just given how the internet works now, necessarily have less to do with quality or coherent brand identity than with cheaply and quickly addressing some need. "I once loved Memorex's VHS tapes, so I will now buy this Memorex e-scooter," is on the merits an absurd value proposition, but not much more or less absurd than anything else in that space. These are real businesses, but in this ecosystem they are effectively plankton—small organisms that are integral to the broader system's survival and mostly food.

More legible were the booths that featured products with more obvious uses. If the Marvo-EarFun tranche exists to fill an unreasoning appetite for "earbuds, any kind, immediately," these were answers to more discrete problems. The solution to "I do not have a soft-serve ice cream machine in my home but would like to" weighs 76 pounds, costs about $3,000, works a bit like a Keurig coffee machine, and is made in Tom Glavine's Massachusetts hometown. (The ice cream was good; if basically everything about my life was different, I'd want one.) The solution to "I want a pet door that only opens for my dog, and not for burglars or possums" was smart, smartly presented, and made of aircraft-grade aluminum that withstood hundreds of blows from a sledgehammer in the company's booth without any visible dents. Not all of these problems were what you might call urgent, or even what you might call problems: A woman working in the booth for a sex toy manufacturer caught me looking at a Fleshlight-style device moving up and down on an arm synched, she told me, "to whatever you're watching," which in this case was a video playing on a laptop of a smiling woman bouncing on a trampoline. "Usually," she explained, "the device would be set up the other way," meaning with the aperture facing down. I don't remember what I said in response.

These were things, in short, and in this superheated context they were comforting in their legibility. It became more abstract further up the food chain. It is one thing for some concern or other to try to gamify masturbation—the game was called Fappy Bunny, it was "powered by Fluffer," and I do not want to talk about it—or disrupt the gaming chair space, and quite another to meet investors' grandiose ambitions. Practical technologies tended to emphasize impractical and fantastical potential uses that scale-obsessed investors might care more about: Wearable accessibility tech that allows people without the use of their arms or legs to use a computer through small facial gestures was talked up as a hands-free retail tool; a home saliva-testing kit that could be of use for trans people doing DIY hormone therapy was instead pitched more or less at that one unsettling rich guy who is trying to age in reverse. The fantasy of growth, on the metastatic scale that investors demand, outstrips every other concern. Even the most miraculous plowshare must have some potential utility as a sword.

At the top, there was more abstraction than anything else. Big corporations necessarily deal in more abstruse problems and more elaborate solutions, and tell different stories. The challenge for them is that there are only so many plausible or even passably appealing Whole New Ways To Watch Television, regardless of how much better the screens get. That those screens really are getting a lot better only means so much; at this level, the work is not solving any particular problem but consolidating a suite of overarching semi-solutions around all of them. This gambit inevitably bumps up against the outer boundaries of the possible in terms of individual solutions to structural problems: Even a notional future AI assistant—one that can do the things contemporary ones can't, one that Truly Gets You—can only do so much about poorly designed or badly maintained public spaces. The worldview that gave us the Hyperloop does not have solutions to any of that; ideologically, and just at a more elemental level, it doesn't believe in them.

A mannequin kind of freestyling on a unicycle type thing in front of a sign that reads Unlock Immersive Meetings at the 2025 CES.
Nothing more extreme than unlocking immersive meetings. David Roth/Defector

Consider the problems that, taken altogether, add up to our shameful and unworkable political moment. It's the abandonment of not just any sense of a common cause but a workable consensus reality; it's the swamping of any collective effort or any nascent social consciousness in favor of individuals assiduously optimizing and competing and refining and selling themselves, not so much alongside the rest of humanity as in constant competition with all of it; it's the rich buffing all human friction from every aspect of their days so that they can more cleanly and passively move through them, a circuit of Teslas circling silently underground forever; it's everyone else, somewhere offscreen, leaving whatever those restless protagonists have ordered on the doorstep and getting tipped 10 percent for it; it's an efflorescence of dead-eyed scams and ever taller fences. The fantasy and utility of AI, for the unconscionably wealthy and relentlessly wary masters of this space, converge in a high and lonesome abstraction—technology designed less to do every human thing for you than to replace all those human things with itself, and then sell that function back to you as a monthly subscription. This device will play with and talk to your child; this furry mouthless robot with enormous attentive eyes will replace your pet; your coffee is ready and your clothes for the day have been picked out for you. Or not.

It is both the nature and the business of casinos to make the outside world disappear, but there was a greater recession at work here—all these miracles and potential miracles worked to push users into the same stilted and solitary prisons of ease. Steve Jobs's belief that people don't know what they want until it is shown to them has long been a catechism in this cohort; Silicon Valley types have spent nearly two decades now showing people things they mostly do not want and insisting that they actually want it. If there is anything new about Silicon Valley's triumphal AI push, it is the extent to which its exponents are no longer asking whether anyone wants what they're selling and simply asserting its inevitability. "A world in which human wages crash due to AI—logically, necessarily—is a world in which productivity goes through the roof, and prices for goods and services crash to near zero," the reactionary venture capitalist Marc Andreessen tweeted last Friday. "Consumer cornucopia."

The familiar Jobs-ian notes of wonder and inspiration just do not sound very convincing coming from this notably less visionary cohort, both because they seem so ignorant about what people actually like to do and because their answer to that is flubby and mediocre surveillance technologies whose only promise to users is an ever more optimized and atomized self. At any rate, the offer is clear: In exchange for self-determination or dignity or privacy or agency, you will be granted airless post-human convenience mediated and enforced by proprietary algorithms that currently do not work, on a pay-to-play basis. The immediate circumstances of the AI industry changed dramatically for its incumbent powers just days after Andreessen's post due to the furor around the Chinese AI technology DeepSeek, whose success raised the question of whether you actually need $500 billion worth of infrastructure and world-historic energy consumption in order to make a fun little tool that can summarize emails. Still, the fundamental terms of the industry's offer remain the same. It's not a negotiation; they are not asking. But.

There is another way to read this, though, which is as a tapped-out super-class attempting to rush its preferred future into existence in the absence of any broader justification or appetite for any of it. It is rich cynics trying to make something lifeless grow in the way that living things do, and lock the dying present they rule in for the foreseeable future by effectively removing everyone from it but them. They are impatient not just because they are high-handed and avaricious, but because they know that the only future they can rule in the way they want is one that is passive, stupid, small and shrinking. There is in this morbid turning inward a latent and terribly sad admission of defeat—it is a future you'd accept only if you had given up on every other richer and more human and more generous one. The only people who've actually chosen it, so far, are the ones insisting that there's no other future to choose.

If you liked this blog, please share it! Your referrals help Defector reach new readers, and those new readers always get a few free blogs before encountering our paywall.

Stay in touch

Sign up for our free newsletter