Itemoids

Sam Altman

Why Is Everyone Talking About Getting ‘Oneshotted’?

The Atlantic

www.theatlantic.com › culture › archive › 2025 › 02 › oneshotted-going-online › 681774

Ever since Elon Musk bought X, the platform’s production of novel slang, metonyms, catchphrases, and other neologisms has fallen precipitously. The site that gave us milkshake duck (shorthand for discovering an overnight celebrity’s racist, sexist, or otherwise objectionable posts) has, post-2022, contributed noticeably less in the way of argot. This problem seems partly because of declining engagement—as much as 30 percent in the past two years—and partly attributable to X Premium, which allows subscribers to pay a monthly fee to make their replies more visible, among other features. This pay-to-win offer has interfered with the process of natural selection—in which popular posts would multiply through reposts, mutate in iterative references, and rise above the overwhelming majority of posts that got no interest at all—that used to be the platform’s defining quality.

But if the evolution of weird language on X has slowed, new species of expression are still surfacing from the muck. One of the most colorful in recent months is oneshotted, a term that means, roughly, to be destroyed and subsequently remade by a single experience. The word was all over X last month, suggesting that it had expanded beyond the relatively narrow sense in which it had been used for years.

Prior to its recent popularity, oneshotted was gamer slang: When an opponent kills you in a single blow, you have been oneshotted. Oneshotting tends to happen when you venture outside the places that the designers of the game intended for a character of your current abilities—in other words, when you go somewhere you’re not supposed to be. In its second act on X and other non-gamer-specific social media, however, oneshotted is less physical than existential: a crisis not in the health of your character but in the continuity of yourself.

The current popularity of oneshotted seems traceable to a 2023 post from the X user @LandsharkRides, who wrote: “‘Ayahuasca’ is insane because it appears to be one of the most legitimately dangerous drugs with the potential to gigafry your brain but is exclusively taken by literal turbonormies who unironically want to like ‘heal internalized racism trauma’ and basically get oneshotted by it.”

What we have here is a rich text. Its premise is the fad for ayahuasca among tech entrepreneurs and other corporate types, which goes back at least as far as 2016, when Business Insider reported on a Peruvian ayahuasca retreat marketed toward start-up founders, called Entrepreneurs Awakening. Last fall, Sam Altman of OpenAI told a podcast host that taking ayahuasca while on a retreat in Mexico transformed him from a “very anxious, unhappy person” to “calm,” presumably allowing him to more efficiently build job-eating robots. Such people seem to be who @LandsharkRides meant by turbonormies: go-getters who see ayahuasca as a way to innovate solutions and maximize shareholder value, possibly in conjunction with mindfulness.

This use is decidedly off-label. An extremely potent hallucinogen, ayahuasca was originally consumed by Indigenous South Americans for ritual purposes. Users would take the drug and enter a trancelike state, after which many reported encounters with the spirits of dead ancestors or personifications of abstract forces such as time and death. This is the opponent—the “Mesoamerican 6D demon” that makes them quit work and become a “traveling circus stripper,” as @Landsharkrides put it in a subsequent post—that oneshots the turbonormies, who never would have gotten near it in the universe as correctly designed. That type-A entrepreneurs have come to regard ayahuasca as a performance-enhancing drug is grimly ironic, given its potential to render them incapable of or uninterested in basic functioning.

The viral spread of the original @Landsharkrides post was Phase 1 in the outbreak of oneshotted. Phase 2 began when the word started appearing in new X posts that alluded to the original. These allusions signaled the familiarity with significant posts that passes for status on text-based social media, creating incentives for more people to learn the term and demonstrate that they, too, can use it correctly. (This phenomenon sounds esoteric and modern in this context, but it should be familiar to anyone who remembers that year of elementary school when everyone starts swearing like longshoremen.) The consequent ubiquity of the word has brought us to Phase 3, in which people now say oneshotted without quotation marks—not in reference to the original post but rather as a vehicle for conveying its own meaning. In this interview, for example, two podcasters theorized that the accused killer Luigi Mangione got oneshotted, as in he lost his faculties of judgment as a result of psychedelic-drug use. To hear them tell it, the term carries a whiff of grudging admiration for how the oneshotted has achieved a desirable transcendence.

Phase 4 swiftly approaches, at which point the whole thing will be over, so say it while you can. It’s ironic that this term has entered our vocabulary through a process of gradual accretion, in post after post that calls on us to read and interpret the word until we find ourselves using it without being able to say exactly when we started, because that is pretty much the opposite process of what oneshotted describes. But what drives this exposure to thousands of repetitive posts if not our desire to be oneshotted?

[Read: Social media broke slang. Now we all speak phone.]

Let us agree, for the purposes of this argument, that social media totally rots. It contains some good stuff, but overall, it is like a cookie the size of a football field with three chocolate chips in it, plus an equivalent number of staples. We keep opening our phones and sifting through the dross, though, because we want to find that single, elusive thing that will blow our mind. These things are, for the most part, not available online. Maybe you will get oneshotted by a romantic relationship, or your first surfing lesson, or the touring production of Hamilton; whatever it is, you will probably not be holding your phone when it happens. In fact, the less one uses the internet—due to incarceration or hiking or whatever—the more one realizes that being online actually prevents a variety of oneshot-type experiences. But screen addicts (myself included) keep going back, hoping to get oneshotted by an internet that, characterized by ephemerality and continuous renewal though it may be, is also marked by a crushing sameness.

The ubiquity of oneshotted on social media is paradoxical, because the impression you get from long-term scrolling is that nothing will change. Screen time is not linear; it is episodic, even cyclical. Of course the torrent of news is massive, and lately, any given article is likely to feel shocking, but little to none of it has the power to oneshot you. That’s a shame, because the thing about getting oneshotted is that it’s supposed to be bad, but it’s secretly appealing.

No one wants to gigafry their brain, and the turbonormie who gets oneshotted by ayahuasca has undoubtedly lost something of irreplaceable complexity. Yet one suspects that he is happier. His remaking frees him from the responsibilities and even the values that kept him from living a more interesting life, and that freedom is something many of us long for, even if the circumstances of his new life aren’t. Although the correct use of oneshotted is denotatively negative, it is not entirely derisive, because to be oneshotted is to be released—released by an event that is destructive, yes, but also swift enough that it is over before the old self can be much immiserated by it, and a new self emerges in the aftermath, likely to be objectively gigafried but subjectively happier.

The subtle implication of oneshotted, and what I think accounts for the term’s popularity, is that it is enviable. Getting oneshotted is frustrating when it happens in a game, but when it happens in real life, it’s sublime—when it happens figuratively, anyway. I do wish to be metaphorically oneshotted as soon as possible. You might too. The most reliable way to keep it from happening is to pick up the phone.

President Trump could act as a bridge between Sam Altman and Elon Musk, exec says

Quartz

qz.com › president-trump-bridge-sam-altman-and-elon-musk-ai-feud-1851760819

Deepak Puri, Chief Investment Officer of Deutsche Bank Private Bank (DB), spoke with Quartz for the latest installment of our “Smart Investing” video series.

Read more...

The False AI Energy Crisis

The Atlantic

www.theatlantic.com › technology › archive › 2025 › 02 › ai-energy-crisis-fossil-fuels › 681653

Over the past few weeks, Donald Trump has positioned himself as an unabashed bull on America’s need to dominate AI. Yet the president has also tied this newfound and futuristic priority to a more traditional mission of his: to go big with fossil fuels. A true AI revolution will need “double the energy” that America produces today, Trump said in a recent address to the World Economic Forum, days after declaring a national energy emergency. And he noted a few ways to supply that power: “We have more coal than anybody. We also have more oil and gas than anybody.”

When the executives of AI companies talk about their ambitions, they tend to shy away from the environmental albatross of fossil fuels, pointing instead to renewable and nuclear energy as the power sources of the future for their data centers. But many of those executives, including OpenAI’s Sam Altman and Microsoft’s Satya Nadella, have also expressed concern that America could run out of the energy needed to sustain AI’s rapid development. An electricity shortage for AI chips, Elon Musk predicted last March, would arrive this year.

Both Trump and the oil and gas industry—which donated tens of millions of dollars to his presidential campaign—seem to have recognized an opportunity in the panic. The American Petroleum Institute has repeatedly stressed that natural gas will be crucial in powering the AI revolution. Now the doors are open. The oil giants Chevron and Exxon have both declared plans to build natural-gas-powered facilities connected directly to data centers. Major utilities are planning large fossil-fuel build-outs in part to meet the forecasted electricity demands of data centers. Meta is planning to build a massive data center in Louisiana for which Entergy, a major utility, will construct three new gas-powered turbines. Both the $500 billion Stargate AI-infrastructure venture and Musk’s AI supercomputer reportedly already or will rely on some fossil fuels.

If one takes the dire warnings of an energy apocalypse at face value, there’s a fair logic to drawing from the nation’s existing sources, at least in the near term, to build a more sustainable, AI-powered future. The problem, though, is that the U.S. is not actually in an energy crunch. “It is not a crisis,” Jonathan Koomey, an expert on energy and digital technology who has extensively studied data centers, recently told me. “There is no explosive electricity demand at the national level.” The evidence is ambiguous about a pending, AI-driven energy shortage, offering plenty of reason to believe that America would be fine without a major expansion in oil, coal, or natural-gas production—the latter of which the U.S. is already the world’s biggest exporter of. Rather than necessitating a fossil-fuel build-out, AI seems more to be a convenient excuse for Trump to pursue one. (The White House and its Office for Science and Technology Policy did not respond to requests for comment.)

Certainly, data centers will drive up U.S. energy consumption over the next few years. An analysis conducted by the Lawrence Berkeley National Laboratory (LBNL) and published by the Department of Energy in December found that data centers’ energy demand doubled from 2017 to 2023, ultimately accounting for 4.4 percent of nationwide electricity consumption—a number that could rise to somewhere between 6.7 and 12 percent by 2028. Some parts of the country will be affected more than others. Northern Virginia has the highest concentration of data centers in the world, and the state is facing “the largest growth in power demand since the years following World War II,” Aaron Ruby, a spokesperson for Dominion Energy, Virginia’s largest utility, told me. Georgia Power, similarly, is forecasting significant demand growth, likely driven by data-center development. In the meantime, Microsoft, Google, and Meta are all rapidly building out power-hungry data centers.

But as Koomey, who co-authored the LBNL forecast, argued, that forecasted growth does not seem likely to push the nation’s electricity demands past some precipice. Overall U.S. electricity consumption grew by 2 percent in 2024, according to federal data, and the Energy Information Administration predicted similar growth for the following two years. A good chunk of that growth has nothing to do with AI, but is the result of national efforts to electrify transportation, heating, and various industrial operations—factors that, in their own right, will continue to substantially increase the country’s electricity consumption. Even then, the U.S. produced more energy than it consumed every year from 2019 to 2023, as well as for all but one month for which there is data in 2024. An EIA outlook published last month expects natural-gas-fired electricity use to decline through 2026. John Larsen, who leads research into U.S. energy systems and climate policy at the Rhodium Group, analyzed the EIA’s power-plant data and found that 90 percent of all planned electric-capacity additions through 2028 will be from renewables or storage—and that the remaining additions, from natural gas, will be built at two-thirds the rate they have been over the past decade.

None of this discounts the fact that the AI industry is rapidly expanding. The near-term electricity-demand growth is likely real and “a little surprising,” Eric Masanet, a sustainability researcher at UC Santa Barbara and another co-author of the LBNL forecast, told me. More people are using AI products, tech companies are building more data centers to serve their customers, and more powerful bots may also need more power. Last year, Rene Haas, the CEO of Arm Holdings, which designs semiconductors, attracted much attention for his prediction that data centers around the world may use more electricity than the entire country of India by 2030. Some regional utilities have projected much higher demand growth into the late 2030s than nationwide estimates suggest. And chatbots or not, building enough electricity generation and power lines for transportation, heating, and industry in the coming years will be a challenge.

Still, tremendous uncertainty exists around just how power-hungry the AI industry will be in the long term. State utilities, for instance, are likely exaggerating demand, according to a recent analysis from the Institute for Energy Economics and Financial Analysis. That might be because utilities are overestimating the number of proposed data centers that will actually be built in their territories, according to a new Bipartisan Policy Center report that Koomey co-authored. And AI still could not turn out to be as world-changing and money-making as its makers want everyone to believe. Even if it does, the energy costs are not straightforward. Last month, the success of DeepSeek—an AI model from a Chinese start-up that matched top American models for lower costs—suggested that AI can be developed with lower resource demands, although DeepSeek’s cost and energy efficiency are still being debated. “It’s really not a good idea” to look beyond the next two to three years, Masanet said. “The uncertainties are just so large that, frankly, it’s kind of a futile exercise.”

If AI and data centers drive sustained, explosive electricity demand, natural gas and coal need not be the energy sources of choice. For now, utilities are likely planning to use some fossil fuels to meet short-term demand, because these facilities are more familiar and much quicker to integrate into the grid than renewable sources, Larsen told me. Plus, natural-gas turbines can operate around the clock and be ramped up to meet surges in demand, unlike solar and wind. But clean energy will also meet much of that short-term demand, if for no reasons other than cost and inertia: Solar panels, wind turbines, and batteries are becoming cost-competitive with natural gas and getting cheaper, while a growing number of industries are turning to renewable energy sources. The tech firms leading the AI race are major purchasers of and investors in clean energy, and many of these companies have also made substantial investments in nuclear power.

Using natural gas, coal, or oil to power the way to an AI future will not be the inevitable result of the physics, chemistry, or economics of electricity generation so much as a decision driven by politics and profit. AI proponents and energy companies “have an incentive to argue there’s going to be explosive demand,” Koomey told me. Tech firms benefit from the perception that they are building something so awe-inspiring and expensive that they need every possible source of energy they can get. Any federal blessing for data-center construction, as Trump granted Stargate, is a boon to production. Meanwhile, oil and gas companies want to sell more energy; utilities earn higher profits the more they spend on infrastructure; and the Republican Party, Trump included, has a pretense to satisfy demand to ramp up fossil-fuel production.

Of course, AI needn’t precipitate a national energy shortage to add to a different crisis. Microsoft and Google, despite promising to significantly reduce and offset their carbon footprints, both emit more greenhouse gases across their operations than they did a few years ago. Google’s emissions grew 48 percent from 2019 to 2023, the most recent year for which there is public data, and Microsoft’s are up 29 percent since 2020, an increase driven substantially by data centers. These companies want more power, and the fossil-fuel industry wants to supply it. While AI’s energy needs remain uncertain, the environmental damages of fossil-fuel extraction do not.

Elon Musk and Sam Altman are fighting about OpenAI again. Here's what they've said

Quartz

qz.com › sam-altman-elon-musk-openai-offer-sale-ai-action-summit-1851760224

Sam Altman and Elon Musk are trading barbs about OpenAI again — this time after Musk’s reported offer to buy the artificial intelligence startup’s assets for $97.4 billion.

Read more...

What Is AI Distillation?

The Atlantic

www.theatlantic.com › newsletters › archive › 2025 › 02 › what-is-ai-distillation › 681616

This is Atlantic Intelligence, a newsletter in which our writers help you wrap your mind around artificial intelligence and a new machine age. Sign up here.

OpenAI has said that it believes that DeepSeek, the Chinese start-up behind the shockingly powerful AI model that launched last month, may have ripped off its technology. The irony is rich: We’ve known for some time that generative AI tends to be built on stolen media—books, movie subtitles, visual art. The companies behind the technology don’t seem to care much about the creatives who produced that training data in the first place; Sam Altman said early last year that it would be “impossible” to make powerful AI tools without copyrighted material, and that he feels the law is on his side.

If DeepSeek did indeed rip off OpenAI, it would have done so through a process called “distillation.” As Michael Schuman explained in an article for The Atlantic this week, “In essence, the firm allegedly bombarded ChatGPT with questions, tracked the answers, and used those results to train its own models. When asked ‘What model are you?’ DeepSeek’s recently released chatbot at first answered ‘ChatGPT’ (but it no longer seems to share that highly suspicious response).” In other words, DeepSeek is impressive—about as capable as other cutting-edge models, and developed at a much lower cost—but it may be so only because it was effectively built on top of existing work. (DeepSeek did not respond to Schuman’s request for comment.)

“What DeepSeek is accused of doing is nothing like hacking, but it’s still a violation of OpenAI’s terms of service,” Schuman writes. “And if DeepSeek did indeed do this, it helped the firm to create a competitive AI model at a much lower cost than OpenAI.” (The Atlantic recently entered into a corporate partnership with OpenAI.) Whether or not DeepSeek distilled OpenAI’s technology, others will likely find a way to do the same thing. We may be approaching the era of the AI copycat. For a time, it took immense wealth—not to mention energy—to train powerful new AI models. That may no longer be the case.

Illustration by The Atlantic

DeepSeek and the Truth About Chinese Tech

By Michael Schuman

When the upstart Chinese firm DeepSeek revealed its latest AI model in January, Silicon Valley was impressed. The engineers had used fewer chips, and less money, than most in the industry thought possible. Wall Street panicked and tech stocks dropped. Washington worried that it was losing ground in a vital strategic sector. Beijing and its supporters concurred: “DeepSeek has shaken the myth of the invincibility of U.S. high technology,” one nationalist commentator, Hu Xijin, crowed on Chinese social media.

Then, however, OpenAI, which operates ChatGPT, revealed that it was investigating DeepSeek for having allegedly trained its chatbot using ChatGPT. China’s Silicon Valley–slayer may have mooched off Silicon Valley after all.

Read the full article.

What to Read Next

Americans are trapped in an algorithmic cage: “The private companies in control of social-media networks possess an unprecedented ability to manipulate and control the populace,” Adam Serwer writes. The government’s computing experts say they are terrified: “Four IT professionals lay out just how destructive Elon Musk’s incursion into the U.S. government could be,” Charlie Warzel and Ian Bogost report.