Itemoids

California

What Went Wrong at Blizzard Entertainment

The Atlantic

www.theatlantic.com › technology › archive › 2024 › 10 › blizzard-entertainment-play-nice › 680178

Over the past three years, as I worked on a book about the history of the video-game company Blizzard Entertainment, a disconcerting question kept popping into my head: Why does success seem so awful? Even typing that out feels almost anti-American, anathema to the ethos of hard work and ambition that has propelled so many of the great minds and ideas that have changed the world.

But Blizzard makes a good case for the modest achievement over the astronomical. Founded in Irvine, California, by two UCLA students named Allen Adham and Mike Morhaime, the company quickly became well respected and popular thanks to a series of breakout franchises such as StarCraft and Diablo. But everything changed in 2004 with the launch of World of Warcraft (or WoW), which became an online-gaming juggernaut that made billions of dollars. I started writing Play Nice because I wanted to examine the challenging relationship between Blizzard and the parent corporation that would eventually call the shots. After conducting interviews with more than 300 current and former Blizzard staff members, I found a tragic story—a cautionary tale about how the pursuit of endless growth and iteration can devastate a company, no matter how legendary its status.

When Blizzard was founded, the video-game industry had not yet become the $200 billion business it is today. The Super Nintendo console hadn’t arrived in America, and Tetris was still one of the hottest things going. But Adham and Morhaime saw the unique appeal of the medium. With games, you didn’t just watch things happen—you controlled them.

Adham and Morhaime started the company in 1991 with a little seed money from their families, some college-level programming knowledge, and a handful of artists and engineers. Within a decade, their games were critical and commercial hits, selling millions of copies and winning over players worldwide. None of these titles invented a genre, exactly—the original Warcraft and StarCraft followed strategy games such as Dune II and Herzog Zwei, while Diablo shared some DNA with games such as Rogue and Ultima—but Blizzard had a working formula. The company’s games were streamlined and approachable, in contrast with more arcane competitors that, especially in the early days of PC gaming, seemed to demand that players reference dense manuals at every turn. Yet Blizzard games also maintained enough complexity to separate amateur and expert players. Most anyone could play these games, much as anyone could pick up a bat and smack a baseball—but there are Little Leaguers and then there is Shohei Ohtani.

Crucially, each game contained modes that allowed people to compete or cooperate with one another, first via local networks and then, beginning with 1995’s Warcraft II, through the internet. Blizzard’s success was tied to the rise of the web, and it even developed its own platform, Battle.net, that allowed customers to play online for free (an unusual move at the time). This was a bold approach back when fewer than 10 percent of Americans were regularly going online.

[From the July/August 2023 issue: ‘Hell welcomes all’]

The company’s bet paid off wildly with the release of WoW, an online game that had not just multiplayer matches but a persistent universe, allowing players to inhabit a vivid fantasy realm full of goblins and centaurs that existed whether or not they were playing. Unlike Blizzard’s previous games, WoW required players to pay a $15 monthly fee to offset server costs, so Adham and Morhaime didn’t know what to expect ahead of release. They thought they might be lucky to hit 1 million subscribers. Instead, they reached 5 million within a year. Employees popped champagne, and colorful sports cars began dotting the parking lot as WoW’s designers and programmers received bonus checks that outpaced their salaries.

The company hired armies of developers and customer-service reps to keep up with the unprecedented demand, swelling from hundreds to thousands of employees. Within a few years, Blizzard had moved to a sprawling new campus, and its parent company had merged with a competitor, Activision, to become Activision Blizzard, the largest publicly traded company in gaming. By 2010, WoW had more than 12 million subscribers.

No company can scale like this without making changes along the way. For WoW to thrive, it would have to siphon talent from elsewhere. Players expected a never-ending stream of updates, so Blizzard moved staff from every other team to imagine new monsters and dungeons. Other projects were delayed or canceled as a result. WoW’s unprecedented growth also tore away at Blizzard’s culture. Staff on Team 2, the development unit behind the game, would snark to colleagues in other departments that they were paying for everyone else’s salaries.

Innovating, as the company had done so successfully for years after its founding, seemed to become impossible. Blizzard attempted to create a new hit, Titan, with an all-star team of developers. Mismanagement and creative paralysis plagued the team, but most of all, the team struggled with the pressure of trying to create a successor to one of the most lucrative games in history. Titan was stuffed full of so many ideas—the shooting and driving of Grand Theft Auto alongside the house-building of The Sims—that it wound up feeling unwieldy and incoherent. In the spring of 2013, after seven years of development and a cost of $80 million, Blizzard canceled the game.

To Bobby Kotick, the CEO of Blizzard’s corporate parent, this cancellation was a massive failure—not just a money drain but a wasted opportunity. Meanwhile, WoW was on the decline, losing subscribers every quarter, and an ambitious plan to release new expansions annually had not panned out. By 2016, the company had managed to release two more big hits: a digital card game called Hearthstone, based on the Warcraft universe, and a competitive shooting game, Overwatch, that was salvaged from Titan’s wreckage. But both projects were almost canceled along the way in favor of adding more staff to WoW. And they weren’t enough for Kotick, who watched Blizzard’s profits rise and fall every year and wanted to see more consistent growth. He pushed the company to hire a new chief financial officer, who hired a squad of M.B.A.s to make suggestions that sounded a whole lot like demands about boosting profits. In the early days, Blizzard’s philosophy had been that if they made great games, the money would follow; now the logic was flipped.

In October 2018, Morhaime resigned, writing, “I’ve decided it’s time for someone else to lead Blizzard Entertainment.” The pressure from Activision would only increase in the following years, leading to the departures of so many company veterans and leaders that the company stopped sending emails about them. Blizzard faced endless public-relations disasters, the cancellation of more projects, and frustration from Activision executives as its next two planned games, Diablo and Overwatch sequels, were delayed for years. In 2020, the company released its first bad game, a graphical remaster of an earlier Warcraft title, which was widely panned for its glitches and missing features.

Then things got even worse. In 2021, the state of California sued Activision Blizzard for sexual misconduct and discrimination in a complaint that largely focused on Blizzard. Current and former Blizzard staff spoke out on social media and with reporters about the harassment and discrimination they said they had faced. Blizzard replaced its president, fired or reprimanded dozens of employees, and even changed the names of characters in its games who had been named after alleged offenders. (The lawsuit was later settled for $54 million.) Microsoft agreed to purchase the disgraced game maker for $69 billion one year later.

Today, Blizzard is clearly not the company it once was. Although it retains millions of players and its games are successful, it has not released a new franchise in nearly a decade, and it is still reckoning with the reputational and institutional damage of the past few years. There were many factors, but you can draw a straight line from Blizzard’s present-day woes all the way back to the billions of dollars generated by WoW. If not for that sudden success and the attempts to supercharge growth, Blizzard would be a very different company today—perhaps one following a steadier, more sustainable path.

[Read: T]he quiet revolution of Animal Crossing

Other video-game makers have run into similar problems. Epic Games, once known for a variety of games and technological innovations, released Fortnite in 2017 and watched it turn into a cultural phenomenon; Epic grew exponentially and abandoned most other projects as that game exploded. Rockstar, the company behind Grand Theft Auto, has not released a new entry in the series since 2013 largely due to the billions of dollars generated by the previous game and its online component, which have sold 200 million copies but demanded extensive resources. The independent makers of smash hits such as Hollow Knight and Stardew Valley have struggled to deliver successors in a timely fashion, undoubtedly at least in part because of the creative pressures of surpassing art that millions of people loved.

Not everyone plays video games. But many people have felt the effects of enormous success changing something they once cherished, be it a rock band watering down its music to appeal to larger audiences or a search engine embracing AI garbage to appeal to insatiable investors. Why dedicate your resources to incubating new products when the old one makes so much money? Creative people often find themselves hoping for that one big hit to propel them on a course to greatness, but getting there can also mean losing your soul along the way. As one former Blizzard designer told me: “When millions turn into billions, everything changes.”

Bankrupt EV startup Fisker left its abandoned headquarters in 'complete disarray'

Quartz

qz.com › fisker-bankruptcy-evs-headquarters-lawsuit-1851667041

Fisker can’t seem to do anything right – and that includes closing up shop. Apparently, the La Palma, California headquarters of the now-dead automaker was abandoned and left in “complete disarray” with full-size clay models, automotive equipment, EV batteries and hazardous waste left behind.

Read more...

America Is Lying to Itself About the Cost of Disasters

The Atlantic

www.theatlantic.com › science › archive › 2024 › 10 › hurricane-helene-cost-disasters › 680168

The United States is trapped in a cycle of disasters bigger than the ones our systems were built for. Before Hurricane Helene made landfall late last month, FEMA was already running short on funds; now, Alejandro Mayorkas, the Homeland Security secretary, told reporters on Wednesday, if another hurricane hits, it will run out altogether. At the same time, the Biden administration has announced that local expenses to fix hurricane damage in several of the worst-affected states will be completely reimbursed by the federal government.

This mismatch, between catastrophes the government has budgeted for and the actual toll of overlapping or supersize disasters, keeps happening—after Hurricane Harvey, Hurricane Maria, Hurricane Florence. Almost every year now, FEMA is hitting the same limits, Carlos Martín, who studies disaster mitigation and recovery for the Brookings Institution, told me. Disaster budgets are calculated to past events, but “that’s just not going to be adequate” as events grow more frequent and intense. Over time, the U.S. has been spending more and more money on disasters in an ad hoc way, outside its main disaster budget, according to Jeffrey Schlegelmilch, the director of the National Center for Disaster Preparedness at Columbia Climate School.


Each time, the country manages to scrape by, finding more money to help people who need it. (And FEMA does have money for immediate Helene response.) But each time, when funds get too low, the agency winds up putting its other relief work on hold in favor of lifesaving measures, which can slow down recovery and leave places more vulnerable when the next storm hits. In theory, the U.S. could keep doing that, even as costs keep growing, until at some point, these fixes become either unsustainable or so normalized as to be de facto policy. But it’s a punishing cycle that leaves communities scrambling to react to ever more dramatic events, instead of getting ahead of them.

The U.S. is facing a growing number of billion-dollar disasters, fueled both by climate change and by increased development in high-risk places. This one could cost up to $34 billion, Moody’s Analytics estimated. Plus, the country is simply declaring more disasters over time in part because of “shifting political expectations surrounding the federal role in relief and recovery,” according to an analysis by the Brookings Institution.

Meanwhile, costs of these disasters are likely to balloon further because of gaps in insurance. In places such as California, Louisiana, and Florida, insurers are pulling out or raising premiums so high that people can’t afford them, because their business model cannot support the current risks posed by more frequent or intense disasters. So states and the federal government are already taking on greater risks as insurers of last resort. The National Flood Insurance Program, for instance, writes more than 95 percent of the residential flood policies in the United States, according to an estimate from the University of Pennsylvania. But the people who hold those policies are almost all along the coasts, in specially designated flood zones. Inland flooding such as Helene brought doesn’t necessarily conform to those hazard maps; less than 1 percent of the homeowners in Buncombe County, North Carolina, where the city of Asheville was badly hit, had flood insurance.

For Helene-affected areas, after the immediate lifesaving operations are done, this is the question that most haunts Craig Fugate, the FEMA administrator under President Barack Obama: “How do you rebuild or provide housing for all those folks?” The Stafford Act, the legislation that governs U.S. disaster response, was written with the idea that most people will use insurance to cover their losses and was not built for this current reality of mass damage to essentially uninsured homes, he told me. “The insurance model is no longer working, and the FEMA programs are not designed to fill those gaps,” Fugate said.

Fugate would like to see major investments in preparing homes and infrastructure to withstand disasters more gracefully. This is a common refrain among the people who look most closely at these problems: Earlier this week, another former FEMA administrator, Brock Long, told my colleague David A. Graham that the country should be rewarding communities for smarter land-use planning, implementing new building codes, and working with insurance companies “to properly insure their infrastructure.” They keep hitting this note for good reason. A study by the U.S. Chamber of Commerce found that every dollar of disaster preparedness saves communities $13 in damages, cleanup costs, and economic impacts. But since 2018, the government has set aside just 6 percent of the total of its post-disaster grant spending to go toward pre-disaster mitigation.

That actually counts as a major increase in federal funding for resilience, Fugate told me, but it’s still nothing compared with the trillions of dollars needed to protect infrastructure from current risk. Disaster costs are only going to keep growing unless the country invests in rebuilding its infrastructure for the future. Martín put it to me like this: “If I were to have a heart attack, heaven forbid, and I survived it, I would say, Okay, I’m going to start eating better. I’m going to start exercising. I’m going to do all the things to make sure it doesn’t happen again.” The country keeps sustaining shocks to its system that won’t stop without work.

But some of these measures, such as adopting stronger building codes, tend to be unpopular with the states that hold the authority to change them. “There is a sort of quiet tension between states and the federal government in terms of how to do this,” Schlegelmilch said. The way things work right now, states and local governments would likely end up shouldering more of the cost of preparing for disasters. But they know the federal government will help fund recovery.

Plus, spending money on disaster recovery helps win elected officials votes in the next election. “The amount of funding you bring in has a very strong correlation to votes—how many you get, how many you lose,” Schlegelmilch said. But the same cannot be said for preparedness, which has virtually no correlation with votes. Nonprofits working on disasters face a similar problem. Schlegelmilch told me that some have websites that they keep dark, and then fill in “like a Mad Libs” when disasters inevitably hit. “Insert the disaster name here, insert a photo here, and then they’re up and ready to go, in terms of fundraising, because that’s when people give.” That is natural enough: People want to help people who are obviously in distress. It’s more abstract to imagine helping before any danger arrives, even if that would be more effective.

None of these dynamics are going away, and Schlegelmilch thinks changing them could mean rethinking federal emergency management altogether, “the way we reimagined homeland security after 9/11,” he said. He counts as many as 90 disaster-assistance programs across as many as 20 different agencies; a reorganization into a central disaster department would at least streamline these. “I say this knowing full well that the creation of the Department of Homeland Security was a mess,” he told me. But, he added, “We have to get ahead of this with a greater investment in preparedness and resilience. And greater efficiency and coordination.”

Fugate’s expectations are more pragmatic. “Have you ever seen a committee chairman in Congress willingly give up their program areas?” he asked. (Notably, even after DHS was created, its first secretary, Tom Ridge, had to navigate 88 congressional committees and subcommittees that took an interest in the department’s work.) He would like to see the U.S. establish a National Disaster Safety Board, similar to the National Transportation Safety Board—an organization funded by Congress, and separate from any executive agency—that would assess storm responses and make recommendations.

But he isn’t sure the country has gone through enough yet to fundamentally change this cycle of expensive, painful recoveries. “Every time I think there’s some event where you go, Okay, we’re going to come to our senses, we seem to cope enough that we never get to that tipping point,” he said. Some catastrophic failures—Hurricane Katrina, for example—have changed disaster policy. But Americans have yet to change our collective mind about preparing for disaster adequately. People still can’t even agree about climate change, Fugate notes. “I mean, you keep thinking we’re going to get one of these storms, that we’re going to hit the tipping point and everybody’s going to go, Yeah, we got a problem.” So far, at least, we haven’t reached it yet.