Itemoids

FDA

What Happens When You’ve Been on Ozempic for 20 Years?

The Atlantic

www.theatlantic.com › health › archive › 2024 › 04 › ozempic-mounjaro-glp-1-long-term-effects › 678057

In December 1921, Leonard Thompson was admitted to Toronto General Hospital so weak and emaciated that his father had to carry him inside. Thompson was barely a teenager, weighing all of 65 pounds, dying of diabetes. With so little to lose, he was an ideal candidate to be patient No. 1 for a trial of the pancreatic extract that would come to be called insulin.

The insulin did what today we know it can. “The boy became brighter, more active, looked better and said he felt stronger,” the team of Toronto researchers and physicians reported in March 1922 in The Canadian Medical Association Journal. The article documented their use of insulin on six more patients; it had seemingly reversed the disease in every case. As John Williams, a diabetes specialist in Rochester, New York, wrote of the first patient on whom he tried insulin later that year, “The restoration of this patient to his present state of health is an achievement difficult to record in temperate language. Certainly few recoveries from impending death more dramatic than this have ever been witnessed by a physician.”

Of all the wonder drugs in the history of medicine, insulin may be the closest parallel, in both function and purpose, to this century’s miracle of a metabolic drug: the GLP-1 agonist. Sold under now-familiar brand names including Ozempic, Wegovy, and Mounjaro, these new medications for diabetes and obesity have been hailed as a generational breakthrough that may one day stand with insulin therapy among “the greatest advances in the annals of chronic disease,” as The New Yorker put it in December.

But if that analogy is apt—and the correspondences are many—then a more complicated legacy for GLP-1 drugs could be in the works. Insulin, for its part, may have changed the world of medicine, but it also brought along a raft of profound, unintended consequences. By 1950, the new therapy had tripled the number of years that patients at a major diabetes center could expect to live after diagnosis. It also kept those patients alive long enough for them to experience a wave of long-term complications. Leonard Thompson would die at 27 of pneumonia. Other young men and women who shared his illness also died far too young, their veins and arteries ravaged by the disease, and perhaps—there was no way to tell—by the insulin therapy and associated dietary protocols that had kept them alive in the first place.

In the decades that followed, diabetes, once a rare disorder, would become so common that entire drug-store aisles are now dedicated to its treatment-related paraphernalia. Roughly one in 10 Americans is afflicted. And despite a remarkable, ever-expanding armamentarium of drug therapies and medical devices, the disease—whether in its type 1 or type 2 form—is still considered chronic and progressive. Patients live far longer than ever before, yet their condition is still anticipated to get worse with time, requiring ever more aggressive therapies to keep its harms in check. One in every seven health dollars is now spent on diabetes treatment, amounting to $800 million every day.

The advent of insulin therapy also changed—I would even say distorted—the related medical science. In my latest book, Rethinking Diabetes, I document how clinical investigators in the 1920s abruptly shifted their focus from trying to understand the relationship between diet and disease to that between drug and disease. Physicians who had been treating diabetes with either fat-rich diets absent carbohydrates (which had been the accepted standard of care in both the U.S. and Europe) or very low-calorie “starvation” diets came to rely on insulin instead. Physicians would still insist that diet is the cornerstone of therapy, but only as an adjunct to the insulin therapy and in the expectation that any dietary advice they gave to patients would be ignored.

With the sudden rise of GLP-1 drugs in this decade, I worry that a similar set of transformations could occur. Dietary therapy for obesity and diabetes may be sidelined in favor of powerful pharmaceuticals—with little understanding of how the new drugs work and what they really tell us about the mechanisms of disease. And all of that may continue despite the fact that the long-term risks of taking the drugs remain uncertain.

“The ebullience surrounding GLP-1 agonists is tinged with uncertainty and even some foreboding,” Science reported in December, in its article declaring these obesity treatments the journal’s Breakthrough of the Year. “Like virtually all drugs, these blockbusters come with side effects and unknowns.” Yet given the GLP-1 agonists’ astounding popularity, such cautionary notes tend to sound like lip service. After all, the FDA has deemed these drugs safe for use, and doctors have been prescribing products in this class to diabetes patients for 20 years with little evidence of long-term harm.

Yet the GLP-1 agonists’ side effects have been studied carefully only out to seven years of use, and that was in a group of patients on exenatide—an early, far less potent product in this class. The study offered no follow-up on the many participants in that trial who had discontinued use. Other long-term studies have followed patients on the drugs for at least as many years, but they’ve sought (and failed to find) only very specific harms, such as pancreatic cancer and breast cancer. In the meantime, a 2023 survey found that more than two-thirds of patients prescribed the newer GLP-1 agonists for weight loss had stopped using them within a year. Why did they quit? What happened to them when they did?

The stories of Leonard Thompson and the many diabetes patients on insulin therapy who came after may be taken as a warning. The GLP-1 drugs have many traits in common with insulin. Both treatments became very popular very quickly. Within years of its discovery, insulin was being prescribed for essentially every diabetic patient whose physician could obtain the drug. Both insulin and GLP-1 agonists were originally developed as injectable treatments to control blood sugar. Both affect appetite and satiety, and both can have remarkable effects on body weight and composition. The GLP-1s, like insulin, treat only the symptoms of the disorders for which they are prescribed. Hence, the benefits of GLP-1s, like those of insulin, are sustained only with continued use.

The two treatments are also similar in that they work, directly or indirectly, by manipulating an unimaginably complex physiological system. When present in their natural state—as insulin secreted from the pancreas, or GLP-1 secreted from the gut (and perhaps the brain)—they’re both involved in the regulation of fuel metabolism and storage, what is technically known as fuel partitioning. This system tells our bodies what to do with the macronutrients (protein, fat, and carbohydrates) in the foods we eat.

Chris Feudtner, a pediatrician, medical historian, and medical ethicist at the University of Pennsylvania, has described this hormonal regulation of fuel partitioning as that of a “Council of Food Utilization.” Organs communicate with one another “via the language of hormones,” he wrote in Bittersweet, his history of the early years of insulin therapy and the transformation of type 1 diabetes from an acute to a chronic disease. “The rest of the body’s tissues listen to this ongoing discussion and react to the overall pattern of hormonal messages. The food is then used—for burning, growing, converting, storing, or retrieving.” Perturb that harmonious discourse, and the whole physiological ensemble of the human body reverberates with corrections and counter-corrections.

This is why the long-term consequences of using these drugs can be so difficult to fathom. Insulin therapy, for instance, did not just lower patients’ blood sugar; it restored their weight and then made them fatter still (even as it inhibited the voracious hunger that was a symptom of uncontrolled diabetes). Insulin therapy may also be responsible, at least in part, for diabetic complications—atherosclerosis and high blood pressure, for instance. That possibility has been acknowledged in textbooks and journal articles but never settled as a scientific matter.

With the discovery of insulin and its remarkable efficacy for treating type 1 diabetes, diabetologists came to embrace a therapeutic philosophy that is still ascendant today: Treat the immediate symptoms of the disease with drug therapy and assume that whatever the future complications, they can be treated by other drug or surgical therapies. Patients with diabetes who develop atherosclerosis may extend their lives with stents; those with hypertension may go on blood-pressure-lowering medications.

A similar pattern could emerge for people taking GLP-1s. (We see it already in the prospect of drug therapies for GLP-1-related muscle loss.) But the many clinical trials of the new obesity treatments do not and cannot look at what might happen over a decade or more of steady use, or what might happen if the injections must be discontinued after that long. We take for granted that if serious problems do emerge, far down that distant road, or if the drugs have to be discontinued because of side effects, newer treatments will be available to solve the problems or take over the job of weight maintenance.

In the meantime, young patients who stick with treatment can expect to be on their GLP-1s for half a century. What might happen during those decades—and what might happen if and when they have to discontinue use—is currently unknowable, although, at the risk of sounding ominous, we will find out.

Pregnancy is another scenario that should generate serious questions. A recently published study found no elevated risk of birth defects among women taking GLP-1 agonists for diabetes right before or during early pregnancy, as compared with those taking insulin, but birth defects are just one obvious and easily observable effect of a drug taken during pregnancy. Children of a mother with diabetes or obesity tend to be born larger and have a higher risk of developing obesity or diabetes themselves later in life. The use of GLP-1 agonists during pregnancy may reduce—or exacerbate—that risk. Should the drugs be discontinued before or during pregnancy, any sudden weight gain (or regain) by the mother could similarly affect the health of her child. The consequences cannot be foreseen and might not manifest themselves until these children reach their adult years.

The rise of GLP-1 drugs may also distort our understanding of obesity itself, in much the way that insulin therapy distorted the thinking in diabetes research. With insulin’s discovery, physicians assumed that all diabetes was an insulin-deficiency disorder, even though this is true today for only 5 to 10 percent of diabetic patients, those with type 1. It took until the 1960s for specialists to accept that type 2 diabetes was a very different disorder—a physiological resistance to insulin, inducing the pancreas to respond by secreting too much of the hormone rather than not enough. And although the prognosis today for a newly diagnosed patient with type 2 diabetes is better than ever, physicians have yet to establish whether the progression and long-term complications of the disease are truly inevitable, or whether they might be, in fact, a consequence of the insulin and other drug therapies that are used to control blood sugar, and perhaps even of the diets that patients are encouraged to eat to accommodate these drug therapies.

Already, assumptions are being made about the mechanisms of GLP-1 agonists without the rigorous testing necessary to assess their validity. They’re broadly understood to work by inhibiting hunger and slowing the passage of food from the stomach—effects that sound benign, as if the drugs were little more than pharmacological versions of a fiber-rich diet. But changes to a patient’s appetite and rate of gastric emptying only happen to be easy to observe and study; they do not necessarily reflect the drugs’ most important or direct actions in the body.

When I spoke with Chris Feudtner about these issues, we returned repeatedly to the concept that Donald Rumsfeld captured so well with his framing of situational uncertainty: the known unknowns and the unknown unknowns. “This isn’t a you-take-it-once-and-then-you’re-done drug,” Feudtner said. “This is a new lifestyle, a new maintenance. We have to look down the road a bit with our patients to help them think through some of the future consequences.”

Patients, understandably, may have little time for a lecture on all that we don’t know about these drugs. Obesity itself comes with so many burdens—health-related, psychological, and social—that deciding, after a lifetime of struggle, to take these drugs in spite of potential harms can always seem a reasonable choice. History tells us, though, that physicians and their patients should be wary as they try to balance known benefits against a future, however distant, of unknown risk.

Chocolate Might Never Be the Same

The Atlantic

www.theatlantic.com › health › archive › 2024 › 04 › cocoa-shortage-chocolate-expensive › 678053

Good chocolate, I’ve come to learn, should taste richly of cocoa—a balanced blend of bitter and sweet, with notes of fruit, nuts, and spice. My favorite chocolate treat is nothing like that. It’s the Cadbury Creme Egg, an ovoid milk-chocolate shell enveloping a syrupy fondant center. To this day, I look forward to its yearly return in the weeks leading up to Easter.

Most popular chocolate is like this: milky, sugary, and light on actual cocoa. Lots of sugary sweets contain so little of the stuff that they are minimally chocolate. M&M’s, Snickers bars, and Hershey’s Kisses aren’t staples of American diets because they are the best—rather, they satisfy our desire for chocolate while costing a fraction of a jet-black bar made from single-origin cocoa.

But chocolate isn’t as economical as it once was. By one estimate, retail prices for chocolate rose by 10 percent just last year. And now this is the third year in a row of poor cocoa harvests in West Africa, where most of the world’s cocoa is grown. Late last month, amid fears of a worsening shortage, cocoa prices soared past $10,000 per metric ton, up from about $4,000 in January. To shoulder the costs, chocolate companies are gearing up to further hike the price of their treats in the coming months. Prices might not fall back down from there. Chocolate as we know it may never be the same.

Chocolate has had “mounting problems for years,” Sophia Carodenuto, an environmental scientist at the University of Victoria, in Canada, told me. The farmers who grow them are chronically underpaid. And cocoa trees—the fruits of which contain beans that are fermented and roasted to create chocolate—are tough to grow, and thrive only in certain conditions. A decade ago, chocolate giants warned that the cocoa supply, already facing environmental challenges, would soon be unable to keep up with rising demand. “But what we’re seeing now is a little bit of an explosion” in the crop’s struggles, Carodenuto said.

The simplest explanation for the ongoing cocoa shortage is extreme weather, heightened by climate change. Exceptionally hot and dry conditions in West Africa, partly driven by the current El Niño event, have led to reduced yields. Heavier-than-usual rains have created ideal conditions for black pod disease, which causes cocoa pods to rot on the branch. All of this has taken place while swollen shoot, a virus fatal to cocoa plants, is spreading more rapidly in cocoa-growing regions. Global cocoa production is expected to fall by nearly 11 percent this season, Reuters reported.

In the past, when supply fell and prices rose, farmers were motivated to plant more cocoa, which led to a boost in supply five years later, when the new trees began to bear fruit, says Nicko Debenham, the former head of sustainability at the chocolate giant Barry Callebaut. Already, some West African farmers are racing to plant new trees. But they may not be able to plant their way out of future cocoa shortages. “Climate change is definitely a challenge” because it will make rainfall less predictable, which is a problem for moisture-sensitive cocoa trees, Debenham told me. Furthermore, rising temperatures and more frequent droughts will render some cocoa-growing regions unusable.  

Climate change isn’t the only problem. Cocoa crops in Côte d’Ivoire and Ghana, where 60 percent of the world’s cocoa come from, may already be in “structural decline,” Debenham said, citing disease, aging cocoa trees, and illegal gold mining on farmland. More important, the farmers who tend to the crops can’t afford to invest in their farms to increase their yields and bolster resilience against climate change. The bleak outlook for cocoa farmers threatens to doom cocoa-growing in the region altogether. In Ghana, the average cocoa farmer is close to 50 years old. A new generation of farmers is needed to maintain the cocoa supply, but young people may just walk away from the industry.

No matter how you look at it, the future of cocoa doesn’t look good. With less cocoa available all around, chocolate may become more expensive. For high-end chocolate brands, whose products use lots of cocoa, the recent price hikes are reportedly an existential threat. Barry Callebaut has predicted that the companies it supplies with cocoa will raise chocolate prices by up to 8 percent in the next few months. Because companies buy beans in advance, it will take some time before retail prices reflect the current shortage, so further increases are likely.

When cocoa prices go up, companies start reducing bar sizes and adding in substitutes such as fruit and nuts to reduce the amount of cocoa content. “They’ll try and use every trick in the book to keep the consumption levels up,” Debenham said. My beloved Cadbury Creme Egg, for example, is markedly smaller than it used to be. Now, as Bloomberg has noted, companies are promoting candies that contain less chocolate, such as the new Reese’s caramel Big Cup from Hershey’s, or treats that have no chocolate at all, such as gummies.

Cocoa shortages will affect all kinds of chocolate, but mass-produced sweets may change beyond just the prices. The erratic temperatures brought about by climate change could change the flavor of beans, depending on where they are grown. Variability is a concern for commercial chocolate makers, who need to maintain consistent flavors across their products. They may counteract discrepancies among different batches of beans by combining them, then roasting them at a higher temperature, Johnny Drain, a food-science expert and co-founder of the cocoa-free chocolate brand Win-Win, told me. Doing so can eliminate unwanted qualities, but it may also remove desirable ones, resulting in a less interesting flavor overall. Even if an M&M contains a minimal amount of actual chocolate, a longtime consumer could notice a change in flavor.

Commercial chocolate makers may also tweak their recipes to amp up or mimic chocolate flavors without using more cocoa. These candies contain relatively little cacao to begin with; only 10 percent of a product’s weight must be cocoa in order to qualify as chocolate in the eyes of the FDA. Some already use chocolatelike ingredients such as cocoa-butter equivalents, cocoa extenders, and artificial cocoa flavors. In some cases, the swaps are noticeable: Cadbury’s use of an emulsifying filler to reduce the amount of cocoa butter in its Caramello bars diminished “the rich creaminess of the original,” Bon Appétit noted in 2016.

Newer chocolate alternatives may provide more satisfying counterfeits. Win-Win isn’t the only start-up producing cocoa-free chocolate, which is similar in concept to animal-free meat. The company uses plant ingredients to emulate the flavor and texture of chocolate—as do its competitors Foreverland and Voyage Foods. Another firm, California Cultured, grows actual cacao cells in giant steel tanks.

[Read: Silicon Valley is coming for your chocolate]

Cocoa-free chocolate is currently far more expensive than chocolate, but Drain hopes it will eventually become “cheaper than the cheapest chocolate.” At that point, he said, it’ll likely find its niche at the lowest end of the market, where chocolate plays a supporting role rather than a starring one—think chocolate-coated ice creams and granola bars with chocolate chips. Already, some of these products are labeled as having “chocolate flavor” or being “chocolatey” instead of “chocolate,” which has a strict FDA definition.

Yet change is always tough to swallow. So much of the appeal of cheap chocolate is that it’s always been there—whether in the form of a Hershey’s Kiss, Oreo cookies, a bowl of Cocoa Puffs, or the shell of a fondant-filled egg. “You grow up with those tastes. It’s hard to fathom how pervasive it has been,” Carodenuto said. Chocolate lovers have weathered minor tweaks to these candies over the years, but the shifts happening today may be less tolerable—or at the very least more noticeable. The change that has been hardest to ignore is that cheap chocolate is no longer that cheap.