Itemoids

Dutch

The Moral Matter of Childbearing

The Atlantic

www.theatlantic.com › books › archive › 2024 › 05 › begetting-mara-van-der-lugt-book-review › 678534

In December 1941, Etty Hillesum, a young Jewish woman living in Amsterdam, found herself unexpectedly pregnant. Hers was not a wanted pregnancy; we know from her diaries that she had never desired children, and had even considered a hysterectomy “in a rash and pleasure-loving moment.” Hillesum wanted above all to be a writer. Like many women before (and after) her, Hillesum self-managed her abortion; she mentions swallowing “twenty quinine pills” and assaulting herself with “hot water and blood-curdling instruments.” She left behind an account not just of her methods, but of her reasoning. “All I want is to keep someone out of this miserable world. I shall leave you in a state of unbornness, rudimentary being that you are, and you ought to be grateful to me. I almost feel a little tenderness for you,” she wrote. Hillesum was aware of the dire political circumstances around her, but her rationale was entirely personal. As she explained to the entity growing within her, her “tainted family” was “riddled with hereditary disease.” She swore that “no such unhappy human being would ever spring from my womb.”

Eighty-three years later, the Dutch philosopher Mara van der Lugt looks to Hillesum in contemplating a central question she believes that everyone must attempt to answer for themselves: that of whether or not to have children. In her new book, Begetting: What Does It Mean to Create a Child?, van der Lugt locates in Hillesum’s words no less than “the beginning of an ethics of creation,” an earnest wrestling with the act of bringing a new person into the world. She argues that childbearing is too often framed as a matter of desire and capacity—wanting or not wanting children, being able or unable to have them—when it should be a moral one. Procreation, she proposes, is a “problem—a personal, ethical and philosophical problem, especially in a secular age.” Perhaps, she ventures, it is “the greatest philosophical problem of our time.”

Asking such a question in an era when two-thirds of the global population live in places with fertility rates below replacement level may seem counterintuitive (and to pronatalist policy makers, downright counterproductive). Clearly, many people of reproductive age have decided against parenthood, even though it is still the far more common path. (Decades after contraception was legalized for unmarried people in the U.S., more than 84 percent of women in their 40s had given birth.) But van der Lugt is less interested in the outcomes, and even in the reasons people give for having or not having children, than in the question itself. At the core of her argument are two facts: First, that a person cannot consent to being born, and second, that there is a high likelihood they will experience at least some suffering in their lifetime. As incontrovertible as these assertions are, I’ve rarely heard people outside of environmentalist circles talk about their hypothetical children in these terms.

These two facts, van der Lugt maintains, should be sufficient to trouble common assumptions about begetting—chief among them the notion that having children is inherently good. She wants her readers to reconsider the language people use about childbearing, which usually revolves around choice or preferences. Instead, she argues, begetting “should be seen as an act of creation, a cosmic intervention, something great, and wondrous—and terrible”: Hardly something one should undertake without pausing to examine why.

In her 20s, van der Lugt looked around her peer group and saw people becoming parents without what appeared to be much consideration, sometimes, “seemingly, just for fun.” One day, at a restaurant in Rotterdam, a friend she calls Sylvia tells her, “I actually believe having children is immoral.” Sylvia reasons that because “life always contains some suffering”—ordinary or severe mental or physical illness, emotional pain, and all sorts of other potential harms—bringing a child into the world inevitably adds to that misery. Van der Lugt is shocked, and unconvinced by Sylvia’s argument. The two begin an ongoing debate about the morality of childbearing, which is eventually joined by a third friend. These discussions spur van der Lugt to reexamine her long-held assumptions, a process that forms the basis of the book.

[Read: Why are women freezing their eggs? Look to the men.]

Van der Lugt draws on a wide and eclectic mix of sources as she builds her arguments. Among them: Lord Byron’s Cain: A Mystery, for its explicit connection of “the problem of suffering and evil” to procreation, and Hanya Yanagihara’s novel A Little Life, in which one character asserts that being a friend is enough to make a meaningful existence. Insights from popular media such as The West Wing and The Hunger Games are put in conversation with the work of philosophers including Arthur Schopenhauer, Friedrich Nietzsche, Michael Sandel, and the early ecologist Peter Wessel Zapffe.  

She begins by examining the ideas of several antinatalist philosophers. Antinatalists come in many stripes, ranging from those who believe that humans threaten the well-being of nonhuman animals and the environment to some who are simply misanthropic; the most worthwhile of these arguments, van der Lugt believes, are the ones that are grounded in concern for the welfare of fellow people. She engages extensively with the controversial South African philosopher David Benatar, who wrote in his 2006 book, Better Never to Have Been: The Harm of Coming Into Existence, that “so long as a life contains even the smallest quantity of bad, coming into existence is a harm.” This idea carries with it, in Benatar’s view, an obligation not to procreate; the duty to avoid harm far outweighs the possibility of bestowing a benefit, especially on someone whose consent cannot be obtained. (The logical conclusion of this view is eventual human extinction.) Benatar dismisses the notion of life being good and worth living as the product of the human tendency to hold more tightly to our positive experiences than negative or painful ones. But surely, as van der Lugt counters, “we are an authority on this, the value of our own lives?

Still, the possibility of suffering does make any act of procreation a gamble with someone else’s life, irrespective of how valuable, good, or even sacred we deem our own lives, or human life in general. So how do we apply this bleak calculus to our individual choices? One’s intuitive response might be “to distinguish mere possibility from probability.” Most people, van der Lugt continues, likely believe, at least in the abstract, that we shouldn’t create people “who will most probably lead miserable lives,” such as a child with a hereditary disease that will cause them immense physical pain and an early death. But they probably wouldn’t argue that we “have a duty to avoid creating people who might just possibly lead miserable lives.” She is careful to note that making such a judgment on behalf of others is a dicey prospect, one reason she is unconvinced by some people’s assertion that life is, on net, bad. The late disability-rights activist Harriet McBryde Johnson, for instance, asserted that the “presence or absence of a disability doesn’t predict quality of life,” in response to arguments like those of the philosopher Peter Singer, who has said that parents should have the option to euthanize disabled babies if they judge that their infant’s life will be “so miserable as not to be worth living.”

Of course, this question of possibility versus probability falls unevenly on the shoulders of different groups. “Any child you bring into existence could be assaulted, raped, tortured, or murdered,” writes Benatar. “It could be sent to war. It could be kidnapped, abducted, imprisoned, or executed.” Well, yes. But in a profoundly unequal society, some people are, statistically, far more likely to suffer the sorts of harms that Benatar mentions. We know that Black Americans are about five times more likely to be incarcerated in state prisons than white Americans. We know that in the U.S., women are seven times more likely to be rape victims than men. We know that the children of poor parents are far more likely to end up poor themselves.

Van der Lugt’s book does not engage enough with how we might figure these realities into discussions on begetting, or what the implications of doing so would be. Although she is clear that moral debates about childbirth should be kept separate from legal or policy guidelines, we have long lived in a society that regulates birth—either through racist and classist messages about who should and shouldn’t reproduce, or through legislation, such as the current broad restrictions on abortion in the United States. The Buck v. Bell decision of 1927 authorized sterilization for “imbeciles,” and in 1983 the Milwaukee legislature passed a bill that made artificially inseminating welfare recipients medical malpractice. Then there’s our insurance regime, in which Medicaid beneficiaries can generally get contraception but not fertility care. “Insurers pay for the poor to get birth control and for the rich to get IVF,” the historian Laura Briggs has written, a system underpinned by reasoning she calls “precisely eugenic.” If the logical end point of certain antinatalist arguments is that groups bearing the burden of living in an unjust society must subject their family planning to additional moral scrutiny, perhaps something is wrong with the premise.

[Read: Why parents struggle so much in the world’s richest country]

Probability and possibility come into play again in van der Lugt’s treatment of the climate crisis, which has generated ambivalence about begetting; these hesitations have been perhaps most loudly voiced by people—white, middle-class, college-educated—whose reproduction has historically been encouraged. She acknowledges that the apparent inexorability of climate change makes the possibility of suffering far more of a certainty for many more people. “If there is anything we can be certain of, it is that the world is changing, and not for the better,” she writes. Yet to say that creating children is a uniquely vexed question today is to engage in what van der Lugt calls “temporal exceptionalism,” because life involves pain no matter what. Even if we were to solve climate change tomorrow, she points out, the concerns raised by the antinatalists—the potential harm and horror of human life—are still on the table. “When the question of climate has been answered, the question of begetting remains,” she writes.

Are there any good reasons to have children? Van der Lugt finds all of the most common ones wanting. Among the “worse reasons” she cites are “to remain ‘in-step’ with [one’s] peers,” to save a relationship, or out of fear of regret or missing out. Uncritically accepting “the Biological Narrative,” as she calls “the language of biology, of hormones, of physical urges,” demeans the procreative act. Giving little credence to the evolutionary drive to propagate the human species, she instead suggests that “we might do better to emphasise not the urge itself, but the ability consciously to act, or not to act, upon it.” Other stock answers on the “better” end of the spectrum, such as “happiness, fulfillment, meaningfulness,” are also deemed insufficient. In van der Lugt’s view, expecting a child to provide those things places too great a burden on the child. Even the most obvious reason, “love” (my instinctive answer), is dismissed as logically inadequate. “Even if it is possible to experience love for a non-existent child,” van der Lugt writes, “love alone cannot justify all things.” After all, she notes, when it comes to existing people, mere love (or what she says is more accurately termed “longing” in the case of a child one hasn’t yet met) is not an adequate reason to do anything to them without their consent.

If the question has no one simple answer, it is still, van der Lugt insists, vital to ask it, and to ask it in the correct way, using language that moves away from entitlement and desire (“having” or “wanting” children) and toward “a concept of fragility and accountability”—the idea that we are entrusted with children, responsible for them. Although many people speak of childbearing as “giving the gift of life,” van der Lugt argues that this unidirectional characterization is mistaken. “If life is ‘given’ at all, it is given both to the parents and to the child: neither is giver, but to both it is bestowed,” she writes.

Thus, perhaps, one possible approach to begetting is to begin with humility, combined with a deep appreciation for the fragility of existence. Van der Lugt’s model for this stance is once again Etty Hillesum. Writing in the Nazi transit camp of Westerbork, where she remained for several months before boarding a train to Poland, where she and her family were killed, Hillesum insists that “life is glorious and magnificent,” even as she bears witness to the misery around her. Her searching examination of her own existence left her full of gratitude, yet still did not compel her to give life to someone else, for how could she insist, or predict, that that person might face the adversity she experienced with the same extraordinary grace. As van der Lugt writes, “The principle of gratitude and acceptance, according to which life is worth living ‘despite everything,’ is one that she applies firmly to herself, but only hesitatingly to others.”

Those who do choose to beget might also adopt this same humility. Bidding someone forth, conjuring a new person from a couple of cells, is an act of tremendous magnitude, one whose meaning is perhaps too great and abstract to grasp or articulate with any precision. Before undertaking it, we should commit to the same unsparing self-examination. This, in the end, is van der Lugt’s request of us: to pose the question of begetting to ourselves, and to answer it for only ourselves.

The Sadistic History of Reality Television

The Atlantic

www.theatlantic.com › culture › archive › 2024 › 05 › contestant-hulu-review-allen-funt-candid-camera-reality-tv-history › 678393

More than a decade after watching it, I still get twitchy thinking about “White Bear,” an early episode of Black Mirror that stands as one of the most discomfiting installments of television I’ve seen. A woman (played by Lenora Crichlow) groggily wakes up in a strange house whose television sets are broadcasting the same mysterious symbol. When she goes outside, the people she encounters silently film her on their phones or menacingly wield shotguns and chainsaws. Eventually, trapped in a deserted building, the woman seizes a gun and shoots one of her tormentors, but the weapon surprises her by firing confetti instead of bullets. The walls around her suddenly swing open; she’s revealed to be the star of a sadistic live event devised to punish her repeatedly for a crime she once committed but can’t remember. “In case you haven’t guessed … you aren’t very popular,” the show’s host tells the terrified woman, as the audience roars its approval. “But I’ll tell you what you are, though. You’re famous.”

“White Bear” indelibly digs into a number of troublesome 21st-century media phenomena: a populace numbed into passive consumption of cruel spectacle, the fetishistic rituals of public shaming, the punitive nature of many “reality” shows. The episode’s grand reveal, a television staple by the time it premiered in 2013, is its own kind of punishment: The extravagant theatrics serve as a reminder that everything that’s happened to the woman has been a deliberate construction—a series of manipulations in service of other people’s entertainment.

The contrast between the aghast subject and the gleeful audience, clapping like seals, is almost too jarring to bear. And yet a version of this moment really happened, as seen about an hour into The Contestant, Hulu’s dumbfounding documentary about a late-’90s Japanese TV experiment. For 15 months, a wannabe comedian called Tomoaki Hamatsu (nicknamed “Nasubi,” or “eggplant,” in reference to the length of his head) has been confined, naked, to a single room filled with magazines, and tasked with surviving—and winning his way out, if he could hit a certain monetary target—by entering competitions to win prizes. The entire time, without his knowledge or consent, he’s also been broadcast on a variety show called Susunu! Denpa Shōnen.

Before he’s freed, Nasubi is blindfolded, dressed for travel, transported to a new location, and led into a small room that resembles the one he’s been living in. Wearily, accepting that he’s not being freed but merely moved, he takes off his clothes as if to return to his status quo. Then, the walls collapse around him to reveal the studio, the audience, the stage, the cameras. Confetti flutters through the air. Nasubi immediately grabs a pillow to conceal his genitals. “My house fell down,” he says, in shock. The audience cackles at his confusion. “Why are they laughing?” he asks. They laugh even harder.

Since The Contestant debuted earlier this month, reviews and responses have homed in on how outlandish its subject matter is, dubbing it a study of the “most evil reality show ever” and “a terrifying and bizarre true story.” The documentary focuses intently on Nasubi’s experience, contrasting his innocence and sweetness with the producer who tormented him, a Machiavellian trickster named Toshio Tsuchiya. Left unstudied, though, is the era the series emerged from. The late ’90s embodied an anything-goes age of television: In the United States, series such as Totally Hidden Video and Shocking Behavior Caught on Tape drew millions of viewers by humiliating people caught doing dastardly things on camera. But Tsuchiya explains that he had a more anthropological mission in mind. “We were trying to show the most basic primitive form of human being,” he tells The Contestant’s director. Nasubi was Tsuchiya’s grand human experiment.

The cruelty with which Nasubi was treated seems horrifying now, and outrageously unethical. Before he started winning contests, he got by on a handful of crackers fed to him by the producers, then fiber jelly (one of his first successful prizes), then dog food. His frame whittles down in front of our eyes. “If he hadn’t won rice, he would have died,” a producer says, casually. The question of why Nasubi didn’t just leave the room hangs in the air, urgent and mostly unexamined. “Staying put, not causing trouble is the safest option,” Nasubi explains in the documentary. “It’s a strange psychological state. You lose the will to escape.”

But the timing of his confinement also offers a clue about why he might have stayed. 1998, when the comedian was first confined, was a moment in flux, caught between the technological innovations that were rapidly changing mass culture and the historical atrocities of the 20th century. Enabled by the internet, lifecasters such as Jennifer Ringley were exposing their unfiltered lives online as a kind of immersive sociological experiment. Webcams allowed exhibitionists and curious early adopters to present themselves up for observation as novel subjects in a human zoo. Even before the release of The Truman Show, which came out in the U.S. a few months after Nasubi was first put on camera, a handful of provocateur producers were brainstorming new formats for unscripted television, egged on by the uninhibited bravado and excess of ’90s media. These creators acted as all-seeing, all-knowing authorities whose word was absolute. And their subjects, not yet familiar with the “rules” of an emerging genre, often didn’t know what they were allowed to contest. Of Tsuchiya, Nasubi remembers, “It was almost like I was worshiping a god.”

In his manipulation of Nasubi, Tsuchiya was helping pioneer a new kind of art form, one that would lead to the voyeurism of 2000s series such as Big Brother and Survivor, not to mention more recent shows such Married at First Sight and Love Is Blind. But the spectacle of Nasubi’s confinement also represented a hypothesis that had long preoccupied creators and psychologists alike, and that reality television has never really moved on from. If you manufacture absurd, monstrous situations with which to torment unwitting dupes, what will they do? What will we learn? And, most vital to the people in charge, how many viewers will be compelled to watch?

Some popular-culture historians consider the first reality show to be MTV’s The Real World, a 1992 series that deliberately provoked conflict by putting strangers together in an unfamiliar environment. Others cite PBS’s 1973 documentary series An American Family, which filmed a supposedly prototypical California household over several months, in a conceit that the French philosopher Jean Baudrillard called the “dissolution of TV in life, dissolution of life in TV.”

But the origins of what happened to Nasubi seem to lie most directly in a series that ran on and off from 1948 to 2014: Candid Camera. Its creator, Allen Funt, was a radio operator in the Army Signal Corps during World War II; while stationed in Oklahoma, he set up a “gripe booth” for soldiers to record their complaints about military service. Knowing they were being taped, the subjects held back, which led Funt to record people secretly in hopes of capturing more honest reactions. His first creative effort was The Candid Microphone, a radio show. The series put its subjects in perplexing situations to see how they’d respond: Funt gave strangers exploding cigarettes, asked a baker to make a “disgusting” birthday cake, and even chained his secretary to his desk and hired a locksmith to “free” her for her lunch break. “With the candid microphone, we are at the beginning of the Age of the Involuntary Amateur,” one critic wrote in 1947. “The possibilities are limitless; the prospect is horrifying.” Sure enough, a TV series soon followed.

For all that critic’s revulsion, Funt was earnest about the potentially revelatory power of his shows. He was seemingly influenced by two parallel trends. One was a sociological school of thought that was trying urgently to analyze human nature following a wave of real barbarities: the Holocaust, the bombing of Hiroshima and Nagasaki, Stalin’s great purges. The other was an interest in art that captured the contours of real life, in an outgrowth of the naturalist movement that had come out of the late 19th century. Émile Zola, one of its practitioners, argued in The Experimental Novel that fiction writers were essentially omnipotent forces dropping characters into realistic situations to consider how they might respond. Literature, he argued, was “a real experiment that a novelist makes on man.”

The invention of television, as the academic Tony E. Jackson has argued, offered a more literal and scientific medium within which creators could manipulate real human subjects. This was where Candid Camera came into play. Funt’s practical jokes—setting up a subject in an elevator in which every other person suddenly turns their back to him—tended to consider the nature of compliance, and what humans will go along with rather than be outliers. Candid Camera was considered so rich a work that Funt was asked to donate episodes to Cornell University’s psychology department for further study.

Funt was also highly influential to Stanley Milgram, a social psychologist who turned his Yale studies on conformity into a documentary titled Obedience. The Milgram experiment, conducted in 1961, asked members of the public to inflict fellow subjects with electric shocks—faked, unknown to them—when ordered to do so by an authority figure. Inspired by the 1961 trial of the Nazi war criminal Adolf Eichmann, and the experience of his own family members who’d survived concentration camps, Milgram tweaked the Candid Camera model to more explicitly study how far people would follow orders before they objected. As the film professor Anna McCarthy has written, Milgram paid particular attention to the theatrical elements of his work. He even considered using recordings of humans screaming in real, rather than simulated, pain to maximize the authenticity of the subject’s experience. “It is possible that the kind of understanding of man I seek is an amalgam of science and art,” Milgram wrote in 1962. “It is sure to be rejected by the scientists as well as the artists, but for me it carries significance.”

This studied interest in human nature continued in PBS’s An American Family; its presentation of ordinary life up close, the anthropologist Margaret Mead once argued, was “as important for our time as were the invention of the drama and the novel for earlier generations—a new way for people to understand themselves.” Throughout the later decades of the 20th century, television was similarly fixated on exposure, although shock value quickly took priority over genuine curiosity and analysis. During the ’90s, on talk shows such as The Jerry Springer Show and Maury, people confessed their most damning secrets to anyone who cared to watch. Series including Cops and America’s Most Wanted offered a more lurid, voyeuristic look at crime and the darkness of human nature.

[Read: The paranoid style in American entertainment]

By the time Tsuchiya had the idea to confine a man to a single apartment to see whether he could survive the ordeal, the concept of humiliation-as-revelation was well established. “I told [Nasubi] that most of it would never be aired,” the producer explains in The Contestant. “When someone hears that, they stop paying attention to the camera. That’s when you can really capture a lot.” As an organizing principle for how to get the most interesting footage, it seems to stem right from Funt’s secret recordings of people in the 1940s. Tsuchiya appeared to be motivated by his desire to observe behavior that had never been seen before on film—“to capture something amazing … an aspect of humanity that only I, only this show, could capture.” And extremity, to him, was necessary, because it was the only way to provoke responses that would be new, and thus thrilling to witness.

The reality-show boom of the early 2000s was intimately informed by this same intention. When Big Brother debuted in Holland in 1999, it was broadly advertised as a social experiment in which audiences could observe contestants under constant surveillance like rats in a lab; the show was compared by one Dutch psychologist to the Stanford prison experiment. (Another called the show’s design “the wet dream of a psychological researcher.”) The 2002 British show The Experiment even directly imitated both the Stanford setup and Milgram’s work on obedience. But although such early series may have had honest intentions, their willingness to find dramatic fodder in moments of human calamity was exploited by a barrage of crueler series that would follow. The 2004 series There’s Something About Miriam had six men compete for the affections of a 21-year-old model from Mexico, who was revealed in the finale to be transgender—an obscene gotcha moment that mimics the structure of Candid Camera. Without a dramatic conclusion, a nonfiction series is just a filmed record of events. But with a last-act revelation, it’s a drama.

Contemporary audiences, blessedly, have a more informed understanding of ethics, of entrapment, and of the duty of care TV creators have to their subjects. In 2018, the British show Love Island spawned a national debate about gaslighting after one contestant was deemed to be manipulating another. There’s no question that what happened to Nasubi would trigger a mass outcry today. But reality TV is still built on the same ideological imperatives—the desire to see people set up in manifestly absurd scenarios for our entertainment. The Emmy-nominated 2023 series Jury Duty is essentially a kinder episode of Candid Camera extended into a whole season, and the internet creator known as MrBeast, the purveyor of ridiculous challenges and stunts, has the second most-subscribed channel on all of YouTube. What’s most remarkable about The Contestant now is how its subject managed to regain his faith in human nature, despite everything he endured. But the ultimate goal of so many contemporary shows is still largely the same as it was 25 years ago: to manufacture a novel kind of social conflict, sit back, and watch what happens.

Dutch government veers sharply right after four-party coalition deal

Euronews

www.euronews.com › 2024 › 05 › 16 › four-right-wing-dutch-parties-to-form-government-in-coalition-deal

When presenting the coalition agreement, PVV leader Geert Wilders said he is very much looking forward to the upcoming collaboration with NSC, BBB and the VVD. He also said that 'the sun will shine again in the Netherlands'.