Is FBI Director James Comey “fiercely independent,” as President Barack Obama described him three years ago when nominating him for his current job, or fiercely self-serving? That’s the question that looms large in the wake of Comey’s unprecedented decision to wade into a presidential race eleven days before election day by announcing that he’d reopened an investigation into Democratic candidate Hillary Clinton concerning her private email server—an investigation that he had brought to close in a major press conference in July. The announcement, which came in a very short letter to members of Congress Friday, violated two long-established rules governing criminal investigations. But apparently Comey believes that being independent means not having to follow the rules.
The first rule Comey disregarded requires law-enforcement officials to avoid public comment on ongoing investigations. That rule is designed to prevent prosecutors and law enforcement officials from using their authority to cast innuendo on the reputations of citizens; their job is to conduct investigations, not offer public character assessments. Comey’s original sin in this regard was his highly unusual press conference four months ago, in which, rather than simply noting that he had found no evidence of Clinton violating any laws, he went on to castigate her as “extremely careless.” The last I checked, carelessness is not a federal crime, and therefore, as FBI director, Comey had no business offering his opinion. The authority of the nation’s top law-enforcement investigator comes with responsibility to weigh one’s words and actions carefully; if anything, it was Comey who was “extremely careless.”
Having violated the rule against public comment once, Comey then doubled down, using his own prior violation to justify Friday’s repeat infraction. In an email to his staff, Comey defended his extraordinary step of publically announcing the resumption of a closed criminal investigation by stating that he had promised to keep Congress informed when he testified in July about the closure of the investigation. The job of the FBI, however, is not to offer Congress public progress reports on criminal investigations, but to conduct those investigations in confidence unless and until they warrant an indictment or dismissal. And his “report,” such as it was, was misleading. What prompted Friday’s announcement was the discovery of emails from Huma Abedin, Clinton’s chief of staff at the State Department, on the computer of Abedin’s now-estranged-husband, Anthony Weiner, who is under a separate investigation for allegedly “sexting” with a minor. But Comey’s letter to Congress failed to note that neither Comey nor his underlings had actually read any of the newly discovered emails, and therefore had no basis for believing that they constituted evidence of wrongdoing.
The second rule Comey violated is even more fundamental. The rule, which has been endorsed by attorneys general of both parties for decades, requires law-enforcement officers to avoid filing even completed indictments against candidates for political office within sixty days of an election. It applies to all elections, from those for local school boards to those for our nation’s highest office. The reason for this is that there is too great a risk that the power of the criminal process could interfere in an election, even where—as is decidedly not the case here—prosecutors have sufficient information to indicate that a candidate has committed a crime. That is, even if the FBI had acquired enough evidence to charge Hillary Clinton formally with knowingly transmitting classified information or otherwise violating criminal law, the rule would counsel against making that fact public until after the election. Yet Comey chose to go public, just eleven days before a presidential election, without even a basis for believing the “new” evidence he had come across was actually new, much less evidence of a crime.
Not only had FBI officials not seen the emails on Weiner’s computer at the time of Friday’s announcement. They had not even received a warrant to examine them. (They only got the warrant over the weekend.) Accordingly, the FBI had no basis for believing that the emails offered any evidence of criminal conduct on Clinton’s part. For all we know, the emails may simply be duplicates of those on Clinton’s and other computers that FBI investigators have already combed through.
Comey prides himself on being independent. While serving as deputy attorney general under President George W. Bush, he bravely stood up to White House Counsel Alberto Gonzales, who was seeking to pressure a hospitalized and sedated Attorney General John Ashcroft to approve an NSA program the legality of which many Justice Department officials had questioned. But independence can be taken too far. As another former deputy attorney general for George W. Bush, George Terwilliger, said, “there’s a difference between being independent and flying solo.” There are compelling reasons for rules limiting public comment on ongoing criminal investigations and interfering in upcoming elections. “Independence” doesn’t excuse breaking them.
Comey was no doubt worried that had he not informed Congress of the new evidence, and had it come out after the election that he had kept it quiet—and assuming the emails were of any importance, which has yet to be shown—he might have been subject to criticism from his own Republican Party. He certainly would have been condemned by some. Trump and his supporters have been calling for Clinton’s prosecution ever since the email controversy surfaced—notwithstanding the absence of any evidence that she actually broke any laws. In the wake of an election, they would have directed at least some of their anger at Comey, and their concerns would not likely have been assuaged by the fact that he had actually followed the rules.
But those are the burdens a responsible government official must bear. Following the rules was the right thing to do, even if it might have come at some cost to Comey’s personal reputation. By elevating his own concerns about his personal reputation above the rules, Comey will forever be remembered as the FBI director who abused the power of his office to interfere in an imminent presidential election.
As the country’s major political parties have become foreign countries to each other—with their own languages, press, moral philosophies, realities—a new kind of political literature has emerged, inspired by Thomas Frank’s What’s the Matter with Kansas? (2004). These books are written not by historians but by sociologists, anthropologists, and reporters; a partial list would include Joe Bageant’s Deerhunting with Jesus: Dispatches from America’s Class War (2007), Kate Zernike’s Boiling Mad: Inside Tea Party America (2010), and Theda Skocpol and Vanessa Williamson’s The Tea Party and the Remaking of Republican Conservatism (2012). These are studies of political groups, but they are not chiefly political in nature; they tend to be written in the manner of Coming of Age in Samoa or Notes on the Balinese Cockfight. Our present-day investigators file intimate reports from exotic locales like Flagstaff, Arizona, and Middletown, Ohio, where they embed among the natives.
Though both parties consider their opposite number to be the un-American interlopers, only observers on the left seem to be writing these books; we’re still waiting for “What’s the Matter with Manhattan?” or “Brattleboro Elegy.” Often the author is a reformed member of the lost tribe: Frank and Bageant grew up in the places they later chronicled as outsiders, which allows them a degree of scorn their more clinical colleagues avoid. But all of the authors approach their subject with a puzzlement lined with despair. How, they ask, can so many people live in an upside-down reality, denouncing everything the writers consider virtuous, embracing everything they consider immoral? As Frank wrote on the first page of his book, “How could so many people get it so wrong?”
Arlie Russell Hochschild’s Strangers in Their Own Land is the most satisfying example yet of this fish-out-of-water approach, with a premise out of Preston Sturges. A distinguished, sunny-dispositioned, white-haired sociologist from Berkeley travels to southwestern Louisiana to observe the Tea Party in the seat of its misery and hellfire rage. Before embarking on her five-year research mission, Hochschild tells us, she was not friends with a single conservative or southerner. “Who were they?” she wonders. “How did they come to hold their views? Could we make common cause on some issues?”
She is forced to reconcile everything she knows—“New York Times at the newsstand…organic produce in grocery stores…foreign films in movie houses…small cars…bicycle lanes, color-coded recycling bins…gluten-free entrees”—with everything they know: prayer, fried food, plus-sized clothing, and the economic and cultural dominance of the petrochemical industry. In preparation she rereads Ayn Rand, though she fails to encounter anyone who shows the slightest interest in literature; at a bookstore in Lake Charles she finds that three aisles are reserved for Bibles.
Hochschild hopes to overcome what she calls the “Empathy Wall” that separates political groups, preventing deep understanding of the other side. “Is it possible,” she asks, “without changing our beliefs, to know others from the inside, to see reality through their eyes, to understand the links between life, feeling, and politics; that is, to cross the empathy wall? I thought it was.” But while Hochschild tries to tear down her empathy wall, the Tea Partiers speak fervidly of trying to erect a great wall on the Mexican border—a national antipathy wall. When she introduces herself to one of her subjects, explaining that she is a sociologist from Berkeley who hopes “to understand the deepening divide in our country,” he replies: “Berkeley? So y’all must be communist!” Hochschild tries to reassure her readers that he is only kidding.
Hochschild is deeply sympathetic to the plight of the Tea Partiers, just not enough for their satisfaction. A gospel singer named Madonna Massey, an avid Rush Limbaugh listener who believes that the Rapture will arrive within the next millennium and that the City of Heaven is a cube in the sky 1,500 miles square, complains to Hochschild that “liberals” malign her unfairly. “Oh, liberals think that Bible-believing Southerners are ignorant, backward, rednecks, losers. They think we’re racist, sexist, homophobic, and maybe fat.”
Hochschild’s research appears to confirm this characterization to the letter; she even notes euphemistically that in Lake Charles there are “fewer petite sizes” in the clothing stores. But she is steadfast in her refusal to accept the common stereotypes. She makes a point of being on good terms with the people she studies. She refers to them as her “friends,” dedicates the book to them, and praises them for being “caring,” “bright,” “warm, intelligent, generous—not like people out of the frightening pages of Ayn Rand.”
Yet these warm and caring people say the darnedest things, particularly when it comes to immigrants: “The Syrians should stay, take a stand, and fight for what they believe in. If you flee, in my mind, you’re a traitor unto yourself.” Or: “Have you ever seen a Muslim charity event for people in need, or soup kitchen for the homeless? A Muslim Thanksgiving? Where is the Muslim name on the Declaration of Independence?” One person argues that handing out guns would be the best way to bring peace to the Middle East: “If everybody had a gun and ammunition, they could solve their own differences.”
Hochschild shows such an excess of good faith that it comes to seem a form of naiveté. She cites, for instance, several academic and international studies finding that strict environmental regulation does not kill jobs, as Tea Partiers believe, but stimulates economic growth. “If this was the growing consensus among Organisation for Economic Cooperation and Development (OECD) economists,” writes Hochschild, “I wondered why my Tea Party friends weren’t hearing about it.” How could Madonna Massey have missed the latest findings of the OECD?
Hochschild chooses southwest Louisiana for her “journey into the heart of the right” because it represents the most graphic version of what she calls the Great Paradox:
Across the country, red states are poorer and have more teen mothers, more divorce, worse health, more obesity, more trauma-related deaths, more low-birth-weight babies, and lower school enrollment. On average, people in red states die five years earlier than people in blue states. Indeed, the gap in life expectancy between Louisiana (75.7) and Connecticut (80.8) is the same as that between the United States and Nicaragua.
The more conservative you are, the worse off you are likely to be and the sooner you are likely to die. This holds even on the county level, Hochschild finds, after an analysis of EPA data shows a correlation between political views and exposure to pollution. Yet the very people most damaged by conservative policies are most likely to vote for them.
Nowhere is the abuse as frightening as in Louisiana—with the exception, perhaps, of its neighbor to the east (“Thank God for Mississippi!” is the unofficial state motto). Louisiana is the second-poorest state and second-to-last in human development, which is a measure of individual freedom. The state’s rate of fatal cancers is about 30 percent higher than the national average. For all its antifederalism, Louisiana is fourth in accepting government welfare, with 44 percent of its budget coming from Washington. (Many of Hochschild’s Tea Party friends are beneficiaries of federal welfare programs.) Louisiana has the highest rate of death by gunfire (nearly double the national average), the highest rate of incarceration, and is the fifth-least-educated, reflecting the fact that it spends the fifth-least on education. It is sixth in the nation in generating hazardous waste, and third in importing it, since it makes a side business out of storing other states’ trash.
Louisiana’s governor is among the most powerful chief executives in the nation, a legacy that dates back to Huey Long’s administration, and under Governor Bobby Jindal’s dictatorship, between 2008 and 2016, the state’s prospects declined with unprecedented severity. After he reduced corporate income taxes and expanded the exemptions granted to oil and gas companies, the state’s revenue tumbled roughly $3 billion. He transferred $1.6 billion from public schools and hospitals to oil companies in the form of new tax incentives, under the theory that the presence of oil and a robust petrochemical infrastructure were not incentives enough. (The Louisiana Legislature is not only soaked with oil and gas lobbyists—during a recent session there were seventy for 144 legislators—but many lawmakers themselves hold industry jobs while serving in office.) Jindal fired 30,000 state employees, furloughed many others, cut education funding by nearly half, and sold off as many state-owned parking lots, farms, and hospitals as he could.
Despite these punishing cuts, he managed over the course of his administration to turn a $900 million budget surplus into a $1.6 billion deficit. National agencies downgraded the state’s credit rating. The damage was so great that it helped to bring about one of the most unlikely election results in recent American history. Jindal’s successor is John Bel Edwards, a Democrat—the only one to hold statewide office. Edwards is vehemently pro-life and agnostic about climate change, but he is determined to hold the oil and gas industry responsible for funding their share of coastal restoration. He currently enjoys a 62.5 percent approval rating. Almost a year into his first term, however, despite several emergency measures, the state remains in arrears.
The paradox that most baffles Hochschild is the question of environmental pollution. Even the most ideologically driven zealots don’t want to drink poisoned water, inhale toxic gas, or become susceptible to record flooding. Yet southwestern Louisiana combines some of the nation’s most fervently antiregulatory voters with its most toxic environmental conditions. It is a center of climate change denial despite the fact that its coast faces the highest rate of sea-level rise on the planet.
Hochschild discovers a walking personification of these ironies in a Cajun oil rig engineer named Mike Schaff. In August 2012, Schaff was entering his home in Bayou Corne, about seventy miles west of New Orleans, when he was jolted by a tremor. His concrete living room floor cracked apart. The sound, said a neighbor, was like a “garbage truck had dropped a dumpster.”
More than a mile beneath the bayou, a Houston-based drilling company named Texas Brine had drilled into a vast salt dome, ignoring warnings from its own engineer, with the complicity of the state’s useless Department of Environmental Quality. (In Louisiana, environmental regulators are, in the words of an EPA investigation, “expected to protect industry.”) Texas Brine drills for salt, which it sells to chlorine manufacturers, but other companies had used sections of the salt dome to store chemicals and oil. Texas Brine drilled too closely to an oil deposit and the structure ruptured, sucking down forest and causing seismic damage to the homes of 350 nearby residents. Officials began referring to Schaff’s neighborhood as the “sacrifice zone.”
Texas Brine refused to take responsibility for the accident. It claimed that earthquakes were common in the area (they are not) before blaming a different salt dome tenant for the collapse. If that wasn’t enough, Texas Brine asked the state for permission to dump toxic wastewater into the very sinkhole it created. Jindal did not visit the site for seven months, though it is only forty miles south of the capital. Four years later the sinkhole is 750 feet deep at its center and has grown to thirty-five acres. Methane and other gases bubble up periodically. Residents who defied evacuation orders avoided lighting matches.
After seeing his house, neighborhood, and way of life destroyed by corporate greed and state-sanctioned contempt for the natural environment, and many of his neighbors diagnosed with cancer, Schaff was forever changed. “They think we’re just a bunch of ignorant coonasses,” he told a Mother Jones reporter. Schaff became an environmental activist, railing against the “disrespect that we have been shown by both Texas Brine and our state officials themselves.” He marched on the statehouse, wrote fifty letters to state and federal officials, granted dozens of interviews to local, national, and foreign press. When state officials claimed they had detected no oil in the bayou, he demanded that the EPA check their work.
But Schaff continued to vote Tea Party down the line. He voted for the very politicians who had abetted Texas Brine at every turn, who opposed environmental regulation of any kind. He voted to “abolish” the EPA, believing that it “was grabbing authority and tax money to take on a fictive mission…lessening the impact of global warming.” The violent destruction of everything he held dear was not enough to change his mind.
Schaff’s story is an extreme but representative example of what so many Louisianian voters have brought upon themselves. “The entire state of Louisiana,” writes Hochschild, “had been placed into a sinkhole.” When confronted with the contradictions in their political logic, Hochschild’s subjects fall into “long pauses.” Cognitive dissonance reduces them to childlike inanity. When asked about catastrophic oil spills that result from lax regulation, one woman says, “It’s not in the company’s own interest to have a spill or an accident…. So if there’s a spill, it’s probably the best the company could do.” Madonna Massey says: “Sure, I want clean air and water, but I trust our system to assure it.” Jackie Tabor, whom Hochschild describes as “an obedient Christian wife,” says: “You have to put up with things the way they are…. Pollution is the sacrifice we make for capitalism,” which is a gentler way of saying that premature death is the sacrifice we make for capitalism. Janice Areno, who worked at Olin Chemical without a facial mask as an inspector of phosgene gas and suffers mysterious health ailments that she believes are “probably related to growing up near the plants,” finds comfort in an anthropomorphic analogy: “Just like people have to go to the bathroom, plants do too.”
But Hochschild is not interested in merely documenting the familiar ways in which this stratum of white Americans has consistently voted against its own interests, economic and otherwise. She wants to understand the cause. She finds all of the familiar explanations lacking. She rejects the argument that billionaires and oil companies, through sophisticated and elaborate public relations campaigns, have bamboozled an entire population into voting against government regulation by exploiting religious and cultural anxieties. Thomas Frank, quoted by Hochschild, described this strategy as follows:
Vote to stop abortion; receive a rollback in capital gains taxes…. Vote to get government off our backs; receive conglomeration and monopoly everywhere from media to meat packing. Vote to strike a blow against elitism, receive a social order in which wealth is more concentrated than ever before in our lifetimes.
Hochschild finds this “too simple an idea.” Her skepticism does seem justified in light of the Republican Party’s recent capitulation to a businessman opposed by the Koch brothers and nearly every other major right-wing éminence grise, and unendorsed by a single CEO of a Fortune 100 company.
Hochschild is also unpersuaded by Colin Woodard’s argument for regionalism as the main factor in shaping political views, and Alec McGillis’s argument that those in red states who most need government services vote at a much lower rate than wealthier conservatives. She finds incomplete Jonathan Haidt’s view, in The Righteous Mind (2013), that Tea Party voters are not misled, but instead care more deeply about cultural values than economic principles. While each of these theories may have some merit, she writes,
I found one thing missing in them all—a full understanding of emotion in politics. What, I wanted to know, did people want to feel, think they should or shouldn’t feel, and what do they feel about a range of issues?
This is politics as advertising: emotion over common sense. Such an analysis is overdue at a time when questions of policy and legislation and even fact have all but vanished from the public discourse, replaced by debates about the candidate’s character, “temperament,” and brand. (A Columbia Journalism Review analysis of the first presidential debate found that it “focused more on personality than any other in US history.”) How, then, do Tea Party voters feel? They’re angry, bitter, resentful—that much is obvious. Hochschild goes further, however. She develops for them what in brand marketing is referred to as the “back story,” a story that provides a unifying emotional logic to a set of beliefs. She calls it the “deep story.”
The deep story that Hochschild creates for the Tea Party is a parable of the white American Dream. It begins with an image of a long line of people marching across a vast landscape. The Tea Partiers—white, older, Christian, predominantly male, many lacking college degrees—are somewhere in the middle of the line. They trudge wearily, but with resolve, up a hill. Ahead, beyond the ridge, lies wealth, success, dignity. Far behind them the line is composed of people of color, women, immigrants, refugees. As pensions are reduced and layoffs absorbed, the line slows, then stalls.
An even greater indignity follows: people begin cutting them in line. Many are those who had long stood behind them—blacks, women, immigrants, even Syrian refugees, all now aided by the federal government. Next an even more astonishing figure jumps ahead of them: a brown pelican, the Louisiana state bird, “fluttering its long, oil-drenched wings.” Thanks to environmental protections, it is granted higher social status than, say, an oil rig worker. The pelican, writes Hochschild,
needs clean fish to eat, clean water to dive in, oil-free marshes, and protection from coastal erosion. That’s why it’s in line ahead of you. But really, it’s just an animal and you’re a human being.
Meanwhile the Tea Partiers are made to feel less than human. They find themselves reviled for their Christian morality and the “traditional” values they have been taught to honor from birth. Many speak of “sympathy fatigue,” the sense that every demographic group but theirs receives sympathy from liberals. “People think we’re not good people if we don’t feel sorry for blacks and immigrants and Syrian refugees,” one Tea Partier tells Hochschild. “But I am a good person and I don’t feel sorry for them.”
When Hochschild tells her deep story to some of the people she’s come to know, they greet it rapturously. “You’ve read my mind,” says one. “I live your analogy,” says Mike Schaff. She concludes that they do not vote in their economic interest but in their “emotional self-interest.” What other choice do they have?
Hochschild suggests that economic despair is the central motivation behind the Tea Partiers’ rage, while admitting that race, gender, and class biases contribute. But it’s difficult not to consider racial fear the formative aspect of this story, given our national history and the repeated expressions of racial disdain by her subjects, all of whom are white. Further evidence can be found in Donald Trump’s ability to cast off fundamental Republican economic views without sacrificing Tea Party support, while emphasizing positions dear to the hearts of white supremacists. Either way, race and the economy have for the far right become inseparable. Who’s to blame for lost jobs and opportunities? African-Americans boosted by affirmative action, immigrant laborers, Mexicans, Indians, Chinese.
Whether by calculation or intuition, Trump has emphasized the core elements of Hochschild’s deep story more tenaciously than any previous Republican presidential nominee. In doing so he has overcome a bitter contradiction at the heart of the Tea Party. Though they hold the notion of victimhood in the deepest contempt, Hochschild writes, they have been forced to brave “the worst of an industrial system, the fruits of which liberals enjoyed from a distance in their highly regulated and cleaner blue states.” The Tea Partiers live in a single giant “sacrifice zone.”
They may not have it worse than some other demographic groups in America today, but they have fallen the furthest. Hochschild quotes the economist Phillip Longman’s finding that fifty-somethings today are the first generation of Americans who, “at every stage of adult life,…have less income and less net wealth than people their age ten years before.” Trump’s pitch is simple: under his leadership, the Tea Party losers will be winners. They will win so much that they’ll be sick.
While Hochschild’s Tea Partiers contemplate a triumphant, if hazy, future, she leaves her readers with Grand Guignol images of their present day. We visit the tranquil bayou home belonging to Harold and Annette Areno, who, in the last two decades, have found themselves hemmed in by a chloride hydrocarbon manufacturing facility, a coal-fired power plant, a hazardous waste landfill, and the Conoco Docks, the site of one of the largest chemical leaks in North American history. The family’s cows, goats, and chickens fall over dead after drinking from the bayou, and the turtles go blind, their eyes turning white.
We watch Bob Hardey, mayor of Westlake, as he dutifully tends a family cemetery that is about to be surrounded on all sides by a humungous Sasol gas-to-liquids plant to which he has sold much of his property. Louisiana has given the facility permission to emit benzene at eighty-five times the entire state’s “threshold rate.” And we meet a safety inspector at a Ford battery plant who is called a sissy by plant employees for wearing a respirator. When the workers cackle he sees that their teeth have been eroded by sulfuric acid mist.
We come to know Hochschild’s subjects intimately: their thoughts, their prejudices, and most of all their fears, which form the foundation of their worldview. But we never get the sense that they know themselves. On the rare occasions in which Hochschild directly challenges their views, they tend to respond with platitudes taken from Fox News or by changing the subject. While acknowledging the strands of “greed, selfishness, racial intolerance, homophobia” in her subjects’ nature, Hochschild emphasizes their virtues, many of them shaped by biblical teachings: “their capacity for loyalty, sacrifice, and endurance.”
Of these, endurance is the defining quality, and the most tragic. Their suffering is not merely a personal or demographic crisis but a national tragedy. It threatens to capsize the entire republic. The Tea Partiers can’t be blamed for making a virtue of their anguish. Their situation, after all, is unlikely to improve anytime soon. Their suffering seems fated to last until Judgment Day—which should not be confused with Election Day.
For some twenty-five years, Charles LeDray has been surprising and delighting, and sometimes mystifying, the audience for contemporary art. Now fifty-six, LeDray is a kind of realist sculptor whose pieces—in part because his subjects are familiar but not what we would expect in a gallery setting, and in equal measure because he works with such small, essentially miniaturist sizes—have the power of making almost every object he handles seem new to our eyes.
When his art was getting underway, in the late 1980s, he altered the idea we generally have of stuffed animals. Using bears mostly, and sometimes designing them with their limbs askew or separated from their trunks, he gave them the distilled presence of purist, abstract sculpture, and this somehow made their plight seem as much inward and psychological as it was physical. Going on to work as a potter, he has exhibited large vertical glass display cases where on every glass shelf we see scores of neatly placed bowls, pots, jugs, and so on, each about an inch high and each, unbelievably, freshly designed. And as a carver, LeDray has, strangely, used human bone to create phenomenally delicate pieces of household furniture, or buttons of every possible description, or a sheath of wheat.
All these types of works are on view in a well-chosen small show of LeDray’s art at the Craig Starr Gallery in New York. It is the most wide-ranging exhibition the artist has had since his traveling retrospective, which was organized by the Institute of Contemporary Art in Boston and came to the Whitney in 2010. The current show also includes first-rate examples of what is most engaging about LeDray’s art: his sculptures of clothes. He has designed and sewn diminutive blazers, ties, pants, bathrobes, lumber jackets, suits, hats, overcoats, and other items. They can be presented hanging before us on a gallery wall, where they might be called relief sculpture. But they can be shown naturalistically, too, held by the hooks of a little wood clothing pole, or fixed in place by a similarly small mannequin prop. However they face us they often suggest, in pleasingly ambiguous ways, a story, a person, or a larger atmosphere.
Usually between a foot or two feet high, LeDray’s clothes are predominantly men’s apparel. But if there is a larger point to the artist’s concern with what men wear it is not readily discernible. He leaves larger meanings up to us, or as he said in an interview from 2002 about his work in general, “It’s not for me to say what it is or means.” Some of the sculptures appear to be about costumes, or how we dress for work. The 1993 Army, Navy, Air Force, Marines, for example (which is not in the current show), consists, wittily, of four separate outfits, hung one after another on a wall, each a uniform or work clothes associated with one of the armed forces.
The appealing 1994 S.A.M. (which is not in the show either) is also about a job and might be called a self-portrait of the sculptor. The letters stand for Seattle Art Museum, where LeDray, who is from Seattle, worked as a guard. The roughly two-foot-high piece, facing us on the wall, might be a memory of what LeDray saw in his locker there: gray pants on a hanger, draped over by a white shirt, a dark tie, and a blue blazer with a round patch on the outside pocket reading “Seattle Art Museum Security.”
But other sculptures seem to be about clothing itself. This, I think, is the case with an adorable but also enigmatic 1992 work at the show called Becoming/Mister Man. It is a gray, subtly patterned winter suit draped on a wire hanger, itself hanging from a nail hammered into the gallery wall. Measuring fourteen inches high, the piece is guaranteed to make you, at least at first, laugh and grin. We feel we are encountering somebody.
Yet the sculpture’s spirit is elusive. With its smallness, and especially when it is seen in reproduction, it suggests the presence of a little boy. Because the jacket is bulky and pushes out toward the hips, and we imagine the wearer being oblivious of the word “svelte,” we can think we look at a pudgy English schoolboy. But the work itself, seen in a gallery, seems as much symbolical, possibly talismanic—even confrontational. And while LeDray’s point does not seem to entail sexual confusion, the suit’s slightly too big buttons, and the way they have been placed down the jacket’s whole front, make us think of a woman’s suit jacket. Like so many of LeDray’s pieces, this winter garment sends our minds roaming in any number of directions.
LeDray’s tiny carvings of furniture or other items from bone, and his skill in fashioning countless different clay vessels—in some vitrines, each vessel has its own painted decoration—put him in the class of master craftsmen from any age. But when his subject is clothes something funnier, and more novelistic and contemporary, takes over. In these pieces LeDray’s craftsmanship is certainly phenomenal, but it is a kind of skill that allows the viewer to participate with the artist in experiencing how the whole affair has come into being. A good amount of the pleasure LeDray provides comes simply from our knowing that he has made, or had cast, every small thing before out eyes, down to the somewhat old-fashioned wood clothing pole with its hooks, the plastic and wood hangers, and the sometimes visible inside clothing labels.
Freedom Train, from 2013–2015, one of the highlights of the current show, provides a bonanza of customized details to enjoy. It presents an orange check sports jacket (which might be about twenty inches high), along with a waterfall of ties and a few (very appealing) leather belts, hanging from a variety of hangers in the center of a painted pink pegboard “wall.” At the bottom of the wall there is an electrical outlet, and at the top, stuck on with push pins, there is a felt pennant of the Freedom Train—the train that toured the country in the late 1940s and again during the Bicentennial and was full of documents and objects celebrating American accomplishment. LeDray has created everything we see, including the push pins, the pennant, and the outlet, so as to mesh with the size of the sports jacket, which in effect is our protagonist. (The exception, I think, is the ties, which seem too long.)
LeDray initially wanted to be a painter, and Freedom Train, with its lively mixture of colors, has been thought out as a painting. In its centered construction, the piece has the structure, and something of the memorial spirit, of certain classic indirect American painted portraits. It brought to my mind the late-nineteenth-century artist John F. Peto’s portrait-like paintings of various possessions and souvenirs—stained sheet music, a lantern, a dented cornet—hanging off a nicked, old green door. One can think, too, of Marsden Hartley’s Portrait of a German Officer in the Metropolitan Museum, which employs aspects of clothes, set out in a centered way, to evoke a departed person.
Freedom Train uses the seemingly innocuous and everyday subject of clothes hanging on a wall to suggest a larger world. And, as with other artists involved with the very small, which can imply a concern with seeing things at a distance—and with having a distanced sense of people—LeDray’s art is at times almost as much about vastness and multitudinousness as it is about an intimate connection with tiny, singular things. His glass cases full of little pots and bowls convey a sense of sheer abundance as immediately as any sculpture one can think of. Nor is our experience of Village People (2003–2006)—an assembly of twenty-one different hats (hard hat, beanie, Jester’s cap, and so on)—especially cozy, emotionally or physically. The work says, neutrally, that all of us, in our choice of identity, are equally interesting, and when Village People is displayed on a wall it takes up, according to LeDray’s specifications, over seventeen feet of wall space.
The artist’s greatest colonizer of space may be Mens Suits, a piece dated 2006–2009 that was part of the show that came to the Whitney and is undoubtedly a masterpiece. It mixes together a “real” room that we walk around in with, on the floor, a miniaturist’s vision of what might be a cross between a resale shop and a clothing warehouse. Feeling a little like Gulliver, or God, we look down on three different areas where we see jackets hung on metal coat racks and tables full of shirts and ties. Empty hangers are jumbled together, while one neatly holds a row of gloves. There are canvas bins on wheels full of clothes, and here and there a stack of pallets, a ladder, a broom thrown down, and an overturned dolly.
The larger room our resale shop is set in is fairly dark. What lighting there is comes from industrial-type hanging fixtures placed over the zones where the clothes are. The fixtures, which only emit light downward, descend to about four feet from the ground, which means that we viewers often crouch a little to see more fully the illuminated realm on the floor. We bend to marvel at what we would normally not find marvelous, and we are particularly aware of how big we are. LeDray’s piece is a kind of answer to Richard Serra’s overpoweringly monumental torqued ellipse sculptures in steel, which we also tend to experience communally. With Mens Suits, we are the dwarfing ones.
Long after he began showing, LeDray remains an artist who seemingly has far from exhausted the possibilities of diminutive sizes. At the same time, it remains unclear how to bring together the markedly different facets of his art. How, for instance, do we square his purely enjoyable blazers, belts, and bathrobes with the sense of damage and hurt in his early stuffed bears—or with the shiver we feel when looking at his carvings of this and that in human bone? He takes this note far further in a few clothing pieces where the garments have been tattered or have had parts cut out of them. In works where he has taken scissors to his creations and made drastic cuts in them, one feels a deep violation. The vandalism seems random. That may be the point.
LeDray keeps us on our toes, uncertain of what might come next. It may be enough to say for now that he has found an original, and still evolving, way to have us think about smallness and distance, even memory and mortality—and certainly clothes.
Has the convergence of Halloween and a presidential election ever seemed so apt? Is there anything more frightening than seeing Trump’s orange mop tousled by Jimmy Fallon? Has the ridiculous ever been so closely aligned with the horrific?
Some two hundred years ago, Coleridge had already zoomed in on the linkage. “The terrible, however paradoxical it may appear, will be found to touch on the verge of the ludicrous,” he remarked in his lectures on Shakespeare. “Both arise from the perception of something out of the common nature of things, something out of place.” Coleridge’s prime example was the appearance of the ghost in Hamlet:
I am thy father’s spirit, Doom’d for a certain term to walk the night, And for the day confined to fast in fires, Till the foul crimes done in my days of nature Are burnt and purged away.
Poe was surely thinking of Coleridge when he appended (under an alias) an introductory note to “The Raven,” the greatest of all Halloween poems, and singled out for praise “the curious introduction of some ludicrous touches amidst the serious and impressive.” Poe explicitly introduced humor into the “grave and stern” theme of the possibly undead lover Lenore, amid a riot of internal (and patently ludicrous) rhyming:
Then this ebony bird beguiling my sad fancy into smiling, By the grave and stern decorum of the countenance it wore, “Though thy crest be shorn and shaven, thou,” I said, “art sure no craven, Ghastly grim and ancient raven wandering from the Nightly shore— Tell me what thy lordly name is on the Night’s Plutonian shore!” Quoth the raven, “Nevermore.”
The proximity of horror and humor, as Scott Bruce makes clear in his new anthology, The Penguin Book of the Undead, long predates Shakespeare; selections from Hamlet are merely his most recent examples of “fifteen hundred years of supernatural encounters.” “In most cases,” Bruce notes reassuringly, “the souls of the recently deceased made the journey to their subterranean realm (Hades) without any trouble, but certain kinds of death agitated the soul, causing it to linger on the threshold of the world of the living as a ghost.” Foremost among such unquiet dead were suicides, or those killed and left unburied on the battlefield or in shipwrecks.
The earliest instance of a ghost in European literature, according to Bruce, is Elpenor. If you don’t remember Elpenor, you’re hardly alone. His own shipmates couldn’t remember him either. The youngest member of Odysseus’s crew, he was (in Robert Fitzgerald’s translation) “no mainstay in a fight nor very clever.” One of the victors at Troy, he is not even mentioned in the Iliad, one of the countless ordinary warriors caught up in the mayhem engineered by their bombastic, self-pitying generals. Elpenor means “hope.” This is one of several jokes Homer makes at his expense.
When, shamed by his men, Odysseus tells the seductive sorceress Circe (or Kirke) that it’s time to leave her enchanted island, she gives him bad news: “Home you may not go/unless you take a strange way round and come/to the cold homes of Death and pale Persephone” to hear the wisdom and advice of Tiresias. The next morning, he rouses his crew and all set sail for the land of the dead. All, that is, except Elpenor.
And this one, having climbed on Kirke’s roof to taste the cool night, fell asleep with wine. Waked by our morning voices, and the tramp of men below, he started up, but missed his footing on the long steep backward ladder and fell that height headlong. The blow smashed the nape cord, and his ghost fled to the dark.
The scene is doubly heartbreaking: first, because Elpenor is the kind of kid who’d go up on the roof to see the stars, rather than join in the macho misbehavior of the rest of the riotous crew; and second, because no one notices his absence.
It is only when Odysseus reaches the land of the dead and pours the blood of sacrificed sheep into a ditch—the prescribed rite of necromancy, or summoning the dead—that he learns the fate of his overlooked comrade. “One shade came first—Elpenor, of our company,/Who lay unburied still on the wide earth.” Odysseus can’t resist another joke:
“How is this, Elpenor, how could you journey to the western gloom swifter afoot than I in the black lugger?”
It was, evidently, the absurdity of Elpenor’s death, the way it combined the tragic and the ludicrous, that explained in part his curious appeal to later writers. When Virgil, in his own epic, named his Elpenor counterpart Palinurus—his name meaning, according to one of Martial’s epigrams, “the incontinent one,” or “he who urinates again”—he was playing with the same paradox. Cyril Connolly adopted Palinurus, Aeneas’s hapless ship-pilot who apparently falls asleep on the job and drowns, for his penname in his own elegy for World War I, The Unquiet Grave. Undone in his quest for literary greatness by what he called the “enemies of promise” (like journalism), Connolly saw himself as a similar figure of ridicule: “Palinurus clearly stands for a certain will-to-failure or repugnance-to-success, a desire to give up at the last moment, an urge toward loneliness, isolation, and obscurity.”
Also among Elpenor’s descendants is the ironically named Eutychus, “good fortune.” In the twentieth chapter of Acts, Eutychus falls asleep on a window sill while listening to a midnight sermon by St. Paul, and falls three floors where he is “taken up dead,” only to be revived (and presumably embarrassed) by Paul, who “went down, and fell on him, and embracing him said, Trouble not yourselves; for his life is in him.” There is some debate among scholars about whether Eutychus actually died and was miraculously resurrected from the dead, or had merely appeared to be dead.
Europe, with its dark and violent past, would seem to be a rich breeding ground for ghosts like these hapless young men. “Always the past enshrouds the present with its dark crimes and tumultuous passions,” the literary critic Leo Braudy notes in his new book, Haunted. Part of the mythology of America, Braudy notes in a discussion of Hawthorne, is that ours is “a country without a mysterious past and therefore one without passion and romance.”
And yet, our literature and popular culture are saturated with the Gothic, never more so than today, with our walking dead, our vampire cult, our creepy clowns, our atavistic politics. Have our past “foul crimes”—our original national sin of slavery, our still insufficiently recognized genocide of the American Indians, our outsized appetite for misogyny, xenophobia, and guns—caught up with us? “We are now sixty years or so into a horror phase that started in the late 1960s and early 1970s,” according to Braudy.
So familiar have the aesthetic conventions of horror become that it is increasingly difficult to distinguish “real” Halloween movies from parodies. Something similar has occurred in our political life. Alec Baldwin has often seemed a more convincing—a more terrifyingly ludicrous—Donald Trump than the real thing, a phenomenon not lost on Trump, who, most conspicuously in the third debate, appears to have borrowed mannerisms from Baldwin. One of our candidates for president has cut his teeth in so-called “reality television,” in which what is real is never quite what it seems.
Under such shaky circumstances, we are likely to feel, more than ever, Coleridge’s unsettling “perception of something out of the common nature of things, something out of place.” Will there ever be a return to normalcy, as the corrupt President Harding famously promised? Was Gerald Ford perhaps a bit over-optimistic in his inaugural address, when he tried to bury the ghost of Richard Nixon forever? Will we ever awaken from our long national nightmare? Or will we wake up only to follow the tragically ludicrous fate of Elpenor?
Waked by our morning voices, and the tramp of men below, he started up, but missed his footing on the long steep backward ladder…
Even those who think they know all there is to know about the Brontë family will likely be surprised by many of the documents and artifacts included in “Charlotte Brontë: An Independent Will,” currently on view at New York’s Morgan Library. Many of these revelations have to do with size and scale, with the contrast between the breadth and depth of Charlotte Brontë’s imagination and her physical delicacy, between the forcefulness of her and her siblings’ prose and the neat, astonishingly miniscule handwriting (not unlike Robert Walser’s microscript) in which she, Emily, and their brother Branwell penned their early work.
The first thing we see, on entering the gallery, is a glass case containing one of Charlotte Brontë’s dresses and a pair of her shoes, objects that make us acutely aware—more effectively than any description or photograph of these items could—of how diminutive (by modern standards) this strong and resilient woman was. Tiny books and magazines, including a copy of a satirical play about the art of writing, The Poetaster, that Charlotte wrote when she was fourteen, offer a view of the way in which the Brontë children saw writing as an imaginative game; to them, these miniature, handmade volumes—meticulously printed, and in some cases illustrated with watercolors—were, essentially, toys. Included also is the manuscript of a poem that Emily Brontë wrote when she was nineteen, a work of three hundred words, divided in forty-six lines, on a page that is only ten centimeters tall.
Brontë Parsonage Museum/Photography by Graham S. HaberCharlotte Brontë’s earliest surviving miniature manuscript book with watercolor drawings; “There once was a little girl and her name was Ane [sic],” circa 1828
National Portrait Gallery, London/Photography by Graham S. HaberBranwell Brontë: Portrait of Anne, Emily, and Charlotte Brontë, circa 1834
The Morgan Library & Museum/Photography by Graham S. HaberThe Poetaster: A Drama by Lord Charles Wellesley, part II, by Charlotte Brontë, June 8–July 12, 1830
Brontë Parsonage Museum/Photography by Graham S. HaberCharlotte Brontë: Study of Noses pencil drawing, February 1831
Bonnell Collection/Photography by Graham S. HaberCharlotte Brontë’s doodles in her copy of the General Atlas of Modern Geography, circa 1830
The Morgan Library & Museum/Photography by Graham S. HaberElizabeth Gaskell’s engraving of Haworth parsonage and churchyard, frontispiece to The Life of Charlotte Brontë, vol. 2, 1857
Bonnell Collection/Photography by Graham S. HaberA manuscript written in Charlotte Brontë’s tiny script at Roe Head school, February 4, 1836
The Morgan Library & Museum/Photography by Graham S. HaberA pencil sketch by Charlotte Brontë, possibly a self-portrait, on a blank page in
her school atlas, circa 1830
Anne Brontë’s Bible and a group of family prayer books provide a sense of the intensely religious atmosphere in which the siblings were raised by their clergyman father. Other volumes—a world atlas that Charlotte decorated with doodled portraits of women, Thomas Bewick’s History of British Birds—increase our understanding of what the family read, and of the ways in which they supplemented the sparse and punitive education that Charlotte and Emily received at the nightmarish boarding school that appears, thinly disguised as Lowood, in Jane Eyre. Among the most affecting documents are letters and journal entries in which Charlotte expresses the unhappiness and loneliness she experienced as a teacher (“neither is my heart in the task, the theme or the exercise”), as a governess (“I am miserable when I allow myself to dwell on the necessity of spending my life as Governess”), and as a student in Brussels (“I am tired of being amongst foreigners it is a dreary life”).
Curated by Christine Nelson, the exhibition reinforces our notions of Charlotte Brontë’s daring, ambition, and courage, and of the tragic circumstances over which she prevailed. In one letter, Charlotte describes the 1848 visit to London during which she and her sisters Emily and Anne revealed to her publishers that the novels they had submitted under male pseudonyms (Currer, Acton and Ellis Bell) had in fact been written by women (“Mystery is irksome, and I was glad to shake it off”). The publisher was initially surprised, but nevertheless decided to show the sisters around London, introducing them not as authors but as his “country cousins” the Misses Brown. In another, bordered in black, Charlotte mourns her brother’s death.
Taken from the frontispiece Elizabeth Gaskell’s The Life of Charlotte Brontë, a brooding, romantic engraving of Haworth Parsonage, where the Brontës grew up, positions the family home between a cemetery in the foreground and the dark moors in the distance. A famous group portrait by Branwell Brontë, done when he was fourteen, portrays his three rather pretty sisters looking fully as serious—and as haunted—as we imagine them to be. The painting is all the more haunted when one realizes that Branwell painted himself out of it. Early editions of the novels that the three sisters wrote, Charlotte’s marriage certificate (at the age of thirty-seven, she wed Arthur Bell Nicholls, her father’s curate) and her last will and testament provide an illuminating overview and a moving visual mini-biography of this extraordinary artist.
“Charlotte Brontë: An Independent Will” is on view at New York’s Morgan Library through January 2, 2017.
At the end of the street where I lived in Wellington, at the southern end of the North Island of New Zealand, a street of gracious two-story houses set in large gardens that were planted with oak and ash and maple, with English herbaceous borders and flowering fruit trees and shrubberies, was a park.
“A park?” you say.
A park, yes. But not a park as you know a park to be, not what you would call a park. It was the place where we went to play, at the end of our street.
This park was set with games areas for children as is the case with most parks, with railings around the green, and a swimming pool at the entrance. There was an avenue of magnolia trees that led from the front gate to the main picnic lawn. Places were set aside for different kinds of swings and slides, depending upon how adventurous you felt. Might it be the scary Witch’s Hat? Or the spinner Carousel for babies? The long red and blue see-saw or the short yellow one? You choose. In summer you ran around in your swimming things, wet from the pool. You laid out your towel to dry in the hot sun and ate an ice cream from the little kiosk by the changing rooms that sold sweets and fizzy drinks.
But at the end of this “green and pleasant land” was a thick pelt of New Zealand bush, starting at the border of the swings and slides where the smooth lawn finished, and spreading into the hills behind it—a dark presence waiting at the end of all the brightness and play. It clambered back up and over the rises and falls in the landscape behind the city, clogging gullies and ditches, stopping when the suburbs gave out to open farmland, paddocks—never “fields”—but even then rising up again in pockets of thick growth, only really giving up when it reached the sea.
That bush ran in thick seams through the New Zealand of my childhood. It waited at the ends of roads, growing dense and dark at the edge of clusters of houses; it might be there when you turned a corner, a broad sunlit street giving way suddenly to a narrow track that was cut through the side of a hill. Certainly, our park could barely hold it back. The second playground—”intermediate” it was called, where teenagers would never go although it had been created for grown-up children, with swings that went higher and a slide that was long and damaged with a frightening kink in the middle that could hurl you off halfway down—had bush growing right around it as though it were intended. It was dark and wet-smelling; half the things in it were rotting and the other half in bud. New Zealand bush does not have a season.
“Well, it’s bush,” you say. “Bush doesn’t.”
It grows and dies and grows again—all at the same time, all through the year, scattering seeds and rotting, hosting worms and larvae and beetles even as it puts forth new shoots and brighter leaves.
Not woods. Not a forest. Nor a copse or a dell or a glade.
“It’s bush. It doesn’t translate.”
Perhaps it can change us, though?
After all, she could still hear the picnic, and see it, out of the corner of her eye, but with every step the girl was getting further and further from them. At one point, one of them called out: “Girl! Don’t you go too far! You be careful, hear? Stay close!”—but after that, nothing. There was a burst of laughter, raucous and yet dim, like she were listening to them through blankets, and though for a minute she’d stopped, just in case they were still watching, she knew she could walk on. She could go into the bush now if she wanted and she was going to; she was going to go right in. The heat… That was partly why she’d edged away, to find some relief from it, away from their bright picnic and their loud talking and songs, away from their picnic rugs and the boxes filled with beer and juice, their flasks of tea. Why did adults always have to make such a big fuss about everything, the girl had wondered earlier, when her mother and the aunts were unpacking the boxes and cooler bins they’d brought with them, the bags of swimming costumes and sun lotion and jerseys and towels. Why did they have to be like that? Be so busy? The girl had stood there, to one side, seeing them go backwards and forwards with their arms full. They were always packing and unpacking like that, she’d thought, making things to do for themselves when the weather was too hot to do anything but swim.
Her father had said in the car as they drove toward the park, “Don’t worry, there’s bound to be a swimming pool there. You’ll have fun with the rest of them…,” meaning those paheka brothers and sisters of his, and his brother’s children, her cousins. “You will,” he’d told her, catching her eye in the rear-view mirror. The hot wind had entered through the open window, all yellow and green and the blue sky overhead and she’d wanted to close her eyes and laugh right into it. Instead, her father gave her a look again, and said, “It will be good for you seeing your cousins again. These parks down here, they’ve got everything, swimming pools and tennis courts. Plenty of facilities. You will have a good time. You will.”
But when they arrived at the big park on the outskirts of town, there was no swimming pool, only swings and slides and a roundabout and she was too old for swing parks, she wasn’t a baby. She was tall for her age, heavy and thick set, “with a long torso” her mother said; the girl didn’t know what a “torso” was but it sounded like something rude. She had short strong legs, though, and strong legs were good. She knew that about herself, that she was strong, that there were things she could do that other girls couldn’t, knew it like she knew other secrets about her body and how it was changing… Hair, blood, and no longer being flat beneath her singlet and cotton dresses but pushing out and growing there… Didn’t her mother see? What was happening to her? Or did she not want to see? The girl was used to hiding from her mother at bath-times now, she hunched over when her parents talked to her so that she wouldn’t seem so tall and strong. She drew her shoulders together when either of them came close, so the folds of her clothing might cover her body and all of her self that her mother said she should be careful not to show. “Keep yourself covered up, hear me?” She nodded, mute, ashamed, ashamed of her mother’s shame. It was this morning her mother had said that, with the blood coming and the terrible pains in her belly…”Not another word,” she’d said, after she’d shown the girl what to do.
Because really, there’s not another word for it. Only bush, bush everywhere here, and everywhere it stays the same—bush, just bush, collective and uncompromising. Neither singular nor plural, but both, resisting always the indefinite article that would make it some dainty shrub or hedge as well as the metonym that would have it stand for something larger. Bush. Bush. Bush. And it was growing, all this, at the edge of the “park” at the end of our street. Look at what I’ve written in my notebook:
How can we have ever played here? How can we have made this place our home for childish games? The fort where we pretended we were Daniel Boone hiding from the Indians is pitch black inside and damp, and the pongas and flax that were planted to be like a garden around it have half grown up, have covered it with a sort of spoor and in the vegetation have bred wetas and huge spiders that rattle out from under the dead leaves when we go into the dark interior. Then, over here, is the little stream running down by the picnic green—we used to call it “the burn,” remember? That Scottish word for a little stream, how could we have called it that?—when, look at it! Choked with weeds and mud slick, water sluggish and engorged with rotting leaves…My brother tried to dam it, to create a pool to sail his boat, but no toy boats could sail there!
“You make it seem like it was a terrible place, this park. You make it seem dangerous and unpleasant.”
And yet we did play there. We had walks. We had games, make-believe. We had tents, we made bows and arrows, we built shelters.
“You were frightened?”
Yes. It was terrifying there.
And she wasn’t a baby. She wasn’t. Her cousins, though, were teenagers, and older than her. Her parents had said that she would enjoy seeing them again, but the cousins were mean. The girl remembered, the minute she saw them. The boy picked at her clothes and the others stared and whispered. They mocked her silence. “Listen to her, Terry!” said the oldest girl. “Listen to what?” another one replied. “There’s nothing to listen to, dummy. She’s a rock. Big things like that don’t talk.” The cousin came up close to her, opened her mouth and closed it again, mouthing a silent word as though she were a frightening fish. “She’s like a maori-girl… Ain’t ya?” the cousin said. “Because I don’t think she speaks at all.”
“Cut it out, Bethan!” the boy said then. “You sound like white shit.” He took one of the girl’s heavy plaits between his thumb and forefinger as if to measure it, weigh it. “Anyhow, I like her,” he said. “With her funny little country ways and her little itsy-bitsy dresses. And this hair-do here…” He ran his thumb and forefinger down the length of the plait while she stood, shock still, closing her eyes against him. “I like it too.”
None of the adults heard, of course—and if they had? They would have laughed, or his mother would have cuffed Terry over the ears and he would have said, “Hey, steady, Ma.” He would have smiled his bright white smile that made him look like a movie star. The older girl and the twins, they just stared at her mostly, with a fixed expression on their faces like: What kind of a thing are you?
“Does your mother make you wear your hair like that?” said one of them, after their brother had turned away. “Cause it’s really ugly.”
Maybe the dictionary can help: Bush: n. a woody plant in size between a tree and an undershrub. That’s from the Chambers Twentieth Century Dictionary, a British Dictionary with “up-to-date English” as it states on the flyleaf, with “everyday words.” And sure enough, right before us, here’s the little British bush we all know so well, with its flowering leaves and dainty wooden branches, that “woody plant.” But coming straight after, there it is: Bush: wild uncultivated country: such country covered with bushes; the wild.—v.i. to grow thick or bushy.—v.t. to set bushes about: to cover. adj. bushed, lost in the bush: bewildered.
And that word: bewildered…
Though, the girl thought, she herself, she knew, could never be lost there…
Even so, bewildered, yes.
But never lost there.
In the New Zealand of my childhood people used an expression within which was the nightmare of a place so overridden with manuka and scrub you might never escape it. This phrase, it seemed, existed permanently in a kind of future tense—that you might walk into and inhabit and be lost in: “Going Bush.” My father used to say it, about a friend who’d made the decision to go into remote and difficult country, allowing himself to be altered by that experience. “Ah yes, old Malcolm. He went bush in the end and you couldn’t get much sense out of him by then.” As though, my father seemed to imply, by walking into that dense growth, a person would never afterwards be freed from it; changed forever. “Poor old Malcolm wasn’t good for anything after that,” my father said, “he went bush, alright.” And friends of my parents would talk about it at parties, how they might go bush themselves—as a way of escaping from the easy daily routine, joking with each other about it, topping up their whiskies and laughing. The men saying that they might take a rifle in there and a fishing rod and think about never ever coming home again. “That sounds just fine to me!” Or people would use the phrase at the beginning of the summer, as a way of describing how they were planning to relax, like they were “going native,” another expression that intimated the horror of an irreversible change. Women did not use this expression, they had no need. For them, in those days, there was no other life that might claim them—or that is what we children thought. They refreshed their lipstick and shook their heads when their husbands offered them another sherry. Only men went in there, into the Tarawheras or the Ureweras or the Kaimanawa Ranges. They came home, sun-blackened and with beards or stubble on their faces, laughing and smelling of earth and drink and something else—seeds or mould or blood.
The girl remembered all of this, what the adults were like, what her cousins were like, the things they did and said, from last time, a long time ago when her parents had driven down South to meet her father’s relations. Then, she’d been little and they had been tall. Now they were older but she was the tall one; she knew she was strong. Still, there were four of the cousins and only one of her. And how hot she was in the sun, in her dress that covered her up and the cardigan that went across her back like a rug. She wouldn’t be able to run away from them like last time when she’d fled them to find her mother who had taken her by the hand and let her stay close. Now she would never be able to escape fast enough, not with the vest and dress and cardigan all stifling her like a heavy trap.
“Why doesn’t she take all that ugly stuff off?” said the oldest girl, who was wearing a bikini and nothing else, and who kept running her thumb along the waistband low down on her belly.
“She must be baking, in all that stuff,” said another.
“She’s cooking herself,” said the boy, and he caught the girl’s eye and smiled.
“To be unobtainable,” my father tells me. “Going Bush. That’s what the expression means.”
Te Ara-A-Hongi: Hongi’s Track.
Puketapu: Sacred hill or mound.
With roots, and mould, and spoor, and the tangle of new growth over everything, over any chance of a summer holiday at the beach, by a river or a lake.
Puarenga: The stream which flows through Whakarewarewa.
Te Puia: The burial hill at the edge of the geyser.
All keeping you in there, stopping you at the names of the places that were in the midst of all the growth.
Tarukenga: Place of slaughter.
Whakarewarewa: A place of uprising.
All bush, bush. The girl knows. How there at the edge of the bright sand, on the other side of the sparkling water, it was waiting for you.
Whangapipiro: Evil-smelling place.
When you swam to the other side of the bay where the water was in shadow, or turned a bend in the river and suddenly you were in deep.
Because she was country, the girl was. So country, her cousins said, that they couldn’t understand a word her parents were saying. “Did you hear the way he asked for a beer?” said the one called Bethan, talking about the girl’s father. “Like straight off the pa…”
“They’re all maoris up there,” said the boy.
“They sure look that way,” said one of the twins, eyeing the girl as she stood, off to one side, trying not to see them, trying not to hear.
And all these landmarks, locations, these long site-words, naming-words, complex, reaching and growing words, all seeding further meanings… Te Ara-A-Hongi, Puketapu…
They had no meaning then, for me. Those words. The words that might break bush down, separate it, make of it this place here, or that place there. I didn’t have the parts of speech, the articulations, pronunciations. I didn’t have a way in, and bush gave, absolutely it gave, no quarter.
“Don’t go in there again,” her mother had said. “It’s dangerous there, beyond the park where we’re going.”
Because that dark would not be colonized by our games! We were little white children, little Pakehas the Maoris would have called us, lost in the midst of it: no synonym in English for the English noun that it devoured with its own reality.
“Don’t go in there,” her mother said.
Adapted from Kirsty Gunn’s Going Bush, number 27 in the Cahier Series, published this month by Sylph Editions. Gunn’s My Katherine Mansfield Project will be published on November 15 by New York Review Books.
Around 1840, Niels Christian Kierkegaard began a sketch of his second cousin Søren. The face remains, like the portrait itself, unfinished. Though the sitter was already twenty-seven you would swear he was still an adolescent. The gaze is intense, the eyes innocent and preternaturally large. You can just glimpse the light sideburns he grew to cover his cheeks after the fashions of mid-nineteenth-century masculinity. But the mouth, held firm in conviction, betrays him: the lips are too delicate, sensuous, petulant. A year later he would cancel his engagement to Regine Olsen to begin a life of celibacy that would also mark the start of his philosophical career: “My engagement to her and the breaking of it,” he wrote, “is really my relationship to God.”
Kierkegaard is widely considered the most important religious thinker of the modern age. This is because he dramatized with special intensity the conflict between religion and secular reason, between private faith and the public world, and he went so far as to entertain the thought that a genuine reconciliation between them is impossible. Society, for Kierkegaard, is a place of leveling conventions, and the ethical principles that bind us together ignore the genuine self. It is faith alone, uncontaminated by public understanding, that distinguishes the authentic individual, and faith is something wholly interior, a leap into paradox.
At their limit such arguments suggest religious absolutism; they extol the believer even if his belief runs against all accepted codes of humanity. In reading Kierkegaard’s works one begins to fear that the individual whom he celebrated as “the knight of faith” too closely resembles that figure upon whom we have heaped so many of the anxieties of our own time: the religious fanatic.
But Kierkegaard’s thinking is more subtle than this. A lover of irony, he signed many of his works with pseudonyms: Vigilius Haufniensis (“Watchman of Copenhagen”), Johannes de Silentio, Anti-Climacus, and, perhaps best of all, Hilarius Bookbinder. Everyone in Copenhagen knew that Søren was the author of his works, but this did not deter him from giving his pseudonymous personae a further twist of the pen, at times adding to the title page of a book the scholarly note that it had been “edited by Søren Kierkegaard.”
But at other times he was deadly serious, a moralizing Lutheran who excoriated the high officials of the Danish reformed church and held forth obsessively on themes of faith, sin, and anxiety. Among his most famous works are bitter satires and invectives against bourgeois conformity that are interlaced with the same veins of explosive resentment that Dostoevsky would mine in his Notes from Underground (which was published less than a decade after Kierkegaard’s death in 1855).
No single line of inheritance connects Kierkegaard to our present day. One tradition bridges an unlikely divide, linking the pious Kierkegaard to the atheist Friedrich Nietzsche. In the 1920s and 1930s philosophers such as Martin Heidegger and Jean-Paul Sartre began to pluck from Kierkegaard’s writings the central themes of existential philosophy. Though they did not share his theism, they borrowed his image of the human being as incorrigibly mortal, condemned to a worldly existence bereft of all rational certainties. The French philosopher Jean Wahl assigned Kierkegaard a principal part in his Petite Histoire de l’existentialisme (1947): “The word ‘existence’ in the philosophical sense that it is used today,” Wahl declared, was originally “discovered by Kierkegaard.”
In histories of philosophy it is still commonplace to name Kierkegaard the founder of existentialism. But another legacy connects him to the rebellious movement known as “crisis theology,” associated chiefly with Karl Barth, the Swiss Reformed pastor whose Epistle to the Romans transformed the landscape of twentieth-century religious thought. It is not hard to see why Barth found instruction in Kierkegaard, whose writings meditate to an obsessive degree on the absolute chasm between God and humanity. For Kierkegaard, as for Barth, God remains “wholly other” and cannot be pressed into service for mundane causes.
But Kierkegaard was an unbending conservative, and the political consequences of his religious absolutism remain uncertain. His hatred of the mob, for instance, fosters a healthy skepticism toward political conformity but also a disabling contempt for the public good. A rather different line of influence connects him to illiberal critics of modern democracy such as Carl Schmitt, the Nazi legal theorist who cited the Dane as an authority when he claimed that the ultimate problems in politics require radical decision, not reasonable deliberation. The world of Kierkegaard scholarship is thick with complaints that he has been misunderstood and that he was in fact neither an arch-conservative nor an “irrationalist” (a standard charge). But the truth is that his influence spans ideologies of all kinds, and his legacy is contested only because its meanings overflow all boundaries of doctrine and argument.
The bicentennial of Kierkegaard’s birth in 1813 has brought him surging back into view, with the publication of Joakim Garff’s monumental biography (first in Danish, 2000; English translation, 2005) and the no less lengthy Oxford Handbook of Kierkegaard, along with a second work in Danish by Garff, Regines gåde: Historien om Kierkegaards forlovede og Schlegels hustru (Regine’s Mystery: The Story of Kierkegaard’s Fiancée and Schlegel’s Wife), also published in 2013.
Nor can we neglect Walter Lowrie’s A Short Life of Kierkegaard (reprinted by Princeton University Press for the bicentennial). First issued in 1938, the Lowrie biography still deserves to be read in its unabridged form, not least for its author’s unembarrassed prose. Lowrie, an Episcopalian clergyman schooled at Princeton Theological Seminary, was clearly besotted with Kierkegaard, whom he affectionately calls “S.K.,” a nickname scholars still use today. In his introduction Lowrie confesses: “S.K….spoke of a time when ‘my lover’ will come—and the reader will easily discern that this book was written by a lover.”
Daphne Hampson, the author of Kierkegaard: Exposition and Critique, is not a lover of Kierkegaard’s ideas. But perhaps for that reason she serves as a more trustworthy guide. A professor emerita at the University of St. Andrews and associate of the Faculty of Theology and Religion at Oxford, she holds two doctorates, in history from Oxford and in theology from Harvard, where, in her mid-twenties, she first encountered Kierkegaard in an independent study with the professor of divinity Richard Reinhold Niebuhr (nephew of the American theologian Reinhold Niebuhr). Though Hampson has spent much of her life with S.K. as an imagined companion, her book is less a work of appreciation than an exercise in stock-taking. Its structure is straightforward: each chapter addresses one of Kierkegaard’s major works, offering a careful reconstruction of a single one before turning to criticism.
The shift can be jarring. Hampson is a dutiful expositor and her summaries can be a bit dry when she keeps her opinions in check. But once she has departed from the task of exposition her critical voice is thoughtful and unconstrained by orthodoxy. Surely the explanation for this is, in part, the fact that Hampson does not share Kierkegaard’s beliefs. Although she favors some kind of “spirituality,” she rejects traditional monotheism, especially its patriarchal themes, and she considers the notion of a God who resides “outside” the world the remnant of a defunct metaphysics. In the preface, after she has listed her debts to teachers and colleagues she permits herself the unusual gesture of thanking S.K. himself:
My life would have been subtly different had I not encountered Kierkegaard. He has been a source of delight and edification with his insights and perspicacity. I am moved by his love of God, his sensitivity to others, and his sparkling wit.
But, she adds, Kierkegaard has also helped her to grasp “with greater clarity why I should not wish to be Christian.”
Hampson, like many lay scholars who read Kierkegaard, wishes to clarify with as much sympathy as possible what he meant to say. But on issues vital to his faith she clearly dissents, and she is willing to entertain the sorts of thoughts that Kierkgaard himself would have found intolerable.
Kierkegaard’s own devotion to his faith was unqualified. Although he was born into financial comfort, his father had arrived as a nearly destitute youth in the Danish metropolis of Copenhagen, and he brought with him a lingering nostalgia for the Moravian pietism that flourished in the Jutland countryside. Søren himself was schooled in the evangelical Lutheran Church of Denmark, but he was racked by doubt about whether its respectable teachings were genuinely Christian. During Denmark’s so-called Golden Age (the era that spans the first half of the nineteenth century from the nation’s defeat in the Napoleonic Wars to the democratic revolutions of 1848), the country remained culturally divided between its enduring agricultural traditions and the relatively small but powerful aristocracy that clustered in the capital. Popular taste hugged the shores of convention, and the tales of Hans Christian Andersen typified the nation’s longing for rural virtues.
The prestige of the official church was personified by Jakob Peter Mynster, a prominent theologian who by 1834 had secured the lofty title of Bishop of Zealand and Primus. As a boy Kierkegaard was devoted to Mynster—in fact it was Mynster who performed his confirmation—but his father also brought him on occasion to gatherings of the Moravian Brethren. This deep split in Kierkegaard’s character is still visible in the 1840 portrait sketched by his cousin: the ardent defender of Christian simplicity can be found in the eyes of the young man who is, in all outward respects, a sophisticated son of the Copenhagen bourgeoisie.
That same year, at age twenty-seven, Kierkegaard became engaged to Regine Olsen. Garff informs us in his biography that Kierkegaard already realized the next day that he had made a mistake. But it took him until the following summer to annul the engagement, and when he returned his ring he gave the false excuse that he was just a young cad who needed a “lusty young girl” for excitement. He told himself that this was “a necessary cruelty” since it guaranteed that Regine would not continue to hope for a liaison with him, though it also meant that Kierkegaard himself would spend nights alone in tears. Regine went on to marry Johan Frederik Schlegel, a high official of the Dutch West Indies; after his death she returned to Copenhagen where she helped to curate the legacy of her former fiancé, who had left her his entire inheritance.
One is tempted to read this sad tale of eros interrupted as a foreshadowing of Kierkegaard’s mature philosophy. Its plot—in which he betrayed bourgeois convention to achieve pious solitude—would resound in nearly all of his later works, though its echoes are loudest in Either/Or, a meditation on the conflict between romance and ethics, which he wrote at a feverish pace during a long winter in Berlin after the break with Regine. He also spent the time in Berlin attending lectures by Friedrich Schelling, one of the luminaries alongside Hegel in German Idealism (he considered the lectures a waste of time). He returned to Copenhagen with his manuscript complete, and its publication in the spring of 1843 scandalized Danish society and established his reputation, despite the fact that it appeared under a pseudonymous editor named “Victor Eremita” (Victorious Hermit).
Hampson does not devote a chapter to the book, which is a pity. The Marxist theoretician Georg Lukács suggested that Kierkegaard’s entire philosophy could be found in his separation from Regine. Either/Or is in fact a meditation on the conflict between two modes of existence, the hedonistic or aesthetic (as described in the “Diary of a Seducer”) and the ethical (in which one awakens to remorse and “holiness”). To the individual who must choose either one life or the other, Kierkegaard says, reason offers no guidance. But at the book’s conclusion we learn that neither path is right, for in prayer alone we find truth: “We gladly confess that in relation to [God] we are always in the wrong.”
In the same year Kierkegaard also published what surely remains his most famous work, Fear and Trembling: A Dialectical Lyric. It is here, hiding behind the pseudonym of “Johannes de Silentio,” that S.K. insists on the stark opposition between public obligation and private faith. The book offers a meditation on the “Akedah,” the biblical episode in which Abraham appears ready to sacrifice his beloved son Isaac in obedience to God’s command and is only prevented from doing so by divine intervention. If this book arouses controversy today it is partly because its author does not flinch from considering the most extreme implications of the biblical tale.
Kierkegaard grants that any reasonable person who wishes to follow social norms must condemn Abraham as a would-be murderer. But we are asked to consider the alternative possibility, that Abraham’s obedience signifies the genuine “paradox” of individual faith without which religion is impossible. From the observer’s point of view this faith must appear preposterous, and we are right to conclude that Abraham’s love of God is “incommensurable with the whole of actuality.”
But for Kierkegaard this is precisely the point: faith entails a paradox by which the individual cannot make himself understood and yet “the particular is higher than the universal.” Abraham embraces the impossible proposition that through sacrifice he will receive his son alive: “Only he who draws the knife gets [back] Isaac,” Kierkegaard wrote. It is commitment, heroic yet absurd, that distinguishes Abraham as a “knight of faith.”
In the nearly endless commentaries on the Akedah some have argued that God never meant for Abraham to go through with the sacrifice. The medieval Jewish commentator Maimonides was especially troubled by the proposal that God “tested” Abraham since this implied that God did not know the outcome of the test—a violation of the principle of divine foreknowledge. For Christians the unrealized sacrifice of Isaac has an important part in Old Testament prophecy since it was thought to prefigure the actual sacrifice of Jesus Christ. For Muslims Abraham’s faith is praiseworthy but they hasten to explain that it was not Isaac who was readied for sacrifice but Ishmael, ancestor to all who worship Allah.
Modern readings of the episode are no less varied. Many have followed Kierkegaard in praising Abraham’s devotion, though few have matched him in analytical intensity. But Hampson is troubled (and rightly so, I think) by a religious disposition that violates our common understandings of humanity. Appealing to feminist criticism, she sees in the God of Kierkegaard a distinctly patriarchal ideal of stern (even violent) authority.
The charge is familiar but it reflects our own cultural preconceptions of male and female conduct. Indeed, the notion that God has gender at all involves an unwarranted lapse into anthropomorphism. The abstract deity of the Protestant West is customarily called “God the Father,” but as a matter of philosophical principle it makes no more sense to imagine God as a conventional man (the strong, silent type) than it does to picture God with the attributes we conventionally assign to women.
More worrisome, in my view, is not Kierkegaard’s lapse into anthropomorphism but rather the opposite: his readiness to amplify the doctrine of Protestant abstraction to a limit where the divine exceeds all understanding. Kierkegaard’s God lies at such a great remove from everyday categories as to contravene our most fundamental and enduring norms of morality. When a parent believes he hears voices that command him to take a knife to his own child our proper response should be not praise for his piety but horror at his self-evident lunacy. Kierkegaard is of course aware that this is our customary belief. But he sees in Abraham’s conduct a “teleological suspension of the ethical,” which is to say, he entertains the thought that religion imposes a higher purpose on us than what ethical reasoning demands.
Here, as Hampson reminds us, Kierkegaard belongs to the tradition called “divine command theory.” God does not command an action because it is good; rather, an action is good only because God commands it. One can admire Kierkegaard for the candor with which he pursues this principle to its perverse conclusion. But it smacks of authoritarianism. If a parent hears such an obscene command we should extol him not if he falls in line but only if he disobeys. Resistance to barbarism, even when it is commanded by the highest of all authorities, is the true mark of the blessed.
There is a painting called The Sacrifice of Isaac by Caravaggio that hangs in the Uffizi, about which the philosopher J.M. Bernstein has recently written a telling commentary.* The look in Isaac’s eyes as Abraham holds the knife to his throat bespeaks not just terror but protest. Though he wishes to be an obliging son, he cannot assent to what his father is determined to do. His resistance is visceral, the cry of a human animal whose desire to live calls into question any ennobling ideas of divine obligation. For Bernstein, the painting is an allegory for the birth of secular consciousness: it expresses a dawning awareness that if faith demands barbarism, then it is faith that must yield. But Kierkegaard’s argument runs in precisely the other direction: when faith and humanity conflict, faith supervenes. He is most blessed who persists in his piety even if he has made himself utterly unintelligible to those around him.
Such arguments run through many of Kierkegaard’s most celebrated works, which ruminate on the chasm between God and humanity, between the individual and the collective. It is not surprising that Kierkegaard therefore despised Hegel, because it is Hegel most of all who developed the idea that individual consciousness cannot exist in the way Kierkegaard supposed. During Kierkegaard’s lifetime Hegel’s philosophy had gained prestige in Denmark thanks to the promotional efforts of Hans Lassen Martensen, the court preacher who succeeded Mynster as bishop upon the latter’s death in 1854, and who Kierkegaard saw as the embodiment of everything that was rotten in Denmark. In 1846, he published his most sustained assault on Hegel in Concluding Unscientific Postscript to the Philosophical Fragments, a title that, like many others, was deeply ironic, since in this case the “postscript” was a great deal longer than the “fragments” (literally, “crumbs”) that Kierkegaard had published before.
Kierkegaard’s antipathies toward Hegel are due chiefly to the German philosopher’s confidence in reason as the medium of universal reconciliation. Both philosophers recognized that modern life was increasingly beset with tensions—between individual and community, freedom and necessity, science and faith. But where Kierkegaard saw contradiction, Hegel saw the possibility for rational development. Reason, he argued, is a kind of agency, or “Spirit” (Geist) that strives to manifest itself in the world.
The trial of self-realization has a dialectical pattern: reason overcomes its own imperfection, much as a student must confront occasional difficulties in learning but eventually through her struggles gains in both maturity and insight. Politically, the dialectic meant the emergence of constitutional government and the perfection of a society in which no individual remains unfree. A fully modern society for Hegel was therefore a fully rational one, and—this is the crucial point—rationality was therefore a quality shared in common by all individuals. In a rational world the individual would no longer feel any contradiction between subjective interest and the public good. The Hegel scholar Terry Pinkard calls this theme “the sociality of reason.”
In the early nineteenth century many students of theology looked to Hegel as a guide for understanding the social and historical character of religion. Among them were radicals such as David Friedrich Strauss and Ludwig Feuerbach, who concluded that religion was a human creation and lacked metaphysical reality. The Hegelian critique of religion dissolved into anthropology: the modern celebration of human qualities.
But some students of the dialectic felt differently. Without contesting its metaphysical truth, they felt the Christian religion should transform both self and society, binding the pious individual to public reason for the betterment of modern civilization. It was this project that Martensen found appealing, and Kierkegaard deplored. To “go beyond” the Christian faith, Kierkegaard feared, would not enhance its reality; it would only sacrifice its inner truth for the sake of outer decorum. Christianity’s truth would vanish into mere Christendom. As Hampson explains, “Hegelian religion à la Martensen [was] only too at home in the world.”
In his own system Hegel consigned the desire to remain “outside” of society to a lower stage in the dialectic that he called “the unhappy consciousness.” For Kierkegaard, however, the sociality of reason is a kind of totalitarianism; it will not tolerate any kind of difference within its domain. (The charge that reason itself is therefore “intolerant” inspired an unlikely revival of interest in Kierkegaard among poststructuralists such as Jacques Derrida, who often understate the ease with which Kierkegaard contests reason’s authority only to submit to an authority that is far more absolute.)
Still, one can appreciate why readers across the ages would find his criticism sympathetic. His protest against the worldly power of the Danish evangelical church ranks him in a long line of reformers spanning the millennia from Luther back to Jesus Christ (the thorns in the side of priestly authority) and Hebrew prophets such as Amos and Isaiah. Nor are the comparisons confined to religion. Kierkegaard often writes of Socrates as an ironist and gadfly who isolated himself “above every relationship.”
A complicating factor is that Kierkegaard was hardly a critic of authority. With the coming of revolution in 1848 the feudal system in Denmark was destroyed and in its stead the national liberals laid the foundation for a constitutional order that would outlast any of the other midcentury revolutions across Western Europe. The franchise was not universal and the monarchy was not wholly abolished; but for the first time most laborers could vote. Kierkegaard, however, considered the revolution a “catastrophe.” What he loathed most of all was the fact that the Danish church was ready to reconcile itself to the new situation. It had (in Hampson’s phrase) “baptized the world,” leaving all essentially unchanged. The dissoluteness of brothels remained the same, Kierkegaard wrote, only “they have become ‘Christian’ brothels.”
Hampson explores such criticism without undue censure. At stake, she writes, was a basic question regarding the place of religion in modern society:
Should there be a broad state church, a spiritual home for the Danish people, which could provide a focus for the nation in times of crisis or rejoicing, able also to offer guidance and comfort to individuals whether or not they normally attended Christian services, a place where they might negotiate life’s transitions. Or by contrast should there be…a “confessing church,” standing for defined and stringent Christian beliefs, based on a certain reading of the New Testament, its members apparently ready…for “martyrdom”?
The question is still with us today, and not only for Christianity but for all faiths that confront a choice between solitude and society, individual purity and the comforts of belonging. But there is a deep irony in Kierkegaard’s rebellion. Although he imagined himself a critic of modern convention, his individualist bid to wrest himself free of social constraint was a highly modern ideal. As the philosopher Charles Taylor has explained, in the secular age even those who cleave to a conventional faith conceive of their religion as one option among many: it is something an individual must will to have and no longer something one is merely given as an artifact of collective history. The age of religious reform was but one stage in the historical process that Taylor calls the “disembedding” of the self from shared traditions of meaning.
Ironically, the desire to stand as an authentic individual beyond all such traditions is the greatest conceit of the bourgeois era and Kierkegaard was in this respect far more conformist than he cared to admit. And yet none of us wishes wholly to surrender this desire for authenticity since it is also the very sign of possibility itself, the hope that life might be otherwise than it is. To abandon this hope is to give up on possibility altogether. Against all the forces that counsel resignation Kierkegaard remains not just the knight of faith but something more: the eternal child.
J.M. Bernstein, “Forgetting Isaac: Faith and the Philosophical Impossibility of a Postsecular Society,” in Habermas and Religion, edited by Craig Calhoun, Eduardo Mendierta, and Jonathan VanAntwerpen (Polity, 2013), pp. 154–178. ↩
The story of the Beach Boys is a kind of philosophical problem. Not that they didn’t make some albums still eminently worth hearing, if we go by the unit of the album: Pet Sounds, from 1966, is the prize pony, full of confident hits as well as deep-purple self-absorption (“God Only Knows,” “Wouldn’t It Be Nice,” “Caroline, No”). For anyone justifiably wary of the whole idea of the pop masterpiece, Summer Days (And Summer Nights!!), from 1965, is a great uncool record—cornball, hard-sell with its lists of proper nouns and tour-stop shout-outs (“Amusement Parks U.S.A,” “Salt Lake City,” even “California Girls”), but getting grand-scaled in tone and mass. SMiLE, started in 1966 and not really finished till 2011, is the art-song project, a kind of underground labyrinth of melody, exhaustingly effective in its final form.
But time and social change have been rough on the Beach Boys. Their best-known hits (say, “California Girls,” “Help Me, Rhonda,” “I Get Around”) are poems of unenlightened straight-male privilege, white privilege, beach privilege. It is hard to imagine that they helped anyone toward self-determination or achieving their social rights. Brian Wilson’s great integrative achievement as a songwriter and producer was absorbed in bits and pieces by others—Paul McCartney especially—but it mostly worked for him alone. In their rhythm and humor the Beach Boys sound squarer all the time compared to Motown, the Beatles, and the Stones, and a lot of Phil Spector. Of course, it comes down to individual songs.
We all like songs. But we also tend to regard a cultural institution like the Beach Boys—a fifty-year-old rock band, heavy with institutional honor—as having some kind of fixed identity and owning some kind of essential rightness. What is, or was, the essence of the Beach Boys, and what were they right about? In this time of curation and reassessment, we have cash in hand and we are here to understand the Beach Boys. What are we paying into? What are we understanding?
Is it the Beach Boys as a performing and/or recording entity, neither of which since the death of Dennis Wilson in 1983 ever quite seemed the real thing, whatever “real” is? (The Beach Boys have never broken up per se, the way the Beatles did. But as of this writing in 2016, you can see two different versions of them: a touring band led by the singer Mike Love called the Beach Boys, which includes the non-original member and long-time on-and-off Beach Boy Bruce Johnson; or a touring band led by Brian Wilson, who does not have the legal right to call his band the Beach Boys, as well as the original Beach Boy Al Jardine, playing many of the same songs as Love does, as well as songs that were recorded under the name the Beach Boys but are basically Wilson’s.) Is it the corpus of the Beach Boys’ music, which is totally disunified: pre-and-post Pet Sounds, pre-and-post LSD, pre-and-post Dennis Wilson, with and without Brian as songwriter, with and without Brian as producer, and on and on? Or is it the projection, through time, of the individual members’ personalities, which have been widely simplified and are, finally, unknowable?
Brian Wilson’s storied vulnerability will never be quite squared with his demonstrated ambition: in a short amount of time, and before the age of thirty, he had dealt with great amounts of American musical culture. One class of his songs, like “Shut Down,” is built on Chuck Berry’s updated rhythm-and-blues—a driving beat under a twelve-bar blues pattern and wise patter about the will to win a drag race. (Roger Christian, a Los Angeles DJ, wrote the suggestive lyrics: “He’s hot with ram induction but it’s understood/I got a fuel-injected engine sittin’ under my hood.”) A different class, like “Let Him Run Wild” and “The Warmth of the Sun,” took up the lessons of the Four Freshmen’s vocal arrangements, moving them out of a jazz context and toward a new kind of rock and roll song—diffuse, harder to reduce, written with harmonic tension and shifting keys. For “California Girls,” Wilson composed a twenty-second prelude for electric guitar, bass, cymbals, and saxophone long-tones that suggested an American pastoral symphony. For “Guess I’m Dumb”—written in 1964 for Glen Campbell—you can hear him competing with Burt Bacharach, in sophisticated rhythmic phrasing and harmony, and with Phil Spector, in the imagining of sound as physical mass in a physical space.
The narrators of Beach Boys songs used their time as they liked: amusement parks, surfing, drag racing, dating, sitting in their rooms. Listeners through the mid-Sixties —I wasn’t there—must have responded to the way ordinary leisure and ordinary kicks could be enshrined by a cool, modern, prosperity-minded sentimentality. (Something similar had happened with bossa nova in Brazil, four years before the Beach Boys made their first records.) After that, listeners may have seen the paradox inside the Beach Boys’ music as a whole: the drive to be a man, to know the score, to win in small-stakes battles—the animating force of “Shut Down” and “I Get Around”—versus the drive to retreat and regress or live in a world of one’s own invention, which is the drift of Pet Sounds and SMiLE. Both sides of the paradox suggest a naive state. A lot of the allure of the Beach Boys may be about not knowing: about us not knowing them, which is pretty common in the relationship between pop stars and their audiences, but also about them—in some way, if only a performed way—not knowing themselves.
One possible standard for the Beach Boys as a band, as content, as knowable historical reality, is their appearance on Good Morning America, ABC television, December 1980. Joan Lunden, in New York, interviews the band remotely, through a television monitor. The occasion is the band’s twentieth anniversary, one of the non-musical publicity hooks that have kept Beach Boys music flowing through American commerce.
The group, in Los Angeles, sits on a semicircular couch: the singer and guitarist Al Jardine (diffident, white suit over Hawaiian shirt), the lead singer Mike Love (diplomatic, ball cap and silver racing jacket), and the three Wilson brothers: the maestro Brian (puffy and impassive, thick beard, hair parted on the right), the singer and guitarist Carl (youngest and most eager to please, light suit over aqua-blue, open-necked shirt), and the singer and drummer Dennis (haggard and twitchy in the early morning after his thirty-sixth birthday—brown V-neck long-sleeve shirt and Native American-print knit vest).
Lunden: “Why do you think you’ve had such, you know, continued popularity?”
Dennis makes a rude, exaggerated shrug: Who knows? He is so, so tired. Three years later he would be dead.
Carl gives it a try. “Well, it’s the music, obviously,” he begins. “People enjoy sort of the happy, easygoing sort of spirited music that we make. The sound we all make together.” (What does all of that mean? Is that a style, an organizing principle, or the basis for anything?)
“You have teenagers, some of you,” notes Lunden, a bit later. “Do they like the music you play?
“Yes,” answers Brian, expressionless. He speaks out of the left side of his mouth because he is deaf in his right ear. “I have two daughters, Carnie and Wendy.” Lunden: “How old?” Wilson: “Twelve and ten. And they both love our music. I mean, they play our records on their record players.” (Off-camera, Dennis rumbles: “If they don’t, they go to bed early.”) A little later, Lunden again: “What do you listen to when you’re at home?”
Brian Wilson pauses for two seconds. “I listen to a record called ‘Be My Baby,’ by the Ronettes.”
Lunden laughs, perhaps because it is such an awkwardly specific answer, and because the idea that he might listen to one record over and over is funny. Mike Love begins a canned chuckle before Wilson has finished. “Ah, the legend lives on, right?”
The clip and everything about it—the occasion for its happening, Dennis’s shrug, Al Jardine’s silence, Carl’s gameness, Brian’s self-absorption, Mike Love’s critical stewardship of the narrative—seems to say a lot, in seven minutes, about who the Beach Boys are and how they worked together. In Mike Love’s skeptical, detail-oriented new memoir, Good Vibrations, Love mentions the Good Morning America debacle in an implied can-you-believe-I-had-to-put-up-with-this-bullshit kind of way, a tone that obtains through most of the book. He is known for preferring predictable success to artistic nuance—but this can’t quite be squared with the care he claims to put into his lyrics in songs such as “Good Vibrations” and “I Get Around,” nor with his lifelong interest in transcendental meditation. We shouldn’t necessarily think we know him.
Wilson doesn’t mention Good Morning America in his own new memoir (his second), I Am Brian Wilson, a book almost the opposite of Love’s: forgiving, chaotically associative. But he often brings up “Be My Baby,” and the song’s ability to “make emotions through sound.” You sense that this is where Wilson really lives: in emotions triggered by sound. The more the book makes that clear, the better it gets.
You will read of Wilson, at a party in the mid-1970s, drunk on chocolate liqueur, commandeering the turntable and playing the song’s drum intro ten times, until told to stop; “then I played it ten more times,” he remembers. (In Peter Ames Carlin’s Catch A Wave: The Rise, Fall & Redemption of the Beach Boys’ Brian Wilson, published in 2006, the engineer Steve Desper tells of making a tape loop for Wilson consisting of only “Be My Baby”’s chorus, and leaving Wilson at home to listen to it. “I must have been gone for about four hours,” Desper told Carlin. “And when I came back, he was still listening to that loop over and over, in some kind of a trance.”) In describing his reaction to the song upon first hearing it, he remembers shooting a BB gun, as a boy, at a man in a bean field sitting on a motorcycle; pop!—the man fell off his motorcycle. It was like that, he explains: “Be My Baby” was the BB pellet, shot by someone crouching out of sight; Wilson was the motorcyclist, unaware of what was going to happen to him.
SMiLE may have been Wilson’s peak of creative ambition, perhaps of learning, perhaps of feeling that he knew himself and could believe in his own powers. The record is more Stephen Foster and less Chuck Berry, episodic and digressive, advanced and childlike, stylish and repetitive (he was entranced by the Rolling Stones’ “My Obsession,” a monument of repetition, which he saw being recorded in Los Angeles while making SMiLE). And it was a further step beyond Pet Sounds into new sounds: slide-whistles, marimbas, banjos, harpsichords, glockenspiel, body percussion, saws, hammers, and wild amounts of reverb.
It was also his Waterloo of drugs and psychosis. In I Am… Wilson appears fascinated and baffled by the question of what SMiLE means to people—the book is like Dennis’s shrug, but prolonged and more curious—even though he feels the record was “too much music—not too complicated but too rhapsodic, with too many different sections.” (Those are among the qualities for which many people worship it, and what his great underground-pop idolators of the 1990s—Neutral Milk Hotel, Animal Collective, the High Llamas—were most able to use.)
But back to “Be My Baby.” Wilson’s obsession with that song has indeed inflated to a myth, and Brian Wilson has perpetuated the myth. Neither Wilson nor the Beach Boys as such released a studio version of the song. The closest Wilson ever came to the curve of its melody in his own best work was in “Don’t Worry Baby,” recorded by the Beach Boys in 1964. But guess who did record and release a version, on his only solo album, Looking Back With Love, in 1981? Mike Love’s version is garish, cheesy, almost robotic. Its approach to a great song, Brian Wilson’s sacrament, is seemingly not to celebrate or interpret or amplify or investigate—only to reduce. Strangely enough, Wilson produced it. It feels almost like an insult, or a well-aimed BB shot: possibly a greatest hit. But it’s unclear who, or what, the target is.
I Am Brian Wilson: A Memoir by Brian Wilson with Ben Greenman is published by Da Capo. Good Vibrations: My Life As A Beach Boy by Mike Love with James S. Hirsch is published by Blue Rider.
Donald Trump made the 2016 presidential election his plaything from the outset so it’s no surprise that he’s doing that to the end. The electoral process wasn’t something he respected; it was another tool for promoting himself, his long-running reality TV show having taken him far but not far enough for his ambitions to be super-famous. His entry into the presidential race now seems so long ago that we tend to forget the extent to which his becoming a candidate was considered a joke. And then for a while many of us found him amusing: his arrows hit his rivals for the nomination with deadly aim, and his disdain for normal political behavior seemed refreshing.
In time we may know authoritatively why Trump entered the race, though his “thinking” is likely to have several interpretations by several witnesses. If he ran to enhance the value of his “brand,” which is a possibility, he made a calamitous miscalculation. The Trump brand had been affixed to all sorts of products, from bottled water to buildings, and entering the presidential race was to be the greatest advertisement of all. But his businesses have already been hurt by his political venture. His opulent new five-star hotel in Washington that he’s showed off in campaign events has rooms going begging even at heavily discounted rates, and tenants in two of his residences on New York’s Upper West Side have signed petitions asking that his name be removed from their buildings.
It still seems questionable that Trump had a great desire to become president, a job that would probably bore him and certainly restrict his free-wheeling life. It’s been reported that when Trump and his top advisers reached out to John Kasich about joining him on the ticket they said that as vice president Kasich could run domestic and foreign policy. Around this time, then-campaign chairman Paul Manafort said, “He needs an experienced person to do the part of the job he doesn’t want to do. He sees himself more as the chairman of the board than even the CEO, let alone the COO.” (Kasich is one of the few elected Republicans who will emerge from this race with his integrity intact. He said he’d have nothing to do with a presidential candidate Trump and that’s how he proceeded.)
In the final weeks of his campaign, Trump has been surrounded by an unappetizing mix of advisers: Steve Bannon, the former editor of Breitbart News and high priest of the alt-right; Trump’s son-in-law Jared Kushner, whose guidance has included favoring the stunt of presenting at the second debate three women who accused Bill Clinton of sexual mistreatment; Stephen Miller, a heretofore obscure Capitol Hill aide who has worked for Michele Bachmann and most recently for ultra-nativist Alabama senator Jeff Sessions; and Kellyanne Conway, whose experience had led people to believe that she’d introduce an element of moderation into the campaign (though she’d most recently supported Ted Cruz) but has always been on hand to try to explain away his odd or even dangerous statements.
So extreme had the campaign leadership become that Trump’s national field director, Steve Jolly, quit with less than three weeks to go. Roger Ailes departed some time ago amid public complaints by others in the circle that he hadn’t been helpful in the debate preparation, which had been his main purpose. (Ailes has been having his own problems with charges of sexual exploitation of Fox women.) And Chris Christie had put some distance between himself and the campaign. This left the strangely mercurial and self-demeaned Rudolph Giuliani as Trump’s traveling buddy and introducer.
Thus, once the initial shock wore off, it wasn’t so surprising that Trump refused to pledge to accept the outcome of the election in the third and last debate, in Las Vegas on October 19. A passionate consumer of polls, Trump realized his campaign was in danger of losing after the fallout from the Access Hollywood tape. So what to do? Discredit the outcome by charging that the election was “rigged.” Fall back on the Republican myth of widespread “voter fraud.” According to an NBC-SurveyMonkey poll, 45 percent of Republicans might not accept the results of the election as legitimate if their candidate doesn’t win. Trump had been telling them the election will be “rigged” for several weeks. Might that make it all the harder for Clinton to govern? What did he care? Trump was prepared to pull down the temple’s pillars.
When it came to Trump’s unwillingness in Las Vegas to pledge that he’d concede if Clinton won, Conway and other advisers fell back on what might be called the Gore Defense. On election night in 2000 Al Gore first accepted that he’d lost to George W. Bush and later challenged the result in Florida, where there had been numerous reports of blacks being misled as to where and how to vote and other irregularities. In the end the evidence was that more people went to the polls intending to vote for Gore than were found to have done so. With the help of Bush’s brother, the governor of Florida, and then of a Supreme Court that divided 5-4 along partisan lines to stop the state recount, George W. Bush was declared president by a 537-vote margin in Florida—though Gore had won the national popular vote. Gore then gave a concession speech in which he said that he “strongly” disagreed with the Court’s ruling but that he accepted it.
Drawing on Florida in 2000, if Trump lost a crucial state by just a few hundred votes and the electoral college vote was very close, he might be justified in contesting the outcome in that state. But the 2016 presidential contest was reaching the point where Trump would have a lot of states to contest and wouldn’t have won enough of the rest to put him near 270 electoral votes. He was living in fantasyland.
Once more, Trump forced his defenders to turn themselves into pretzels. Before the final debate, Mike Pence, Ivanka Trump, and Conway had said that Trump would accept the outcome of the vote. After the debate, Pence, executing one of several backflips he’s had to perform in this race, said Trump has every right to challenge “questionable” results. The ever-limber Conway also lugged in the novel idea that the media was rigging the election.
Of course for quite a while the television media found it lucrative to have Trump on the air, and it took considerable time for the media of any kind to catch up with his fables about his so-called fabulous company and foundation. By early October, Trump’s foundation had been barred by the New York State attorney general from raising money in New York because of Trump’s various alleged misuses of its resources, including to settle lawsuits. (Trump used to say that he never settled, but that’s another story.) The reports that Trump and Kushner have been looking into establishing their own TV network take on more validity when one examines just what Trump, should he lose, will have to go back to. But there are questions as to how well this would work.
Trump’s campaign may turn out to have been the ruin of his popular image as a fabulously successful businessman. Some biographers had raised questions about his business acumen and his true wealth (a question that drives him crazy), and other New York billionaires—assuming that Trump is one—have long quietly pointed out that his business was, in the words of one, “a joke.” (The Wall Street Journal reported last March that he could not get loans from the major US banks and had to borrow from the troubled Deutsche Bank.) But when Trump entered the race the legend was largely intact.
This, by the way, may be the value of the much-maligned long election. The longer it goes on the more we find out about the people to whom we might entrust incredible and terrifying power. I’ve never quite understood the complaints about a long election; people can always stop watching if they so choose. Had this campaign been limited to six weeks, as the British elections are, we wouldn’t have known how either candidate would hold up over time. And in regard to Trump we probably wouldn’t have known about his record as a sexual predator plus his willingness to demean his accusers. We might not have been aware of how easily and frequently he lies; or caught on to his willingness to do most anything for publicity.
Which is what I think his refusal to say ahead of time that he’d accept the outcome of the election is heavily about. Had he, when asked, twice, in the third debate if he’d respect the outcome of the election, instead of his “I will keep you in suspense,” said, “Of course I will,” would he have gotten anywhere near the same attention that he’s been receiving of late? Trump is the contemporary adherent to the tenet that there’s no such thing as bad publicity, a line attributed to Trump’s spiritual ancestor, P.T. Barnum. After the furor over Melania’s purloining of segments of Michelle Obama’s speech died down, Trump tweeted, with his characteristic understatement, “Good news is Melania’s speech got more publicity than any in the history of politics especially if you believe that all press is good press!”
Thus while Clinton and other Democrats as well as some Republicans and pundits went on—with reason—about the challenge to the democratic system inherent in Trump’s remarks, he may well have been smiling inwardly. So much attention so easily attained. And if enough people believed his charges of a “rigged” election that just might alleviate the humiliation of the likely loss. It took little to start with the notion of “voter fraud” and, a la Trump, expand it into a national conspiracy to steal the election from him.
As for the voting fraud, it’s been shown time and again that actual instances of it are extremely rare; but Republicans have found that alleging it is a useful tool for discrediting losses and an excuse for writing state laws that make it more difficult for groups—especially blacks—who support Democrats to vote. In the George W. Bush administration Karl Rove engineered the firing of seven US attorneys for failing to find non-existent voter fraud. Since then, various states with a Republican governor and Republican-dominated legislatures have passed laws requiring voter ID and creating other hindrances to voting by blacks, the elderly, and students. Some of these restrictions were pushed back by the courts but, combined with the Supreme Court’s gutting of the Voting Rights Act, the odds were that in some states members of these groups would be deprived of the right to vote. (It made the difference in the 2014 North Carolina Senate race, in which a Republican defeated a Democratic incumbent by fewer votes than the number of people prevented from voting.)
Trump said other things in the third debate that might have received more attention had he gone the expected route of saying he’d accept the election outcome. For example, Trump charged that the just-begun attack on Mosul was being undertaken because Clinton was “running for the office of president and they want to look tough…she wanted to look good for the election.” Or his statement that “she should never have been allowed to run for the presidency based on what she did with e-mails and so many other things.” But unsurprisingly Trump’s interjection that Clinton was “such a nasty woman”—in response to Clinton’s jab that his Social Security taxes would go up “assuming he can’t figure out how to get out of it”—led to an uproar, in particular among women, creating a new meme reclaiming the epithet “nasty woman,” and a tool for Clinton to try to pull still more women into her corner. Trump clearly couldn’t stand being outwitted by a woman, and he lacked the discipline to resist responding.
Legend has it among Trump supporters and some members of the press that Trump had put in a quite good performance in the third debate, especially in the first thirty minutes (just as he did in the first debate), but had undercut himself with his lines about possibly not honoring the outcome of the election. Well, he was relatively coherent in the early period, but what he said made little sense. It was striking that he knew what the Heller Supreme Court decision about guns was. But his hommage to the NRA and his extreme example of abortions—he said, “If you go with what Hillary is saying, in the ninth month you can take the baby and rip the baby out of the womb of the mother just prior to the birth of the baby”—weren’t so impressive.
Once again, style and substance were getting confused. Trump’s grasp on current events is tenuous at best. For at least the second time he called out US officials for letting it be known that an attack on Mosul was imminent, but that’s because he doesn’t understand that it’s a deliberate tactic to warn civilians to take flight if they chose, and that it’s considered a good thing if it encourages ISIS leaders to run, since it’s easier to target them out in the open than huddled in a warren of buildings or underground.
Clinton was back to her assured and composed style of the first debate, smiling when Trump trapped himself, enjoying watching him flail when she poked him. Trump looked puffy and ill at ease. Once again, her intense preparation paid off. Clinton made it cool again for a girl to do her homework, and for a woman to be a nerd. One foreign policy matter that came out of her comments in the third debate and may point to Clinton changing an Obama policy—a point that Tim Kaine echoed on Morning Joe the following day—was that once in office she would be at least somewhat more aggressive in Syria, by setting up a no-fly zone, which some observers find risky now that anti-aircraft missile batteries have been installed.
Trump’s gracelessness carried over the next night to the annual Al Smith dinner, which every four years invites the two major presidential candidates to speak, and the expected still is that they’ll be witty and self-deprecating. Trump just couldn’t do it. He started off on a sour note (about how people, presumably some in the audience, had come to him for money and had treated him like a bosom friend but then turned on him as soon as he entered the race as a Republican). But aside from one good joke—about Melania’s supposedly delivering the exact same speech that Michelle Obama had—his lumbering, loutish performance elicited such boos from the audience that Chuck Todd the following day called Trump the first candidate to “lose the Al Smith dinner.”
On Saturday, October 22, Trump was to take the occasion of a speech in Gettysburg to lay out his goals for his first hundred days. Why not? Nothing to be lost by that. While Clinton has a lot of programs but is still without an overarching message (“Stronger Together” doesn’t cut it), Trump has a theme (“Make America Great Again”) but has been almost devoid of concrete programs. So in his own Gettysburg address, Trump said he would call for a constitutional amendment invoking term limits on members of Congress; repeal and replace Obamacare (Trump didn’t usually bother with the replace part of it); and tighten immigration policy. In a new twist, he said that the US would pay for the wall he’d build on the Mexican border but that the US would get Mexico to repay it. This was apparently to signal that he is serious about building that wall.
But on the same day as the Gettysburg speech, an eleventh woman came forward to accuse Trump of misogyny. The “adult film” actress charged that in 2006—a year after his marriage to Melania—he offered her $10,000 to spend the night with him in his hotel room. And so the undisciplined Trump began his remarks by threatening to sue each of his accusers after the election. (He’s separately threatened to sue TheNew York Times for its recent disclosure of two women who accused him of sexual aggression.) He also got off a complaint about the “rigged” election. Understandably, most news outlets led their stories about Trump at Gettysburg with his threats to sue his women accusers—after the election.
Republicans are now in a panic. There are signs of an impending “wave” election that would take down not only Trump but many Republican candidates for Congress. But irrespective of whether this happens (the 1980 wave didn’t develop until the last weekend before the election, when Ronald Reagan swamped Jimmy Carter, and many Democrats down-ballot also lost), it’s now increasingly likely the Democrats will take the Senate. If they pick up a net of four Senate seats that produces a tie that a Vice President Kaine would break, but if current trends continue they may well pick up more than that. A Democratic takeover of the heavily gerrymandered House has seemed less likely, since it requires a net pickup of thirty seats, but if a wave truly develops it isn’t out of the question. However, in the 2018 midterms, the Democrats will be more vulnerable than the Republicans in the Senate, so Democratic control of the Senate, should it occur, might be short-lived.
Republican candidates who have separated themselves from Trump are on the whole doing better than those who have clung to him–it’s his supporters that have complicated matters. Incumbents who renounced Trump either because they were genuinely offended by hiss comments on the Access Hollywood tape or thought it would look better to take that position were assailed by his more rabid followers. Thus some Republicans went back and forth and ended up looking weak-kneed. Apparently both the Republican congressional leaders, Paul Ryan and Mitch McConnell, made the calculation to stay apart from Trump—Ryan made more noise about this than McConnell—but not to go so far that they’d rile his supporters.
This is not to suggest that, however the congressional elections turn out, a President Clinton will have anything resembling an easy time of it. Even if they’re in a minority, the Republicans will still be in a position to filibuster her Supreme Court nominees as well as any legislation she puts forth. Clinton would like the Senate to approve the long-pending nomination of Merrick Garland for the ninth Supreme Court seat—this would spare her a grueling, bitter Supreme Court nomination fight at the outset of her administration. But a spokesman for McConnell told me recently that the Republicans would continue to insist on the principle that a lame duck president shouldn’t make a Supreme Court appointment. This novel constitutional theory would spare Republicans from being vulnerable to the charge that they’d allowed the Court to turn in a more liberal direction following the death of Antonin Scalia.
Word coming out of Clinton’s headquarters is that she hopes to be able to govern in something of a bipartisan way since as a senator she worked successfully with various Republicans. But when it comes to the question of Republicans working with Clinton both the Trump base and old-line conservatives may pounce on any of them who cooperate. Also, Clinton elicits hostility on the part of many Republicans, and has done so from the time she came to Washington as first lady—and accrued powers that no one in that position had had since Eleanor Roosevelt. So the first female president won’t begin with bipartisan good will. Ryan might for his own reasons be tempted to work with her but he has to watch his right flank; his predecessor as speaker, John Boehner, was deposed for deigning to work with Obama.
If Clinton wants to work with Ryan or Senate Republicans, she’ll feel at least a tug from the left—the pull being administered by Bernie Sanders and Elizabeth Warren—which she will ignore at her peril. Clinton undoubtedly understands this. Warren, who has helped Clinton by campaigning for her, can be relied on to give a hard time to any nominees to positions dealing with financial matters who haven’t been cleared with her. Sanders, who has also made many appearances for Clinton, has his own leftward constituency to keep content. Clinton is a skilled legislator but the cross-currents she’s likely to encounter could be daunting.
For ammunition to use in forthcoming battles Clinton is now trying to roll up as large a victory as she can. In an almost predatory manner she’s now going for such states as Arizona and Georgia, long considered out of bounds for Democrats. Even Texas, reliably red since 1976, is nearly tied. As for the “swing states,” most of them appear to be lining up in her column. Ohio, with its large blue-collar constituency, has for much of this year been considered out of her reach, but as of this writing, polling suggests that the race is tied.
Clinton has dispatched her considerable force of surrogates, including, in addition to her husband, both the president and the first lady. Having had little use for politics when she came to Washington with considerable dread, Michelle Obama has unexpectedly become the star speaker of the campaign. Joe Biden, campaigning in rust-belt areas, has tried to reconnect blue-collar workers with their previous home in the Democratic Party. There are still enough days left for the unexpected to occur, but Trump is increasingly coming across as a bloated and broken figure.
Trump is without question the greatest whiner of any presidential candidate in memory: his mic was tampered with as part of a conspiracy; Clinton got the debate questions ahead of time; and, of course, the election is rigged against him. And so this sad, rasping heap of a man, a former formidable figure, shambles his way toward a humiliating defeat; his blustering and threatening, once instruments of strength, pitiable.
I am gratified that the NYR Daily chose to reviewMen Without Work: America’s Invisible Crisis, and that Jeff Madrick is the reviewer since I have always enjoyed, if not always agreed with, his work.
However critical, the review is welcome because broad debate serves an essential purpose in addressing the crisis I described in my book. Even so, debate about the “Men Without Work” crisis cannot proceed without well informed readership. Several oversights and errors in Jeff’s depiction of my study require correction.
First: Jeff’s Trumpian insinuation notwithstanding, it is in no sense true that “Eberstadt’s analysis” focuses “particularly [on] white men.” To the contrary, Chapter 5 specifically demonstrates that the collapse of work for men has been disproportionate for African Americans. I write about all seven million prime age men missing from the workforce—not just some of them.
Second: Jeff mischaracterizes my assessment of the role of structural economic forces in generating this crisis. Nowhere do I contest the obvious fact that the bad economy has taken a grim toll in the labor market—especially during the Bush-Obama era. I concur repeatedly with this proposition (pp. 100, 109, and 180). But I dissent from the narrative that practically all of the collapse of work for men over the past half century has been due to such factors. None of the reasons for my dissent (which takes up much of Chapter 7) are even mentioned. I hope curious readers will consult my text.
Third: I am puzzled by Jeff’s aside that “More heart-breaking, the truly disconnected are the growing proportion of men who are or have been in prison and often can’t get jobs when they are released,” without even mentioning that I devote an entire chapter (Chapter 9) to this question in my book.
Finally: Jeff misreads my argument about the role of disability benefits in the current male work crisis. I never state that our national social welfare and disability programs are “too-generous.” Rather, I highlight the fact that Europe’s welfare states are more expansive than ours, and yet somehow we are the affluent country with the most acute male flight from work (pp. 50-54).
While Jeff asserts that the expansion of disability rolls have had little impact on the collapse of work for men, Men Without Work shows (pp. 117-120) that 57 percent of men twenty-five to fifty-four years of age who are out of the labor force reported benefits from at least one government disability program in 2013. Isn’t that worth a mention? By the way: I never argue that disability programs caused the great male flight from work. My incontestable point: they helped finance it.
Our disability programs are perverse: they incentivize helplessness and dependence, and tether recipients to their often low-employment localities. They cry out for reform. NYR readers may be bemused to learn this American Enterprise Institute researcher points favorably to Sweden as an exemplar of such possible reforms (p. 184).
Henry Wendt Chair in Political Economy
American Enterprise Institute
Jeff Madrick replies:
Nicholas Eberstadt has written a book about men out of work in contemporary America, but his main focus is what he views as the shirking of work by males—white males and also black males, young and middle-aged—enabled by too ample and ill-designed social programs. He talks consistently about “fleeing” from work, describing a new “caste” of men who can “scrape by in an employment-free existence, and membership in the caste is, in an important sense, voluntary.” To this veteran of the welfare wars, this has shades of welfare-queen rhetoric.
Economic vitality and weakness have a large part in determining participation in the labor market, as Eberstadt acknowledges. But the purpose of his Chapter 7, which I take on, is to diminish this economic relationship in order to suggest that the social programs are also a primary source of the problem. Perhaps aware of the ambiguity of his evidence, he feels obliged to draw a distinction between claiming that disability programs have “caused the great flight from work” and claiming that they have merely “helped finance it.” But this is a distinction without a difference. For example, later he writes, “We have seen the insidious role that disability programs play in sustaining the no-work lifestyle.”
In fact, the evidence I cited is overwhelmingly clear that worker participation ebbs and flows with the economy. Eberstadt and I agree that there is also a decided secular (noncyclical) trend in falling participation rates. But much of it is due to an aging work force, a high rate of poor health among those out of work (recently demonstrated by the Princeton economist Alan Krueger), and young people going back to school—not to social policies.
Eberstadt would be on more credible ground if he conceded that more men, even those with pain and disabilities, might seek work if jobs paid better wages. Good jobs are the issue. Eberstadt maintains he didn’t write that social programs in America are too generous but his book recounts in great detail how much help Americans get from government.
In fact, only a handful of states offer disability insurance to workers. Moreover, a chart done by the CEA shows that non-federal disability payments, including state and local disability, military retirement, civil service, worker’s compensation, and more, do not support Eberstadt’s story. They have been declining, according to the CEA, as a share of the men not in the work force.
Eberstadt argues that more small business creation as well as work-first policy reform will bring more men back to work. Yet we’ve been adopting work-first policies since Ronald Reagan, while participation in the labor force has steadily fallen. A strong economy, aided by fiscal stimulus, is the best lubricant for entrepreneurialism and better jobs.
Eberstadt wants me to credit him for mentioning Sweden as a welfare state with successful work-first programs. But it gets only a single sentence at the end of his book. There is no analysis of Sweden’s many differences with the US, including exceptionally low poverty, inequality rates, and higher GDP growth. Sweden’s government disability policy is generous and multi-layered, and offers considerable help finding jobs.
I used the phrase “particularly white men“ in the piece partly because I didn’t want to imply the book was yet another allegation about work-shirking black men. If the significant fall in participation rates for whites hadn’t occurred, however, Eberstadt’s book would have been much different.