Месечни архиви: January 2018

A Plate of Jellyfish

TASCHEN Köln/Niedersächsische Staats- und Universitätsbibliothek GöttingenFrom Art Forms in Nature, plate 8, 1899–1904; Desmonema annasethe, the jellyfish that Ernst Haeckel named after his late wife Anna Sethe

In 1665, Robert Hooke published Micrographia, a small, illustrated treatise on the world seen through his simple compound microscope. Most of his subjects could be found on the kitchen table: the mold on a piece of cheese, the weave of the tablecloth, the cells in a sliver of cork. His most famous image is of a magnified flea. His engravings captivated and unnerved the public; the kitchen table was forever changed.

First Run/Icarus FilmsErnst Haeckel and his assistant Nicholas Miklouho-Maclay, posing as naturalists in the Canary Islands, 1866; click to enlarge

Two hundred years later, twenty-five-year-old Ernst Haeckel paddled a rowboat through the Gulf of Messina, sieving zooplankton with a fine mesh net, and writing home to his fiancée with as much excitement as if he were panning for gold. Armed with a x1,000 microscope, the best money could buy in 1859, he was cataloging radiolarians, taking up the study of his late mentor, Johannes Müller. Radiolarians are single-celled marine organisms named for their radial forms, of between a tenth and two tenths of a millimeter in diameter, and armored by silicate skeletons as variable as snowflakes.

Study for a plate in Art Forms in Nature on human evolution, 1906; considered scandalous for its nudity and also for its evolutionary implications, Haeckel’s sketch (made with painter Gabriel Max) was never published as a print

Haeckel’s name has not endured as well as the words that he coined—among them, phylum, ecology, and stem cell. Yet he was a giant of nineteenth-century scientific thought, and responsible for the contagious spread of Darwin’s theory of evolution in Germany. An atheist iconoclast, he took up a crusade against religious dogma in scientific fields, tearing a rift between fundamentalist faith and Darwinism that rages today. His take on evolutionary theory, known as the biogenetic law or recapitulation, suggested that the evolution of a species was reflected in the individual’s embryonic development; the similarities between embryos of different species might point to their common ancestral origin. His research led him to speculate on human origins, and he was the first to draw out trees of evolutionary relationships between species. Recapitulation was a flawed concept, and so were Haeckel’s ideas about racial hierarchy, which the Nazis would readily co-opt. His evolutionary trees look like oaks, because Haeckel was also an artist, and the quality of the visuals was important to him.

TASCHEN Köln/Niedersächsische Staats- und Universitätsbibliothek GöttingenMonograph on the Radiolaria, volume 1, plate 15, 1862

A new book by Taschen has gathered the art from Haeckel’s monographs on the taxonomy and anatomy of marine organisms. Both art book and textbook, The Art and Science of Ernst Haeckel is coauthored by Rainer Willmann and Julia Voss, specialists in phylogenetics and German art history, respectively. Under its weight of fourteen pounds, my arms were jelly after hauling it home. But anything lighter would not do justice to the prolific output of Haeckel’s career.

From The Evolution of Man, plate 15, 1879; Haeckel was the first to draw an evolutionary tree. Darwin wrote to Haeckel after reading The History of Creation: “My dear Haeckel… your boldness sometimes makes me tremble, but… some one must be bold enough to make a beginning in drawing up tables of descent.”

It was while preparing his research on radiolaria that Haeckel read On the Origin of Species in translation. In the varying forms and complexities of the radiolarians, Haeckel saw evidence for the diversification of species. His first book, Monographie der Radiolarien, was published in 1862, complete with thirty-five sumptuously engraved plates based on his drawings of 144 new species he had observed on his trip to Italy; with one eye to the microscope and one hand sketching what he saw, the speed and perfectionism of his renderings are extraordinary. The book, published for a mass audience, was a great success in Germany and abroad. The anatomist Max Schultze wrote of Radiolarien, “I find myself undecided as to whether to feel more astonished by nature itself and its capacity to bring forth such diversity and beauty of forms, or by the hand of the draughtsman in his ability to capture such magnificence on paper.” The monograph earned Haeckel a professorship at the University of Jena, a post from which he would champion Darwinism through lectures and his bestselling 1868 book, Natürliche Schöpfungsgeschichte (The History of Creation).

TASCHEN Köln/Niedersächsische Staats- und Universitätsbibliothek GöttingenMonograph on the Medusae, volume 1, plate 1, 1979

In the midst of his early triumphs, Haeckel was devastated by the premature death of his wife, Anna Sethe. In his grief, he turned to the observation of tide pools, christening two jellyfish after her: Mitrocoma annae and Desmonema annasethe, particularly elegant species whose tentacles looked to Haeckel like curling tendrils of blond hair. Annasethe was shipped to him from Africa, preserved in wine in a tin. From the crumpled brown specimen, he reconstructed the shimmering tropical flower that is arguably his most famous image. On the lithographic plate, he fudged the size ratios so that it would be the largest of the three species. Haeckel’s prioritization of style and symbolism over strict accuracy may have corrupted the scientific value of his diagrams, but it elevated them into art.

From Art Forms in Nature, plate 31, 1899–1904; in addition to jellyfish, Haeckel also named a radiolarian after his wife (Dictyocodon annasethe, bottom, center)

Haeckel nurtured a lifelong affinity for painting landscapes, though Voss remarks that they lack the originality of his later monographs. His realist approach to biodiversity was not so different from his approach to landscapes; it’s just that naturalistic landscapes are a cliché, while jellyfish are cool and weird. Despite their baroque stylings, Haeckel’s monographs are anatomically very accurate. His mastery of chiaroscuro brought visual clarity and dimensionality to his subjects that were impossible to achieve by photography, especially with its black-and-white limitations at the time. What makes his images so useful to artists is that they’re mostly untampered-with; Haeckel seems to have had no pretensions that he could make anything more beautiful than nature already had, modestly claiming to be “no accomplished artist, but only an enthusiastic dilettante whose moderate talent, through extensive practice and heartfelt dedication, has been directed usefully to nature.” Haeckel had wanted to illustrate a plate on the evolution of humans, with the likeness of his mistress, Frida von Uslar-Gleichen, surrounded by the greater apes, but he doubted his ability to do her artistic justice. Instead, he dedicated a majestic jelly in her honor, Rhopilema frida, that since its publication has inspired a bloom of tentacled chandeliers from sculptors and craftsmen.

TASCHEN Köln/Niedersächsische Staats- und Universitätsbibliothek GöttingenFrom Art Forms in Nature, plate 71, 1899–1904

In Kustformen der Natur (Art Forms in Nature), 1899, science was put in the service of art. Haeckel believed that evolution would unite science with art and philosophy under one discipline, through which humans could reach a greater understanding of their world. His intention was to make the natural forms of elusive organisms accessible to artists, and supply them with a new visual vocabulary of protists, mollusks, trilobites, siphonophores, fungi, and echinoderms. Opening Art Forms, which is excerpted in The Art and Science, is like stepping into a cathedral, a place crafted by human hands that nonetheless inspires awe of the divine. Within are jellyfish that look like flowers, protists that resemble Fabergé eggs, presented like crown jewels on black velvet, the seeming cosmic vastness of the images belying their actual, microscopic size.

Gustav Klimt: Water Snakes II, 1907

Library of Congress Prints and Photographs divisionRené Binet took his inspiration for the entrance gate at the Exposition Universal in Paris, 1900, from Haeckel’s radiolarians

TASCHEN Köln/Niedersächsische Staats- und Universitätsbibliothek GöttingenFrom Art Forms in Nature, plate 28, 1899–1904

Artists took heed. Art Nouveau is crowded with the natural arabesques and patterns that seduced Haeckel. The crystalline structures of radiolaria made their way into the architecture of Antoni Gaudí, René Binet, and visibly into Bruno Taut’s Alpine Architecture, which imagined radiant utopian cities built of crystal and glass (one of Haeckel’s weirder treatises postulated that crystals possessed souls). The fluid forms of jellyfish charmed Klimt, whose women seem to swim across the canvas. Ghostly embryos haunt the works of Munch. The Surrealists found inspiration not only in the beauty of Haeckel’s images but in the darkness underlying them; Max Ernst’s biological distortions invoked an evolution toward not beauty but perversion.

From Art Forms in Nature, plate 88, 1899–1904; Haeckel’s tribute to Frida von Uslar-Gleichen (Rhopilema frida, center) was used as the model for several chandeliers

Haeckel’s images are still referenced by contemporary artists like Philip Taaffe and Thomas Nozkowski. Still more artists have embraced his larger, cross-disciplinary project—such as Jen Bervin, who uses the structure of DNA in her poetry to explore the life cycles of silk worms; Lauren Redniss, whose illustrated books on science blend art, history, and lyric essay; and, of course, the musician Björk, whose album Biophilia makes emotional metaphors of plate tectonics and viruses, and who is rarely seen without a handcrafted Haeckelian tendril glued to her face. To see protists in our pop culture still strikes us as distinctively modern—although the organisms are ancient, only recently were we formally introduced.

The Art and Science of Ernst Haeckel is published by Taschen.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/GrLpR9cxHfo/

The Nunes Memo Kremlinology

Andrew Harrer/Bloomberg via Getty ImagesDevin Nunes, chairman of the House Intelligence Committee, Washington, D.C., February 27, 2017

On the day before President Trump was to deliver his first State of the Union speech, an event his furious, humiliated wife reportedly will attend only grudgingly, the rest of us were reminded that a state of undeclared civil war now exists in America. On one side is the president, his Republican allies in Congress, scions of finance and commerce who are cashing in on the administration’s widening corruption, white nationalists and their enablers, the resentment-nursing, swindled “forgotten men and women,” and the gleeful Russians. On the other side are the rest of us, including longtime public servants at the Justice Department and the FBI, and congressional Democrats who have the facts but clearly not the power on their side.

It is a war that has been waged since the president took his oath of office with his “American Carnage” speech, designed to terrify and split an already-divided nation. For more than a year now, one side has relentlessly pressed forward thanks to congressional majorities in both chambers and control of both the judicial and executive branches. The other side has fought largely on the defensive, with rearguard actions and delaying tactics, in the hope that the tangible damage to the country, to the rule of law, and to democracy itself can be limited between now and next January, when the next Congress, perhaps a more Democratic Congress, is seated.

The war has taken two forms. It has featured the slow dissolution of legal and political norms, of common language even, but also sporadic days of great import, when the scope of what this president and his fellow travelers are capable of is laid bare. Yesterday was one of those days. No more can we say that the Trump administration is necessarily going to obey bipartisan congressional directives designed to punish our foreign adversaries. Any more than we can say that the president’s lackeys on Capitol Hill are content to allow the investigation into his ties to Russia to proceed without direct, partisan interference. Let me put it this way: Monday was the day Congress obstructed justice to aid the president. And it was the day the White House obstructed justice to aid the Russians.

Monday began with the now-routine litany of reported anecdotes detailing the emotional and intellectual fragility of the president and his inability to behave in ways that diminish or slow the evolution of the obstruction case against him. We learned that he had had yet another temper tantrum when he was reminded recently that the Justice Department is not his personal police force or detective agency. And we were reminded of his streak of cruelty in the taunt—gratuitous even by his own standards—that he issued during a phone call some months ago to Andrew McCabe, the deputy director of the FBI, who announced Monday that he would be leaving his post six weeks early. For Trump, McCabe’s decades of his public service, and his commendable record of federal law enforcement, were nullified by the lawman’s matrimonial choice: McCabe had had the temerity to marry a Democrat who ran for office. If McCabe weren’t hostile to Trump before this episode, he surely should be now.

As afternoon turned into evening, we learned that Republicans on the House Intelligence Committee, led like lambs by Representative Devin Nunes, had voted to release a memo they have ginned up that purports to detail overzealousness on the part of FBI agents investigating Carter Page, a former Trump campaign adviser who met with Russian officials in the summer of 2016 (and who had denied those meetings for months). Page is a patsy here; on that, all agree. The partisan purpose of this memo, and of releasing it in this way, is to link the FBI to Christopher Steele, a former British spy whose work highlighting the Trump team’s extensive ties to Russia was funded by, among others, Hillary Clinton and the Democratic National Committee.

In the addled world of conservative media, of dark conspiracies, which is to say the addled world of a president who gets his news from Fox News, Steele’s link to Clinton taints whatever information the FBI gleaned from his “dossier.” Tainting that link, the theory goes, in turn taints the FBI and the Justice Department and, by extension, the work of special counsel Robert Mueller, who was appointed, after all, by Rod Rosenstein, the deputy attorney general who reportedly is one of the targets of the memo’s GOP-crafted scorn. It’s a convenient theory—that would neatly tumble together all the officials who threaten the president and his tenure like bowling pins—but one entirely untethered to law or fact or reality.

For the rest of us, the memo is nonsense. A distraction. A cheap stunt designed to give cover to the congressional Republicans while the president continues his assaults on the very structures of federal law enforcement. Actually, it’s not simply nonsense, it’s dangerous nonsense. It is a partisan summary of raw intelligence information whose release Trump’s own Justice Department last week said would be “extraordinarily reckless.” To believe that the memo undermines the work of the FBI, Justice Department, and Mueller is to disbelieve the mountain of independent evidence that corroborates, in whole or in part, material aspects of Steele’s work. The world no longer needs to rely on Steele or his dossier to understand how deep and abiding were the ties between Team Trump and the Russians. It need only note how many of that team are already under criminal indictment or are actively cooperating with Mueller and company.

As a hearsay summary of a written report, the Republican memo would never be allowed into evidence in a trial unless the judge allowed the other side to cross-examine the memo’s authors on the accuracy of the document. Under that scrutiny, by any competent lawyer, the credibility of its contents would soon be in tatters, and the ploy by House Republicans exposed for what it is. We will likely never see that cross-examination in a court of law. But we can and should see it in the coming days and weeks in the court of public opinion

The last headline of the day came when we learned that the White House would not impose the sanctions against Russian officials that a bipartisan Congress demanded last year. Another win for Russia; another loss for those Americans who believe that foreign states that meddle in our elections should be discouraged from doing so again (including, for example, during the mid-term elections this November). By refusing to accede to congressional directive here, by refusing to fully punish Russia for improper interference in our democratic process, the Trump White House didn’t just veer from its “tough-on-crime” theme. It also showed us how emboldened it feels. 

The president evidently now sees no material risk in publicly ingratiating himself with the Kremlin even as Mueller and his investigators build their case that the Russians did the Trump team favors during the 2016 election. What the White House did on Monday, then—blowing off Congress on sanctions—is strong evidence of a quid in the quid pro quo, and it raises new questions about the extent of continuing undue Russian influence over the president and his administration. The White House tactic here is striking: rather than getting tough with the Russians to undermine Mueller’s case, they have intensified their Russian romance. There is no attempt at a cover-up, no remorse or regret; instead, the ploy is to discredit the tried and true cops who are investigating a potential crime.

On this black Monday, congressional Republicans undermined generations of legislative history and precedent to help a president who then, before the sun had set, undermined the will of Congress in its battle to rein in the Russians. Some will call this treason. Others, obstruction of justice. I’d rather call it giving aid and comfort to the enemy. The really bad news of the day was the inescapable conclusion that the real enemy America faces is not foreign, but domestic.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/MNbWDthaaHE/

The Nonbinary Gender Trap

Musée Picasso, Paris/Bridgeman Images/Estate of Pablo Picasso/Artists Rights Society (ARS), New YorkPablo Picasso: Maya with a Doll, 1938

As California residents rang in 2018, they joined residents of Oregon and Washington, D.C., in having the option to revise their legal genders from either “male” or “female” to “nonbinary.” California’s enactment of the Gender Recognition Act, which was signed into law by Governor Jerry Brown last October and starts to take effect later this year, is but one more sign of a slow but steady sea-change. Washington State has followed suit, and similar third-gender options are under consideration in New York and Vermont. Under California’s new law, legal gender is dramatically reframed. No longer is it based on which sex (male/female) one was assigned as birth. Instead, legal gender will be based on what California lawmakers deem “fundamentally personal”: gender identity. 

A person’s gender identification encompasses identification with certain physical features as well as gender expressions, gender norms, and gendered language. This means that, regardless of one’s physical characteristics, a Californian who does not identify as a man or a woman may change their legal gender to “nonbinary.” The motivation for this shift is clear: according to California Senator Scott Wiener, the nonbinary option means that “transgender and non-binary people will now be able to identify themselves as they are, not as who society tells them they should be.” (California’s law goes further than other states’ measures, permitting citizens to opt for nonbinary on their birth certificates, as well as other official documents like driving licenses.)

California legislators, and progressive lawmakers in other states, may be acting from the best of motives, but this swath of new legislation rests on a dangerous mistake. I say this as a nonbinary person, one who identifies as genderqueer and uses the gender-neutral pronouns they and them. I’m also a philosophy professor and spend much of my time thinking about the ways in which gender categories are constructed and enforced. For me, adding “nonbinary” to the list of legal gender options does not address the core problem: any legal system that requires a person to record their gender perpetuates government control over our bodies and identities.

Even as it was signed into law, the Gender Recognition Act was condemned by the right as opposing nature, science, and even God. Conservative groups argued that legal gender should reflect the supposed fact that nature designates everyone as a “biological male” or “biological female.” The California Family Council dubbed the Act the “Change Your Gender Bill” and derided it as replacing “a physical description of reality” with “a description of feelings and self-identification.” According to such groups, the government should uphold an ideology that determines gender on the basis of whether, at birth, you have a penis or a vagina. Many on the left, by contrast, have hailed the Act as championing trans rights and progressive ideals.

Despite their apparent disagreement, both conservatives and liberals share a fundamental assumption. Conservatives insist that the state should record what genitals I have. Liberals insist the state should instead record my self-identity. But both assume that the state should be concerned with my gender, whatever they understand that to be. In so doing, each side—whether tacitly or intentionally—endorses the use of legal gender to reinforce its own preferred gender ideology. To fail to see this appointment of the state as final arbiter is naïve and dangerous.

Senator Wiener’s comments in support of the Act also betray a widespread illusion that our gender identities exist independently of legal frameworks. He seems to believe that the state’s work is merely to reflect the genders of individuals. But the legal institutions we navigate every day do not simply reflect gender. Laws entrench the language, concepts, rights, and obligations constructing gender categories. The society that Senator Wiener believes “tells us who we should be” cannot be divorced from the institutions that structure that society: laws, as much as persons around us, set up the gender categories we do or do not identify with.

I grant to progressive lawmakers that it is better to have a third, nonbinary option than to limit constituents to “male” and “female.” It is an achievement to resist a near-universal legal practice of marking (and policing) bodies according to a binary classification of reproductive features. The best solution, though, would be eliminating all gender markers on state-issued identification. Americans should not have to resign themselves to a choice between two legally classified genders based on genitals and three legally classified self-identities.

The new law forces Californians to identify within a ternary range of options even though lawmakers apparently recognize that the scope of gender identities outside of male and female is vast and effectively unlimited. The Act states: “Nonbinary is an umbrella term for people with gender identities that fall somewhere outside of the traditional conceptions of strictly either female or male… such as agender, genderqueer, gender fluid, Two Spirit, bigender, pangender, gender nonconforming, or gender variant.” In other words, California now lumps all identities other than “male” and “female” under the “nonbinary” label. This reduces alternative gender identities to “not a woman or a man.” Far from escaping the gender binary, this and any similar law will continue to define every gender identity with reference to the binary. It perpetuates the common prejudice that binary identities are somehow more legitimate than the multitudes of other identities. Rather than deconstruct gender binarism, lawmakers have, in effect, shored it up.

Admittedly, the current legal system encodes gender so deeply that there is no immediate route to gender-neutral jurisprudence. The only viable options presently available to lawmakers might be three legal genders or two. But we must begin to think beyond this dichotomy. There is no good reason why our bodies or our gender identities should be recorded by the state. These designations have been used to police heterosexual marriage (until 2015), gender segregation in voting (until 1920), and females’ ineligibility to serve in the military (until 1948); they were also used to maintain females’ exclusion from universities (including my own, Yale, until 1969). History suggests that federal and state governments have encoded biological sex in the service of institutional sexism. Legal gender has, by and large, allowed the clean and easy exclusion of females from positions of social and political power.

I say “by and large” because these designations have also been used at times to correct sexism. Consider Title IX, the law that prohibits sex discrimination in education programs that receive federal funding. It is perhaps best known for requiring schools and universities to fund women’s sports, a measure that revolutionized women’s participation in athletics in the United States. Legal gender was part of enforcing Title IX: without the law’s ability to track male and female students, it might have been more difficult to ensure that females had equal access to education programs. Title IX is but one example of the ways in which legal gender has been used to secure certain rights for women in the United States.

But now Title IX faces problems. It was built on assumptions that did not take into account the complexity that intersex, trans, and nonbinary students introduce to education systems. The new landscape creates questions that Title IX is inadequate to answer: Can a school legally discriminate against nonbinary persons? Has a school sports league violated Title IX by forcing a transgender boy to wrestle against girls

Perhaps these questions have clearer answers if legal sex is replaced by legal gender identity. But then, new worries arise. Consider, in this regard, California’s Gender Recognition Act. This Act unfortunately transgresses its own tenets by violating citizens’ rights to privacy. The legislation states in one breath that gender identity is “fundamentally personal” and that there ought to be “state-issued identification documents” that reflect gender identity.

Gender identities are complex and deeply individual. There is no single way to articulate the basis for gender identity, and not all citizens have stable and clearly defined identities. Any legal categorization of gender identity, which necessarily requires external criteria of identity (such as attesting to it “under pain of perjury”), is sure to be reductive and inadequate. I do not want one of my most personal understandings of myself recorded and fixed on my driver’s license any more than I would like my genitalia recorded on it.

I applaud California lawmakers for identifying and seeking to correct the negative impact that legal gender markers have on trans and nonbinary lives. But their solution takes us sideways, and other lawmakers would do well to carefully consider the costs. California’s solution fails to address the fundamental problem with any legal gender system: there is no defensible basis for legally recording people’s gender identity or sex, and we should not perpetuate the state’s assumed right to define citizens’ gender identities. The way forward is the way out.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/_jTzM93jzUE/

The Worst of the Worst

Gerald Scarfe

On January 10, The Washington Post reported that Donald J. Trump passed a milestone that none of his predecessors is known to have attained: just short of the anniversary of his first year in office, he told his two thousandth lie.1 It had happened sometime the day before, when the president was meeting with legislators to discuss immigration and tossed out a few of his old standbys—about how quickly the border wall could be built, about “the worst of the worst” gaining entry to the United States through a visa lottery, and about his wall’s ability to curtail the drug trade.

The path from the first lie to the two thousandth (and now beyond), a veritable Via Dolorosa of civic corruption, has been impossible for even the most resolute citizen to avoid. Trump is in our faces, and our brains, constantly. Yet the barrage is so unceasing that we can’t remember what he did and said last week, or sometimes even yesterday. Do you remember, for example, that first major lie? It was a doozy: the one about how his inaugural crowds were larger than Barack Obama’s, larger than anyone’s, the largest ever, despite the ample photographic evidence that rendered the claim laughable.

That was Day One. On Day Two, he sent his press secretary, Sean Spicer, out to meet the White House press corps for the first time. In that ill-fitting suit jacket that appeared to have been tailored for someone with a neck a good three inches thicker than his, Spicer insisted that the photographs were misleading and the press was wrong. Not just wrong—lying. “There’s been a lot of talk in the media about the responsibility to hold Donald Trump accountable,” he said, sputtering his words in terse reports as if they were issuing from a machine gun.

And I’m here to tell you that it goes two ways. We’re gonna hold the press accountable as well. The American people deserve better, and as long as he serves as the messenger for this incredible movement, he will take his message directly to the American people, where his focus will always be.

Here we are, a year later. From my reading and television viewing, the general assessment of most pundits seems to be that it’s been worse than we could have imagined (except on the Fox News Channel, where everything in Trump world is coming up roses and the gravest threat to democracy is still someone named Clinton). But honestly, who couldn’t have imagined any of this? To anyone who had the right read on Trump’s personality—the vanity, the insecurity, the contempt for knowledge, the addiction to chaos—nothing that’s happened has been surprising in the least.

I think most close observers of Trump understood his personality perfectly well. If that’s right, what, then, could explain the surprise? Maybe just a reasonable disbelief that a president would, for example, remark crudely on a female television anchor’s facelift surgery, or actually encourage the Boy Scouts—the Boy Scouts!—to boo his predecessor (remember that one?). But I think there has been some deeper collective refusal on the part of the political class to acknowledge what has happened here, and of course to own up to their part in it. No one (on this point I include myself) believed Trump could win. No one took his candidacy seriously enough. This is especially true of the press, which, in hammering away on Hillary Clinton’s e-mails, assumed itself to be in training to refight the wars of the 1990s once the Clintons moved back into the White House.

When we are forced to confront the reality of a shocking outcome that we never thought would happen, we start rationalizing: It wasn’t our fault. There’s nothing we could have done. Maybe it won’t be so bad. Maybe it’s what we deserve. And maybe, in some strange way, it will all work out.

To be fair to the press, the reporting on the Trump administration has been thorough and often unflinching, willing to call a lie a lie (witness the Post’s list) and even resolved, as we saw recently, to print the word “shithole” in news stories and headlines and say it on air. But the rules of journalism generally prevent news outlets from rendering larger judgments. Journalism tries to get the little stuff right but often gets the big stuff wrong.

Enter Michael Wolff.

For a good two decades, Wolff has chronicled the doings of important people—in media, mostly, but also in Silicon Valley and show business and politics. His subject is power and how men (and some women) use it. We were colleagues for a few years at New York magazine. We talked politics sometimes, but he was always rather sphinxlike on the topic. I suppose if one sat him down and asked him his positions on abortion rights and same-sex marriage and global warming and so on, he’d come out a liberal. But positions per se don’t interest him much. What interests him is how men to whom history has given the stage command it or fail to.

Wolff interviewed Trump in the spring of 2016 for The Hollywood Reporter, his current home base. When the resulting piece—not fawning, but by no means written with the acid pen for which he is often known—appeared, he e-mailed Trump press aide Hope Hicks. Her oh-so-very Trumpland verdict: “Great cover!” (It featured a grainy shot of Trump’s head floating in air, with images of Hillary Clinton and Bernie Sanders in his mirrored sunglasses.) Wolff then approached Trump after the election, as he wrote in the Reporter following Fire and Fury’s release, with an offer:

I proposed to him that I come to the White House and report an inside story for later publication—journalistically, as a fly on the wall—which he seemed to misconstrue as a request for a job. No, I said. I’d like to just watch and write a book. “A book?” he responded, losing interest. “I hear a lot of people want to write books,” he added, clearly not understanding why anybody would. “Do you know Ed Klein?”—author of several virulently anti-Hillary books. “Great guy. I think he should write a book about me.” But sure, Trump seemed to say, knock yourself out.2

In an in-house interview with a Reporter staffer timed to the book’s release, Wolff confirmed that politics and ideology weren’t exactly foremost in his mind. He was asked whether he had been “just a neutral observer” in the White House:

Completely. I would have been perfectly happy to have written a contrarian book about how interesting and potentially hopeful and novel Trump-as-president was. I would have written a positive Trump book. And I thought it would be a fun thing to do—an audacious way to look at the world.3

And so Wolff got his access, as he wrote in his own Reporter piece, because Trump’s “non-disapproval became a kind of passport for me to hang around.” He benefited, no doubt, from the unprecedented disorganization of the Trump White House, where no one quite had the authority to tell him to leave. He made his appointments, but basically, he says, he just plopped himself down on a couch in the entrance foyer of the West Wing. I know the anteroom, and the couch. The Cabinet Room is to the left as you face the sofa (with the Oval Office off of it), the Roosevelt Room to the right, and many offices nearby. Everyone walks by. It’s a grand place to sit and watch.

In no time, he was horrified at what he saw. You surely know by now the book’s big scoops, which are stunning. The biggest one is dropped very late in the book, in a section about Trump’s sons, Don Jr. and Eric, who “existed in an enforced infantile relationship to their father, a role that embarrassed them, but one that they also professionally embraced.” Wolff is discussing the infamous meeting on June 9, 2016, at Trump Tower that Don Jr. had with Paul Manafort—who would soon be named Trump’s campaign chairman—Jared Kushner, and the Kremlin-connected Russian attorney Natalia Veselnitskaya.4The New York Times broke the news of the meeting last July, and sources told the paper that Don Jr. was looking for dirt on Clinton.

Wolff writes that “not long after the meeting was revealed,” he was talking with Steve Bannon, then still the president’s chief political strategist, who said: “Even if you thought that this was not treasonous, or unpatriotic, or bad shit, and I happen to think it’s all of that, you should have called the FBI immediately.” It was this quote—far too frank, and given to “the enemy,” i.e., the nonconservative media—that got Bannon fired from Breitbart News, to which he had returned after he left the White House (and where, incidentally, he was earning $750,000 a year in 2013, which was at least $200,000 more than Jill Abramson, the editor of The New York Times at the time, was making). The quote has landed Bannon in the conservative movement doghouse, but we all know these things can change; it’s not hard to imagine the mercurial Trump deciding someday that Bannon is of use to him again.

That quote was hard news, but the most breathtaking scoops in Fire and Fury are about Trump himself. “What a fucking idiot,” Rupert Murdoch is reported to have muttered as he put down the phone after talking to Trump, then president-elect, about immigration. Trump’s complete incuriosity and resistance to learning anything is a constant theme. “Here’s the deal,” one of Trump’s close associates told Reince Priebus, when the former Republican Party chairman was named the president’s chief of staff. “In an hour meeting with him you’re going to hear fifty-four minutes of stories and they’re going to be the same stories over and over again. So you have to have one point to make and you have to pepper it in whenever you can.”

The White House and the GOP, of course, have tried to attack Wolff’s credibility. Trump himself, on January 13, called Wolff “mentally deranged” in a tweet (his first public remarks that day, after he finished the round of golf he was playing when Hawaii received a false missile attack alert). Even presumably sympathetic interlocutors like Stephen Colbert have pressed Wolff on why he doesn’t make public the recordings of these conversations he says he has. And some journalists have spotted small mistakes, like a Washington Post reporter placed at a meeting he apparently did not attend.

But by and large, Wolff’s reporting has held up. It makes perfect sense that Trump not only didn’t expect to win, he didn’t want to win. By the end he campaigned feverishly, because he’s a competitive man, and he surely hated the thought of losing, as he would have put it, to a girl. But he did not want to be the president:

He would come out of this campaign, Trump assured [Roger] Ailes, with a far more powerful brand and untold opportunities. “This is bigger than I ever dreamed of,” he told Ailes in a conversation a week before the election. “I don’t think about losing because it isn’t losing. We’ve totally won.” What’s more, he was already laying down his public response to losing the election: It was stolen!

On election night, when it became clear that he had won, Trump, Wolff writes, seemed paralyzed. Melania, his wife, was crying, and not tears of joy.

The attacks on Wolff haven’t stuck partly because it all rings so true. But I think there is also another reason. Some critics have tried to accuse Wolff of not playing by the standard rules of journalism, by which they mean to insinuate that he’s taken off-the-record material and put it on the record. But no one has produced evidence of this. And in fact, outside of eight or ten salacious quotes, nothing in Fire and Fury seems out of the ordinary. Indeed, for quite long stretches, it reads like any conventional work in the genre. Trump himself disappears for pages at a time. The running theme is the feud between Bannon and “Jarvanka,” Wolff’s portmanteau for Jared Kushner and Ivanka Trump, which is no more inherently interesting than the feuds inside any other White House. At times one can feel Wolff himself getting a little bored.

However, there is one sense in which he doesn’t play by the usual rules. Wolff doesn’t do “fairness.” He comes to his conclusions, and he lets you know them. He doesn’t tell the other side. No New York Times or Washington Post reporter could have written this book. They follow rules that demand more “balance,” rules under which they might have been more likely to get all the small things absolutely right but would have diluted the larger truth. And so, free from that stricture of straight news reporting, Fire and Fury has performed a great public service: it has forced mainstream Washington to confront and discuss the core issue of this presidency, which is the president’s fitness for office.

Alex Wong/Getty ImagesPress Secretary Sarah Huckabee Sanders listening to President Trump speaking on video during a news briefing at the White House, January 2018

While Wolff doesn’t emphasize policy positions, they are of keen interest to David Frum, the conservative commentator who writes for The Atlantic and graces cable-news television screens frequently, often these days in need of a shave. For many years, Frum has been an astute observer of the conservative movement. It is a measure of how conservatism has changed—and possibly of how Frum has changed, but I think more the former—that he used to write books and articles criticizing the Republican Party from the right. His 1994 book, Dead Right, argued that Republicans had drifted from their ideological roots of minimal government, fiscal responsibility, and personal probity. From today’s vantage point, of course, the concept of drift circa 1994 seems quaint.

Today Frum—also a onetime colleague of mine, at The Daily Beast—is part of a group of (former?) neoconservatives who have emerged as some of Trump and Trumpism’s most forceful critics. It seems likely, in the first instance, that their objections to Trump had to do with his Lindberghian foreign-policy views, which offend everything they have stood for. Back in late 2015, neoconservative criticisms of Trump tended to be focused on his isolationism. But in fairness, their attacks on him have expanded far afield from foreign policy, to his rhetoric and behavior and destruction of norms and all the things liberals care about.

Weekly Standard editor Bill Kristol was a neoconservative writer, organizer, and theorist for a quarter-century, at the barricades on controversies from health care reform to the Iraq War (he was also the most important promoter of Sarah Palin, who embodied Trumpism before Trump became Trump). Now he regularly issues withering tweets about Trump and is a fixture on the liberal-leaning MSNBC. The foreign policy writer Max Boot was a vocal and at times strident champion of the Bush Doctrine. These days he’s a ferocious and shrewd critic of the president. Washington Post blogger-columnist Jennifer Rubin was, among prominent conservative pundits, probably Mitt Romney’s most aggressive defender in 2012 and aside from that was known for her hard-line foreign policy views, particularly on matters relating to Israel. Now, her columns often read as if they could have been written by the late Molly Ivins. (Two recent Rubin headlines: “Trump Retreats on Iran, and He Will Need to Do So Again”; “The Enablers of the Racist President Are Back at It.”)

Observing the extent to which the Trump era has forced these and other conservative writers into a thoroughgoing reassessment of their movement, their party, and themselves—and wondering who among them will do a complete ideological volte-face—has become a parlor game in Washington journalism and intellectual circles. Rubin seems farthest along that road. But Boot turned quite a few heads at year’s end with a column in Foreign Policy headlined “2017 Was the Year I Learned About My White Privilege,” in which he wrote:

It has become impossible for me to deny the reality of discrimination, harassment, even violence that people of color and women continue to experience in modern-day America from a power structure that remains for the most part in the hands of straight, white males. People like me, in other words.

It’s hard to imagine these folks becoming liberals, but it’s also pretty difficult to picture someone staying a conservative after experiencing an epiphany like that.

Frum insists that he is still a conservative and writes in Trumpocracy that he wants “a conservatism that can not only win elections but also govern responsibly, a conservatism that is culturally modern, economically inclusive, and environmentally responsible.” His is a far more polemical book than Wolff’s, and Frum is a skilled polemicist, capable of producing lines that carry rhetorical precision and force but stop short of screaming for attention:

Trump has contaminated thousands of careers and millions of minds. He has ripped the conscience out of half of the political spectrum and left a moral void where American conservatism used to be.

If journalism is the first draft of history, Trumpocracy reveals Frum’s intent that he be one of the first out of the gate offering a second draft. He acknowledges in his introduction that there is a risk of events overtaking his arguments and proving some of them wrong; however, he adds, “if it’s potentially embarrassing to speak too soon, it can also be dangerous to wait too long.”

Frum’s warning is expressed in his subtitle—that Trump’s rule is a very real threat to the republic. At other times in his text he calls Trump a threat to democracy. Those are two different things—“republic” refers to a body of laws, “democracy” to majority rule—and while it’s true that Trump is a threat to both, it would have been helpful to readers unsure about that distinction if Frum had explained both threats in more detail and laid out why they’re different. Nevertheless, his broader warning about what the alert citizen should be on the lookout for is on point:

The thing to fear from the Trump presidency is not the bold overthrow of the Constitution, but the stealthy paralysis of governance; not the open defiance of law, but an accumulating subversion of norms; not the deployment of state power to intimidate dissidents, but the incitement of private violence to radicalize supporters.

The book is organized into chapters with dramatic titles: “Enablers,” “Appeasers,” “Plunder,” “Betrayals.” “Enablers” discusses WikiLeaks and various fake-news purveyors and includes a sobering anecdote about “the single most circulated fake story of the election”—the news (false) that Pope Francis had endorsed Trump, which popped up first on a website made to look like that of a local TV news station and then on a website called Ending the Fed (both fake). The story was seen by more than one million people. “Appeasers” is about how establishment Republicans went from saying “Never!” to their current state of servility.

“Plunder,” my favorite chapter, provides a rich catalog of Trump family pelf that may be useful one day to Democratic impeachment committee staff. As with presidential lies, these episodes have been so numerous and so shocking—yet simultaneously rendered so pedestrian by their repetition and the casual attitude with which every Trump family member advances them—that we can’t begin to remember them all.

If you squint hard, back through time’s mists, you may recall the phone call Trump placed to Argentine president Mauricio Macri six days after his victory. Why this relatively obscure head of state, alone among the leaders of South America? We don’t know. But we do know that at the time, a Trump-licensed building in Buenos Aires was stalled. Miraculously, the next day, someone cut through the red tape, and the project was moving forward. Trump also put his daughter Ivanka on this call. She has known Macri since she was young, but she is also still involved in her father’s business. We learned of all this only through the Argentine media.

Frum’s criticisms are not limited to Trump. He devotes several pages to an attack on recent Republican efforts to suppress the votes of Democratic-leaning constituencies, advancing the argument, which many conservatives are still loath to make, that Trump, far from being an aberration of modern Republicanism, is in fact its logical endpoint:

It was not out of the ether that Donald Trump confected his postelection claim that he lost the popular vote only because “millions” voted illegally. Such claims have been circulating in the Republican world for some time, based in some cases on purported statistical evidence. Beyond the evidence, however, was fear: fear that the time would soon come, and maybe already had come, when democracy would be turned against those who regarded themselves as its rightful winners and proper custodians.

Conservatives, he writes later, will never abandon conservatism. If the day comes when they conclude that their side can’t win elections democratically, “they will reject democracy.” Trumpocracy warns that the day of reckoning is upon us—that the liberal democracy that is our heritage “imposes limits and requires compromises,” and that Trumpism is its mortal enemy. As the lies mount, questions that once seemed overwrought can no longer be put to the side. We probably have three years of this—at least—to go.

—January 25, 2018

  1. 1

    Glenn Kessler and Meg Kelly, “President Trump Has Made More Than 2,000 False or Misleading Claims over 355 Days,” The Washington Post, January 10, 2018. 

  2. 2

    Michael Wolff, “‘You Can’t Make This S— Up’: My Year Inside Trump’s Insane White House,” The Hollywood Reporter, January 4, 2018. 

  3. 3

    Seth Abramovitch, “Michael Wolff Reflects on a Wild Week and Trump’s Anger: ‘I Have No Side Here’ (Q&A),” The Hollywood Reporter, January 6, 2018. 

  4. 4

    See also Amy Knight’s “The Magnitsky Affair” in this issue. 

Source Article from http://feedproxy.google.com/~r/nybooks/~3/YUTgW5cgVwY/

Consciousness and the World

Herbert List/Magnum PhotosHerbert List: Spirit of Lycabettus XIII, Athens, Mount Lycabettus, 1937

We are all used to thinking of ourselves as a single entity that enjoys a certain continuity from birth to death, if not beyond. We confidently say, “Ten years ago, I did this; next week, I will do that.” But what exactly is this entity? Traditional Christianity posits a non-material soul housed in a material body. At the beginning of the scientific age, Descartes formalized this concept with an idea that came to be known as dualism: the soul in the skull was spiritual and interfaced with the material world of the body through the pineal gland between brain and spinal chord. Modern materialism jettisoned this idea, yet struggles to replace it; even the most serious philosophical and neuroscience publications often lapse into an implicit dualism, where, for example, the word “mental” is opposed to “material” without our being quite sure what we mean by the two, or where consciousness and, with it, the self are understood as “supervening” on vast quantities of “information” moved back and forth in the brain. It is very much as if the brain were made equivalent with the self, and information afforded some non-material, “mental” existence. “You are your brain,” claims the neuroscientist David Eagleman.

—Tim Parks

This is the fifteenth, and final, conversation in a series of dialogues on consciousness between Riccardo Manzotti and Tim Parks.

Tim Parks: Riccardo, in these conversations you have been elaborating a radically externalist theory of consciousness; rather than representations in some inner theater of the brain, experience is understood as united with the world that comes into contact with our bodies, allowing no separation between subject and object. In this view, nothing is stored in the head, nothing is “mental.” Simply, we are our experience, and it is all out there. But if we are no more, no less, than our experience, what about the self? How can I be someone, if I don’t have a self in my head?

Riccardo Manzotti: If you had a self, the self would not be you, right? But let me ask you, have you ever seen, or in any way observed, or, even better, pinned down someone’s self? Or your own self? Of course not. No one has.

Parks: I observe behavior I’m familiar with in people I know, alongside facial expressions and gestures I’ve seen a thousand times. I know when a friend is acting himself, or herself, as they say. Similarly, I know my own way of living and thinking, all too well.

Manzotti: That makes more sense. Let’s focus on a concrete example. Yesterday evening, I was watching my two boys preparing for a party. One is eleven and the other is eighteen. They were arguing as usual, the younger accusing the older of wearing his pants in a stupid way, sagging off his butt. I must say I rather agreed with him, but no doubt my eldest is aping his friends’ way of dressing. It’s a fashion. So, was I seeing the boys’ “selves”?

Parks: A thing doesn’t have to be visible to exist. We know a dark hole is there because of its gravitational pull on the area around it, not because we can see it. Having met your kids a couple of times, I can only say they both radiate very distinct and attractive identities.

Manzotti: From selves to black holes! But yes, there are cases where certain goings-on require us to posit a physical entity we cannot see; in those cases, we make a hypothesis and set to work with all our scientific instruments to pin this entity down. But does this behavior of my children really require me to suppose there’s an invisible “self” around that we should start looking for? Aren’t there any number of entirely visible causes to look at before positing something invisible?

Parks: Like what? And remember, it’s not just the behavior, but the awareness and sense of purposefulness that accompanies it. Or are you just going to say what Daniel Dennett said about “intentional stance”—his term for selfhood, I think—that it is only in the eye of the beholder? A descriptive tool?

Manzotti: Not exactly, no. The notion of an internal self is akin to that of the hypothetical center of mass they use for calculations in mechanics. It’s a form of shorthand. It doesn’t exist as such. But the cause of the ongoing behavior pattern, as you put it, coupled with awareness, etc.–the person even, if you like–certainly does.

Parks: So, what is it, physically, scientifically?

Manzotti: You could say that, at any moment, a person is the collection of all those objects that are presently active thanks to a particular body. Or, again, it is the world that exists, now, relative to a certain body, an immensely complex amalgam of things spread in space and time, but all, at one moment or another, perfectly available to the senses.

Parks: I’m losing you now. How can I possibly think of a person in this way?

Manzotti: Let’s get back to the example we began with: my boys arguing over the way one of them wears his pants. We’ve agreed in previous conversations that since there is nothing in the head but neurons and chemicals, the brain can’t constitute our experience; the only thing identical with a person’s experience is experience itself, which is outside our nervous system. When I see an apple, my experience is the apple I see and I am my experience. This is why objects of whatever kind—cars, songs, other people, pets, food—are so important to us; they constitute our experience.

Parks: And so, your boys?

Manzotti: Giulio, my older son, is eighteen. For years and years, every waking moment, inescapably, his body has been in contact with the world that has become his experience and that continues to produce effects by means of his body and brain.

Parks: That’s an expression you keep using; what exactly do you mean by it?

Manzotti: You hear a song yesterday, and you sing it today. You watch an ad for a brand of sunglasses and, six weeks later, you buy them. You remember your mother’s voice from twenty years ago. You dance a step your friend taught you in your teens. You…

Parks: Enough, I’ve got it. These were all originally objects in the world, experiences my body is still experiencing.

Manzotti: Think of the body as a kind of hyperactive proxy that allows objects in the world to manifest themselves, much the way your phone allows a friend’s voice to be heard. The cause of the voice is not in the phone, but the phone is necessary for the voice to be there.

Parks: Except the body would have to be a proxy and go on being a proxy for any number of things at the same time.

Manzotti: Right. A whole world speaks through your body and that world is consciousness, the consciousness you think of as yourself.

Parks: Back to Giulio.

Manzotti: Okay. Giulio’s multitudinous experiences include his little brother, Emilio, his school friends who wear their jeans sagging ridiculously low, the jeans themselves, of course, and his own body, which is a constant object of his own experience. If we want to use the word self, we could say that rather than some ghostly entity imprisoned inside Giulio, self is a shorthand for all these things acting together through his body. It is his life.

Parks: I appreciate your desire to avoid the mystification that lurks behind words like self and soul. But this description seems hopelessly incomplete. If I am nothing other than the myriad objects of my experience, why would I be different from anyone else who has come into contact with the same objects, more or less? Why would we have this powerful sense of distinction between different people, and this awareness of a continuity of behavior in our own lives? How can we have character, or even temperament, in your vision of the world? It seems to me it’s this awareness of individuality that leads people to posit the existence of an internal entity, self or soul.

Manzotti: First, each of us has a distinctive body that comes into the world in distinctive circumstances setting us up for a unique set of experiences—foremost, of course, of the body itself, which is not only the agent of experience, but also the prime object constituting experience, libido, appetite, pain, pleasure, illness, and so on; then, of all the other objects relative to that particular body, including parents, aunts, uncles, brothers, sisters, physical and social environment, school, etc. Then, day by day, we are constantly performing, repeating, and subtly altering areas of our experience—language, learned behavior, and so on—while all our past experience conditions each new experience.

As a result, the apple I grab is different from the apple you grab—even if, from an eating perspective, there is just one apple—because my apple is relative to my body and experience, and yours is so to yours. If you were blind, your apple would lack color. The poem you read is relative to your life experience; the poem I read is relative to mine. Maybe my moon (or the moon that is me, at that moment) is pretty much the same as your moon, visually, but who knows what past moons and experiences make the two quite different? If we look at life this way, it is actually far easier to explain distinctions between people than if we posit some soul or self already there at birth.

Parks: So, both Giulio and Emilio, thanks to their bodies and brains, bring into existence a world of relative objects, experience, and habitual patterns of juggling those objects and having them interact with new objects. And that is what you call the self? Have I got it?

Manzotti: Right. Though I prefer the word “person,” or just “life.” In any event, this amalgam of experience is relative to the body, but not inside it and not identical with it. Above all, it’s not “mental,” and it’s not a fixed entity. It has a certain continuity, or inertia, but it is also open to change.

Parks: Help me with the “not mental” side. Let’s say that as the two boys argue about these sagging pants, Giulio is thinking, “Emilio is only saying this to please Dad.” And you are thinking, “How can I show my agreement with Emilio without seeming a boring old conservative?” And Emilio is thinking, “I’m going to be really embarrassed turning up at this party with my brother wearing his pants down to the floor.” And at the same time, all three of you are looking at the pants and one another in the mirror of the wardrobe. Don’t we have an awful lot of mental stuff going on here? Then, at a certain point, Giulio turns to Emilio and says, “You’re just jealous!” What was it that decided him to do that? What orchestrates this amalgam of experience as you’ve called it? Don’t we have to posit some unifying entity to explain this?

Manzotti: Nice drama. Let’s observe two things: first, communication; then, the action that you say demands a “mental self.” The common notion of communication supposes that people exchange packets of symbols—words, gestures—that their inner selves then interpret by associating each symbol with the correct meaning. This view is a myth. Curious as it may seem, our current models of communication are derived from the work of mathematicians like Claude Shannon and Alan Turing who, during World War II, were involved in such things as deciphering coded messages between enemy submarines and their naval commanders. People do not communicate like navy submarines.

Parks: You’re telling me we don’t exchange messages?

Manzotti: Of course, we send things back and forth. But communication is not an exchange of messages to be decoded. Communication is sharing. It is the activity through which different people—by which I don’t mean people’s bodies, remember, but their experiences—become composed of the same stuff.

Parks: So, as you all look in the mirror, contemplating one another and Giulio’s sagging pants, you are communicating.

Manzotti: Absolutely. We are our experience and those sagging pants are shared between us. But people do this all the time when they point things out to each other. Look at this, look at that. A seaplane landing on a lake. A strange orange spider. Or listen to this, a new song by a favorite band. Touch this, smell that. This is communication. The airplane or spider, relative to our bodies, becomes our shared experience. Likewise the TV, the radio, the new Star Wars film.

Parks: But we could communicate these things in words, speaking or writing.

Manzotti: If someone were blind from birth, no amount of words, or bundles of bits in a computer processor, could convey the orange of the orange spider. Communication is shared experience, being literally made of the same world relative to our bodies. With words, we share the experience of words and their appeal to previous experience. “Orange spider” in words will activate a previous experience of orange, but not maybe the exact hue we share when we look together.

Parks: But is it communication even when you disagree? About the pants, I mean. Doesn’t that suggest the experience isn’t entirely shared, in the sense of being the same for each?

Manzotti: The new object, as I said, is relative to me, my life, my experience. Maybe I had a bad experience with a spider as a child, so my body doesn’t react to the orange spider as yours did. Maybe I have a long experience of jeans worn in a certain way and don’t have friends I’m eager to impress who wear theirs sagging to their knees. Still, this is communication. We share the same world and declare our difference by our different, sometimes contrary, responses. That response can then modify the other’s experience. I see the sagging pants and laugh, and maybe Giulio now experiences the pants differently. I jump seeing the spider, and maybe you now experience it differently. A critic pours scorn on a book you like and you begin to see it through his eyes and have your doubts. The book you now see is a new book relative to you. So, as well as sharing, communication is change, experience is accumulated, shifted. Which is why, as well as continuity of identity, we also have discontinuity.

Parks: What about decisions? I decide to focus my eyes on this, rather than that. To listen to my friends, rather than to my father. To suddenly say to my brother, “You’re just jealous.” How does that fit in?

Manzotti: We can’t stop acting and being, moment by moment, and the amalgam of experience that we are reacts constantly to the new experience of the moment. Emilio laughs at the sagging pants; Giulio reacts with the old card of accusing the younger boy of jealousy. There are any number of stories going on at once, worlds manifesting themselves through the proxy our bodies offer.

Parks: It seems you always see action as reaction, you always explain decisions in terms of the influence of the relative objects, experiences, constituted by our body’s meeting with the world. But doesn’t this seriously diminish us? Who are we, finally? Who am I? Who are you? Just a bundle of reactions?

Manzotti: We have an interesting linguistic trap here, one created by centuries of human self-regard. By using a different pronoun to enquire about the identity of people rather than of things—who, instead of what—we introduce an imaginary metaphysical difference. Why not ask “What are we? What am I?”

Tim: You mean, essentially, that we are objects, and objects “take place,” rather than act.

Riccardo: We are part of the physical world, hence objects. What else could we be—immaterial souls?

As for identity, we are what we are because we are identical with a portion of the world that has come together over the years in a certain way. The traditional separation of subject and object that underpins all standard thinking on consciousness and identity lies at the heart of our troubles as individuals and as a society. Convinced that we are separate from the world, we feel we have been expelled from the Garden of Eden, and we yearn to return, maybe after death. But however useful the subject-object divide may be for all kinds of practical matters, it is plain wrong.

Tim: So, a subject is never in relation with anything but what that subject is?

Riccardo: I would say something like “Thou shall have no other relations but identity,” and, no, I’m not starting a new religion; it’s just the most elementary fact of physical reality. In nature, everything is what it is and only what it is. A rock is a rock. A planet is a planet. A neuron is a neuron. A brain is a brain. A brain cannot “partake of” a spider. A spider is a spider. My experiencing a spider cannot be a relation between a mysterious self and an extraneous spider. My experiencing a spider is an identity between the spider and what I am, experience. I am the spider; it’s the only thing that has the right properties to be my experience.

Parks: The spider and simultaneously all the other object-experiences of your life. So, identity is constantly expanding.

Manzotti: Western thought has always sought to describe consciousness as identical with something, whether that be ideas, in the idealist view, or neural activity, according to the contemporary materialist position. When I propose an identity between consciousness and the world, I am following the same explanatory strategy scientists and philosophers have always adopted. I have simply settled on a new candidate for identity: not ideas, not neurons, but the world itself.

Parks: Then consciousness has always been, as they say, hidden in plain sight.

Manzotti: Exactly. We are the world that surrounds our body and the body itself as known; we are the objects we see, hear, smell, taste, touch. The rest is hot air.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/fOJXGYrokDo/

Art in Free Fall

Laura Owens

an exhibition at the Whitney Museum of American Art, New York City, November 10, 2017–February 4, 2018; the Dallas Museum of Art, March 25–July 29, 2018; and the Museum of Contemporary Art, Los Angeles, November 4, 2018–March 25, 2019

Laura Owens/Whitney Museum of American ArtLaura Owens: Untitled (detail), 138 1/8 x 106 ½ x 2 5/8 inches overall, 2014

The Los Angeles artist Laura Owens brings a light touch and a tough mind to a new kind of synthetic painting. Her exuberant, bracing midcareer survey at the Whitney beams a positive, can-do energy. As a stylist and culture critic, Owens is neither a stone-cold killer nor a gleeful nihilist, traits embraced by some of her peers. She’s an art lover, an enthusiast who approaches the problem of what to paint, and how to paint it, with an open, pragmatic mind. Her style can appear to be all over the place, but we always recognize the work as hers. Her principal theme may be her own aesthetic malleability.

Owens bends the conceits of art theory so that her own personality can flourish. She is not afraid of wit. Enchantment has its place too. Walking through her show, I was reminded of something Fairfield Porter once wrote about Pierre Bonnard: “He was an individualist without revolt, and his form…comes from his tenderness.”

For decades, and especially in the mid-twentieth century, a persuasive reading of modern painting revolved around the idea of the gestalt—the way every element in a painting coalesced into one totality, one essence that blotted out ambiguity. A painting isn’t a thing about another thing—it just is. This gestalt theory of painting was especially alluring during abstraction’s dominance; it put a brake on the drive for narrative, and helped to establish painting’s autonomy from literalist interpretations.

But a funny thing happened to the gestalt: life intruded. What if the whole is not more than the sum of its various parts, but more like a shopping list? What if all the various elements used to make a painting are just left out on the floor like pieces from a puzzle that no one bothered to finish? In a recent New Yorker profile, Owens thoughtfully implies that the time for gestalts is over, that collage—i.e., something made out of parts or layers—is simply a feature of the life we all lead. Indeed, a big part of our culture is involved with putting things together, with little distinction made between the invented and the found, and even less between the past and the present. The fragmentary, the deconstructed, even the deliberately mismatched—that is our reality. We are all collage artists now.

As someone who holds more or less the same view I can hardly fault Owens for believing this, but it seems to me that her paintings are very much gestalts anyway, though perhaps of a new kind, something closer in their effect to imagist poetry, and it’s their sometimes surprising gestaltness that holds our attention. Owens has interesting ideas, but it is her ability to give them form, often in unexpected ways sourced from unlikely corners of the visual world, that makes her art exciting.

Owens’s paintings are squarely in the middle of a postmodern aesthetic that’s been gaining momentum for the last ten or fifteen years. It is not the world of Luc Tuymans via Gerhard Richter, in which the painting’s photographic source is like a radioactive isotope that you could never touch but that, in its absence, is what really matters. The new attitude is not much interested in photography at all. It wants to rough an image up, put it through a digital sieve, and decorate the hell out of it.

A tree imported from Japanese painting anchors a wispy, airy composition. The tree shelters a monkey, or an owl, or a cheetah, perhaps borrowed from Persian miniatures—brown and khaki on a cream-colored ground, accentuated here and there by swatches of painted grids and colored dots, or beads or bits of yarn, or shapes cut from colored felt. An Owens monkey in a tree (or a Peter Doig canoe) is imagery augmented, repurposed. This is composite painting. It coheres, but maybe not in the way we’re used to.

Owens has a big formal range at her disposal—her quiver is full. Odd color harmonies: teal, hot pink, raw sienna, fuschia, manganese blue, cream. The colors of the decorator shop. Art taste that’s been knocked down a notch or two. The paint is applied with a pleasing paint-by-numbers quality. You can feel the hobby store just around the corner.

Laura Owens/Collection of Nina MooreLaura Owens: Untitled, 66 x 66 inches, 2004

Paintings like illustrations in a children’s book. They feel liberated, unafraid to be garish. Saturated color and loose, agitated brushwork. Images that kids find appealing: animals in the forest, princesses, wild-haired children—fairy-tale stuff. The spirit of Magritte hovers over these paintings—his vache paintings from the late 1940s, the ones that nobody wanted.

A chunky white horse with a fanciful tail like a philodendron frolics against a loosely painted blue background (see illustration above). The horse is happy, energized; it lifts its head and kicks up a front hoof, straining at the confines of the rectangle. Everything about this painting from 2004 is right: the brushwork like china painting, the scale, color, and image all aligned. This quality feels instinctual—not something Owens was ever taught.

The paintings after 2005—a collage of grids in different scales colliding at different angles; a counterpoint of impasto freestyle painting, or silk-screened commercial imagery, or an expanse of text with the deadpan look of an old phone book. The compositions are retinal, assertive, like novel forms of candy seen in a glass jar. Their look is closer to greeting cards than to Franz Kline. This is strangely reassuring; they carry you along without protest, arbitrary, clownish, and weird.

The reciprocity of silk-screen and digital printing, the computer’s replica of the hand-made, are products of the coding mind of twenty-first-century painting. For artists of Owens’s generation (she’s forty-seven), the easy back and forth between found and made forms, and between painting and printing, is a given.

Sometimes the paintings are so casual-looking they can trip you up. In 2013, Owens showed twelve large (roughly twelve-by-ten-foot) paintings of this new postmodern composite type in the gallery space that occupies the ground floor of her Los Angeles studio, and people have been talking about them ever since. Paintings that leave the impression they could be, or do, just about anything. Perhaps their most salient quality is confidence. The digital enhancement of a sketch or doodle enlarged to twelve feet gives a vertiginous, Alice-in-Wonderland feeling to some of the paintings. Small grids on top of larger ones, hard-edged curlicues, computer-assisted drawings of cartoony Spanish galleons, red hearts and splashy arabesques, Photoshop lozenges and fat zucchinis of impasto thick as cake frosting, throbbing pinks and hot greens—all of it and much more is easily dispersed around the canvases, which were hung just inches apart, lest anything get too contemplative. The paintings don’t so much violate notions of good taste as ignore them.

Owens’s paintings sometimes seem to have been made by another kind of intelligence altogether, one tuned to a frequency similar to our own, yet different, as if a space alien, stranded here on a mission from a distant galaxy, had been receiving weak radio signals from Planet X. You can just about pick out the command from the static: unintelligible…static…static… Put raspberry-colored grid on aqua-colored rectangle. Add cartoon figures. Add black squiggles. Put drop shadows here and there. Don’t worry about placement—just put somewhere. Sign on back with made-up generic name. Try to pass as earthling until we can send rescue ship.

The critic most relevant to Owens’s work might be an Englishman who’s been dead for nearly forty years and never wrote about contemporary art. In his Seven Types of Ambiguity (1930), William Empson concerned himself with the ways in which poetical language—motifs, figures of speech, even individual words—can mean more than one thing at a time. He wanted to know how a poet or dramatist uses linguistic constructions to convey both the complexities of character and a setting for interpreting their actions. A figure of speech inserted into the right narrative structure can evoke the unstable experience of a protagonist who chooses a course of action, but who retains an awareness of the things not chosen. In literature, certain words, certain figures allow the reader to feel that psychic rub: this and not that—but with a bit of that still present; the memory of what was not chosen hangs over the action. Empson also made an important observation about an author’s intention. He believed that an author could say something in the work that could probably not be said apart from it. This type of ambiguity in particular resonates with visual art.

When people talk about irony in painting, which they do quite a lot, what they usually mean is ambiguity. Irony is saying one thing and meaning its opposite, while ambiguity is the ability for a form to hold two or more meanings at the same time. Painting that trades only on irony can have a short shelf life. Ambiguity keeps on giving; it rewards prolonged looking. In painting, ambiguity is most often present in the imagery, in its references and connotations, as well as how that imagery is handled. The style, of course, inflects the feeling. This is true for Owens as well. What feels invigorating about her work is that the painting’s structure itself is also a marker of ambiguity. But even such painterly sophistication is not so rare; what really sets Owens apart is her dexterity at both kinds of ambiguity.

Take, for example, one of her works from 2013 (not pictured): a dense field of hot pink slashes on a pale lavender ground, overprinted with fragments of differently scaled grids in cadmium green, turquoise, or black—and that’s just the background. This eye-dazzler is covered with lots of wheels (eighteen!) of different sizes. Rubber tires on metal hubs or spokes—the kind of wheels found on tricycles, wagons, grocery carts, some brightly colored, Day-Glo even—are mounted on the canvas, parallel to its surface with just enough clearance to freely turn. Wheels punctuate the composition in a jaunty, syncopated rhythm—a chariot race without the chariots, a riot of implied motion on top of an already pushy abstract painting, not going anywhere, in perpetuity. Hilarious, breathtaking, circus glamour. The ghosts of Jean Tinguely and Charles Demuth, the Dada mind of Francis Picabia.

Then there are, across a narrow corridor from each other, a pair of pale robin’s-egg blue paintings of medium size, both covered with clusters of what look like hand-drawn, variously sized, random numbers, which, on a closer look, are made with thin, raised ribbons of black acrylic paint. The numbers on one painting exactly match those on the other in size and position, but reversed, as if one painting is looking at itself in a mirror. A hand-drawn mirror image of a random jumble of numbers. The conceit is witty and cartoony-weird. I have no idea what it means. It’s engaging and fun to look at. But the color! Other artists might have an idea for mirror writing in painting, but I doubt they would have expressed it with the shade of blue found in a tea shop or a girl’s bedroom. The surprising color choice gives the paintings an identity separate from whatever idea generated them.

One of Owens’s many strengths is her use of scale—the big painting and the internal relationship of shapes to the whole. She’s at ease with the large New York School canvas. Even though a lot has happened since, and no matter how anachronistic it may seem, our yardstick for serious painting is still shadowed by the achievements of the New York School. One of the hallmarks of that type of abstract painting is the “all-over” composition, in which the paint reaches all four edges of the canvas equally, and the eye roams through the picture in a nonhierarchical way. Owens extends and reinvigorates that tradition when she brings her affinity for textiles to the all-over, large-scale work.

Laura Owens/Ringier CollectionLaura Owens: Untitled, 137 1/2 x 120 inches, 2013

Textiles, weaving, 2-D design—all are full-fledged high art now, and Owens has no problem letting herself be influenced by them. Another of her works from 2013 is an almost twelve-foot-tall painting with line drawings of cats playing with balls of yarn dispersed over its white ground (see illustration above). Some drawings are carefully executed and others more slapdash; some are in plaid, others in the grid patterns that Owens is so fond of. Here and there are touches of spray paint in raspberry, yellow, and blue. It’s like a motif that one might find on a young girl’s flannel pajamas, something a sophisticated seven-year-old would find amusing and a bit arch. This is a painting that says: You want all-over? OK, how about this? This is the way to be ambitious now. You don’t always have to throw your Sunday punch.

The installation at the Whitney, overseen by Owens and Scott Rothkopf, the museum’s chief curator and deputy director for programs, restages exhibitions from Owens’s principal galleries in Los Angeles, New York, London, and Cologne. Walls were built, hanging plans copied—and the result resembles a Laura Owens theme park instead of a traditional retrospective that aims to situate works in relation to one another as well as to deliver the greatest hits. The reason is that much of Owens’s thinking about her work, its generating impulse, is tied up with the notion of site specificity. There is a continuous run of invention and forward-thinking bravado; painting ideas ricochet around the rooms. You can either run alongside and try to hop on, or just get out of the way.

I especially admire the way that Owens integrates her various sources and influences into her own pictorial vocabulary. The ability to be influenced in a productive way, which includes making one’s influences legible to the audience, might be essential to success in today’s art world—so much so that art schools should have a class in how to identify and use the myriad external points of reference from the visual world with which we make up our hybrid, signified selves.

Let’s recognize too the generative power of folk art and all of its derivations, including gift shop, decorator store, calendar, and greeting-card art, as well as technical drafting, computer graphics, and the look of giveaway supermarket magazines. Owens can mash or stack up all of the referents, compass points, familial overlaps, and personal curiosities into one tightly compressed, orderly bundle. Hers is the painting equivalent of the machine that turns cars into a solid, dense cube of crushed metal. Owens’s work doesn’t look squashed—quite the opposite: its surface is open and inviting, but the structural components appear indivisible.

As an image gatherer, Owens is peripatetic and astute. An estate sale yielded the source for an especially winning group of paintings from 1998: a crewel-work pillow enlivened with a swarm of honeybees arrayed around an orange-hued, dome-shaped hive (see illustration below). As much as anything else in the show, the sure touch involved in successfully translating that image of wishful good vibes to a painting convinced me of Owens’s superior pictorial instincts. Her bees are made with thick blobs of black and yellow acrylic paint, their wings rendered as delicate black lines with veins of iridescent white, and the dozen or so insects hover and float on an expanse of unpainted cotton canvas. The four tones—coral orange, cadmium yellow, deep burnt sienna, and umber, which together create an illusion of volume—form a perfect chord of color harmonies. The beehive paintings are secure in their directness and shorthand styling. They have what in the theater is called good stage manners: every decision is bold, clear, and appropriate.

Laura Owens/Gavin Brown’s enterprise, New York and Rome; Sadie Coles HQ, London; and Galerie Gisela Capitain, CologneLaura Owens: Untitled, 66 x 72 inches, 1998

Owens’s work is part of the American pragmatic tradition. The intellectuality in her work feels new to me. As a thinker, Owens is self-reflexive, curious, and matter-of-fact. As a stylist she’s resourceful and fearless. She’s braided into the same rope with a few older painters, such as Albert Oehlen or Charline von Heyl, or, closer to her own age, Wade Guyton or Seth Price, but she doesn’t share their angst or spiky black humor. In fact, Owens doesn’t seem to have a nihilistic bone in her body. Her work incorporates ambiguity as part of its structure, but the paintings are not difficult or withholding. Like a character in a screwball comedy, they wear an expression that says, I don’t know what happened, Officer; one minute I was just standing here, and then

The 663-page catalog that accompanies the exhibition is generous and full of anecdotes. Many voices are heard: the artist’s mother, her fellow artists, dealers, novelists, curators, and critics all contribute their shadings to Owens’s rise. The book is a hybrid literary form, a bildungsroman with pictures: the story of an earnest young woman from the provinces with an appealing, straightforward manner who, through luck and pluck and the benevolent intervention of some well-placed patrons, grows up to see her dream of being an artist realized.

A blizzard of faxes, letters, clippings, photographs, and invoices to and from Owens and her dealers and peers shines a light on the world of professional art schools and the international network of galleries and alternative spaces that are the mechanisms of generational renewal. Owens approaches art-making with a democratic spirit, in the naturally collegial way that comes easily when you’re young. The catalog represents a welcome acknowledgment that any career of real substance is also a group project. That Owens so readily embraces that reality may be a gender thing; her male counterparts still cling for the most part to the “prickliest cactus in the desert” mode, tiresome though it has become.

It has been terribly important to Owens that her paintings call attention not only to the conditions of their own making, but also to the social nexus in which they participate. The work of art is one link in a chain that includes gallerists, curators and critics, her fellow artists, and of course the viewer. This focus on the social system of art is, in part, the legacy of conceptual art as it has been filtered through the language of painting. Owens makes being a good citizen into an aesthetic.

Probably many people can identify with the trajectory of Owens’s life. I know I can. Midwestern and middle-class, Owens as a teenager looked at paintings, noting which ones held her attention and why. She developed her skills at summer art camp, going off to college at RISD, and on to CalArts for grad school. This is how artists today are made: from avid teenaged drawer and painter to RISD adept who then finds herself questioned by the hard-liners for whom painting was a lost cause. It’s a great recipe when combined with talent and drive. By the time Owens got to grad school, she had sufficient self-confidence to survive the ritual hazing known as the group critique. Although Owens put in her time at CalArts, that hotbed of conceptualism with the arch-enforcer Michael Asher, it doesn’t seem to have done her any harm, possibly because she was so clear about her vocation from an early age.

What a relief to have a painter who didn’t get the stuffing knocked out of her at the Whitney Independent Study Program or its equivalent. A strong design sense, internalized early on and reinforced at critical junctures by encounters with Chinese painting, or Matisse, or Bauhaus textiles, can carry you through a whole career. You can add the conceptual icing, which is what the tuition buys at schools like CalArts, but you’d better bring your own cake.

Owens takes the world of design, especially children’s books and child-friendly graphics, and teases out the forms that can be recast as art. Put another way, she has an instinct for choosing the right thing and knowing what to do with it. Her work has no anxiety about being nerdy, or not much anxiety period. This is art that’s comfortable wearing fuzzy slippers.

Owens’s work is the apotheosis of painting in the digital age. The defining feature of digital art—of digital information generally—is its weightlessness. Images, colors, marks, text, are essentially decals in a nondimensional electronic space. They exist, but only up to a point. They can excite the mind, but you can’t touch them. An air of weightlessness remains even when they are transferred to the physical surface of a painting. If these images were to fall, nothing would catch them. They’re like Wile E. Coyote running off a cliff, just before he realizes he’s churning air.

Owens is part of the ongoing process of loosening the rules governing how a painting acquires meaning that began with the young painters of the late 1970s and early 1980s (of which I was one). The general idea was to dissolve the gravitational force field that held the disparate elements of a painting together, like atoms bundled into a molecule. This “glue” was, and is, invisible to most viewers, just as it is in life generally. These artists wanted to make it visible, as though shining a black light on a painting to reveal the cracks in its surface. And they wanted to move painting out from under the infuriating drone of high-culture pieties, which had lost much of their credibility. There was a low-grade, cheerful nihilism in much of that work, but it didn’t go very far on the rebuilding side.

The whole project eventually got absorbed into criticism as the original artists moved on to other things. The field lay fallow for some years. Eventually, it fell to artists of Owens’s generation to replant. Owens, whether using embroidery, the computer, painted forms, or screen printing, stumbled on a hidden truth that has been more or less obscured since the late 1980s: the relationship between form, visual logic, and emotional catharsis is itself ambiguous.

A strange thing happens after you spend some time at the exhibition. Once you become acclimated to the endless malleability of the prosaic that is at the heart of Owens’s visual syntax—Oh, this is actually not something found at the mall—what follows is like a tiny paint bomb that detonates in the mind’s eye, which leads in turn to a strange and unexpected hollow sensation. It’s the tart pinch of a correction you feel after the cheering stops. The effect of Owens’s work, with its ebullient leap-frogging into worry-free zones of pictorial busyness, can sometimes feel, to paraphrase John Haskell, like waiting for the happiness to arrive.

Laura Owens/Collection of the ArtistLaura Owens: Untitled, 108 x 84 inches, 2016

I can’t fully explain why, but walking through the show I had the feeling, as I rounded a corner, of a dream of falling, one that was deprived of its conclusive ending when you hit the floor and wake. Pictorial free-fall—it’s thrilling, and quite unnerving. But no matter, just let it go. I was reminded of what the iconoclastic film critic and painter Manny Farber wrote in 1968 at the end of his review of Jean-Luc Godard’s La Chinoise: “No other filmmaker has so consistently made me feel like a stupid ass.”

The last room at the Whitney contains three large paintings that were part of a complex installation made for the CCA Wattis Institute in San Francisco in 2016. Though clearly related to other paintings in the show, they have a different gravitas. Their terse, shimmering surfaces are made from black-and-white fields of digital static. Amorphous sprays of x’s and o’s and other pixilated data are the result of various objects put through a scanner, digitally manipulated, enlarged or reduced, allowed to play out in large swaths, and sometimes corralled into shapes with drop-shadow edges. These compositions are then printed on paper and glued to aluminum panels. The silvery-gray tones and irregular patterns recall barren terrain seen from the air. The paintings are bisected near their edges by vertical and horizontal bands of white, and here and there with bits of color (see illustration above). They suffer in reproduction—there is a sound component to them as well—but their austerity, resourcefulness, and sense of resolve are impressive. I found them, and much else in the show, beautiful, dramatic, and moving, and a strong case for painting’s digitally assisted future.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/Ojqzu6nwoxI/

Stephen Shore, Seer of the Everyday

Stephen Shore/The Museum of Modern Art, New YorkStephen Shore: Breakfast, Trail’s End Restaurant, Kanab, Utah, August 10, 1973

Most photographers use the camera as a tool of memorialization. They choose which moments to rescue from the enormous trash heap of everyday existence, by referencing some sort of defined visual hierarchy, a scaling of what scenes deserve to be immortalized. A “good” photograph happens when reality lines up in a way that is more valuable than other arrangements (see Cartier-Bresson’s “decisive moment”), and a good photographer, therefore, is someone with a knack for recognizing that value and clicking the shutter.

That is not how Stephen Shore uses a camera, and his retrospective at the Museum of Modern Art makes this apparent. The exhibition casts a wide net, including the anomalous periods when Shore worked abroad, but its main focus is his many photographs of hyper-quotidian America, our stalest shades of red, white, and blue. These quiet and straightforward pictures—of food, buildings, cars, and toilets—show that Shore is best understood as a photographer uninterested in photographing what is agreed to be worthy of capture.

But the true subject of Shore’s work cannot be found in the contents of his photographs alone. When Shore frames a very American breakfast spread at a restaurant in Utah, we may ponder the scene, and what our country’s particular styles of consumption say about us, but to focus entirely on the reality represented would be to miss what makes this extraordinarily ordinary photograph profound. Shore shoots the table from above, from the familiar perspective of someone who has walked away, maybe to go to the bathroom, and returned to find their food served. The silverware is haphazardly placed, the cantaloupe sits askew, the edges of the frame seem to include and exclude details at random. The photograph, a kind of highly intentional accident, is a reminder that when it comes to Shore, it’s not always about what we are seeing, but how we are seeing it.

Stephen ShoreStephen Shore’s series “Circle No. 1,” 1969; click to enlarge

Shore has had a half-century-long preoccupation with the difference between the way we see the world when we move through it, and the way we see it in images. His hope, he has often said, is to make photographs that look “natural.” For Shore, this means an erasure of mediation, a lightening of his hand in the creation of an image. Writer Lynne Tillman called it Shore’s attempt at “willed objectivity.” Photographer Joel Sternfeld described it as Shore’s “Zen-like, awakened unconsciousness.” Shore simply says he wants his photographs to “feel like seeing.”

With this in mind, the many phases of Shore’s career, which unfold chronologically in MoMA curator Quentin Bajac’s sprawling layout, begin to look like distinct strategies employed to achieve this effect. At first, he experimented with blunter instruments, making images within tight frameworks reminiscent of those designed by the artist Ed Ruscha. In 1969 Shore made “Circle No. 1,” a series composed of photographs of a friend taken from the perspective of all eight points on a compass. For “Avenue of the Americas” (1970) he took a picture facing north on every corner of New York City’s 6th Avenue between 42nd and 59th streets. These photographs—plain, repetitive and sometimes even poorly exposed—may not be very interesting when examined individually, but that’s not the point. Systems like these allowed Shore to remove himself from the equation, to make an image without entirely dictating its contents.

Stephen ShoreA detail from Stephen Shore’s “Avenue of the Americas,” 1970; click for full series

Unlike Shore’s friend and mentor, Andy Warhol, though, Shore did not want “to be a machine.” He wanted to retain some sense of authorship. He wanted to bring the conceptual spirit of Warholian pop to the documentary tradition of Walker Evans, Robert Frank, and the other great photographic chroniclers of the American condition. So Shore eventually left the systems behind. In 1971 he experimented with a toy camera called the Mick-a-Matic (a Mickey-Mouse-shaped device that is on display at the exhibition) in an attempt to amateurize his work, to remove traces of sophistication. It was around this same time that Shore made his controversial switch from black and white to color. (The renowned modernist photographer Paul Strand famously told Shore that this was a disastrous career move.) Until that point, black and white film served as an aesthetic marker that differentiated the medium from the way we see reality; it elevated a photograph from banality to a work of art. Contemporaries like William Eggleston and Joel Sternfeld were also pioneers of color, but their use of it often added dynamism to their images. Shore, on the other hand, used color to bring photography back down to realism.

It makes sense, then, that Shore has been so drawn to Instagram recently, and the photographs he has made while using the app are presented in the exhibition via iPads. Instagram is, after all, definitively diaristic. It might not always be used in that mode these days—when aspirational filters and curation seems to dominate—but the original premise called for unmediated snapshots of everyday life. Shore seems to use the platform in this manner, posting photos of nature, his friends, and pets, creating a large visual repository of sorts. Scrolling through his feed, it’s clear that his artistry is often a strategic stripping away of artfulness. In that respect, Instagram can seem a more natural home for his photographs than the white walls of a prestigious museum.

It’s telling, then, that Shore has expressed mixed feelings about one of his most famous photographs: U.S. 97, South of Klamath Falls, Oregon (1973)—a billboard with a mountain range painted on it in the foreground, a real life mountain range in the background. The picture, clever, beautiful, and a little sad, is a “good” photograph, as close to a “decisive moment” as Shore gets. But it’s when he uses the mundane the way a sculptor might use clay, manipulating a material that seemingly has no inherent character or value, that Shore’s mastery is on full display. The pictures that might have visitors to the museum scratching their heads, thinking, “I could do that,” reveal Shore not just as one of his generation’s great photographers, but as a rare kind of seer—the sort that, with his eyes, can change the way we use ours.

Stephen Shore: U.S. 97, South of Klamath Falls, Oregon, July 21, 1973

“Stephen Shore” is at the Museum of Modern Art through May 28.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/JuscKbddzMI/

The Folk ‘Jews’ of Spain

Manuel Balles/NurPhoto/Getty ImagesA Vaquilla member from Fresnedillas de la Oliva in costume, at a parade in Zamora, Spain, 2015

Forty-odd bachelors in flowered pajamas, tubular bells extending from their rears, chase another, rigged up as a cow, around a Spanish village. Others, playing a seedy official and his libertine wife, roam the streets flashing pornographic sketches at townspeople, blackmailing them for alms under threat of revealing naked pictures of them, as well.

The runners are known, interchangeably, as Judíos and Motilones. The first means Jews; the meaning of the second is unclear. The mêlée breaks for Mass, at the height of which the runners kiss the priest’s stole and spit gold coins from their mouths. The pursuit then resumes, capped by a gunshot in the air to signal the cow’s defeat. The day’s festivities conclude with a grand feast, the “slaughtered” cow cheerfully making table rounds to fill guests’ cups with its own “blood.”

The Fiesta de la Vaquilla, or Heifer Festival, takes place every January 20 in Fresnedillas de la Oliva, a village west of Madrid in central Spain with a population of approximately 1,500. It commemorates the martyrdom of San Sebastián, patron saint of cattle health, but its roots lie in ancient Vetton pagan rituals in which men performed feats of endurance and strength to attract mates—with Celtic, Roman-era, and Catholic motifs having worked their way in later. While most components of the festival are symbolic in nature—as in the Eucharist, the blood is, in fact, wine—others are pragmatic: the villagers were once so poor, it is said, that they were forced to sew their curtains, which were floral-patterned, into costumes, then unstitch them and re-hang them when the festival was over. The flower part stuck.

What actual Judíos have to do with any of this is unknown. Jews arrived in what is now Spain at least as early as the third century BCE. By 1492, there were about quarter of a million Spanish Jews. The cow’s escape attempt may reflect time-honored strategies of hiding cattle from the tax collector—a thankless job into which Jews were periodically pressed. Or it might jibe with local yarns about a bull incited by outsiders to break out of its pen and tear through town. In Europe, Jews were long believed to pollute wells, spread disease, and kill Christian children. Even minor misfortunes were often seen as the Jews’ fault.

In any case, a Jewish legacy is felt. Legends abound of hauntings by Jews massacred nearby. Other tales link Jews to the local seat of King Philip II, El Escorial. Though the palace came after the Inquisition, by which time Spain’s Jews had already been murdered or forced to convert, or had fled, it is thought to resemble Solomon’s temple, and prominently features statues of Philip’s regal models: Solomon the Wise himself, and the Biblical warrior-king David.

In Spain, as in Poland, Jewishness is a protean concept. Some Fresnedillans insist that Judíos are not based on real Jews at all, but rather on commoners who were smeared by aristocrats as marranos, or “Jew-pigs”—in the same manner that some European soccer fans trade the epithet “Jew” as a slur today.

Motilone is an antiquated word that translates as “the Shorn One,” a fact that has led some to link the character to the Catholic monks known for short tonsures. If so, these clergy have perhaps fallen prey to the inversion typical of pre-Lenten carnival, in which piety and social hierarchies are overturned, while sin and chaos reign. Yet Motilone has another connotation, as the name visited by the Conquistadors upon the close-cropped Bari and Yukpa peoples of South America.

The coin-spitting may reflect medieval associations of Jews with commerce; or it may allude to the influx of wealth to Spain from that colonized New World. Indeed, the invasion of Bari and Yukpa territory was spurred on by notions that lighting strikes there turned stones to gold. Or the ritual might, again, recall tax collection: both Jews and colonial subjects were perpetually, ruinously, levied. In the Vaquilla, what might further unite the two groups is the kissing of the priest: Jews and indigenous peoples share a history of coerced conversion.

Transmitted orally through countless generations, Vaquilla practices and lore tend to be shrugged off, unquestioned, as givens. Of more interest to villagers than the origins of ethnic tropes is pulling off the endlessly complex event itself. Festival talk and planning consume village life throughout the year, with highlights chuckled over, cows’ performances critiqued, costumes fashioned, menus debated, youths initiated into bachelor roles as coming-of-age rite, and funds raised for both festival and town-improvement needs. Neighboring villages host modest Vaquillas of their own, but the one in Fresnedillas is by far the most elaborate.

vaquillafresnedillas.esThe Cow running with Motilones and Judíos in the Fiesta de la Vaquilla at Fresnedillas de la Oliva, Spain, 2011

While local tourists do attend, the Vaquilla’s function seems mainly to be sustaining community cohesion through time, discord, and change. The town has long been politically divided—literally so during the Civil War, when the battlefront ran along its main street (the only time in history, townspeople proudly say, the Vaquilla was canceled). The grave of Francisco Franco, Spain’s dictator of thirty-six years, lies in the area. Some elders still, it is said, pay homage there, passing lifetime hard-line communist neighbors, to whom they barely speak, en route. Struggles remain today between right-wing former town leaders accused of embezzlement and a socialist group now in power. But the festival must go on. The Vaquilla forces opposing sides to work together.

Jobs in Spain’s small villages have always been scarce, obliging young people to move elsewhere. Fresnedillas is especially imperiled; with new highways, Madrid is only an hour away. But the Vaquilla keeps its sons and daughters coming back. One heavily tattooed young man I met had left town as a disaffected teen, but then—finding big-city life lonely—returned to vie for roles in the festival he had once scorned. Others go back and forth between the ancient and the modern worlds. The Vaquilla’s chief organizer is, by trade, an astrophysicist at a NASA Apollo station, incongruously near the crumbling stone village.

In 1994, officials from the Simon Wiesenthal Center, the Jewish institute that runs Los Angeles’s Museum of Tolerance, were flying to Spain to present an award to Queen Sofía when they noticed a feature on the Vaquilla in an inflight magazine. In uproar over what they regarded as a shocking anti-Semitic display, the Americans appealed to the European Commission to intervene. On the basis of tradition, the festival was left unchanged. The following year, however, an ABC television network program intercut images of Auschwitz and a German neo-Nazi march with footage of Fresnedillas’s Vaquilla in order to illustrate the continuity of European hatred of Jews. When the show was broadcast in Spain, it was hotly debated around the nation. In an effort to clear its name, the village brought a lawsuit. The US ambassador and the Spanish station apologized, the suit was dropped, and finally the public scrutiny of the town abated. Fresnedillans felt injured, and still do, twenty-three years later.

Beyond the Vaquilla, Jews often appear as a minor figures in other traditional Spanish settings, such as Moros y Cristianos (Moors and Christians) pageants, which celebrate the fifteenth-century Catholic rout of Muslim rule. In one of these some years ago, the Judío contingent was accompanied by a marching band playing the theme song from Exodus, Otto Preminger’s 1960 blockbuster that starred Paul Newman as a fighter for Zionist independence from the British. While this conflation of Spain’s long, proud, and tragic Jewish history with Hollywood kitsch could cause offense, as a Jew and a scholar myself, I simply find the music choice intriguing.

In recent years, curiosity about Europe’s Jewish past has led to the invention of new festivals, in Spain and beyond, striving in Disneyland-like forms to reenact it. These festivals serve very different purposes. Some, which tend to be well-informed and respectful, seek to revive lost Jewish arts, cultural, and religious traditions and educate the public about them. Others play upon philo-Semitic notions of the Jew as an exotic, soulful, and mystic figure. Still others are merely spectacles cooked up to draw national heritage grants and tourist dollars. One, in the village of Hervás, I found especially dispiriting. Its central event, a late-night play about the lead-up to the Inquisition, has gradually been edited down into a Romeo and Juliet-style Catholic-Jewish love story. By day, activity in the village revolves around a “Jewish” market selling cheap trinkets with no relation to the festival’s theme and pork-based snacks. Fresnedillas’s genuinely folkloric Vaquilla lies in a completely different category. 

As a cultural fixture of probably more than two millennia, the Vaquilla is a rare find—worth preserving in as much of its totality as possible. But would it hurt to drop “Judíos” as the name of one of the bachelor groups? Hardly. As living history—not untouched, but embraced by time—what the Vaquilla shows is that traditions at once endure and evolve.

Girls now participate in children’s Vaquilla events that were once reserved for boys. Women have not yet joined the bachelor run, but are invited to do so. And with an influx of immigrants from Ecuador, Morocco, and twenty-one other countries that has made Fresnedillas one of the most diverse towns in the country, the Vaquilla has gradually grown multicultural. Among those supporting it are the village’s Spanish Jews. In meaning so much that no one can quite define them, “Judíos” and “Motilones” come to mean not much at all.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/Vy6tmZPjSQs/

The Great British Empire Debate

Leeds Museums and Galleries (Leeds Art Gallery)/Bridgeman ImagesGeorge William Joy: General Gordon’s Last Stand, circa 1893

The sun may have long ago set on the British Empire (or on all but a few tattered shreds of it), but it never seems to set on the debate about the merits of empire. The latest controversy began when the Third World Quarterly, an academic journal known for its radical stance, published a paper by Bruce Gilley, an associate professor of political science at Portland State University in Oregon, called “The Case for Colonialism.” Fifteen of the thirty-four members on the journal’s editorial board resigned in protest, while a petition, with more than 10,000 signatories, called for the paper to be retracted. It was eventually withdrawn after the editor “received serious and credible threats of personal violence.”

Then, in November, Nigel Biggar, regius professor of theology at Oxford University, wrote an article in the London Times defending Gilley. Biggar saw Gilley’s “balanced reappraisal of the colonial past” as “courageous,” and called for “us British to moderate our post-imperial guilt.”

Biggar also revealed that he was launching a five-year academic project, under the auspices of Oxford University’s McDonald Centre for Theology, Ethics and Public Life, called “Ethics and Empire.” The project aims to question the notion prevalent “in most reaches of academic discourse,” that “imperialism is wicked; and empire is therefore unethical” and to develop “a Christian ethic of empire.” Fifty-eight Oxford scholars working on “histories of empire and colonialism” wrote an open letter condemning the project as asking “the wrong questions, using the wrong terms, and for the wrong purposes.” A second open letter with nearly two hundred signatures from academics across the globe expressed “alarm that the University of Oxford should invest resources in this project.” Another Oxford historian of empire, Alexander Morrison, denounced these open letters as being “deeply corrosive of normal academic exchange” and encouraging “online mobbing, public shaming and political polarization.”

Like all such debates, this latest controversy comprises many threads. Was colonialism good or bad, and for whom, and in what ways? How should one debate these questions in academia, and in politics? And why has this debate erupted now?

Apologists for colonialism argue that Western powers brought economic development, the rule of law, and liberties to its colonies. According to Gilley, colonialism stressed “the primacy of human lives, universal values, and shared responsibilities” and constituted a “civilizing mission” that “led to improvements in living conditions for most Third World peoples.” For Biggar, it introduced “order” to the non-Western world. And for many British historians, the British Empire was preeminent in achieving all this. As Niall Ferguson put it in his 2003 book Empire, “no organization in history has done more to promote the free movement of goods, capital and labour than the British Empire.… And no organization has done more to impose Western norms of law, order and governance around the world.”

It is an argument that confuses economic development and political liberalization, on the one hand, with colonialism, on the other. The British Empire began to take shape during the early seventeenth century, with the English settlement of North America and Caribbean islands, and the creation of corporations, such as the East India Company, to administer colonies and overseas trade. The origins of colonialism lie, in other words, in a time when Britain was still a feudal kingdom, with a parliament but little democracy, and when manufacture was dominated by the handloom rather than the factory.

If Britain could, over the next 250 years, transform itself from a backward, undemocratic state into a modern industrial power, why could not any of the nations it colonized have done so, too? Why assume that it was only colonization that allowed India or Ghana to develop?

One answer might be that the countries that Britain colonized were even more backward than Britain was at the time, and lacked the social and intellectual resources to transform themselves as Britain did. But the reality, at least in some of its colonies, was the opposite. Consider India. At the beginning of eighteenth century, India’s share of the world economy was 23 percent, as large as all of Europe put together. By the time Britain left India, it had dropped to less than 4 percent. “The reason was simple,” argues Shashi Tharoor in his book Inglorious Empire. “India was governed for the benefit of Britain. Britain’s rise for two hundred years was financed by its depredations in India.” Britain, Tharoor argues, deliberately deindustrialized India, both through the physical destruction of workshops and machinery and the use of tariffs to promote British manufacture and strangle Indian industries.

William Vandivert/The LIFE Picture Collection/Getty ImagesAn old woman starving in the street during the Bengal Famine, India, 1943

It was not just India from which resources flowed back to Britain, though in different countries it happened in different ways. Britain’s West Indian colonies were at the heart of the “triangular trade” by which goods from Britain were used to purchase slaves from West Africa who were taken to the Caribbean, and from whose labor great riches flowed back to British merchants in Bristol, Liverpool, and London.

The historian Robin Blackburn notes that around 1770, total investments in the domestic British economy amounted to £4 million (about £500 million, or $700 million, in modern values). This investment “included the building of roads and canals, of wharves and harbors, of all new equipment needed by farmers and manufacturers, and of all the new ships sold to merchants in a period of one year.” Around the same time, profits from the slave trade and slave labor came to £3.8 million. Not all profit was reinvested but, suggests Blackburn, “slave-generated profits were large enough to have covered a quarter to a third of Britain’s overall investment needs.” Without the slave plantations, it is unlikely that Britain would have been able to industrialize, or to forge an empire, as it did.

What of democracy and liberalism? The Enlightenment helped transform the intellectual and moral culture of Europe in the eighteenth century, and laid the ground for modern ideas of equality and liberty. “All progressive, rationalist and humanist ideologies,” as the late Marxist historian Eric Hobsbawm put it, “are implicit in it, and indeed come out of it.”

But if the European Enlightenment was crucial to the development of progressive social ideals, European colonialism as a practice denied those ideals to the majority of people. It maintained slavery, suppressed democracy, and was rooted in a racialized view of the world. It was not colonialism but anticolonial movements that truly developed Enlightenment ideals. From the Haitian revolution of 1791, the first successful slave revolt in history, to the Quit India movement, to the liberation struggles of Southern Africa, the opponents of empire demanded that equality and liberty applied to them, too.

As the Martinique-born Algerian revolutionary Frantz Fanon wrote in his 1961 book The Wretched of the Earth, “All the elements of a solution to the great problems of humanity have, at different times, existed in European thought.” The problem was that “Europeans have not carried out in practice the mission which fell to them.” So, it was left to the anticolonial struggles to “start a new history of Man.”

But, respond defenders of empire, the “new history” created by anticolonial struggles has been disastrous. There is no gainsaying that, in the decades following independence, many former colonies descended into chaos and worse. The reasons are manifold, and partly lie, as Tharoor’s Inglorious Empire shows, in the policies enacted by the colonial powers themselves before independence and in the economic and political conditions imposed by Western powers after. The horrors of the postcolonial world seem, however, to have created an amnesia about the horrors of colonialism. Gilley, for instance, commenting on the current disorder in the Democratic Republic of Congo, suggests that “Maybe the Belgians should come back.”

Belgian colonialism instituted an almost unimaginable reign of barbarism and terror. King Leopold II laid claim over the “Congo Free State” as part of the European “Scramble for Africa” at the end of the nineteenth century. He was, he insisted, acting on humanitarian motives—to abolish the slave trade. In reality, Belgium waged a war of enslavement. Congo was transformed into a mass labor camp, in which the most brutal of punishments were inflicted for the most trivial of offenses. If villagers did not meet their rubber quota, their children’s hands and feet were sometimes chopped off. Between the 1870s and the 1920s, the population of the Congo fell by a half—an estimated ten million people lost their lives to Leopold’s brutal regime. It was, observes the writer and historian Adam Hochschild in his book King Leopold’s Ghost, “a death toll of Holocaust dimensions.”

Supporters of the British Empire argue that its rule was far more benign than the terror of the Belgian Congo. That is not to place the moral bar very high. But here, too, there is considerable historical amnesia. From Tasmania, where a whole people were virtually wiped out for resisting British rule in the “Black War” of the 1820s and 1830s; to Jamaica, where the Morant Bay rebellion of 1865 led to a six-week rampage by British troops, during which more than four hundred people were killed and almost the same number summarily hanged; to Ireland and a history of bloody terror from Oliver Cromwell’s savage war of conquest in the seventeenth century to what the Irish historian Thomas Bartlett called the “universal rape, plunder and murder” wrought by British troops after the 1798 Irish rebellion, to the brutal acts of revenge exacted during the Irish War of Independence by the Black and Tans, a British-controlled paramilitary police force, in the 1920s; to India, where some three million died in the Bengal famine of 1942–1943, caused by the British decision to export rice, for use in the war theaters and for consumption in Britain, from a state that usually imported rice, and at a time of great local shortage—the experience of the “order” of the British Empire was cruel and ferocious. Even Niall Ferguson, in his paean to the British Empire, acknowledges that “when imperial authority was challenged… the British response was brutal.”

Hulton-Deutsch Collection/Corbis via Getty ImagesBritish police, known as Black and Tans, searching a suspected Irish republican, Ireland, 1923

Perhaps the most egregious claim of the apologists is that the British Empire should be lauded because it helped end slavery; this was “one of the undoubted benefits of colonialism,” as Gilley puts it. Imagine an arsonist who burns down a building, killing many of its inhabitants. When the people inside try to flee, he forces them back. Eventually, after many hours of this, he decides to help the people, both inside and outside the building, who are trying to put the fire out. Would we say of this arsonist, “Yes, he may have burned the building down and killed dozens of people, but what really matters is that he threw some water over the fire at the end?” That is akin to the argument of the apologists for empire. 

In 1807, Britain passed a law banning the slave trade. But for three centuries, that trade had been dominated by Britain; three centuries of savage enslavement, pitiless brutality, and casual mass murder. Twelve million Africans are thought to have been transported to the Americas, half of them in the peak years of the Atlantic slave trade between 1690 and 1807. In those peak years, about half of these slaves were taken on British ships. Historians estimate that at least one in ten, and possibly one in five slaves, died on the Middle Passage, the journey from Africa to the New World. This suggests that half a million Africans may have lost their lives while being transported on British ships.

“It is important to remember,” the historian David Olusoga observes in his recent book Black and British: A Forgotten History, “how few voices were raised against slavery in Britain until the last quarter of the eighteenth century. The Church of England was largely silent on the issue, as were most of the politicians.” This was inevitable, he adds, because “too much money, too many livelihoods and too much political power were invested; millions of British people lived lives that were intimately connected to the economics of slavery and the sugar business.”

It was not the British Empire that began the struggle against enslavement, but slaves themselves, and radicals in Europe. When slaves rose up, the British response was savage, and not just in British colonies. In Haiti, after the revolutionaries defeated the French, Britain sent more than 20,000 men to try to retake the island as a British colony. They, too, were humiliatingly defeated by the army of former slaves.

The reasons that led Britain to eventually ban the slave trade in 1807 are still debated by historians. There are many threads to the story: the growing social influence of both working-class radicalism and Christian moral evangelism; the decline in the political power of West Indian planters; the entrenchment of Enlightenment ideals of equality; the changing economics of plantation production; the strategic advantage that Britain now had in being able to police its imperial rivals, such as France and Spain, which were still involved in the slave trade.

Whatever the causes of the ban, the historian of slavery James Walvin has observed, “the discussion about British morality and sensibility in 1807 has served to obscure what went before. And what went before was not only important to Britain, but it was brutal on a scale which, even now, is scarcely credible.”

As Marika Sherwood shows in her book After Abolition, Britain continued to profit from the slave trade even after 1807. Britain, she writes, “not only continued to build slaving vessels, but it financed the trade, insured it, crewed some of it and probably even created the many national flags carried by the vessels to avoid condemnation.” Slavery flourished unhindered. It took another three decades before Britain ended slavery itself with the 1833 Abolition of Slavery Act; and even then, not throughout the empire. Not until the twentieth century was slavery legally abolished in colonies such as Burma, Nigeria, and Sierra Leone.

Even where the Abolition of Slavery Act did end slavery, it did not end slave-like conditions. Slave owners were handsomely compensated for the loss of their “property.” Some £20 million (about £16 billion, or $22 billion, in today’s values) was set aside by the British government to recompense 46,000 slave owners. Not only did the slaves themselves receive no reparation, but, under the Act, they were compelled to provide forty-five hours of unpaid labor each week for their former masters, for a further four to six years after their supposed liberation. “In effect,” writes David Olusoga, “the enslaved paid part of the bill for their own manumission.”

The suggestion that the British Empire was good because it ended slavery reflects historical amnesia of the most decadent kind. And yet, as the historian Katie Donington puts it, “the ‘moral capital’ of abolitionism” has continually provided “a means of redeeming Britain’s troubling colonial past.”

The arguments for the moral good of colonialism are, then, threadbare. Many academics are, however, concerned by demands for the retraction of Gilley’s paper, or for Oxford University to reassess Biggar’s Ethics and Empire project. A letter signed by a number of leading scholars, most of whom politically and intellectually disagree with Gilley and Biggar, expressed alarm at the “censorious attitudes and campaigns directed at Third World Quarterly.” The signatories saw it as part of a wider problem: that of “a rising tide of intolerance on campuses and in the academic profession, with certain scholars and students seeking to close down perspectives with which they disagree, rather than debating them openly.”

Critics of Gilley and Biggar protest that they are not calling “for the curtailing of the writer’s freedom of speech” but simply want to maintain academic “standards.” The question of academic rigor is important; nevertheless, the distinction between maintaining standards and demanding the suppression of what are regarded as morally or politically unacceptable opinions is a fine one. “Our default reaction to cases like this,” argues Justin Weinberg, an associate professor of philosophy at the University of South Carolina and the editor of Daily Nous, a popular philosophy blog, “should not be ‘retract!’ but rather, ‘rebut!’” If Gilley’s article is full of mistakes, he observes, “then the job of the experts is to point this out and help us learn from them, so people are less likely to make them again.”

The “rebut rather than retract” advice is particularly important because the issue of colonialism is a matter not just for academia but for the wider political culture. According to opinion polling, some 43 percent of Britons think that the British Empire was a “good thing” and 44 percent that British colonialism is “something to be proud of” (compared to 19 percent who think the empire was bad, and 21 percent who believe that colonialism is a matter for “regret”). Other polls have shown even greater support for British colonialism. The public support for colonialism reflects, at least in part, the lack of a full and proper debate on the issue. Against this background, the arguments of Gilley and Biggar may best be seen as an opportunity to have that debate, and to change public opinion, rather than dismiss their claims as “shoddy” and “distorted,” even though they are.

Why has the debate erupted now? For many, the obvious answer seems to be Brexit. The desire to leave the European Union, critics insist, is little more than a form of nostalgia for a Britain that has gone, and for an empire that is no more. The “need for nostalgia” argument is, however, a recurring theme throughout postcolonial British history. “The continuing decline… and the meanness of spirit” of Thatcherite Britain, Salman Rushdie wrote in 1984, led many to yearn “nostalgically” for the days of empire, and to a “recrudescence of imperialist ideology and the popularity of Raj fictions,” such as the TV adaptations of Paul Scott’s The Jewel in the Crown and M.M. Kaye’s The Far Pavilions. Nearly forty years on, the Indian news website The Wire, in a review of Bengal Shadows, a new documentary about the Bengal famine, writes similarly of “a Brexit-scarred Britain… increasingly showing signs of a growing nostalgia for its colonial past.”

There is certainly a strand of imperial nostalgia that has never quite disappeared in the national consciousness, and that keeps resurfacing, especially at times, as now, when Britain is searching for a sense of identity. It is one strand of the Brexit discussion, but hardly one that dominates the debate.

Today’s apologists for colonialism are driven as much by present needs as by past glories. In his Times article, Nigel Biggar referenced not empire nostalgia, but the lessons of empire for contemporary Western foreign intervention. “If we believe what strident anti-colonialists tell us,” he argued, “it will confirm us in the belief that the best way we can serve the world is by leaving it well alone.” A more “balanced view of empire,” on the other hand, would allow us to “think with care about how to intervene abroad successfully” and ensure that “we won’t simply abandon the world to its own devices.” What the British Empire tells us about Western interventions in Iraq and Afghanistan was not that the interventions were wrong but that “successful intervention requires more, earlier.”

Similarly, Gilley uses his claims about colonialism to argue for the “abandonment of the myth of self-governing capacity.” He calls for Western institutions, international bodies, and multinational corporations to help create “good governance.” It is, in other words, a call for overriding democracy and installing technocratic forms of rule.

Foreign intervention and technocratic governance: these are very contemporary issues, and ones with which liberals wrestle as much as reactionaries. Liberals may despise empire nostalgia, but many promote arguments about intervention and governance that have their roots in an imperial worldview. We should not imagine that apologists for empire are simply living in the past. They seek, rather, to rewrite the past as a way of shaping current debates. That makes it even more important that their ideas and arguments are challenged openly and robustly.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/KjdV2sBYNNo/

Controlling the Chief

Mark Wilson/Getty Images President Donald Trump and Defense Secretary James Mattis outside the White House on Inauguration Day, January 2017

It was August 2004, and the Iraqi insurgency was raging in Anbar province. Major General James “Mad Dog” Mattis of the Marines, who is now the Trump administration’s defense secretary, called a meeting with a group of religious leaders outside Fallujah. His division was coming under daily fire from both local militants and foreign terrorists associated with al-Qaeda’s affiliate in Iraq, and he hoped to persuade the leaders that it was misguided of them to encourage local young men to pick up rifles and shoot at American forces rather than trying to throw out al-Qaeda, whose bombings and beheadings were transforming their province into a hellscape.

“How could you send your worshipers, some of them young boys, against us when their real enemy is al-Qaeda?” Mattis asked them, according to the military analyst Mark Perry in his new book, The Pentagon’s Wars: The Military’s Undeclared War Against America’s Presidents, a history of high-level Pentagon decision-making and of relations between uniformed and civilian executive branch officials over the past quarter-century. Perry goes on to repeat further details about this meeting drawn from Bing West’s The Strongest Tribe: War, Politics, and the Endgame in Iraq (2008). When the religious leaders continued sipping their tea, Mattis shouted: “They’re kids…. Untrained, undisciplined teenagers. They don’t stand a chance.” Later, Mattis told West that the tribes at the time “only saw us as the enemy”; what was needed was for al-Qaeda’s militants to make mistakes and “expose themselves for what they were.”

Al-Qaeda’s tactics did eventually repulse Anbar’s tribes enough for them to band together in the so-called Sunni Awakening and drive the foreign terrorists out. The reprieve was temporary; the remnant of al-Qaeda’s Iraq affiliate would regenerate across the border in Syria, rebrand itself as the Islamic State, and sweep back into Iraq in 2014. But for a few years, Anbar became a somewhat less dysfunctional and dangerous place, as Mattis had thought it could be.

More than a decade later, Mattis, now in a civilian position, is once again trying to navigate a tricky and dangerous situation. Widely regarded as one of the “grown-ups” in the idiosyncratic Trump administration, he is among the striking number of military men with whom Trump has chosen to surround himself. Trump appointed another retired four-star general, John Kelly, as his homeland security secretary, then elevated him to White House chief of staff. He made an active-duty three-star general, H.R. McMaster, his national security adviser. McMaster’s chief of staff is a retired three-star general, Keith Kellogg, and many of the lower-level policy specialists gradually succeeding Obama-era holdovers at the National Security Council also have military backgrounds. As a result, the upper reaches of the executive branch, which felt at times like a law firm under Obama, are coming to resemble a command post.

This spreading militarization of the executive branch makes it timely to think about the experiences that have shaped the past generation of top Pentagon brass. The most important of Trump’s military men—along with General Joseph Dunford, the chairman of the Joint Chiefs of Staff, who was appointed to that position by Obama—share a remarkably narrow range of experience. Mattis, Kelly, and Dunford are all Marines, the smallest of the military services and the one that has the greatest reputation for cultural conservatism and for a warrior identity. They have also known one another and worked together for a long time: Kelly and Dunford even served in the unit that Mattis commanded in Anbar. While McMaster is an Army officer, he also came up fighting insurgents in the post–September 11 Muslim world.

Some of The Pentagon’s Wars’ most interesting passages focus on these men. Perry writes that observers might like to believe that their familiarity “with the terrible costs of war” would make them “unlikely to support the military interventions that had marred the terms of the four previous presidents.” But he is skeptical about this, pointing to various episodes in their backgrounds that suggest that “each of these four officers believed deeply in American military power—and in its ability to shape the international environment.” He adds:

Senior military commanders who knew and had served with Mattis, Kelly, Dunford, and McMaster now regularly reassured the press that the election of Trump would not lead to an upending of America’s traditional role as the enforcer of global stability. The war on terrorism would continue, the defense budget would be increased, the US military would be strengthened, and, as Donald Trump reassured the public, the United States “would start winning wars again.”

There is already a growing disparity between Trump’s rhetoric as a candidate and his foreign policy decisions. For all his campaign talk about being eager to “bomb the hell out of” the Islamic State, he ran for president as something of an isolationist—an opponent of military missions that did not put “America first”—and at times even portrayed himself as more dovish than Hillary Clinton. For instance, he criticized George W. Bush’s invasion of Iraq (ignoring the inconvenient truth that at the time, he had told Howard Stern that he supported it), for which Clinton had famously voted, and he denounced her advocacy of Obama’s ill-fated 2011 intervention in Libya. In a major foreign policy speech in April 2016, Trump said: “Unlike other candidates for the presidency, war and aggression will not be my first instinct. You cannot have a foreign policy without diplomacy. A superpower understands that caution and restraint are really truly signs of strength.”

In a series of moves starting just days after he took office, Trump has approved Mattis’s requests to give the military more freedom to attack Islamist militants at its own discretion, removing Obama-era constraints on drone strikes and commando raids in places like Yemen and Somalia. In April, after the Syrian government used chemical weapons against civilians, Trump ordered a punitive missile strike on an Assad regime air base without offering any public explanation of his rationale for how it complied with the constraints imposed by the international laws of war.

In August, Trump answered the military’s request for more troops in Afghanistan by authorizing a new surge of thousands of additional American forces there, winding back up a sixteen-year-old war that Obama had tried to wind down. In explaining his decision in televised remarks, Trump denied that the United States was resuming a nation-building mission in Afghanistan, saying instead that the forces were there only to kill terrorists. But he conceded that he had been talked into changing his position on troop levels out of fears that terrorists would flow into the vacuum left behind if American forces departed. “My original instinct was to pull out—and, historically, I like following my instincts,” he said. “But all my life I’ve heard that decisions are much different when you sit behind the desk in the Oval Office.”

In October, after the deaths of four American army soldiers in an ambush in Niger focused attention on a build-up of hundreds of American forces there, a deployment that began under Obama and expanded under Trump, Mattis testified that the administration had sent the troops to be trainers and advisers who would help that nation resist incursions by Islamic State fighters: “Why did President Obama send troops there? Why did President Trump send troops there? It’s because we sensed that as the physical caliphate is collapsing, the enemy is trying to move somewhere.” Mattis added, “We’re trying to build up the internal defenses of another country so they can do this job on their own.”

Perry explains that The Pentagon’s Wars “is not a recounting of America’s recent wars, but a narrative account of the politics of war—the story of civil-military relations from Operation Desert Storm to the rise of the Islamic State.” For those hoping that a skeptical, assertive military mindset is exactly what is needed to restrain the potential excesses of the Trump administration, Perry’s portrayal of many of the most important Pentagon leaders of the past generation may be disturbing. He deplores the “professional, and inbred, military establishment” of the post–cold war era, writing that all too often three- and four-star officers have been weak, ego-driven, and self-promoting, while rarely independent and outspoken enough to stand up to presidents who advance bad ideas.

The bad ideas Perry has in mind are protracted nation-building missions. He is critical of many of the United States’ military interventions abroad in places like Somalia, Haiti, Kosovo, Iraq, Afghanistan, and Libya, believing that they squandered the US’s position of strength at the end of the cold war. He acknowledges that civilian control of the military is a hallmark of our democracy. But he suggests that—confronted by a series of “political leaders whose vision of a world made safe by American arms, with nations rebuilt according to our ideals” was doomed to fail—senior military officers have too often fallen into line instead of offering unvarnished advice: they should have more aggressively “insisted that our civilian leaders question their assumptions or rethink their options.”

This sounds more like a tale of unduly supine generals than of a Pentagon that has been engaged in an undeclared war against America’s presidents. Indeed, The Pentagon’s Wars contains only a few episodes that live up to its provocative subtitle. In Perry’s account, the most striking instance in the last twenty-eight years of a senior military officer telling a president “no” came when General Colin Powell, then the chairman of the Joint Chiefs, rebelled against President Clinton’s attempt in 1993 to let gays and lesbians serve openly in the military. Powell’s revolt, resulting in the “Don’t Ask, Don’t Tell” compromise, was a “triumph” of sorts, Perry writes, because henceforth, “for the first time in history, the head of the military had a veto: Clinton believed he couldn’t successfully promote a military policy decision without his concurrence.” Yet Powell’s victory for the institutional clout of the military may have been pyrrhic: in Perry’s telling, the nation’s top civilian officials since then have “purposely named military officers they believed they could control” to influential positions, rewarding those who salute and agree and stifling those willing to express dissent.

Whatever one makes of Perry’s arguments, the analytical portions are his book’s most interesting and valuable component. But its factual portions have certain limitations. First, The Pentagon’s Wars contains very little new information. As the endnotes make clear, the essential details of most of the meetings and events it recounts are taken from previously published books and articles by journalists like Michael Gordon, Thomas Ricks, David Halberstam, and Bob Woodward, and from memoirs by retired generals and other former national security leaders. Although Perry conducted many interviews, a typical passage of his book will lay out a well-established episode based on material already put forward by others and then append a new minor detail or observation about it.

Second, the contributions of his Greek chorus of mostly anonymous sources are often jarringly gossipy. We are told, for instance, that Army General Norman Schwarzkopf, the leader of the first Gulf War, under pressure just before the start of the air campaign, was “almost whimpering” and “acting like a baby.” General Wesley Clark, who led the Clinton-era interventions in the Balkans as the head of European Command, was a “tireless self-promoter…who’d gotten ahead by rubbing shoulders with the right people, endlessly polishing his own credentials—and by his singular focus on himself.” Air Force General Richard Myers, the chairman of the Joint Chiefs during the Bush administration’s invasions of Afghanistan and Iraq, “never disagreed with his boss,” to the point that he was supposedly called “limp Dick” behind his back. While Perry writes that Mattis was a “fearless fighter and plain talker” as a battlefield commander, he also quotes an unnamed former subordinate Marine who describes Mattis as a back-slapper who displayed a need to “parade his masculinity” among lower-ranking troops by swapping stories about women they knew.

Whatever the value of this kind of snark, it’s not clear how much credence to give it. I took a closer look at one of the few bits of original reporting about one of the generals who went on to work for Trump and came away unconvinced that it was true. Perry recounts an acrimonious phone call between Mattis and Tom Donilon, then Obama’s national security adviser, in December 2012. At the time, Mattis was in charge of the US Central Command, which oversees military operations in the Middle East, and he was developing a reputation for favoring a more aggressive approach to curbing Iranian misbehavior than some of Obama’s civilian advisers. According to the book, Mattis unilaterally moved a carrier group closer to Iran, and Donilon, when he noticed, told Mattis to pull it back. But Donilon told me that neither the phone call nor the carrier group incident ever happened. I then talked to half a dozen former White House and Pentagon officials who were in a position to know about it, and none remembered the incident either.*

Finally, Perry made some puzzling choices about what to omit. The book largely ignores the unusual policy issues that arose after September 11 and led to recurring battles between civilian and uniformed officials in both the Bush and the Obama administrations. There is no discussion of Guantánamo or trying terrorists before military commissions, for example, and only a fleeting reference to torture amid a brief mention of the Abu Ghraib scandal.

The absence of these crucial issues in a history of recent civilian–military relations is glaring, and is all the more unfortunate because their inclusion would have helped to illuminate significant moves by Trump’s generals. For example, one of Mattis’s early accomplishments as Trump’s secretary of defense was to persuade the president to abandon his campaign promise to sanction the use of torture for interrogating suspected terrorists. Another example is Kelly’s decision, on his first day as Trump’s chief of staff, to send a draft executive order on detainee policy, which had been nearly ready for Trump’s signature, back to lower-level staffers across the various security agencies for reworking so that its final language might better lay the groundwork for putting Guantánamo, which Kelly had overseen during the Obama years as leader of the US Southern Command, to a wider use.

In making sense of those moves, it helps to know that during the Bush administration, many (though not all) uniformed military leaders pushed back hard against civilian officials’ desire to bypass the Geneva Conventions when it came to abusive interrogations of wartime detainees. And it helps to know that during the Obama administration, foot-dragging Defense Department officials sometimes put up passive resistance to the White House’s policy of trying to empty Guantánamo. Including those civilian–military fights over policy during the war on terror would have better supported the book’s subtitle—and made it even more timely for the Trump era.

But there is ample material in The Pentagon’s Wars to raise an important question about the Trump administration: Are Trump’s generals cut from the same mold as their recent Pentagon colleagues, whom Perry sees as having been yes-men? There is some evidence to believe they are not. Even if the purported carrier group incident is dubious, it is widely agreed that Mattis’s hawkish views on Iran created tensions with the Obama administration. When he ran Southern Command, Kelly was also known for candidly saying what he believed even when it put him at odds with the White House, as when he blamed a major 2013 hunger strike at Guantánamo on Obama’s de facto abandonment of his effort to close the prison in his administration’s middle years. And McMaster, as Perry reminds us, wrote a book that became influential in military circles called Dereliction of Duty: Lyndon Johnson, Robert McNamara, the Joint Chiefs of Staff, and the Lies That Led to Vietnam (1997). In it, he condemned the Joint Chiefs for failing to forcefully question (or resign in protest against) Johnson’s disastrous escalation of the Vietnam War.

Brendan Smialowski/AFP/Getty ImagesWhite House Chief of Staff John Kelly in the Oval Office, October 2017

Against that backdrop, one of the striking patterns during the first year of the Trump administration has been the repeated spectacle of these current and former military men contradicting or brushing off the commander-in-chief. Some of this has come in culture-war episodes, such as the debate over transgender troops. After Trump abruptly declared on Twitter that transgender people would no longer be permitted to serve in the American military “in any capacity,” Dunford said that transgender troops could remain unless and until some more formal directive arrived from the White House. When the Trump White House finally produced such a document, Mattis launched a lengthy study during which transgender troops would be permitted to keep serving. And Dunford later testified that his advice was that currently serving transgender troops should be permitted to stay in the military after all, raising the possibility that the president’s edict may fade away.

Another example came after the racially tinged violence surrounding the marches in Charlottesville sparked by the city’s plan to remove a statue of Robert E. Lee. After Trump prompted widespread outrage by equating the white supremacist protesters with anti-racist counterprotesters, Mattis gave a speech to troops that took a starkly different tone, urging the military to “just hold the line until our country gets back to understanding and respecting each other.”

Kelly complicates this pattern. In some ways he has looked like a restraining force, using his power as chief of staff to impose order on the chaotic White House by regimenting the flow of information to Trump and supporting the ouster of several of the most unconventional staffers of his administration, such as Anthony Scaramucci, the foulmouthed and short-lived White House communications director, and Stephen Bannon, the head of Breitbart News whom Trump had made his chief strategist. Those moves echoed a similar overhaul and purge at the National Security Council launched by McMaster last spring and summer, after he took over as Trump’s national security adviser.

But Kelly has also done things that looked more accommodating to his boss—and not just carrying out, as homeland security secretary, Trump’s first, poorly designed travel ban, which caused chaos before courts blocked it. In October, after a Democratic congresswoman criticized Trump over his apparent botching of a consolation call to the widow of one of the soldiers killed in Niger, Kelly came to the White House podium to invoke the combat death of his own son, while attacking the lawmaker as an “empty barrel” whom he had once witnessed crassly bragging, at the dedication ceremony for an FBI building in Miami named after two slain agents, about how she had used her access to President Obama to secure funding for the project. When video of the ceremony surfaced showing that Kelly’s accusation was false, he refused to apologize. His performance drew a rebuke from retired Admiral Mike Mullen, who served as chairman of the Joint Chiefs from 2007 to 2011, and who agreed with an interviewer in November that Kelly now appeared to be “all-in” on supporting Trump as a policy matter:

Certainly what happened very sadly a few weeks ago, when he was in a position to both defend the president in terms of what happened with the gold star family and then he ends up—and John ends up politicizing the death of his own son in the wars. It is indicative of the fact that he clearly is very supportive of the president no matter what. And that, that was really a sad moment for me.

Still, for all the turbulence of these culture-war moments, and for all the slow-burn escalation of counterterrorism deployments, the military men’s most important contribution may be in restraining full-scale war. This past summer, for example, in response to North Korea’s provocative testing of more powerful nuclear weapons and longer-range missiles, Trump declared that “talking is not the answer.” But Mattis contradicted him just a few minutes later, telling reporters: “We’re never out of diplomatic solutions.” Similarly, when Trump told reporters he might use the American military to intervene in Venezuela amid deepening unrest there, McMaster swiftly reassured the world that “no military options are anticipated in the near future.”

Mattis has also repeatedly reaffirmed America’s unconditional support of NATO after Trump called it into question, and said publicly that it was in America’s interest to stay in the Iran nuclear deal, even as Trump threatened to abrogate it. In November, Air Force General John Hyten, the commander of the US Strategic Command, which controls the United States’ nuclear weapons, told an audience at a security forum that if Trump gave him an “illegal” nuclear attack order, he would reject it and try to steer the commander-in-chief toward lawful options.

In short, Trump’s generals—some still in uniform, some now civilians—are clearly trying to mitigate turmoil and curb potential dangers. That may be at once reassuring and disturbing. In the United States, the armed forces are supposed to be apolitical. While the nation should be grateful in these troubled times that the military as an institution has remained loyal to constitutional values, Ned Price, a former CIA officer who served on the National Security Council under Obama, wrote in an essay in Lawfare that the military’s very act of contradicting or distancing itself from the president, even subtly, “goes against the grain of our democratic system and should engender at least fleeting discomfort among even the most virulent administration critics.” Thus, even if it is a good thing for now that the line between “civil and military affairs in American society” is getting a bit blurred, in the long run, Price warned, “that line must again become inviolable when our political class returns to its senses.” Or as Mullen, in a speech in October at the US Naval Institute, put it:

How did we get here to a point where we are depending on retired generals for the stability of our system? And what happens if that bulwark breaks, first of all? I have been in too many countries globally where the generals, if you will, gave great comfort to their citizens. That is not the United States of America.

The more immediate question, however, may be whether Trump’s generals will last through the Trump presidency. McMaster is a regular target of far-right media outlets that see him as a threat to Trump’s nationalist agenda and have urged the president to fire him. And in late August, Trump, who hates being managed, lashed out at Kelly, who later reportedly told other White House staffers that he had never been spoken to like that in his thirty-five years of public service and that he would not abide such treatment in the future. While Mattis so far has escaped Trump’s penchant for abusing his subordinates, it seems safe to predict that he too will eventually get on the mercurial president’s wrong side. In the meantime, it looks increasingly as though Mattis’s long-ago meeting outside Fallujah, asking over tea that the religious leaders of Anbar province see reason, foreshadowed his final mission: trying, on a far larger scale, to keep things from spinning out of control at home.

  1. *

    Others who told me they thought this story was false included Jeremy Bash, the chief of staff to former secretary of defense Leon Panetta; James Miller, the former undersecretary of defense for policy; and John Allen, a now-retired Marine general who was formerly Mattis’s deputy commander at Centcom. I received no answer when I reached out to Mattis through the Pentagon press office. 

Source Article from http://feedproxy.google.com/~r/nybooks/~3/EBM2Lcbr_Rs/