Месечни архиви: April 2016

A Revolutionary Discovery in China

An eighteenth-century painting showing Emperor Qin Shi Huang of the Qin dynasty ‘burning all the books and throwing scholars into a ravine’ in order to stamp out ideological nonconformity after the unification of China in 221 BCE. ‘For over two millennia,’ Ian Johnson writes, ‘all our knowledge of China’s great philosophical schools was limited to texts revised after the Qin unification.’ Now a trove of recently discovered ancient documents, written on strips of bamboo, ‘is helping to reshape our understanding of China’s contentious past.’ Illustration from Henri Bertin’s album The History of the Lives of the Chinese Emperors.
Bibliothèque Nationale de France/RMN-Grand Palais/Art ResourceAn eighteenth-century painting showing Emperor Qin Shi Huang of the Qin dynasty ‘burning all the books and throwing scholars into a ravine’ in order to stamp out ideological nonconformity after the unification of China in 221 BCE. ‘For over two millennia,’ Ian Johnson writes, ‘all our knowledge of China’s great philosophical schools was limited to texts revised after the Qin unification.’ Now a trove of recently discovered ancient documents, written on strips of bamboo, ‘is helping to reshape our understanding of China’s contentious past.’ Illustration from Henri Bertin’s album The History of the Lives of the Chinese Emperors.


As Beijing prepared to host the 2008 Olympics, a small drama was unfolding in Hong Kong. Two years earlier, middlemen had come into possession of a batch of waterlogged manuscripts that had been unearthed by tomb robbers in south-central China. The documents had been smuggled to Hong Kong and were lying in a vault, waiting for a buyer.

Universities and museums around the Chinese world were interested but reluctant to buy. The documents were written on hundreds of strips of bamboo, about the size of chopsticks, that seemed to date from 2,500 years ago, a time of intense intellectual ferment that gave rise to China’s greatest schools of thought. But their authenticity was in doubt, as were the ethics of buying looted goods. Then, in July, an anonymous graduate of Tsinghua University stepped in, bought the soggy stack, and shipped it back to his alma mater in Beijing.

University administrators acted boldly. They appointed China’s most famous historian, seventy-five-year-old Li Xueqin, to head a team of experts to study the strips. On July 17, the researchers gingerly placed the slips in enamel basins filled with water, hoping to duplicate the environment that had allowed the fibrous material to survive so long.

The next day, disaster struck. Horrified team members noticed that the thin strips had already started developing black spots—fungus that within a day could eat a hole through the bamboo. Administrators convened a crisis meeting, and ordered the school’s top chemistry professors to save the slips.

Over the following weeks, the scientists worked nonstop through the eerily empty campus—the students were on vacation, and everyone else was focused on the Olympic Green just a few miles east. With the nation on high alert for the games, security officers blocked the scientists from bringing stabilizing chemicals into the locked-down capital. But the university again put its weight behind the project, convincing leaders that the strips were a national priority. By the end of the summer, Professor Li and his team had won their prize: a trove of documents that is helping to reshape our understanding of China’s contentious past.

Until now, this revolution has largely been confined to paleography—the deciphering of ancient texts. But its importance is slowly spilling into other disciplines, as a result of the deciphering of the Tsinghua bamboo slips now at Tsinghua University, and several other excavations made in the past two decades. Although their significance is not widely understood in the West, in China they have already aroused popular interest, with newspapers and television reporting as these texts are edited and published.1 And yet the implications of these unearthed texts are so profound that they will take decades to digest.

The manuscripts’ importance stems from their particular antiquity. Carbon dating places their burial at about 300 BCE. This was the height of the Warring States Period, an era of turmoil that ran from the fifth to the third centuries BCE. During this time, the Hundred Schools of Thought arose, including Confucianism, which concerns hierarchical relationships and obligations in society; Daoism (or Taoism), and its search to unify with the primordial force called Dao (or Tao); Legalism, which advocated strict adherence to laws; and Mohism, and its egalitarian ideas of impartiality. These ideas underpinned Chinese society and politics for two thousand years, and even now are touted by the government of Xi Jinping as pillars of the one-party state.2

The newly discovered texts challenge long-held certainties about this era. Chinese political thought as exemplified by Confucius allowed for meritocracy among officials, eventually leading to the famous examination system on which China’s imperial bureaucracy was founded. But the texts show that some philosophers believed that rulers should also be chosen on merit, not birth—radically different from the hereditary dynasties that came to dominate Chinese history. The texts also show a world in which magic and divination, even in the supposedly secular world of Confucius, played a much larger part than has been realized. And instead of an age in which sages neatly espoused discrete schools of philosophy, we now see a more fluid, dynamic world of vigorously competing views—the sort of robust exchange of ideas rarely prominent in subsequent eras.

These competing ideas were lost after China was unified in 221 BCE under the Qin, China’s first dynasty. In one of the most traumatic episodes from China’s past, the first Qin emperor tried to stamp out ideological nonconformity by burning books (see illustration on this page). Modern historians question how many books really were burned. (More works probably were lost to imperial editing projects that recopied the bamboo texts onto newer technologies like silk and, later, paper in a newly standardized form of Chinese writing.) But the fact is that for over two millennia all our knowledge of China’s great philosophical schools was limited to texts revised after the Qin unification. Earlier versions and competing ideas were lost—until now.


When China’s imperial system collapsed in the early twentieth century, iconoclasts used the lack of ancient texts to question everything about China’s past. Led by one of China’s most influential historians of the twentieth century, Gu Jiegang, this “doubt antiquity” (yigu) movement cast aspersions on the received history that Chinese had learned for millennia, from the existence of its first dynasties to the uniformity of the great philosophical texts.3 For Gu Jiegang and his allies, Chinese history was much like the West’s, founded in myth and oral traditions that only slowly evolved into written works at a much later date. These were plausible theses, but Gu had no archaeological evidence to back his ideas, instead relying on close readings of the transmitted texts to find inconsistencies.

In China, this skeptical approach to the past was displaced by the Communist victory in 1949. History was interpreted through another unbending lens: Marxism’s rigid eras of primitive, slave, feudal, and capitalist societies that would culminate in communism. Although this schema can still be seen in some Chinese museums, few people have truly believed in it since the disasters of the Mao era discredited Communist ideology. But over the years since then, as China struggles to create a new identity, a “believe in antiquity” (xingu) movement has slowly taken hold—one promoted today by the Communist Party, which idealizes a neat past of filial piety and harmony, a harmless, fairy-tale world, anesthetized and dull.4

The discovery of ancient texts has begun a challenge to these simplistic positions. In 1993, tomb robbers were thwarted in the village of Guodian, in central China’s Hubei province. Archaeologists stepped in and found eight hundred bamboo slips. The next year, 1,200 slips were smuggled to Hong Kong and bought by the Shanghai Museum. The Tsinghua strips followed in 2008, numbering nearly two thousand full slips (the final number is in flux as fragments are being pieced together). All three finds likely came from the same region of China near the Yangtze that used to be occupied by the state of Chu. Carbon dating shows that all three were buried around 300 BCE, right around the time that Confucius’s chief disciple, Mencius, died.

These are not China’s oldest writings. Chinese characters first appeared on “oracle bones”—tortoise shells that were used for divination, mainly in the Shang dynasty (circa 1600–1050 BCE). They are useful for understanding that era, but the core texts of Chinese civilization came later. They were written on bamboo or wood strips that could be bound with string and rolled up, allowing for the creation of complex works of legend, philosophy, and history.

These are not easy manuscripts to decipher. They contain many irregular characters, leading paleographers to debate the exact meaning of important passages. The Tsinghua texts, for example, are being issued in volumes with a version agreed upon by Professor Li’s team but also with dissenting views. (Only about a third of the Tsinghua slips have been published, with one volume released each year. Another ten are projected.)

Academics in China have responded with thousands of books and articles, discussing every detail of the new texts. Western scholars have joined in a bit more slowly. But, perhaps with the benefit of distance, they are drawing broader and more provocative conclusions.5 One example is The Bamboo Texts of Guodian, an epic, 1,200-page annotation and translation of all eight hundred slips from Guodian by Scott Cook of Yale-NUS College in Singapore.6 This is the most complete rendering of the Guodian discovery in any language, including Chinese, and is an example of the sort of cross-cultural work now possible among paleographers who share their ideas and views on blogs and in chatrooms.

Most notable among the Guodian texts is a version of the Daoist classic, Laozi’s Daodejing (better known in the West by the older Romanization form as Lao-tzu’s Tao Te Ching, or “The Way and Its Power”). Cook writes that the discoveries at least partly confirm traditional views of the antiquity of the Daodejing, a hotly debated subject for the past century, especially in the West.

This is because antiquity doubters like Gu influenced many of the West’s most important sinologists of the twentieth century. In his highly influential 1963 Penguin translation of the Daodejing, for example, D.C. Lau arbitrarily adds 196 subheadings to the text, arguing that these were independent sayings with only a “tenuous” connection to each other, and were only collected much later, and in a haphazard manner. The newly excavated texts, however, show that at least large chunks of the Daodejing were circulating in China in the Warring States Period. Some Chinese scholars like Cook and Li believe that the full text existed at that point.

Bamboo slips from the Tang Yu Zhi Dao, an ancient text from the Guodian excavation advocating ‘rule by the most meritorious’ and ‘abdication as a means of succession.’ According to Sarah Allan in Buried Ideas, it ‘reflects ideas that were current in the late fifth and early fourth centuries BCE.’
Bamboo slips from the Tang Yu Zhi Dao, an ancient text from the Guodian excavation advocating ‘rule by the most meritorious’ and ‘abdication as a means of succession.’ According to Sarah Allan in Buried Ideas, it ‘reflects ideas that were current in the late fifth and early fourth centuries BCE.’

But the most revolutionary implications of these texts are political. In her 1981 book The Heir and the Sage, the Dartmouth paleographer Sarah Allan presciently described the central part that abdication played in ancient Chinese thought. As elsewhere in the world at this time, philosophers were grappling with the best way to organize and lead states. Should one be loyal to one’s family (and thus support hereditary rule) or should one put the best interests of the state (and the people) first, and accept that the best person should run the land?

One school of Chinese thought, Mohism, advocated meritocracy in the appointment of officials, while other schools referred to ancient kings who handed rule to sagacious ministers rather than their sons. These legends of abdication were also incorporated in the work of Mencius. He accepted hereditary rule, but with the caveat that if a ruler was very bad, then the people could abandon him and the “mandate of heaven” could pass to a better ruler. But overall the transmitted texts support hereditary rule; revolt was meant to be a measure of last resort to depose a tyrant or a grossly incompetent ruler. Abdication was consigned to the primordial past.

The new texts reveal that other philosophers took much more radical positions. Some contend that every ruler eventually should abdicate in favor of the most talented person in the land. And not just that, but at least one of the new texts is explicitly Confucian in origin, forcing us to revise our view of that school of thought. Sarah Allan makes the case for this in her bold new book, Buried Ideas: Legends of Abdication and Ideal Government in Early Chinese Bamboo-Slip Manuscripts. This is a translation of four texts, all previously unknown, all firmly in the intellectual world of the followers of Confucius, but all arguing strenuously in favor of meritocracy—even for rulers.

One of the texts, Tang Yu Zhi Dao, or The Way of Tang Yao and Yu Shun, from the Guodian excavation, is a philosophical discourse that is based on a well-known myth of King Yao yielding power not to his evil son, but to his wise minister, Shun. But instead of being presented as an exception, as it is in the previous transmitted texts, the story is told as a model for all rulers in every era: “To abdicate and not monopolize is the fullest expression of sagehood.”

Allan also analyzes two texts from the collection in the Shanghai Museum. One is a meritocratic discussion in the form of questions and answers between Confucius and his disciple Zigao. In the known Confucian texts that contain stories of Confucius’s life—primarily the Analects and the Mencius—Zigao is described as a ne’er-do-well and a marginal figure. But here he is a welcome interlocutor of the sage and asks him about the issue of abdication, which Confucius supports—a view never attributed to Confucius before. This text also shows Confucius discussing esoteric issues such as divine insemination and miraculous birth—a direct challenge to the conventional view of Confucius as a secularist who, as the Analects put it, “did not talk about uncanny events, feats of strength, disorders, or spirits.”

The other text from Shanghai, the Rongchengshi, is a long narrative describing an idealized time when all served according to their ability, not according to their birth. The final text, the Bao xun from the Tsinghua collection, is an instruction from a king to his son, reminding him that abdication is a high ideal.

In her lucid introduction and conclusion, Allan cautions that these texts do not form a coherent philosophical school.7 But they do make references to people and arguments found in the extant canon, making it plausible that Mencius’s “mandate of heaven” was a direct response to these kinds of writings, a kind of compromise that protected hereditary rule by incorporating some of the meritocrats’ views. Such stories, Allan writes,

served to promote abdication as an alternative to hereditary rule. This paradigm of abdication is the only alternative to the idea of dynastic cycle found in the Chinese tradition and it did not survive the Qin and Han dynasties as an idea for an alternative form of succession.

Do these old texts matter today? They do in several ways. One has to do with the antiquity of China’s written culture. In the West, many classic texts, for example by Homer or stories in the Bible, are widely accepted as having been oral works that later were written down—a view of history picked up by Gu and the antiquity-doubters of the early twentieth century. Even though Gu was sidelined in China, his heirs in the West have tended to dismiss traditional views that important works in China were written down early on, or even composed as written texts. For many of these skeptical Westerners, Chinese efforts to prove the antiquity of their culture is closet chauvinism, or part of a project to glorify the Chinese state by exaggerating the antiquity of Chinese civilization.

But the new discoveries should give pause to this skepticism. Allan argues that the texts were indeed primarily written down, and not transcribed oral tales. Besides the Daodejing, only a few of the texts excavated over the past twenty years have mnemonic devices or rhyme. She writes that even the texts that claim to be speeches of ancient kings originated as literary compositions. And as the Guodian texts show, works like the Daodejing took a written form earlier than skeptics believed, possibly even as early as the traditionalists have always claimed.

Another point of contention is the scope of ancient Chinese civilization. In traditional history, China was seen as stretching from Beijing in the north to Guangdong in the south, and from its coastline to today’s Sichuan province. Some Westerners have dismissed this, saying that it did not make sense to speak of “China” before the Qin unified China in the third century BCE. Instead, they argue, the states that existed before the Qin should be viewed as separate cultures. But the discovery of these manuscripts at least partially backs the traditionalists. They are almost certainly from the southern state of Chu, which had one of the most distinctive local cultures. The implication is that the various kingdoms engaged in the same philosophical debates and discussions, perhaps making it defensible to speak of one greater cultural area.

‘Emperor Qin Shi Huang traveling in a palanquin,’ late third century BCE; eighteenth-century painting from The History of the Lives of the Chinese Emperors
Bibliothèque Nationale de France/Bridgeman-Giraudon/Art Resource‘Emperor Qin Shi Huang traveling in a palanquin,’ late third century BCE; eighteenth-century painting from The History of the Lives of the Chinese Emperors

And yet the texts also challenge the traditionalists. Even today, Mencius’s “mandate of heaven” is essentially the argument used by the Communist Party to justify its rule: the Kuomintang had become corrupt and ineffective, thus the Communists were justified in usurping power. The Party’s continued rule is likewise justified by China’s economic development, which proves heaven’s support (“history’s judgment,” in Communist parlance). But true to Chinese tradition, the Party makes clear that its rule is hereditary. This is true not only broadly in the sense that other parties cannot take power, but narrowly in the creation of a quasi-hereditary class that has coalesced around “red” families that helped found the Communist state. The old texts, however, show that even in ancient China, a significant group of writers disapproved of such practices, arguing for rule based purely on merit rather than membership in a group.

Without viewing the past too much through today’s lens, one can also see other intriguing parallels to contemporary society. Back in the Warring States Period, rising literacy and urbanization gave rise to a class of gentlemen scholars, or shi, who advised kings; some thought that they might be better qualified than the person born to the throne—the origins of the meritocracy argument. Today, similar trends are at work, but on a much broader scale. Now, instead of a scholar class that wants a say, it is the entire population.

One might even say that the excavated texts show a more freewheeling society than today’s. Here we encounter a past that was home to vigorous debates—a place where Confucians approved of kings abdicating, and might even have fancied themselves capable of ruling. Today’s China also has such ideas, but like the bamboo slips before their discovery, they are buried and their excavation taboo.


The main offices of China’s National Archives are located just north of the Forbidden City in a small campus of 1950s-era buildings. This was a brief period when the new People’s Republic was trying to develop its own architectural style, and several famous architects came up with a hybrid form: the main structure made of brick, not wood, allowing it to be much bigger and higher, but the roofs tiled and the eaves curved. At the time, many condemned the style as a pastiche, but these are among the few buildings in the capital that bridge the past with the present.

I went there in 2014 to hear a public talk by Liu Guozhong, one of Professor Li’s chief deputies. Almost every year, the team gives a public update on its work. As one of the younger and more dynamic members, the task often falls to him. A southerner with a heavy Hunan accent, Professor Liu is humorous and engaging, and spoke for ninety minutes without notes.

Liu told a crowd of about one hundred the story of how he and other members of the team saved the strips from rot in 2008. He showed pictures of how the strips were now being held in trays in a dark room and how the university was building a museum and research center. Then he outlined some of the new texts soon to be released: a chart for multiplying and dividing complex numbers, as well as new books of divination.

Liu spoke carefully and avoided grand conclusions. In person at their offices or at international conferences that they organized, team members speak freely, but their writings and comments are focused on very specific issues. At one conference, I sat next to Sarah Allan, who noticed the same thing.

“I don’t know if it’s especially Chinese, or a result of the past decades [of political turmoil], but people often don’t try to make bigger conclusions,” she told me. “They write the papers and do the research with the big picture in their head, but rarely write it down.”

And yet many people do seem to get the implications. Paleography is a popular field, attracting some of the best young Chinese academics. When I asked Professor Liu about this, he told me that up until the 1970s, “We had these classics like the Shangshu [the Ancient Documents], and for two thousand years they didn’t change. Now we can see them before that and the texts are different!”

At his lecture, Professor Liu said the work will keep him busy until he returns, adding: “But then you and others will be debating this for the rest of this century.” He then concluded and bowed to the audience. People rushed the stage, peppering the young academic with questions. There was a man from the I Ching Research Society asking how they should treat the new texts on divination. A journalist asked about a chart that could be used as a calculator. A graduate student from Peking University eagerly asked about the political implications of abdication. Professor Liu answered them all, while handing out name cards. When the last of his stack was gone, people began to pass them around, snapping photos of his card with cell phones.

The room was lit now only by the dim winter sun. The guards at the back waited to lock the door, but the crowd wouldn’t let Professor Liu leave. For them, he held a key to the present: the past.

  1. 1

    Two exceptions are Francesco Sisci, “Ancient Texts Uncover Meritocracy Debates,” Asia Times, June 25, 2013; and Didi Kirsten Tatlow, “Rare Record of Chinese Classics Discovered,” The New York Times, July 10, 2013.  

  2. 2

    See, for example, Jeremy Page, “Why China Is Turning Back to Confucius,” The Wall Street Journal, September 20, 2015. 

  3. 3

    The best Western treatment of Gu’s life remains Laurence A. Schneider’s Ku Chieh-kang and China’s New History: Nationalism and the Quest for Alternative Traditions (University of California Press, 1971), although another version is badly needed to include Gu’s persecution in the Cultural Revolution.  

  4. 4

    See my article “Old Dream for a New China,” with photos of propaganda posters, NYR Daily, October 15, 2013.  

  5. 5

    See Dirk Meyer, Philosophy on Bamboo: Text and the Production of Meaning in Early China (Brill, 2011); Matthias L. Richter, The Embodied Text: Establishing Textual Identity in Early Chinese Manuscripts (Brill, 2013); Edward L. Shaughnessy, Unearthing the Changes: Recently Discovered Manuscripts of the Yi Jing (I Ching) and Related Texts (Columbia University Press, 2014); and Wang Zhongjiang, Daoism Excavated: Cosmos and Humanity in Early Manuscripts (Three Pines Press, 2015).  

  6. 6

    Cornell East Asia Program, 2012. 

  7. 7

    A view shared by Dirk Meyer in his Philosophy on Bamboo. Meyer uses the bamboo strips to posit that writers on bamboo created two styles of writing: argument-based and context-based. The former argued in favor of very specific points but could not survive because they clashed with subsequent dynasties’ views. Context-based writing was more open to interpretation and survived in the canon. 

Source Article from http://feedproxy.google.com/~r/nybooks/~3/PVjOtcimyrI/

Has Obama Upheld the Law?

President Obama and John Brennan, then assistant to the president for homeland security and counterterrorism, on a conference call about the situation in Libya, Martha’s Vineyard, August 2011
Pete Souza/White HousePresident Obama and John Brennan, then assistant to the president for homeland security and counterterrorism, on a conference call about the situation in Libya, Martha’s Vineyard, August 2011


Over the past decade, Charlie Savage has become an indispensible reporter of US counterterrorism. He won the 2007 Pulitzer Prize for his articles on presidential signing statements, an obscure legal device the George W. Bush administration frequently used to reject legislation that in its eyes encroached on the president’s power. Savage’s first book, Takeover, was a broad study of executive overreach by the Bush administration.1 In Power Wars: Inside Obama’s Post-9/11 Presidency, Savage pursues the same themes in the Obama administration. Power Wars is a long and comprehensive book, covering in intricate detail nearly every major issue in Obama’s national security policy: detainees, military commissions, torture, surveillance, secrecy, targeted killings, and war powers. Its behind-the-scenes story will likely stand as the definitive record of Obama’s approach to law and national security.

Savage offers a distinctly lawyer’s-eye view of his topics. The major participants in his story are not politicians and planners, but top lawyers in the White House, the Justice Department, and the security apparatus—what Savage calls the “national security legal-policy team.” They work mostly behind the scenes, veiled by lawyer–client confidentiality; no more than half a dozen of the more than fifty lawyers described in the book have names that readers are likely to recognize. Their domain is the arcane network of laws that constrain the president as he wages what continues to be, at least in US eyes, an endless war against al-Qaeda and its offshoots.

One reviewer has complained that much of Power Wars will interest national security professionals and law students but not a broader audience.2 That misses the real importance of Savage’s work. His main interest is presidential power in its perennial struggle with Congress and the courts. Ultimately, the stakes are high: whether we will continue to have, in John Adams’s words, “government of laws, and not of men.”3

Savage focuses on executive branch lawyers because they play a central part in restraining the presidency—or not restraining it, as the case may be. Lawyers are trained to pose two questions whenever they interpret laws: What did the legislature mean, and what would the courts say? Metaphorically speaking, Congress and the courts have automatic stakes in the deliberations of any competent lawyer, including lawyers for the executive branch. And if the president’s lawyers tell him that a policy is illegal, he will have a hard time carrying it out. Even an adventurous president willing to say “Do it anyway!” would find too many other officials who won’t sign off on an illegal action. If, on the other hand, the lawyers are yes-men and women, the result may be an “executive unbound.”4 John Ashcroft nicknamed Bush administration lawyer John Yoo, of torture memo fame, “Dr. Yes.” It was not a compliment.

Savage delves deeply into the administration’s legal debates about the War Powers Act during the Libya intervention in 2011, as the campaign approached the sixty-day deadline after which the act requires presidents to get approval from Congress. But he never touches on the decisions to launch the intervention in the first place and then escalate it into a de facto (even if unacknowledged) war to bring about regime change. This was surely one of the biggest blunders of Obama’s foreign policy: the campaign left Libya a fractured and chaotic haven for terrorists, and the escalation probably doomed any future UN-backed humanitarian interventions. Here, focusing on the lawyers rather than the planners seems mistaken.

Even in the Libya case, though, the legal story matters. Savage depicts lawyers scrambling for a legal theory to avoid bringing Libya before a skeptical Congress at a moment when France and the UK were counting on US military help. Tellingly, the lawyers didn’t think the solution they eventually came up with was the best reading of the law, merely that it was “legally available”—a dubious category that apparently means little more than “not laughably off the wall.” It was the closest they ever came to acting as Dr. Yes. Readers might well wonder why legal arguments that are not right but merely “available” deserve any of the respect we accord to law. And one can’t help wondering whether some congressional skepticism was exactly what was needed.

On other issues the legal story clearly deserves emphasis, because the contours of the law critically shaped the policy. This is most notable in decisions about who can or cannot be targeted for drone strikes. Lethal drones are weapons of war, not of law enforcement, so the questions of where and with whom the United States is at war are decisive. Obama’s lawyers interpreted Congress’s 2001 authorization to use military force to apply not only to al-Qaeda but also to “associated forces”; but not every jihadist group is associated with al-Qaeda. In 2010, Jeh Johnson, then general counsel of the Defense Department, concluded that al-Shabaab in Somalia was not an associated force. He “stunned his Pentagon colleagues” by countermanding a strike that Special Operations Forces wished to launch against Shabaab militants. This was an example where the law made a real difference.


Barack Obama took office promising to end the perceived lawlessness of the Bush administration, manifested most vividly in the torture of detainees at CIA black sites. True to his promise, on his third day in office Obama issued an executive order ending torture and revoking all the Bush administration legal opinions that authorized it. His supporters were elated.

Three weeks later, though, Obama’s lawyers entered a California courtroom and, to the surprise of the judges, defended one of the Bush administration’s most aggressive tactics, the invocation of state secrets, in order to dismiss a lawsuit that might expose ugly facts about CIA cooperation with other nations in the program of rendition and torture. Much of the cheering stopped. As it happened, Savage tells us, nobody had informed the president about the Justice Department’s decision to maintain the Bush legal position, and Obama was furious. But evidently he got over his fury, for the Justice Department continued to assert the state secrets defense in that case and all other pending cases where the Bush administration had invoked it.

Gradually, other points of continuity between the national security policies of Bush’s second term and the Obama administration became evident. Obama revived the military commissions charged with conducting trials of detainees. He continued to classify the struggle against al-Qaeda as a war to be fought under military rules. He also increased drone strikes, maintained the NSA’s secret surveillance programs, and prosecuted whistleblowers with greater zeal than any administration in history. What happened?

Savage offers an explanation. Obama’s supporters on the left thought he was a civil libertarian, but they were wrong. What Obama cares about, Savage argues, is not civil liberties, but the rule of law. Confusing the two is understandable, for both are central to American constitutionalism. But they are fundamentally different. Civil liberties are substantive rights; the rule of law is about legal legitimacy. What Obama’s team aimed to do was provide a firm legal foundation for his policies, including those that civil libertarians oppose—policies like preventive detention, targeted killings, and extensive surveillance.

Savage’s diagnosis rings true in many ways. Again and again, we read about the Obama lawyers agonizing over legal arguments and sweating to get them right. That’s the rule of law at work. Obama and his team wanted, as a matter of principle, to avoid Bush’s swashbuckling use of the commander in chief’s authority, and they sometimes bent over backward to find well-grounded alternatives.

Mostly they found them. Savage quotes CIA Director (and former Deputy Homeland Security Adviser) John Brennan: “I have never found a case that our legal authorities, or legal interpretations that came out from that lawyers group, prevented us from doing something that we thought was in the best interest of the United States to do.” Savage raises a very good question, though:

If the end result was often the same—the president can do something specific he wants to do—does it make a difference if his lawyers got there by tossing off a five-page memo or by agonizing through a hundred-page memo?

The answer to this seemingly rhetorical question is yes. Taking careful account of legal process in making decisions rules out the most extreme and patently illegal options long before officials ever get to the stage of formally asking for a legal memo. Meeting legal standards disciplines and shapes official debates over what actually is in the national interest. That makes it less surprising that the option ultimately chosen usually gets the approval of legal advisers. The more cynical answer that the legal process makes only a fig leaf’s difference is largely untrue.

However, Savage himself reports plenty of cases that don’t fit his neat dichotomy between civil liberties and the rule of law. For one thing, civil liberties concerns did drive some important decisions. The administration took enormous political flak for granting Miranda rights to Umar Abdulmutallab, the “underwear bomber”—a civil liberties victory that also turned out to be an intelligence success, since Abdulmutallab continued volunteering information after being informed of his right to remain silent. The administration’s unsuccessful efforts to try the September 11 suspects in civilian court rather than in the shambolic military commissions were at least partly based on respect for civil liberties.

In a few striking cases, however, the administration took positions that violated both civil liberties and the rule of law. Chief among these are Obama’s refusal to hold torturers accountable, his plans for Guantánamo, and his administration’s use of secret law, coupled with unprecedented vigilance in going after leakers. These may be exceptions rather than the rule, but they are exceptions that matter.


The first and most obvious deviation from the rule of law was Obama’s decision not to pursue accountability for the Bush administration’s adventures in torture. It’s understandable that Obama would shy away from even the thought of investigating his predecessors. The investigations would have derailed everything else on the president’s agenda in order to pursue a matter on which many Americans believe Bush was right. With Bush boasting in his memoirs about enhanced interrogation, and Dick Cheney accusing Obama of making America less safe, investigations would have reinforced the self-fulfilling Republican claims that the torture debate is a partisan issue, rather than what it really is: a rule of law and human rights issue.

But what about accountability for those below the top leaders? It came to nothing. Attorney General Eric Holder appointed a prosecutor to look into the hundred cases where interrogators went even further than the torture memos permitted, but in the end he dropped them all. And when the Justice Department’s internal ethics watchdog recommended professional discipline for two lawyers who wrote the torture memos, a DOJ official downgraded the finding of unethical behavior to mere bad judgment. Lawsuits against officials and contractors involved in the torture program were either thrown out because of the state secrets defense or dismissed on grounds of official immunity.

Obama avoided even lesser forms of accountability. No CIA official was fired or demoted for participating in the rendition and torture programs; whatever his own views on accountability, Obama did not want to “lose” the CIA. Early in the Obama administration, some voices called for a truth commission to investigate Bush-era torture. Among others, Admiral Mike Mullen, then chairman of the Joint Chiefs of Staff, favored such an “after-action report” (the phrase “truth commission” was toxic); and after a White House lawyer drafted a proposal to establish a bipartisan commission, most members of Obama’s national security cabinet seemed ready to support it. But the president rejected the proposal, and “one by one, everyone who had liked the idea ten minutes earlier reversed course and told Obama he was right.” That, Savage shows, was the end of accountability.5

Obama’s decision to “look forward, not back” was politically understandable, but it is an obvious blow to civil liberties and human rights: without accountability for past abuses, deterrence of future abuse is undermined. The decision is equally an affront to the rule of law—official impunity is, always and everywhere, the archenemy of the rule of law.

Camp X-Ray, where the first prisoners from the war in Afghanistan were held, Guantánamo Bay, Cuba, March 2002
Lori Grinker/Contact Press ImagesCamp X-Ray, where the first prisoners from the war in Afghanistan were held, Guantánamo Bay, Cuba, March 2002


At one point, Guantánamo held 780 prisoners, but the Bush administration eventually transferred more than five hundred to other countries, including their home countries. When Obama took office, 242 prisoners remained. Some were already cleared for release, or likely to be cleared. Others, like the September 11 suspects, were slated for criminal trials. That left a third group who were deemed too dangerous to release but could not be tried, either because they had committed no crime or because the evidence against them was too thin or too tainted by torture. All were labeled “law-of-war” detainees, on the theory that they are analogous to POWs: captured enemy fighters who can be held for the duration of hostilities. (They do not have formal POW status, which is reserved for regular members of military forces and militias associated with them.)

Releasing those cleared by review boards or courts would be an obvious victory for civil liberties and human rights, most strikingly in the case of the Uighur prisoners—Chinese Muslims who had never been anti-American but who could not be returned to China for fear of torture and other punishment. Early on, the Obama administration hoped to resettle some of the Uighurs in the US, but it retreated in the face of Republican outrage at the very thought of the government importing “terrorists” to US territory. Savage criticizes the administration’s spineless decision, which gave a signal to congressional Republicans that Obama “could be pushed around on Guantánamo issues.” (The Uighurs were eventually resettled elsewhere.)

In fact, though, there had been no intention of closing Guantánamo on grounds of civil liberties. All Obama proposed was moving the detainees from Guantánamo to the Thomson Correctional Center in Illinois. Literally, that would “close Guantánamo,” but with no effect on the liberties of its inmates other than moving them to the harsh conditions of a maximum security prison. Even that dubious plan was derailed by the wave of terrorism panic following the failed attack on an airplane by the passenger Abdulmutallab.

The plan also had nothing to do with defending the rule of law. On the contrary: it would violate the principle of “no punishment without a crime,” the most basic of all the principles of rule of law. The Thomson plan would have moved men never convicted of a crime from a prison camp to a punishment facility. Guantánamo is still a grim place, but over the years conditions for its inmates have improved greatly; it seems likely that Thomson would be worse. Sanctimoniously, Obama insisted that Guantánamo is “not who we are as a country.” But he never explained why locking the detainees away in prison cells qualifies as who we are as a country. Under the laws of war, among them the Geneva Convention, POWs are to be detained in nonpunitive conditions. If these men are truly “law-of-war detainees,” the same should be true for them.

Of course, the real motive for the proposed move was that the prison at Guantánamo is a serious embarrassment for US foreign policy, causing difficulties with our allies and used by enemies to recruit anti-American militants. Yet nobody has ever explained how moving the men from Guantánamo to Thomson would have solved this public relations problem. Did the administration think jihadi recruiters are too thick-headed to notice that the prisoners would still be prisoners? Social media would have quickly turned “Thomson” into a rallying cry just as potent as “Guantánamo.”

To be sure, Obama was assuming that by the time the Thomson prison was ready only a handful of detainees would be left; he had not counted on Congress clamping down on any effort to release Guantánamo prisoners. There have been other delays in the continued effort to transfer the prisoners—some caused by a defiantly insubordinate Pentagon,6 some by Justice Department litigators who continue to fight with undiminished zeal every case for habeas corpus of a detainee, and some by instability in Yemen, where it had been hoped some prisoners could be repatriated.

Recently, after a long hiatus, nine were released in January and eight more are slated for release; if that goes through, Obama will have transferred 145 Guantánamo inmates. At its present rate of clearance (fifteen clearances out of eighteen cases decided), the Guantánamo Periodic Review Board might be expected to clear thirty or more of the remaining detainees. That would leave the ten defendants to be tried by a military commission, seven “high value” detainees, and perhaps a dozen additional prisoners as the irreducible Guantánamo core.7 They will remain prisoners under the law of war in what still appears to be a forever war.


In a 2013 congressional hearing, a representative asked Robert Litt, the top lawyer for the Office of the Director of National Intelligence, whether he really thought the NSA’s program of bulk surveillance of telephone calls in the US “could be indefinitely kept secret from the American people.” Litt responded, “Well, we tried.”

Obama promised to run the most transparent administration in history, and in one way he kept his promise. In 2014 the government classified only one fourth the number of secrets as the yearly average under the Bush administration. But in other important respects, Obama perpetuated the Bush administration’s desire for secrecy. Notably, the bulk collection of metadata by the NSA was approved under secret legal interpretations that were reluctantly made public following the Snowden revelations. These interpretations were not merely secret; they were flatly contrary to what many people supposed the law meant.

Secret law undermines the rule of law, as legal theorists beginning with Kant have insisted: citizens must be able to know the law under which they are being governed. This is a hard lesson for officials to accept. When journalists won a Freedom of Information Act case to gain the release of secret white papers about the law governing targeted killings, Obama’s White House Counsel Neil Eggleston “swiftly issued instructions to the Obama legal team:…no more white papers.”

Furthermore, the Obama Justice Department prosecuted, in Savage’s words, “three times as many leak-related cases as all previous presidents combined.” Some, such as the cases of Bradley (now Chelsea) Manning and Edward Snowden, involved major leaks. Others, however, did not. Former NSA official Thomas Drake was prosecuted because he was a source for news exposés of waste and mismanagement. After years, the government acknowledged that it had no case against Drake; but before admitting that, prosecutors wishing to save face unethically got Drake to accept a last-minute plea bargain for a misdemeanor. At the sentencing hearing, the judge ripped into the prosecutors:

I think the average American citizen would take great caution to say, okay, let me get this straight, my home is searched, and three years later I’m finally indicted, and then a year after that the government drops the whole case. That’s four years of hell that a citizen goes through…. I find that unconscionable.8

Eric Holder defended the leak indictments by arguing that each was justified on the merits, which was plainly untrue in Drake’s case. But Holder’s defense also ducks the larger question. Previous administrations had refrained from cracking down on leakers, even when the government might have won on the merits; why not the Obama administration as well? Savage reports that there was no conscious policy driving the prosecutions for leaking, merely a set of unconnected decisions. I find this is a bit hard to credit, because the early prosecutions were well publicized and controversial; not to back off in later ones would have been a policy decision. A 2012 Defense Department document made it clear that, for at least some in government, the point was harsh deterrence: “Hammer this fact home,” the document said. “Leaking is tantamount to aiding the enemies of the United States.” That is a chilling mind-set: anyone who reveals a government secret is a traitor. It bespeaks a sense of entitlement to operate in absolute secrecy that is as foreign to the rule of law as it is to civil liberties.

At the same time, high-level officials who leaked information were not investigated. That includes CIA and Pentagon officials who cooperated in the film Zero Dark Thirty and were criticized by the Pentagon’s inspector general for unauthorized disclosures, and a former vice-chair of the Joint Chiefs of Staff who “became a prime suspect as a source for…reporting about the cyberattack on Iranian nuclear equipment.” It need hardly be said that such double standards are antithetical to the rule of law.


Following the attacks in Paris and San Bernardino, Americans are reportedly more fearful than at any time since September 11, and fear inevitably makes both law and liberty seem insignificant compared with security. The level of fear now seems palpably greater than after comparable attacks: the 2004 Madrid train bombings, which killed 191 people and wounded nearly a thousand, and the 2013 Boston Marathon bombing. Perhaps that is because during an election season politicians deliberately amplify the fear; or perhaps the flamboyant atrocities of ISIS have become so magnified in our imagination that we see it as an existential threat to the West, just as it fancies itself.

Of course, this fear is not only American. After the Paris massacre, according to Human Rights Watch, the French government expanded emergency powers, allowing its officials

to impose house arrest without authorization from a judge, conduct searches without a judicial warrant and seize any computer files it finds, and block websites deemed to glorify terrorism without prior judicial authorization. These powers interfere with the rights to liberty, security, freedom of movement, privacy, and freedoms of association and expression.

French police promptly used their new powers against demonstrators during the conference on climate change. Such are the dangers of allowing reactions against terrorism to weaken commitment to rights.

Savage begins his book by arguing that the actions in 2009 of the underwear bomber, which came close to succeeding, produced comparable fear in the Obama administration, causing it to harden its positions on terrorism-related issues. Fear can quickly undermine civil liberties and the rule of law; that is something even the most conscientious lawyers cannot change.

  1. 1

    Takeover: The Return of the Imperial Presidency and the Subversion of American Democracy (Little, Brown, 2007); reviewed by David Bromwich in these pages, November 20, 2008. 

  2. 2

    James Mann, “Charlie Savage’s ‘Power Wars’ Dissects Obama’s Evolution on National Security,” The New York Times, October 30, 2015.  

  3. 3

    John Adams, “Novanglus Paper” (no. 7), The Works of John Adams, edited by Charles Francis Adams (Books for Libraries Press, 1969), Vol. 4, p. 106. 

  4. 4

    See Eric Posner and Adrian Vermeule, The Executive Unbound: After the Madisonian Republic (Oxford University Press, 2011). The authors are proponents of the “unbound” executive. 

  5. 5

    See my detailed discussion of the Obama administration’s failures on torture accountability in “An Affair to Remember,” in my Torture, Power, and Law (Cambridge University Press, 2014), pp. 271–306. 

  6. 6

    See Charles Levinson and David Rohde, “Special Report: Pentagon thwarts Obama’s effort to close Guantánamo,” Reuters, December 29, 2015.  

  7. 7

    I am grateful to David Remes for these figures. 

  8. 8

    Transcript of Proceedings, Sentencing, Before the Honorable Richard D. Bennett, United States District Judge, July 15, 2011, pp. 28–29, 42. 

Source Article from http://feedproxy.google.com/~r/nybooks/~3/FxaxwE4vzok/

Europe: A Better Plan for Refugees

Migrants wait in line for food in a makeshift camp at the Greek-Macedonian border, March 31, 2016
Bulent Kilic/AFP/Getty ImagesRefugees in line for food in a makeshift camp at the Greek-Macedonian border, March 31, 2016

The asylum policy that emerged from last month’s EU-Turkey negotiations—and that has already resulted in the deportation of hundreds of asylum seekers from Greece to Turkey—has four fundamental flaws. First, the policy is not truly European; it was negotiated with Turkey and imposed on the EU by German Chancellor Angela Merkel. Second, it is severely underfunded. Third, it is not voluntary. It imposes quotas that many member states oppose and requires refugees to take up residence in countries where they don’t want to live, while forcing others who have reached Europe to be sent back. Finally, it transforms Greece into a de facto holding pen without sufficient facilities for the number of asylum seekers already there.

All these deficiencies can be corrected. The European Commission implicitly acknowledged some of them this week when it announced a new plan to reform Europe’s asylum system. But the Commission’s proposals still rely on compulsory quotas that serve neither refugees nor member states. That will never work. European Commission Vice President Frans Timmermans is inviting an open debate. Here is my contribution. 

A humanitarian catastrophe is in the making in Greece. The asylum seekers are desperate. Legitimate refugees must be offered a reasonable chance to reach their destinations in Europe. It is clear that the EU must undergo a paradigm shift. EU leaders need to embrace the idea that effectively addressing the crisis will require “surge” funding, rather than scraping together insufficient funds year after year. Spending a large amount at the outset would allow the EU to respond more effectively to some of the most dangerous consequences of the refugee crisis—including anti-immigrant sentiment in its member states that has fueled support for authoritarian political parties, and despondency among those seeking refuge in Europe who now find themselves marginalized in Middle East host countries or stuck in transit in Greece.

Most of the building blocks for an effective asylum system are available; they only need to be assembled into a comprehensive and coherent policy. Critically, refugees and the countries that contain them in the Middle East must receive enough financial support to make their lives there viable, allowing them to work and to send their children to school. That would help to keep the inflow of refugees to a level that Europe can absorb. This can be accomplished by establishing a firm and reliable target for the number of refugee arrivals: between 300,000 and 500,000 per year. This number is large enough to give refugees the assurance that many of them can eventually seek refuge in Europe, yet small enough to be accommodated by European governments even in the current unfavorable political climate.

There are established techniques for the voluntary balancing of supply and demand in other fields, such as with matching students to schools and junior doctors to hospitals. In this case, people determined to go to a particular destination would have to wait longer than those who accept the destination allotted to them. The asylum seekers could then be required to await their turn where they are currently located. This would be much cheaper and less painful than the current chaos, in which the migrants are the main victims. Those who jump the line would lose their place and have to start all over again. This should be sufficient inducement to obey the rules.

At least €30 billion ($34 billion) a year will be needed for the EU to carry out such a comprehensive plan. This includes providing Turkey and other “frontline” countries with adequate funding to maintain their very large refugee populations, creating a common EU asylum agency and security force for the EU’s external borders, addressing the humanitarian chaos in Greece, and establishing common standards across the Union for receiving and integrating refugees.

Thirty billion euros might sound like an enormous sum, but it is not when viewed in proper perspective. First, we must recognize that a failure to provide the necessary funds would cost the EU even more. There is a real threat that the refugee crisis could cause the collapse of Europe’s Schengen system of open internal borders among twenty-six European states. The Bertelsmann Foundation has estimated that abandoning Schengen would cost the EU between €47 billion ($53.5 million) and €140 billion ($160 million) in lost GDP each year; the French Commissioner for Policy Planning has estimated the losses at €100 billion ($114 billion) annually.

Moreover, there is no doubt that Europe has the financial and economic capacity to raise €30 billion a year. This amount is less than one-quarter of one percent of the EU’s combined annual GDP of €14.9 trillion, and less than one-half of one percent of total spending by its twenty-eight member governments.

It is Europe’s political capacity that is lacking, at least at the moment—its ability to make effective unified decisions about such an urgent matter. Most member states are restricted by the EU’s fiscal rules from running larger deficits and financing them by issuing new debt in the capital markets. Even though German Finance Minister Wolfgang Schäuble lifted hopes in Davos in January when he spoke of a European Marshall Plan to deal with the migration crisis, he also insisted that any spending should be financed out of revenues rather than by adding to the existing government debt.

Taking on new common European debt, backed by the joint and several guarantee of the EU’s members, would raise strong objections, particularly in Germany. Even if the debt were restricted to addressing the migration crisis, Germany and others would see it as a dangerous precedent toward creating debt backed by EU members collectively, with Germany responsible to step in if other countries fail to repay their share of the debt. Berlin has diligently avoided providing such a precedent throughout the euro crisis. That is why the question has not even been raised, let alone seriously considered. But there are other ways to raise the necessary funds using existing EU structures.

Member states could raise new tax revenue in order to fund what is needed. However, Europe does not have the political capacity to raise the necessary sums needed in time to contain the crisis. For a new tax to be perceived as fair, it would have to be imposed equitably across the EU. The proper route for such a tax increase would be for the European Commission to propose new legislation to be adopted with the unanimous support of all members. This would likely fail, since it would give every country the right to veto the tax. If a “coalition of the willing” of at least nine countries could be assembled, the Commission could opt for “enhanced cooperation,” the approach used for the proposed European financial transaction tax (FTT). If the recent experience with the FTT is any guide, this process would take months to conclude.

A more promising alternative would be to re-open the European Commission’s Multiannual Financial Framework, which establishes the EU’s broad budgetary parameters, including the maximum amounts the EU may spend in different areas. The forthcoming mid-term review of this EU budget offers an opportunity to increase the VAT contribution of member states, and designate that some of the new funds raised should go to a refugee crisis fund. This would also be difficult but offers the most realistic path forward.

It will be crucial, however, to make a large part of the funding available very quickly. Making large initial investments will help tip the economic, political, and social dynamics away from xenophobia and disaffection toward constructive outcomes that benefit refugees and countries alike. In the long run, this will reduce the total amount of money that Europe will have to spend to contain and recover from the refugee crisis. This is why I call it “surge” funding.

Where will the necessary funds come from? There is a strong case to be made for using the EU’s balance sheet itself. The EU presently enjoys a triple-A credit rating that is underused and that allows it to borrow in the capital markets on very attractive terms. And with global interest rates at near historic lows, now is a particularly favorable moment to take on such debt.

Tapping into the triple-A credit of the EU has the additional advantage of providing a much-needed economic stimulus for Europe. The amounts involved are large enough to be of macroeconomic significance, especially as they would be spent almost immediately and exercise a multiplier effect. A growing economy would make it much easier to absorb immigrants, whether they are refugees or economic migrants—a win-win initiative.

The question is: How to use the EU’s triple-A credit without arousing opposition, particularly in Germany? The first response is to recognize that the EU is already a triple-A borrower in the global bond markets, through facilities created to deal with the Eurozone crisis. Indeed, it was during the financial crisis that the EU repeatedly put its borrowing capacity on display, establishing financial instruments (such as the European Financial Stabilization Mechanism, or EFSM, and the European Stability Mechanism, or ESM) capable of borrowing tens of billions of euros on attractive terms in very short order. Once Europe’s leaders made a political decision to act, they were capable of doing so very quickly.

Some of these European financial entities, which still have considerable borrowing capacity, could be redirected to the refugee crisis. This would be far more efficient and faster than creating a new borrowing mechanism for the purpose. And such a redirection would require only a political decision—one that can be taken at short notice if the political will can be generated.

Two sources of money in particular—the EFSM and the Balance of Payments Assistance Facility—should be put to the task. These sources complement each other: the EFSM was designed for loans to euro-area members, whereas the balance of payments facility is for EU members that do not belong to the Eurozone. Both kinds of loans will be necessary for a comprehensive approach to the crisis. Both also have very similar institutional structures, and they are both backed entirely by the EU budget—and therefore do not require national guarantees or national parliamentary approval.

The combined gross borrowing capacity of the EFSM and the Balance of Payments facility is €110 billion ($125 billion), a number meant to coincide with the annual revenue ceiling of the EU budget. The amounts of each facility were set so that the EU never has more than its annual budget in debt outstanding. The Balance of Payments Assistance Facility’s €50 billion of borrowing power is almost completely unused. The EFSM has made some €46.8 billion worth of loans to Portugal and Ireland but has substantial spare capacity. They jointly have well over €60 billion of capacity, and this capacity grows each year as the loans to Portugal and Ireland are repaid.

The EFSM, the ESM, and its precursor, the EFSF, were all established in response to the euro crisis. The task back then was to provide cheap credit to countries like Ireland, Portugal, Spain, and Greece that had otherwise been frozen out of the credit markets. The expectation was that these countries would repay their loans from the EU once they had been restored to financial health.

Now the task is fundamentally different. As with the euro crisis, the refugee situation is at a critical point and requires a very quick response. But it differs from the euro crisis in that the countries to which the funds would be aimed—like Jordan, Lebanon, Turkey, and Greece—are merely on the frontlines of what must be a collective European undertaking; they are entitled to grants, rather than loans, and should not be obliged to repay the monies they receive.

If we accept this reality, how then will the surge funding get repaid? The answer is that the EU and its member states must find new sources of tax revenue, and do so in a way that spreads the repayment obligation as widely as possible. This could be done by levying special EU-wide taxes. The new tax revenue could come from a variety of sources, including the EU-wide VAT, which already provides revenue to the EU; a special tax on gasoline, as Minister Schäuble has suggested; or a new tax on travel into the EU and on visa applications, which would shift some of the burden onto non-EU citizens wishing to travel to the EU.

It was noted above that the process of levying new taxes inside the EU is one that will take a long time to complete. However, those looking after the finances of the EFSM and the Balance of Payments facility will want to know that the loans they make have a sure source of repayment. That’s why the EU must guarantee that it will find this new tax revenue by the time it is needed, even if the exact source of the new revenue has yet to be determined.

The question remains, how can the necessary political will be generated? The European Union is built on democratic principles. I believe there is a silent majority that wants to preserve the European Union even if it is currently not a well-functioning institution. The leaders will listen if this silent majority makes its voice heard.

The refugee crisis poses an existential threat to Europe. It would be irresponsible to allow the EU to disintegrate without utilizing all the resources it has at its disposal. The lack of adequate financing is the main obstacle standing in the way of successful programs in the frontline countries. Throughout history, governments have issued bonds in response to national emergencies. That is the case in Europe today. When should the triple-A credit of the EU be mobilized if not at a moment when the European Union is in mortal danger?

Source Article from http://feedproxy.google.com/~r/nybooks/~3/9OOKhf05Zt4/

Caught in the Act

A bank robber aiming at a security camera, Cleveland, March 8, 1975
The Metropolitan Museum of ArtA bank robber aiming at a security camera, Cleveland, March 8, 1975

Consider the lowly mug shot, artless, indifferent, striving for nothing more than accuracy of likeness for the purpose of criminal identification. Mug shots are the cruelest of posed pictures; punitive, shameful, they make even the young and beautiful look plain. Yet a mug shot is also a kind of formal portrait: you sit (or stand) for it and project whatever the setting and the photographer draw out of you. Flip through the pages of a police booking file and you will see a taxonomy of human expressions, from fear to impotence, haughtiness, fury, grief, and stress. In my own mug shot, taken several years ago after a wrongful arrest, I obligingly present the image my capturers want: puffy and defeated, the pictorial equivalent of a confession of guilt.

Marius Bourotte, 1929
The Metropolitan Museum of ArtMarius Bourotte, 1929

In the 1870s, a Parisian policeman named Alphonse Bertillon pioneered the mug shot as part of his “anthropometric” system of criminal identification based on minute physical measurements. For no pay, he spent his free hours examining inmates at La Santé prison, using calipers and rulers to record the length and width of prisoners’ fingers, noses, foreheads, and mouths. Perhaps he was seeking to demonstrate the theory that political “agitators” and other “moral degenerates” tended to have similar physiognomies, a popular fixation in the late 1800s. (Bertillon had no compunctions about violating his own rules of exactitude when it came to imprisoning accused enemies of France. He had no expertise whatsoever as a handwriting specialist, for instance, yet testified as one in 1894 and again in 1899 to help convict Alfred Dreyfus of treason.)

A sampling of Bertillon’s meticulous mug shots, along with about forty crime-related images from American tabloids, police files, security cameras, and photographers both anonymous and widely known, comprise the fascinating exhibition “Crime Stories: Photography and Foul Play,” currently at the Metropolitan Museum of Art.

The images on display date back as far as 1865, when Alexander Gardner shot a full-body portrait of Lewis Powell after he had unsuccessfully tried to assassinate Secretary of State William Seward as part of the conspiracy to overthrow (with John Wilkes Booth) President Lincoln’s administration. Powell, young, pleasantly chubby-faced, with rich black hair combed low across his forehead, is wearing a white canvas trench-coat. You sense the importance he places on this photograph, his effort to transmit himself as a captured hero, unrepentant and armored with a kind of calmly fanatical conviction. If it weren’t for the Union soldier guarding him with a bayonet, the picture could pass today for a Ralph Lauren ad.

Portrait of Lewis Powell by Alexander Gardner, April 27, 1865
The Metropolitan Museum of ArtPortrait of Lewis Powell by Alexander Gardner, April 27, 1865

The curators have set up an interesting dialogue between celebrated victims and assassins, on the one hand, and the unknown on the other. A Los Angeles Times photograph of Robert F. Kennedy seconds after he was shot on June 5, 1968, still alert and trying to convey something urgent, it seems, to the person kneeling beside him, provokes a predictable ache as the viewer runs through his knowledge of what followed the assassination and what might have been. The image engages the news gossip in us, the gawker at historical artifacts. By contrast, an anonymous, plebian bank robber, ferociously trying to shoot out a security camera in a burst of smoke and light, invites us to imagine the crime and, by doing so, to insert ourselves into the scene.

If one measure of a photograph’s power is the extent to which it inspires us to fill in the circumstances around it, then the mug shot of a 1930s Baltimore shoplifter is a small masterpiece of portraitist art. A woman in her late forties, with whitening blonde hair, turns slightly away from the police photographer’s camera with a mix of melancholia and trapped defiance. The flesh around her left eye is badly bruised, a messy black puddle that spills along her cheek and temple. Who slugged her—the department store security guard, the arresting cop, the shopkeeper himself, or an intimate friend? Her lips are thin and subtly crooked, her jawline is just beginning to sag. The life before (and after) the picture rushes in on you in an imagined story of filled-in time. You note the shabbiness that at a quick glance could pass for respectability, the probable alcoholism or drug addiction, the stickpin in her wrinkled cloth hat. You admire what you take to be her underworld intelligence, her survival in the face of obvious hardship and abuse, all the while mulling over the ethical nature of your own coldly scrutinizing eye.

Automobile murder scene, circa 1935
The Metropolitan Museum of ArtAutomobile murder scene, circa 1935

Hanging near a photograph of the stockinged legs of a murdered young woman in the back seat of a car is one of Walker Evans’s sneaky, overcoat-cloaked shots of passengers on New York City’s subway in the Thirties and Forties. “Ladies and gentlemen of the jury,” Evans called his unwitting subjects. In this example, a smirking, vaguely menacing young man is engrossed in his copy of the Daily News. PAL TELLS HOW GUNGIRL KILLS, reads the headline. An obese woman, obviously not an acquaintance, sits miserably beside him.

Next to the Evans, in turn, hangs Diane Arbus’s on-screen shot of an actress playing a B-movie gun moll that she took from her seat in a theater in 1958. Both photographs are reflections on murder as a species of mass entertainment, especially if the murderer is a woman. Plucked from the running display of ordinary life, they create an event. In the case of tabloid photographers—rushing to a crime scene and capturing what they can on the fly—it’s the event that dictates the picture. By remarking on the pornographic connection between violence and sex that is fed us to sell newspapers and movies, Evans and Arbus offer a different level of news.

In the 1970s, the sexualized outlaw was either a lone wolf existentialist hero (as in Norman Mailer’s The Executioner’s Song, his 1979 “true life novel” about the psychopathic killer Gary Gilmore) or a putatively idealistic warrior for political revolution, like Andreas Baader of the Baader-Meinhof gang in West Germany.

Patricia (Patty) Hearst during the Hibernia Bank robbery, San Francisco, 1974
The Metropolitan Museum of ArtPatricia (Patty) Hearst during the Hibernia Bank robbery, San Francisco, 1974

“Photography and Foul Play” includes two potent photographs from 1975, the last and most recent year that the exhibition covers. The first is of Patty Hearst, the nineteen-year-old kidnapped publishing heiress who became, in captivity, an armed Symbionese Liberation Army guerrilla. With her dark feathered hair, her alarmingly fragile thinness, her swung open jacket, and her shoulder-strapped automatic rifle held waist high at a bank heist in San Francisco, she is like a stylized version of a punk rock goddess. The formalizing effect of the framed print on the gallery wall transforms it from the bombarded news and television “evidence” image that it had been into a deliberate and premeditated glamor portrait.

The second is a frankly erotic portrait by Larry Clark of two armed robbers in Oklahoma City. The robbers, maybe eighteen years old and stripped naked to the waist, are posing for Clark in what appears to be a motel room. One has a gun shoved in his jeans, white pearl handle protruding; the other is armed with a Mick Jagger haircut and pout, lit cigarette dangling from his lips, his revolver on a table beside him. To Clark, they are stars, and you can feel the tug of his admiration for them, his desire to be like them, his kinship and awe.

Armed robbers; photograph by Larry Clark, Oklahoma City, 1975
The Metropolitan Museum of ArtArmed robbers; photograph by Larry Clark, Oklahoma City, 1975

Richard Avedon is represented by his 1960 portrait of Dick Hickock, the Kansas murderer who was one of the subjects of Truman Capote’s In Cold Blood. In his typical style, Avedon presents a large-scale investigation of Hickock’s face: the black greased pompadour combed just so, the slightly fleshy nose, the disturbingly engaging eyes, all of it ever so slightly skewed by the impression of Hickock’s inscrutable lopsided grievance. (Hickock had suffered a brain injury in an automobile accident when he was nineteen.) As with Gardner’s 1865 portrait of Lewis Powell (who was also executed by hanging for his crime), you sense how seriously Hickock takes being photographed, his wish to give something of himself, to influence, if not control, the emanations of his image and how he is being portrayed.

“He’s hot!” said one of three teenage girls at the exhibition the day I was there. Giggling, they took turns standing for cell phone pictures next to Hickock’s face, as if showing off their latest romantic interest.

Crime Stories: Photography and Foul Play” is showing at the Metropolitan Museum of Art through July 31.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/d-zj-buSH-4/

Balkan Poison, Revisited

Vojislav Šešelj, circa 1995
Tanjug / EPA / picturedesk.comSerbian ultra-nationalist leader Vojislav Šešelj (left), with paramilitary leader Dragan Vasiljković (“Captain Dragan”) and other Serbian fighters, Croatia, circa 1991

What are the limits of international accountability for crimes of war? And what does it mean for the local populations in question? As human rights groups prepare for cases which might be brought when the war in Syria ends, the last few weeks have brought some stark results from the Balkan wars of the 1990s.

First was the March 24 conviction of Bosnian Serb leader Radovan Karadžić, the most senior figure from the wars to be convicted by the UN’s war crimes tribunal for the former Yugoslavia. The court in The Hague found him guilty on ten of eleven charges—including genocide for the executions of 8,000 Bosniak Muslim men and boys from Srebrenica in 1995. Then came the baffling acquittal, on March 31, of Serbian ultra-nationalist Vojislav Šešelj, who was a leading advocate of ethnic cleansing and whose militia was deeply involved in campaigns to drive out Croats and Bosniaks from regions claimed for a Greater Serbia in the early 1990s. Similarly shocking was the January murder conviction of Oliver Ivanović. Sent to jail by judges of the EU’s law and justice mission in Kosovo, the politician Ivanović had done much to work for reconciliation between Serbs and Albanians in Kosovo. Something seems amiss here.

Faced with these erratic results after so many years, few in the Balkans are happy with the work of The Hague tribunal or local ones, except perhaps the victims who have testified in court and whose words and experiences are recorded for posterity. Nor is there much indication that leaders in war-torn countries today, like Syria, are even listening. What has been less predictable, however, is the extent to which some people, at least in Serbia, Bosnia, and Kosovo, are themselves beginning to reassess what took place in their countries during the 1990s, even as there appears to be little political will in their countries to see that justice is fully done.

As it happens, the recent verdicts in The Hague have coincided with a series of stunning new films about the wars made in the region. There is no single explanation for why this is happening now, and it would be wrong to exaggerate how widespread the phenomenon is. But these works—including both documentaries, and feature films—suggest that some are not prepared to forget or gloss over a past that is still dominated by a victim mentality in each of the countries in question. Meanwhile, an important new book has given us the full story of how men like Karadžić were finally tracked down—and why it took so long.

AFP/Getty ImagesRadovan Karadžić in The Hague, July 31, 2008; and in disguise as a mystic healer before his arrest

In The Butcher’s Trail, Julian Borger, the world affairs editor of The Guardian and a former correspondent in Bosnia, has pulled together the many different strands of the complicated effort to bring suspected war criminals from the Balkans to justice. One way or another, every single one of the 161 suspects on the list of the UN’s war crimes tribunal was eventually accounted for. The so-called big fish included the Croatian general Ante Gotovina, but inevitably the most interesting stories are about the leading Serbs—Karadžić, his military commander Ratko Mladić, and their boss of bosses, Slobodan Milošević. Milošević died in 2006, before his trial had finished; Mladić’s trial is expected to wind up next year.

Accounting for each of these suspects was no mean feat, given that the tribunal had no force of its own to arrest the people it was indicting. It had to rely on UN troops in Croatia, soldiers from the NATO peacekeeping mission in Bosnia, national police forces if the suspects were no longer in the region, and the intelligence agencies of any countries prepared to help. “More than half the suspects were tracked down and captured,” writes Borger. “Others gave themselves up rather than lie awake every night wondering whether masked, armed men were about to storm into their bedroom. Two committed suicide. Others decided they would rather die in a blaze of gunfire and explosives than be taken alive.”

The single most extraordinary story is that of Karadžić himself. At first, NATO forces, which flooded into Bosnia after the peace deal of 1995, were deeply reluctant to arrest him, fearing reprisals against their own men. Eventually, as arrests began and it was clear that international forces would not be threatened, Karadžić fled only later to live in plain sight in the middle of Belgrade: using a stolen identity provided to him by the Serbian State Security Service, he grew a bushy beard and reinvented himself as a mystic healer. His cover was so good that the lady who lived opposite him and who worked for Interpol was utterly unaware who he really was. But when Karadžić was finally tracked down in 2008, he shaved off his beard, and, as though a switch had been flicked, turned back into his former personality.

Borger’s book is specifically about the hunt for those the UN’s tribunal indicted but his conclusion given the disappointment with many of its verdicts, already well before the Karadžić one is eloquent. “Resurgent nationalists in the states of the former Yugoslavia are covering over the truth of what happened with a thick layer of revisionism and denial,” but he adds, trying to be optimistic, “the meticulous record of the tribunal, with its seven million documents, cannot be buried forever. Nor can the demand for justice for humanity’s worst crimes.”

Ranko Momić, Slaviša Kastratović, and Milutin Nikolić, with other members of the "Jackals," Kosovo, 1998, from the documentary The Unidentified, by Marija Ristić and Nemanja Babić, 2015
Balkan Investigative Reporting NetworkRanko Momić, Slaviša Kastratović, and Milutin Nikolić, with other members of the “Jackals,” Kosovo, 1999, from the documentary The Unidentified, by Marija Ristić and Nemanja Babić, 2015

Unfortunately, for all the efforts of the international tribunal to catch figures like Karadžić, many who live in the region may doubt that justice for the wars of the 1990s can ever be fully rendered. To them the recent acquittal of Šešelj and the previous acquittals of prominent Croats, Serbs and Kosovo Albanians have shown how limited the court’s efforts may be in practice. The limits of international accountability, coupled with the apparent weaknesses of domestic courts, is certainly one of the main themes of The Unidentified, a documentary about the Jackals, a Serbian paramilitary unit that operated in Kosovo during the period of the NATO bombing in 1999, made by Marija Ristić and Nemanja Babić, two journalists from the Balkan Investigative Reporting Network (the only pan-Balkan news organization; it also publishes news in English). I have watched this extraordinary film four times now and it still makes my flesh creep. 

The Jackals were led by a local Kosovo Serb criminal called Nebojša Minić, aka Dead. He earned that nickname after showing up to a wake that had been held in his honor when, a few years earlier, he had been reported dead. The story begins with the tale of a truck that was full of exhumed Albanian corpses and that had swerved off the road and fallen into the Danube. It was being driven from Kosovo to Serbia, in a period just after the war when the Serbian authorities were trying to conceal massacres (not just by the Jackals) from incoming NATO troops and hence from the UN’s tribunal. The film goes on to show the extent of the Jackals’ brutal killings with testimony from survivors, who describe how the group rounded up civilian men and boys in a couple of villages and then gunned them down in cold blood.

But what makes The Unidentified remarkable is the participation of Zoran Rašković, one of the killers, who decided to speak about the Jackals’ activities during the war and who testified in court when some of them were later put on trial in Serbia. Rašković says he decided to talk “because, if this seed of hatred remains among us, we’ll get to each others throats again.” After a graphic description of mass murder, he says: “When they were all dead and quiet, I looked up at the minaret and I thought, ‘God, what a beautiful day.’ It was truly a nice, sunny day. And I could not bring myself to look down.” Then someone was sent to get beer, which they drank, and then they went “back home.” It is possible to imagine what might go through someone’s head who has just gunned down dozens of people, but hearing what they actually thought is something else.

Particularly disturbing is the power Dead seems to have had over his men. The paramilitary leader eventually died of AIDS in Argentina but Rašković says unambiguously that, although Dead was “a fanatic with a deranged mind,” if he had still been alive he, Rašković, would not have testified against him. Nine members of the much larger Jackals unit were convicted for these massacres in a Serbian court, but last year that verdict was overturned on appeal and a retrial ordered. An expanded group of twelve are now on trial. Fred Abrahams, of Human Rights Watch, who was among the first to reveal the crimes of the Jackals, in the film describes these men as “sacrificial lambs”: none of the senior military or police involved in crimes in Kosovo have ever been prosecuted in Serbia. (While there have been trials for Kosovo at the UN tribunal, Serbia, like the rest of the region, is supposed to prosecute those further down the chain of command, and its record of success, like that of courts in Bosnia, Croatia, and Kosovo, has been mixed.)

Mirjana Karanović in her film, A Good Wife, 2016
Cineplanet/This&That ProductionMirjana Karanović in her film, A Good Wife, 2016

The extent to which many of the atrocities of the 1990s have simply been ignored, minimized or argued away, except if they were committed by the other side of course, is the subject of A Good Wife, a new feature film directed by Mirjana Karanović, one of Serbia’s most famous actresses. Though it was made for viewers in the countries of the former Yugoslavia and many of its references will be lost outside the region, the film should reach a broader audience. Karanović plays the protagonist, a middle-aged woman named Milena who lives in the town of Pančevo, just outside Belgrade, in a nice house with her successful builder husband. She goes for a routine checkup and her doctor, feeling a problem, orders her to have a mammogram. Fearful of what might be found she delays. Then the doctor says: “Milena, why did you wait so long? The tumor is rather large.” The metaphor is clear: by this time Milena has discovered an old video of her husband and his friends murdering a group of Bosniaks after the fall of Srebrenica.

Now Milena understands the tensions between her husband’s friends, the threats one of them makes against her husband implying he will tell investigators about their “secret,” and his subsequent death which we are led to believe is not simply the result of his being drunk on a motorbike. Everyone in the Balkans will understand that the core of the plot is based on the actual discovery of such a videotape, which shocked the region when it emerged in 2005; and Balkan viewers will also recognize that the lady who appears on TV and whom Milena, in an attempt to deal with this tumor, hands the video to, is supposed to be Nataša Kandić, the famous Serbian human rights activist.

In real life the paramilitary group who was filmed killing the Bosniaks was called the Scorpions, and during the Kosovo war they committed another massacre. A new short film from Kosovo, called Shok (“Friend”), which received a nomination in this year’s Oscars, involves the activities of such a group in the Kosovo war. Set in 1998, when that war began, it is the story of two twelve-year-old boys. Petrit prides himself on being a businessman and is selling cigarette papers to the Serbs. He brings along his friend Oki, but one of the Serbs declares that his nephew does not have a bike and so Oki is forced to hand the bike over. The story races to its tragic conclusion as Petrit’s family is thrown out of their home by the Serb paramilitaries and told that they will die if they look back. But this is no simple tale of goodies and baddies. It is a study of lingering guilt, with the crimes of past returning to haunt the now-adult Petrit when he comes across a bike, or possibly the same bike, years later. Shok is to be shown at the Tribeca film festival later this month.

Eagle Eye Films LLC/Ouat MediaEshref Durmishi as Dragan and Andi Bajgora as Oki in Shok, 2015

Another feature film that has gained much attention in the former Yugoslavia is Death in Sarajevo. It takes place on June 28, 2014, on the day the Bosnian capital is commemorating the centenary of the 1914 assassination of the Archduke Franz Ferdinand by Gavrilo Princip—the event that sparked off World War I. Directed by Bosnian filmmaker Danis Tanović, who won an Oscar for his 2001 Bosnian war film, No Man’s Land, this is a tragi-comedy set in the city’s once iconic Holiday Inn hotel. On the roof, a TV reporter named Vedrana is doing a live interview with Gavrilo Princip, a descendent of the assassin. He wants to know whom a modern Gavrilo Princip would kill. They argue about the 1990s war, hurling accusations against one another, including a reminder by Princip that innocent Serbs were murdered in Sarajevo during the war, a fact that is generally ignored in Sarajevo and virtually unknown about outside the region. In the middle of the hotel a Frenchman is practicing a dramatic speech for Hotel Europe, a play actually written by Bernard-Henri Lévy, the French philosopher, about the “death of Europe,” while a security man watching him on a hidden camera inside his room sniffs cocaine off his phone as his wife rings to nag him about buying a new couch they can’t afford. Meanwhile Omer, the hotel’s director, prowls the building trying desperately to stave off a strike because staff have not been paid for two months.

Again, it is a film made for a Balkan audience but it provides a revealing sense of how Bosnians remain stuck in the past, their leaders unable to work together for a better future. “We are ridiculous in our stupidity,” laments Vedrana to Princip just before the unexpected climax to the film, in a speech that everyone in the region will identify with. “Instead of helping each other we do everything to make each other’s lives miserable.”

Aleksandar Seksan as Enco and Izudin Bajrović as Omer in Danis Tanović's Death in Sarajevo, 2016
Margo Cinema/SCCA/pro.ba/Betta PicturesAleksandar Seksan as Enco and Izudin Bajrović as Omer in Danis Tanović’s Death in Sarajevo, 2016

In many respects that could be a motto for the Balkans. Twenty-one years after the end of the wars in Bosnia and Croatia and almost seventeen since the end of the Kosovo war, the region has moved a long away, much of it for the better. Not a day goes by without the region’s leaders or their ministers meeting somewhere to discuss problems, people travel easily now and often don’t even need passports as opposed to identity cards and there is much business between the countries. Yet so much more could be done by Balkan leaders to address the legacies of these brutal conflicts, which have not yet really become history. Sometimes it looks like they are not capable of or interested in doing so and verdicts like the Karadžić one gave Serbian and Bosniak leaders an opportunity to beat nationalist drums again and remind their voters that they had better vote for them or the enemy would one day be back.

The Šešelj verdict produced altogether more complex reactions, especially in Serbia since the president and prime minister were once—before, as Šešelj charges, betraying him—his trusted lieutenants. The prime minister has since renounced the aims of his extreme nationalist past. President Tomislav Nikolić however recently decorated Sudanese president Omar al-Bashir, who is wanted by the International Criminal Court on charges of war crimes and genocide in Darfur.

All this is gloomy, but there is an even gloomier specter haunting the region. In Serbia and Croatia revisionism is in. Last July a court process began in Serbia to rehabilitate Milan Nedić, the Serbian quisling prime minister under the Nazis. In Croatia the new minister of culture is an open admirer of the genocidal Ustasha Independent State of Croatia, which existed during World War II. This year the Croatian Jewish community will boycott the annual ceremony of remembrance which is held at the wartime Jasenovac death camp because it says that the government tolerates revisionism with regards to the Ustasha past. On April 4, a new Croatian documentary about Jasenovac, very much in this spirit, was released to praise from the culture minister.

The failure of the postwar Yugoslav Communist regime to deal with some of this dark history left space for spurious and revisionist claims to grow, once the regime had lost its repressive power. This was one reason why, in the late 1980s and early 1990s, nationalists on all sides were able to blow on the embers of the past in order to mobilize their campaigns to come to power. Books and films about the wars of the 1990s may not be able to change politics but, as the UN’s tribunal winds down its work, they remind us how these tendencies remain very much alive. Dealing with them is necessary, not just for victims and societies today, but for future generations too, lest zombie-like the past returns to poison the future as it has already done before in this part of the world.

Julian Borger’s The Butcher’s Trail is published by Other Press.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/54PX7qOuvqI/

Living Happily Ever After

Colorado Springs, Colorado, 1968; photograph by Robert Adams from his 1974 book The New West, to be reissued by Steidl this April. His exhibition ‘Around the House and Other New Work’ is on view at the Fraenkel Gallery, San Francisco, March 10–April 23, with a monograph published by the gallery.
Robert Adams/Fraenkel Gallery, San FranciscoColorado Springs, Colorado, 1968; photograph by Robert Adams from his 1974 book The New West, to be reissued by Steidl this April. His exhibition ‘Around the House and Other New Work’ is on view at the Fraenkel Gallery, San Francisco, March 10–April 23, with a monograph published by the gallery.


Although the Great Recession was set off when the United States housing bubble burst in 2007, amnesiac Americans are again speculating in domestic real estate. The steep rise in property values during the final two decades of the twentieth century still lingers in many people’s minds, as does the widespread memory among the baby-boom generation that the family house was the best investment their parents ever made. Perhaps we have not reverted to pre-crash irrational exuberance, but people seemingly cannot resist the temptation to “flip” residential real estate—that is, buying and reselling in the short term to make a quick profit.

Recovery in the national housing market remains spotty—7.4 million home mortgages are seriously “underwater” (i.e., the balance of the loan is more than 25 percent higher than the property’s current assessed value)—but a healthy local economy can support flipping even modest residences. Several cable TV channels broadcast series that follow this process, including Flip This House, Flipping Virgins, Rehab Addict, and my personal favorite, Flip or Flop. This program stars a telegenic young Southern California couple who take down-at-the-heels Orange County tract houses, remodel them to suit contemporary tastes (loftlike “open concept” floor plans, spa-inspired bathrooms, and obligatory granite kitchen countertops), and often make a five-figure return on their investment (although some of their cost estimates can seem low and their completion schedules speedy to anyone who has ever done home improvements).

Flip or Flop and other examples of what has been called real estate porn remind us that there is a huge inventory of postwar suburban housing stock all around the country, much of it in the style we now call midcentury modern, although more exists in hybrid modes that mix traditional and contemporary elements. Millions of detached single-family houses were erected on the outskirts of American cities between the end of wartime building material restrictions in 1947 and the onset of the oil embargo recession in 1973, but two new books differ greatly on the exact number.

Barbara Miller Lane’s Houses for a New World puts the figure for the first two postwar decades at thirteen million, whereas James A. Jacobs’s Detached America states that during the quarter-century after the war “private builders and building companies constructed about 35,500,000 housing units…[of which] the overwhelming majority…were the detached single-family houses that define suburbia.” Even adjusting for a somewhat different time frame, there is no question that the mass migration at midcentury—by 1950 more Americans lived in suburbs than in cities—represented a thoroughgoing reformulation of our domestic landscape and a colossal demographic upheaval.

This immense shift grew out of the pent-up demand created by nearly two decades of severely curtailed construction during the Great Depression and World War II. The acute housing shortage was exacerbated by a sharp increase in postwar birth rates, which resulted in a frenzy of suburban residential development that had been anticipated for some time. For if planners could not have predicted the back-to-back economic and military traumas of the 1930s and 1940s, housing reformers had long foreseen a looming crisis in the uncontrolled increase of urban centers.

The proliferation of automobile ownership during the century’s second and third decades gave new impetus to the Garden Cities movement, founded in 1898 by the British planning theorist Ebenezer Howard, who opposed unlimited urban growth in favor of multiple, moderate-sized “conurbations” with dwellings and workplaces in convenient proximity. Such medium-density centers would ideally be surrounded by undeveloped “greenbelts” for recreation and as barriers against pollution. Yet this congenial middle ground between urban overcrowding and rural isolation made little headway in the US, despite the advocacy of the critic Lewis Mumford and his fellow members of the Regional Planning Association of America, which was organized in 1923 by the architect and planner Clarence Stein.

A rare application of Garden City principles in this country was Radburn, New Jersey, of 1928–1930, Stein and Henry Wright’s only partially realized but nonetheless internationally influential “New Town for the Motor Age.” Yet however intelligently Radburn accommodated the burgeoning car culture, it did not incorporate businesses (beyond a few shops for local residents) and became a commuter suburb, albeit a very fine one, for people with jobs in nearby cities, especially New York. Importantly, Radburn was predicated on the radical belief of the nineteenth-century political economist Henry George that “we must make land common property.” Thus, although Radburn’s homeowners hold title to their dwellings, the ground beneath their houses is retained in trust for all by a cooperative association. This egalitarian principle removed a major financial incentive that typically drives developers and investors who can gain from sale of unbuilt plots (usually at steadily increasing prices) if a community thrives, and the Radburn idea never caught on in the US.

The great unanswered question about the suburbanization of mid-twentieth-century America is this: Could it have been done better? Certainly a better way was shown by the Case Study House group—the loose association of like-minded Modernist architects organized in 1945 by John Entenza, the editor of the California-based Arts & Architecture magazine, who solicited superior prototypes for residential living from some of the period’s most imaginative designers. However, their plans were neither conceived for mass production nor geared to providing the cheapest possible dwelling per square foot. Affordability, not design distinction, was the overriding motivation for the best-known postwar tract house manufacturers, the Levitt brothers—William, who headed the company (established in 1929 by their father, Abraham), and Arthur, its chief architect and planner.

The Levitt firm had built high-end houses on Long Island before World War II, but when the founder’s sons served in the military they observed a number of industrialized engineering and construction techniques that could be profitably applied to the mass production of lower-priced dwellings for returning servicemen, the postwar period’s most obvious growth market. To cut expenses, the Levitts’ no-frills, wood-frame structures did not have an excavated basement, but instead were erected atop a concrete slab poured directly onto the ground and inlaid with radiant heating (a method Frank Lloyd Wright pioneered in his low-cost Usonian houses of the 1930s onward, which featured several concepts that Arthur Levitt proudly admitted he’d swiped from the grand old master).

Although it is often assumed that Levitt houses were prefabricated, they were merely standardized, with interchangeable components and simplified assembly strategies that resulted in considerable economies. Therefore the company could deliver compact, freestanding “minimum houses”—a term that originated in Weimar Germany’s social-housing movement as das Existenzminimum (the minimum required dwelling area)—that war veterans could buy for less per month in total carrying charges than the going rental rates for many city apartments. The Levitt company required no down payment, and the low-interest, thirty-year mortgages provided by the Federal Housing Administration to vets put these houses within easy reach of even those with moderate incomes. Who could resist owning a brand-new homestead in the countryside with the last word in kitchens and bathrooms, instead of leasing an old inner-city flat with a claw-foot tub and no backyard?

The first houses offered by the Levitts in their now-famous development in Hempstead, New York, thirty-seven miles east of Manhattan—initially called Island Trees but renamed Levittown after marketers discovered that the family name signified quality to consumers—were completed in 1947. (They were initially for rent only, with an option to buy after one year, but became available for outright purchase in 1949.) The 750-square-foot floor plan comprised a living room, kitchen, two bedrooms, and one bathroom, with an unfinished partial attic space suitable for future expansion. This tidy Cape Cod cottage sold for $6,990 (more than $77,000 in current value), but the brothers soon found that buyers would willingly pay somewhat more for upgrades, and in 1949 they introduced an equally successful eight-hundred- square-foot ranch-style version for $7,990 (equivalent to nearly $78,000 in 2015). Today the median size of new single-family American houses is 2,521 square feet, more than three times larger than the Levitts’ first offering.

The brothers’ venture into mass production was backed up by the bountiful provision of benefits through the Servicemen’s Readjustment Act of 1944. Commonly called the GI Bill, it contrasted dramatically with the disgraceful treatment of World War I veterans at the onset of the Great Depression, when the 1932 Bonus Army March on Washington was violently suppressed. Together with generous tuition subsidies for former servicemen—this country’s most extensive foray into socialized higher education—the 67,000 mortgages guaranteed through the GI Bill made home ownership possible for more citizens than ever before (except nonwhites, who received fewer than one hundred of those loans). To take advantage of this bonanza in government-guaranteed financing, canny entrepreneurs across the country quickly copied the Levitts’ cost-efficient formula.

In Houses for a New World, the Bryn Mawr professor emerita Barbara Miller Lane investigates the output of a dozen lesser-known tract house developers in four diverse regions—New England, the mid-Atlantic, the Midwest, and Southern California—and treats the period’s typical Cape Cods, ranches, and split-levels with the serious formal analysis once reserved for high-style architecture. Her tour de force of research is all the more impressive because she has assembled documentation akin to that previously available on the residential work of important postwar figures such as Richard Neutra, William Wurster, and Marcel Breuer but largely overlooked for builders other than the Levitts.

Both new books remind us of a time when a popular American middle-class weekend pastime was to pile the kids and in-laws into the family car and drive around looking at model houses, whether or not you were actively shopping for a new place. Lane has found newspaper advertisements and promotional materials for subdivisions that were clearly aimed at wives (who wielded huge influence about housing decisions even though their husbands were the breadwinners) and stressed the transformational nature of life in these up-to-the-minute dwellings.

The cover of a Cinderella Homes sales brochure, 1955–1957; from Barbara Miller Lane’s Houses for a New World
Warwick Historical SocietyThe cover of a Cinderella Homes sales brochure, 1955–1957; from Barbara Miller Lane’s Houses for a New World

A revealing example of that appeal to women can be found in a 1955–1957 sales brochure for Cinderella Estates, a new Anaheim, California, subdivision not far from the recently completed Disneyland. This booklet depicts a princess-like figure and regal coach next to a rendering of a sprawling ranch-style house and the words “your every wish for a home…come gloriously true.” The Disneyesque iconography chimes perfectly with the opening lines of “Young at Heart,” Johnny Richards and Carolyn Leigh’s 1953 hit song for Frank Sinatra: “Fairy tales can come true/It can happen to you….”


A phenomenon as pervasive as this vast population redistribution could not have gone unnoticed by commentators in various disciplines, and an extensive literature on the new suburbanization quickly developed. Critical responses tended to be negative from the outset, typified by such debunking books as the newspaperman John Keats’s The Crack in the Picture Window (1956) and a dubious pop psychological report by the physician Richard E. Gordon, his wife Katherine K. Gordon, and the journalist Max Gunther titled The Split-Level Trap (1960), which claimed that the new suburbs made people physically and mentally ill.

In due course several methodologically sound (and now classic) studies—led by the sociologist Herbert Gans’s The Levittowners: Ways of Life and Politics in a New Suburban Community (1967) and the urban historian Kenneth Jackson’s Crabgrass Frontier: The Suburbanization of the United States (1985)—presented more nuanced interpretations. So did The Suburban Myth (1969) by the historian (and future literary biographer) Scott Donaldson, who argued that mass-produced postwar housing had less to do with the malaise of corporate conformity than the persistence of the Jeffersonian ideal (“the champion American myth of all time”), which a century and a half earlier established patterns of individual exurban living that worked against the development of cohesive communities.

Lane attempts to make a case for the architectural virtues of midcentury tract houses, and traces the origins of their organizational formats to specific parts of the country where they originated before spreading nationwide. Thus we learn that the once-trendy split-level, which typically combined three strata within a two-story shell,

a basement level on a slab, containing the garage, a utility enclosure, and sometimes a small “den”; then, half a level up, the living room and kitchen; then, another half level up, the bedrooms and bath (usually over the garage)

emerged in hilly areas where the semisubmerged garage could be dug into a sloping site, and were particularly suited to the terrain of New Jersey. She also points out that because the interior volumes of the split-level were stacked, the “footprint” could be half as large as that needed for a single-story ranch, which saved on land costs.

Yet for all this diligent research, Lane—whose previous work I greatly admire, especially her pathbreaking Architecture and Politics in Germany, 1918–19451—occasionally makes claims that seem dubious. She writes that tract house “furniture might be in some sort of neocolonial mode, but much more often it was spare and light looking, in a style that came to be known as ‘Danish Modern.’” But sleek Scandinavian-inspired furnishings were an advanced taste throughout the mid-1950s, when most suburban houses were decorated in a middle-of-the-road manner termed “Transitional.”

More baffling is her assertion that the stereotype of stay-at-home wives in suburbia was baseless. “Women were not isolated within them,” she says of the new housing developments,

consigned by lack of transportation and lack of work to being dependent “homemakers”: indeed they usually worked, and, when spouses carpooled or used the train, the wives had the use of the family car.

Yet according to AFLCIO statistics, as late as 1972 women represented only 38 percent of the American workforce, and there were unquestionably fewer employment opportunities for women in postwar suburbia than in cities, except perhaps as schoolteachers, librarians, or supermarket check-out clerks.

Furthermore, Lane minimizes the damage caused by racial segregation, which systematically banned blacks from buying into new suburban developments for two decades. She writes:

There was a very significant melting-pot experience in the new postwar communities. Italians, Jews, Catholics, Irish, Polish, and others who had been segregated in American cities and excluded from earlier American suburbs now mingled freely, forming new kinds of communities that they valued intensely. I think that this experience may have helped Americans to become more accepting of diversity, even where color lines were initially maintained. In fact, suburbs became integrated more quickly than cities: by the 1970s, barriers were broken down nearly everywhere.

However Lane chooses to define segregation, there can be no question that an entire generation of African-Americans was unable to participate in what she sees as a grand social experiment, whereas white immigrants of many ethnicities had been mingling freely in cities for the better part of a century. As James A. Jacobs writes in his far more balanced, probing, and insightful history, Detached America, this gross injustice was committed by the federal government:

The FHA did not, as is commonly held, develop racially neutral policies that were then applied in a racist manner. Rather, FHA policy itself was purposefully written in a way to exclude nonwhite Americans, using the abstract notion of “market demands” as blanket justification for discrimination in sales. A prejudiced appraisal system for mortgages passed over existing houses that were believed by officials to pose a risk for devaluation—those in mixed-race subdivisions, for example, or properties in older urban neighborhoods—which further reduced access to the financial windfall that became available to white veterans and their families.

To be sure, many blacks did make it to suburbia, though not easily. By 1960 there were an estimated 2.5 million African-Americans living in suburbs; a decade later, after passage of the Fair Housing Act of 1968, that number almost doubled, to 4.6 million. But Jacobs observes that blacks moved into older inner-ring suburbs vacated by cynically manipulated “white flight”:

Such neighborhoods frequently became available for black ownership through blockbusting, a process structured by the purposeful use of racial anxiety and panic to undermine the residential housing market, ultimately for the profit of real estate brokers. These agents convinced white residents that they should sell their properties at below-market values because blacks were beginning to buy in the neighborhood, and their property values were bound to fall even more. Agents then turned around and sold the properties to black buyers at inflated prices…[and] departing white families often ended up in a new house on the suburban periphery.

I witnessed white flight in Camden, New Jersey, which was a thriving blue-collar community when I lived there from age six to eighteen. By the mid-1960s, the city’s industrial base began to erode and there was a rapid exodus of whites from the once-prestigious Parkside neighborhood—its tree-shaded streets were lined with large, handsome Victorian and Edwardian houses—which bigots dubbed “Darkside” when striving blacks moved in. By the time I brought my new wife to revisit the scenes of my youth in 1978, my parents and all their friends had long since escaped to the suburbs, and Camden was well on its way to becoming “the most dangerous city in America,” as one website named it last year.

Moving day at a new housing development, Los Angeles, 1952
J.R. Eyerman/ Life Picture Collection/Getty Images Moving day at a new housing development, Los Angeles, 1952

There were hardly any mid-twentieth-century equivalents of the exemplary Greek Revival house plans that were widely disseminated through carpenters’ pattern books for many decades after the American Revolution—the main reason vernacular construction in this country maintained such a high general standard, even in rural regions, until the Civil War. Instead semimodern forms devised by most 1950s and 1960s American developers were fundamentally derivative and debased, no matter how evocative they may now seem of their period. None of the houses documented in the new publications displays the nearly faultless proportional logic—especially the pleasing relation of the part to the whole—that distinguished even run-of-the-mill construction by itinerant early-nineteenth-century craftsmen in this country.

Thus the New Urbanists—revisionist planners who since the 1980s have promoted American housing developments with a stronger sense of community values, which they feel can be fostered by traditional elements including front porches and sidewalks—have looked not to postwar suburbia but to premodern prototypes, including pattern book designs. Yet the higher density of dwellings espoused by the New Urbanists is not in itself a solution, especially since few such enclaves have effectively addressed the continually vexing place of the automobile in a society slow to pursue more ecologically responsible transportation options.

Because midcentury tract houses tended to be built very close together (units were usually no more than twenty-five feet apart), their side elevations were often left as windowless blanks. Another accommodation to privacy was the strip window, inserted high on bedroom walls to admit light but prevent neighbors from looking in. This lack of direct visual connection with the outdoors could recall a jail cell, and although tract houses were physically proximate, they often felt emotionally isolated.

Another problem with postwar suburban developments was their dull uniformity, determined by lot lines more than any other factor. Though some builders tried to vary streetscapes by shifting and flipping similar layouts this way and that, there were limits to what could be achieved in such tightly packed settings. The major problem was that these designs were insufficiently conceived in all three dimensions. The backs of even architect-designed houses can sometimes seem like afterthoughts, but the minimally detailed rear elevations of typical 1950s and 1960s American subdivision schemes were perfunctory in the extreme, and presented a bleak aspect until landscaping could soften their stark appearance.

Developers focused almost solely on the street façade to make an arresting first impression on prospective buyers, a quality known in the real estate trade as “curb appeal.” Thus onto the front of these generic boxes they added an array of more-or-less traditional appliqués—touches of brick or fieldstone to give greater heft to the predominant asbestos shingles or wood (and occasionally aluminum) siding; nonfunctioning window shutters; wrought iron railings and Colonial-inspired lighting fixtures; and that ubiquitous symbol of the postwar suburban home, the picture window. But because such flourishes were typically treated as marketing devices rather than emanations of an integrated design, detailing tended to have a tacked-on quality that inadvertently underscored the thin character of mass-produced construction.

There were some notable exceptions to this lack of architectural distinction, particularly on the West Coast, where the public was more receptive to popularized forms of modern architecture and eagerly embraced the prevailing new style—updated reinterpretations of the traditional California ranch house. One of its most vigorous champions was the architect/builder Cliff May (1909–1989), whose flair for self-promotion—evident in such best-selling books as his Western Ranch Houses2—was equaled by strong design skills that have made his sprawling, light-filled, well-crafted structures a byword for pleasurable, informal indoor-outdoor living and desirable properties on the resale market to this day. The same is true of the developer Joseph Eichler (1900–1974), whose houses (mostly in the San Francisco Bay Area) were more fully informed by High Modernism than May’s but now are similarly prized as among the best of their kind.

Interesting though it is to see midcentury tract houses treated with scholarly gravitas, for the most part these are not designs particularly worthy of preservation or emulation. They paid little attention to sustainability or energy conservation, were predicated on a highly conventional model of family life that seems outdated if not oppressive to many Americans today, and few of these structures were so solidly built that they can do without substantial retrofitting as they enter their sixth or seventh decades. They now seem more important in sociological rather than architectural terms.

In 1940, only about 40 percent of Americans were homeowners; by 1965 that figure had soared to almost 63 percent. The extraordinary social mobility indicated by these numbers seems all the more poignant when one considers a reverse development in the United States recently. In 2007–2009, before the full impact of the Great Recession was felt, 66.4 percent of housing units were owned by their occupants. By 2014 that figure had dropped to 64.7 percent, the lowest since 1995.

Rather than glamorizing the white-bread world of Eisenhower-era suburbia, I prefer to think of more socially admirable efforts such as Reston, Virginia, twenty miles west of Washington, D.C., which was founded in 1964 by the developer Robert E. Simon, who died in September at the age of 101. (His initials formed the first three letters of this start-from-scratch community’s name.) With its sensible and salable mix of housing formats, including clustered townhouses and high-rise condominiums, and encouragement of businesses that provided jobs for its residents, Reston more closely resembled European new towns than postwar American suburbs.

As Simon’s New York Times obituary noted:

He laid out a town of open spaces, homes and apartments that would be affordable to almost anyone, racially integrated, economically self-sustaining, pollution-free and rich in cultural and educational opportunities.

That is something to be nostalgic about. You’d have to look very hard in the suburbs today to find a single builder following his example.

  1. 1

    Harvard University Press, 1968. 

  2. 2

    Lane Books, 1958. 

Source Article from http://feedproxy.google.com/~r/nybooks/~3/3C6TQ11A8Lg/

The Disaster of Richard Nixon


In the course of his twenty-eight years in politics and twenty more in active retirement, Richard M. Nixon uttered a great many dubious propositions. None was less accurate than the words he spoke on November 7, 1962—the day after he lost the governorship of California to Edmund S. Brown, two years after losing the presidency to John F. Kennedy: “Just think of how much you’re going to be missing,” he told reporters gathered for what he billed as his last press conference. “You won’t have Nixon to kick around anymore.”

Richard Nixon at a press conference at the White House, October 1973
David Burnett/Contact Press ImagesRichard Nixon at a press conference at the White House, October 1973

A flawed prognostication. The critics who first found fault with Nixon’s 1946 red-baiting campaign against Democratic congressman Jerry Voorhis of California have been disparaging him ever since. Reading these books twenty-one years after his death, one realizes that finding fault with Nixon still has a future. It may never end. Thanks to his gross abuses of presidential power symbolized by the Watergate scandal and to his own decision to record the details of his presidency on tape, Nixon seems destined to remain an object of fascination, amazement, scorn, and disgust for as long as historians pay attention to the American presidency. When the subject matter is their foreign policy, Nixon’s sidekick, Henry A. Kissinger, will be right there beside him.

Is Nixon’s historical reputation doomed forever? These books suggest that it is. Evan Thomas’s highly readable Being Nixon is, inadvertently, the most persuasive. Thomas set out to write a sympathetic account of Nixon’s life. He is persistently empathetic to his subject, but he is also a fine reporter and biographer (of Robert F. Kennedy, Edward Bennett Williams, John Paul Jones, and others). The good reporter gives his readers so many details of Nixon’s bad behavior that Thomas’s intention to write a sympathetic account collapses under the weight of its own facts. You can feel sorry for Nixon as a human being after reading Thomas’s book, but it is much harder to excuse his repeated transgressions—of ethical standards, of the law, of democratic values—and his quite abject reliance on alcohol and drugs. Thomas bends over too far in his effort to forgive Nixon’s misdeeds, particularly his Vietnam disaster and his ugly racial politics.

The other books in this collection of recent works are openly hostile to Nixon (Tim Weiner and Ken Hughes) or subtly devastating (William Burr and Jeffrey Kimball). Weiner, a former New York Times reporter and winner of the Pulitzer Prize, focuses on Vietnam and Watergate; he uses many of the most recently released tape transcripts and documents to give his version of these familiar stories new energy and salience, but his well-paced narratives of both stories don’t break much new ground.1 Hughes, a good researcher but inelegant writer who has been studying the Nixon tapes since 2000 at the University of Virginia’s Miller Center, doesn’t hide his personal anger at Nixon and Kissinger for prolonging the Vietnam War when, according to the evidence of the tapes, they realized it couldn’t be won. He indicts them for sacrificing tens of thousands of American lives and over a million Asian ones to a lost cause. But Hughes oversimplifies when he claims that, almost entirely owing to political calculations, the war had to continue beyond November 1972, or Nixon could not win reelection. Burr and Kimball make more nuanced use of the same material.


Vietnam was the defining issue of Nixon’s presidency, as he knew it would be. Months before he became president, Nixon assured H.R. “Bob” Haldeman, his closest aide, that “I’m not going to end up like LBJ, Bob, holed up in the White House, afraid to show my face on the street. I’m going to stop that war. Fast.” Antiwar protesters had driven Lyndon Johnson into early retirement, which allowed Nixon to become president. Nixon played to the country’s war weariness in his 1968 campaign, implying that he had a plan to end the war.

But he had no plan. Ironically, even before he took office Nixon personally sabotaged an opportunity he might have had to avoid Johnson’s fate. The books under review suggest that this is one of the stories that will continue to stain Nixon’s reputation.

In late October 1968, when Johnson’s negotiators in Paris finally reached an agreement with North Vietnam to end American bombing and begin negotiations on a political settlement, Nixon took an enormous personal risk to derail the peace talks before they could begin. At the time, polls showed that Hubert H. Humphrey, Nixon’s Democratic opponent and Johnson’s vice-president, was rising fast—so fast that Nixon feared he might lose the presidency because of the peace deal. So he performed a dirty trick that foreshadowed many more to come.

For months Nixon had worried about a last-minute deal, or appearance of a deal, that would boost Humphrey. In July he opened his own channel to Nguyen Van Thieu, the president of South Vietnam. As his intermediaries to Thieu Nixon chose his campaign manager, the New York attorney John Mitchell, and Anna Chennault, the exotic, Chinese-born widow of Claire Chennault, a former US Air Force general who led the Chinese Nationalist air force during World War II. In a secret meeting (Nixon loved secret meetings) in Mitchell’s New York office with Chennault and Bui Diem, Thieu’s ambassador to the United States, Nixon explained that when he had a message for Thieu, he would give it to Chennault, who would convey it to the ambassador to forward to Saigon.

In September the Nixon campaign learned that something big would soon be announced from Paris. Haldeman wrote a memo to Nixon on September 17, 1968 saying that he learned from a source that Johnson would likely announce a halt in the bombing campaign in mid-October. In a diary entry of January 13, 1972, Haldeman identified this source as Kissinger, recording that “We’ve got to remember he [Kissinger] leaked things to us in ’68.” Kissinger at the time was a Harvard professor busily cultivating relationships with both the Humphrey and Nixon camps, apparently hoping for a big job in Washington whoever won the White House that year. Kissinger had been a consultant to the US delegation, although he wasn’t directly involved in the negotiations when he visited Paris in September 1968. Richard Holbrooke, a member of the delegation, said that “Henry was the only person outside the government we were authorized to discuss the negotiations with.”

On October 31, the day Johnson announced the suspension of bombing of North Vietnam and the imminent beginning of peace negotiations, Mitchell called Chennault, said he was speaking “on behalf of Mr. Nixon,” and told her it was “very important that our Vietnamese friends understand our Republican position”—that Thieu should wait for a better deal from Nixon. The same afternoon the FBI watched Chennault pay a call on Bui Diem, Thieu’s ambassador. A National Security Agency listening device in Thieu’s Saigon office heard him tell aides that Nixon wanted him to wait for the next president to take office. Thieu did refuse to send negotiators, and no peace talks began. Nixon won the election by a whisker—a popular vote margin of 0.7 percent, though he won in the electoral college more easily.2

LBJ was livid; he thought Nixon had violated the Logan Act, which makes it illegal for private citizens to interfere with official diplomacy. Aides talked Johnson out of making public what he knew about Nixon’s secret maneuverings, much of it based on wiretaps. LBJ believed that Nixon subverted any chance for peace before he left office.

There is no persuasive evidence that peace talks would have succeeded; but Nixon’s presidency began with a newly incurred political debt to Thieu, and with no prospect of an early exit from the war, which Nixon said privately was unwinnable. “There’s no way to win the war,” he told his own speechwriters months earlier. “But we can’t say that, of course. In fact, we have to seem to say the opposite, just to keep some degree of bargaining leverage.”

President Nixon with President Nguyen Van Thieu in South Vietnam, August 1969
AP ImagesPresident Nixon with President Nguyen Van Thieu in South Vietnam, August 1969


Richard Nixon: “I’m probably the toughest guy that’s been in this office since—probably since Theodore Roosevelt.”

Henry Kissinger: “No question.”

—White House conversation, June 30, 1971

The most revealing of the newest books is Nixon’s Nuclear Specter by William Burr and Jeffrey P. Kimball. It is better than its awkward title and subtitle. Burr and Kimball neatly recreate the Vietnam dilemma that Nixon and Kissinger confronted: they couldn’t win, but they couldn’t face losing. Nixon’s Nuclear Specter is a detailed and careful account of Nixon’s and Kissinger’s fruitless efforts during 1969 to find an “honorable” way out of Vietnam. As events that year unfolded, these authors demonstrate, honor had little to do with it.

Nixon’s one big idea for resolving the dilemma was to scare his Communist adversaries into making an acceptable deal to end the war. This is how he explained it to Haldeman, as reported by Haldeman in his book The Ends of Power (1978) and cited by Burr and Kimball:

They’ll believe any threat of force that Nixon makes because it’s Nixon…. I call it the Madman Theory, Bob. I want the North Vietnamese to believe I’ve reached the point where I might do anything to stop the war. We’ll just slip the word to them that, “for God’s sake, you know Nixon is obsessed about communism. We can’t constrain him when he’s angry—and he has his hand on the nuclear button”…and Ho Chi Minh himself will be in Paris in two days begging for peace.

Nixon and Kissinger mobilized an extraordinary combination of unpublicized threats and unannounced acts of violence to pursue this chimera. The bluntest was the bombing of Vietcong and North Vietnamese base areas in Cambodia, near the border with South Vietnam, a huge military campaign in March 1969 that, amazingly, remained a secret for many months. Nixon tried to send “Madman” signals, particularly to Moscow but also to Hanoi and Peking, as it then was called, by multiple means. One was a personal letter to Ho Chi Minh to be delivered by Jean Sainteny, a Frenchman who had personal ties to Vietnam’s leaders. Nixon asked Sainteny to act as his envoy.

The letter that Sainteny delivered to a North Vietnamese official in Paris was a respectful plea to accelerate negotiations and “bring the blessings of peace to the brave people of Vietnam….” But Sainteny was instructed to accompany Nixon’s letter with a threatening verbal message setting a deadline of November 1 for reaching an agreement—the first anniversary of Johnson’s bombing halt. If “no valid solution has been reached” by then, Sainteny was to warn, “he [Nixon] will regretfully find himself obliged to have recourse to measures of great consequence and force.” Kissinger repeated similar language in a secret meeting in Paris with the North Vietnamese. All this had no visible effect on Hanoi’s behavior.

Nixon and Kissinger could not get a helpful reaction from the Russians or the Chinese, either.3 In the spring of 1969, Kissinger had tried using the threat to impress Anatoly Dobrynin, the Soviet ambassador in Washington, to no effect.

Nixon and Kissinger both had high hopes for diplomacy with the two great Communist powers, the Soviet Union and China. Nixon’s boldest, most creative idea was to finally recognize “Red China,” “pulling it back into the world community,” as he wrote in 1967, and also as a way to exploit the Sino–Soviet split, put pressure on the Soviets, and restore America’s global primacy. Both men thought there was a good opportunity to negotiate meaningful limits on the nuclear arms race with Moscow. But a principal goal of their diplomacy was also to find a way out of Vietnam, and they hoped to scare or persuade the Communist powers to help by pressuring North Vietnam to make a deal. They never got such help.

The secret bombing of Cambodia, launched in March 1969—advocated by Joint Chiefs of Staff Chairman General Earle Wheeler, endorsed by Kissinger, and embraced by Nixon—was their first attempt to impress the North Vietnamese with their seriousness of purpose. Over the subsequent four years American bombers dropped nearly 2.8 million tons of ordnance on Cambodian territory, a huge quantity, but North Vietnam never acknowledged that it knew this bombardment was occurring.

Kissinger, according to Burr and Kimball, favored a further escalation of the war with an aggressive bombing campaign against the North in 1969. Nixon authorized planning for such a campaign. Kissinger’s staff and Pentagon officials conceived Operation Duck Hook to be executed in October, shortly before Nixon’s November 1 deadline. In addition, the navy conducted exercises off the coast of North Vietnam that Nixon hoped Hanoi would interpret as practice for the mining of Haiphong harbor. Quite amazingly, Nixon and Kissinger, according to documents cited by Burr and Kimball, also ordered an unannounced, worldwide nuclear alert: an elaborate military exercise that put US strategic forces—missiles, missile-carrying submarines, and bombers—in a position of high readiness, as though the US was preparing to launch a nuclear attack.

These details were particularly fascinating for me because, as a young correspondent in Vietnam for most of 1969 and 1970, I knew nothing about any of this secret maneuvering. In early March 1969, I had stopped in Paris on my way to Saigon to meet with American officials who participated in the peace talks, Holbrooke among them. They told me how lucky I was to be getting to Vietnam just in time to witness the war’s last act.

My first months in Vietnam were dominated by “Vietnamization,” a plan conceived by Melvin Laird, Nixon’s politically astute secretary of defense. Laird’s priority was bringing American troops home. Vietnamization meant turning the fighting over to the South Vietnamese army and hoping for the best as Americans withdrew. By early 1970 an American force that peaked at about 540,000 had shrunk by 115,000, with more reductions promised. Bringing troops home was popular in the US, probably reducing the public pressure on Nixon to end the war.

Initially Vietnamization went smoothly. Communist forces inside South Vietnam, severely depleted by losses suffered in the Tet Offensive of January–February 1968, and another offensive in May, could not immediately exploit American withdrawals. In 1969, the South Vietnamese held their own in a quieter war. A weaker enemy also enabled the American-sponsored pacification program to bring a growing portion of the population under government control, especially in the populous Mekong Delta, where Vietcong—the local Communists—nearly disappeared in late 1969.

The earnest Americans tasked with pacification allowed themselves to become hopeful. I wrote a series of articles about them for The Washington Post that fall called “The New Optimists.” I described their hopefulness, but noted that none of them spoke of winning the war. The smartest American advisers realized that pacification could not create victory; winning would eventually require defeating an effective North Vietnamese army that had thousands of men inside South Vietnam, and tens of thousands more on its periphery in Cambodia and Laos. These were the troops that ultimately won the war.

The account of Burr and Kimball startled me by suggesting how little attention Nixon and Kissinger paid to pacification or Vietnamization. The war I was covering was largely separate from the war they were waging. Pacification and Vietnamization were delaying tactics; they knew they needed a deal with Hanoi—what they called “a political solution”—to end the war. Hence the secret bombing and threats. The trouble was, they had no apparent impact in Moscow, Peking, or Hanoi. As Nixon’s Nuclear Specter makes clear, by the end of 1969 the masters of American foreign policy had not managed to convince the targets of their strategy to make the deal they sought.

Their decision to withdraw American troops spoke louder than Nixon’s vague threat of “measures of great consequence and force.” We have no reliable account of North Vietnamese deliberations, but the growing antiwar movement in the United States seems to have impressed Hanoi more than the mining exercise in the Gulf of Tonkin. In a memo to Nixon on September 10, Kissinger acknowledged that antiwar protests and troop withdrawals encouraged Hanoi “to wait us out,” and admitted the weakness of Thieu and the South Vietnamese army. Kissinger initially wanted Duck Hook to go forward to “jar” the North Vietnamese into negotiating a deal.

But by October Nixon had lost his stomach for escalation. His secretaries of state and defense, William Rogers and Laird, might resign in protest, he feared, and the huge antiwar protests that fall seem to have scared him. Nixon called off Duck Hook.4

The nuclear alert did go forward, beginning on October 13. Apparently, according to Burr and Kimball, the Russians barely noticed. They describe Nixon’s and especially Kissinger’s excitement when Dobrynin called on October 17 to request a meeting with Nixon. Kissinger told Laird that Dobrynin’s call suggested that the alert “seems to be working.” Haldeman recorded in his diary: “K thinks this is good chance of being the big break, but that it will come in stages. P [Nixon] is more skeptical.”

The excitement soon faded. When Dobrynin came to the White House, the alert was not mentioned. “The toughest guy [in the Oval Office] since TR” had prepared for a showdown with Dobrynin. Kissinger had advised him that the purpose of the meeting “will be to keep the Soviets concerned about what we might do around 1 November.” But Nixon, according to Burr and Kimball, “failed to frighten or intimidate the veteran statesman Dobrynin.” He rambled; he lost his temper; he complained that both North Vietnam and the Soviet Politburo were trying to “break” him; he was, by Dobrynin’s account, nervous and sometimes agitated.

Nixon did remember at one point to warn that “the United States would have to pursue its own methods for bringing the war to an end” if talks failed. But he ended by saying that if Hanoi continued to restrain its forces in the South, the US would reciprocate by taking no new offensive action—hardly a threat. Later that night, he instructed Kissinger to call Dobrynin back to the White House and scare him. Kissinger ignored the order.

Reporting this conversation to Moscow,5 Dobrynin observed that Nixon lacked emotional self-control. “The main thing now for him…is to end the war in Vietnam, everything else is secondary,” Dobrynin concluded. “The fate of his predecessor Lyndon Johnson is beginning to really worry him. Apparently, this is taking on such an emotional coloration that Nixon is unable to control himself even in a conversation with a foreign ambassador.”


The utter failure of the threats of 1969 to persuade Hanoi to compromise left Nixon and Kissinger with few cards to play. Their fallback position was one that Kissinger had discussed in academic settings as early as 1967, and that he and Nixon realized from the beginning might have to be their policy. This was the so-called decent interval solution—making a deal with Hanoi providing for complete American withdrawal (and return of all US POWs). In return, the North Vietnamese and Vietcong would agree not to try to conquer South Vietnam for a brief period—“say eighteen months or some period,” Kissinger told Zhou En-lai in an after-dinner conversation on his first secret visit to Peking in July 1971 (which did not appear in Kissinger’s subsequent published account of the meeting).

Kissinger had offered this solution to Dobrynin four months after Nixon took office, on May 14, 1969. Dobrynin’s report on this meeting suggests that Kissinger’s message surprised him. “Nixon is even prepared to accept any political system in South Vietnam, ‘provided [here Dobrynin is quoting Kissinger] there is a fairly reasonable interval between conclusion of an agreement and such a system’” coming into being. From this evidence, both the Soviets and the Chinese knew that Nixon was ready to betray Thieu if he got a face-saving peace agreement.

The two men running US policy neither told members of Congress nor informed the American public that this was their position; nor did they tell President Thieu about it. Perhaps these choices were also sensible, given the absence of practical alternatives, but choosing them and then omitting later comment on them—as both did in books they published years later6—will leave them vulnerable when future generations of historians address these events.

As a practical matter, the problem with the decent interval strategy was the implicit requirement that North Vietnam agree to it. Though Nixon did launch unilateral withdrawals with Vietnamization, he rejected walking away from South Vietnam—what he called the “bug out” option. He wanted Hanoi to collaborate on the decent interval to give him political cover and allow him to claim that he found the long-promised “honorable” end to the war. Not surprisingly given both their history and Nixon’s, the North Vietnamese said openly that they did not trust or believe him. The result was more war.

Toward the end of 1970, his frustrations again mounting, Nixon considered simply announcing the total withdrawal of American troops by the end of the following year. On December 15, 1970, Haldeman recorded a memorable conversation with Kissinger:

He [Kissinger] thinks that any pullout next year would be a serious mistake, because the adverse reaction to it could set in well before the ’72 elections. He favors instead a continued winding-down and then a pullout right at the fall of ’72 so that if any bad results follow, they’ll be too late to affect the election.7

Kissinger, the diplomatic expert, had here become a political adviser giving guidance to Nixon on his reelection campaign. Whether because of Kissinger’s advice or his own calculations, Nixon did not pursue the idea of getting out in 1971.

Prolonging the war was an expensive choice. More than 21,000 Americans died in Vietnam after Nixon became president, more than a third of our total losses in the war. Tens of thousands more were wounded. But Americans suffered the least; hundreds of thousands of Vietnamese lives were lost after 1969. The bombing of Cambodia, and then Nixon’s 1970 invasion in search of a target that never really existed, the “Central Office for South Vietnam,” COSVN, which US intelligence thought was a field headquarters for the Vietcong, contributed to the destabilization of Cambodia.

The March 1970 coup that ousted Prince Norodom Sihanouk, the skillful Cambodian leader who had preserved his country’s independence and neutrality in a dangerous neighborhood, brought on more instability, encouraged by Nixon and Kissinger’s policy of supporting the creation a large new Cambodian army under the coup’s leader, Lon Nol. That army never proved effective, and Lon Nol’s bumbling government could not cope with the chaos created by the widening war inside Cambodia that Nixon promoted. The Khmer Rouge rebels who started out as a small band opposing Sihanouk exploited that chaos. By 1975 they took over the country, and eventually killed some two million Cambodians.8

The bombing of Cambodia was part of a failed effort to avoid what ultimately could not be avoided: the reunification of Vietnam. For more than four years Nixon and Kissinger looked desperately for a way to salvage the American commitment in South Vietnam and minimize the repercussions of losing the war. But they did so cynically, clumsily, and ultimately forlornly. Robert Dallek captured the essence of their Vietnam policy in two words: “a disaster.”

The disaster extended to Nixon’s presidency. In Haldeman’s memorable statement, “Without the Vietnam war, there would have been no Watergate.” Haldeman used the term not to describe just the break-in at the Democratic National Committee, but more broadly to cover all the craziness that John Mitchell memorably called “the White House horrors.” Haldeman realized how the war poisoned Nixon’s presidency. As Carl Bernstein wrote in a review of the books by Thomas and Wiener, “Vietnam and Watergate are inextricably linked in the Nixon presidency. They are an intertwined tale—one story—of sordid abuse of presidential power, vengeance, cynicism and lawlessness.”9 The connection between Vietnam and Watergate is often missed. Thomas ignores it; Weiner doesn’t, but he makes too little of it.

Deceit and disregard for the law were the common threads. The abuses that constituted Watergate began with events tied to the Vietnam war: first was the attempt to sabotage LBJ’s peace talks in October 1968. In 1969 came the secret bombing of Cambodia and the wiretapping of reporters and White House aides, provoked by a leak to The New York Times about the secret bombing. Then the break-in at the office of the psychiatrist of Daniel Ellsberg, the man who leaked the Pentagon Papers about the war. The Huston Plan, drawn up by a White House aide in 1970 and approved by Nixon, proposed break-ins and black-bag jobs aimed at radicals, especially anti-Vietnam activists. The plan was rescinded, but many were kept under surveillance. Nixon explicitly ratified the use of illegal break-ins when he ordered aides to “blow the safe” at the Brookings Institution in Washington in search of Vietnam secrets from the Kennedy and Johnson administrations. That order was also never carried out, but soon after Nixon issued it, Mitchell and others came up with the idea of breaking into the Democratic committee offices. Ultimately, deceit and lawlessness forced Nixon from office, and sent twenty-two of his colleagues to jail.10

We have learned more about the Nixon presidency than about any other, but, astoundingly, there is much more to come. Nearly 2,700 hours of Nixon tapes have been released, but 774 hours more are still being withheld for various reasons. So are hundreds of thousands, perhaps more than a million pages of White House documents. Some of this material is still classified; some involves personal records of the Nixon family; some is being withheld without explanation. Eventually everything will come out, assuring that Nixon will live on as the subject of new books with new revelations. None of this seems likely to be exculpatory.

  1. 1

    The best book on the Nixon presidency, I think, is Richard Reeves’s President Nixon: Alone in the White House (Simon and Schuster, 2001). It was published days before the September 11 attacks and never got the attention it deserves. Another excellent book on these subjects is Robert Dallek’s Nixon and Kissinger: Partners in Power (HarperCollins, 2007). Elizabeth Drew, who writes often in these pages, has written two books on Nixon, Washington Journal: The Events of 1973–1974 (Random House, 1975) and Richard M. Nixon (Times Books, 2007)  

  2. 2

    This story is best told in Ken Hughes’s Chasing Shadows (University of Virginia Press, 2014). Thomas and Weiner provide brief accounts in their books. Whether Thieu’s boycott determined the outcome of the 1968 election is far from clear, but Nixon worried that it might have, and that his involvement might be revealed.  

  3. 3

    Nixon’s bold opening to China—certainly his biggest accomplishment as president—was in part a complicated effort to use the Sino–Soviet split to help him persuade Hanoi to end the war on satisfactory terms. This failed, too. Nixon’s China policy changed the course of history, but not of the Vietnam War.  

  4. 4

    The proposal for a large-scale air campaign against North Vietnam, including mining of the principal port of Haiphong, was revived in 1972, when Nixon and Kissinger tried to compel Hanoi to complete the peace agreement Kissinger had negotiated.  

  5. 5

    Dobrynin’s cables to Moscow reporting on his meetings with Nixon and Kissinger were published jointly by the State Department and the Russian Ministry of Foreign Affairs in 2007—a valuable source for researchers that Burr and Kimball exploit effectively.  

  6. 6

    Nixon avoids the question entirely in his long memoir RN (Grosset and Dunlap, 1978). In a later book, No More Vietnams (Arbor House, 1985), he described the decent interval option that others had proposed: “I believed that this was the most immoral option of all. As president, I could not ask any young American to risk his life for an unjust or unwinnable cause.” But that’s just what he did. Kissinger in his White House Years (Little, Brown, 1979) wrote that it was incorrect to suggest they sought only “a ‘decent interval’ before a final collapse of Saigon.” He did not quote from the meetings with Zhou and Soviet officials where he described Nixon’s objective of having an interval—described as a “reasonable interval,” according to Dobroynin.  

  7. 7

    Hughes and his colleagues at the University of Virginia have created a website where many of the tapes of the Nixon presidency can be heard. You can hear Haldeman dictating this diary entry here: prde.upress.virginia.edu/conversations/4006726.  

  8. 8

    In Sideshow, published in 1979, William Shawcross argued that the US bombing had led to the rise of the Khmer Rouge. In a recent letter to the editor of The New York Review, Shawcross wrote that the history of the Khmer Rouge conquest was more complex:

    But Sihanouk (with whom I later became friendly) also made huge mistakes. The most appalling, which he had told me he always regretted, was siding with China and the Khmer Rouge immediately after his overthrow in 1970. That did far more to guarantee the destruction of the country than the secret bombing.


  9. 9

    See “Watergate Reporter: Nixon Is Still Tricky After All These Years,” The Washington Post, July 24, 2015.  

  10. 10

    For a lively and revealing recent account of Watergate, see John Dean’s The Nixon Defense (Viking, 2014). 

Source Article from http://feedproxy.google.com/~r/nybooks/~3/CqinxwWYc94/

How Shakespeare Lives Now

John Martin: Macbeth, circa 1820
Scottish National Gallery, Edinburgh/Bridgeman ImagesJohn Martin: Macbeth, circa 1820

Shakespeare’s death on April 23, 1616, went largely unremarked by all but a few of his immediate contemporaries. There was no global shudder when his mortal remains were laid to rest in Holy Trinity Church in Stratford. No one proposed that he be interred in Westminster Abbey near Chaucer or Spenser (where his fellow playwright Francis Beaumont was buried in the same year and where Ben Jonson would be buried some years later). No notice of Shakespeare’s passing was taken in the diplomatic correspondence of the time or in the newsletters that circulated on the Continent; no rush of Latin obsequies lamented the “vanishing of his breath,” as classical elegies would have it; no tributes were paid to his genius by his distinguished European contemporaries. Shakespeare’s passing was an entirely local English event, and even locally it seems scarcely to have been noted.

The death of the famous actor Richard Burbage in 1619 excited an immediate and far more widespread outburst of grief. England had clearly lost a great man. “He’s gone,” lamented at once an anonymous elegist,

and, with him, what a world are dead,
Which he revived, to be revivèd so
No more: young Hamlet, old Hieronimo,
Kind Lear, the grievèd Moor, and more beside
That lived in him have now for ever died.

William Herbert, Earl of Pembroke, was so stricken by the actor’s death that months later he could not bring himself to go to the playhouse “so soon after the loss of my acquaintance Burbage.” It was this death that was publicly marked by him and by his contemporaries, far more than the vanishing of the scribbler who had penned the words that Burbage had so memorably brought alive.

The elegy on Burbage suggests that for some and perhaps even most of Shakespeare’s contemporaries, the real “life” of the characters and their plays lay not in the texts but in the performances of those texts. The words on the page were dead letters until they were “revived” by the gifted actor. This belief should hardly surprise us, since it is the way most audiences currently respond to plays and, still more, to film.

There was also a social dimension specific to the age. A grand aristocrat like William Herbert could acknowledge his acquaintance with a celebrity actor like Burbage (though his father was a carpenter) far more readily than he could show a connection to a social nonentity—a bourgeois entrepreneur and playwright without Oxbridge honors or family distinction—like Shakespeare. A hidden connection may all the same have existed: William Herbert is one of the perennial candidates for the sonnets’ “Mr. W.H.” But it would not do to display it in public.

Though Shakespeare’s theatrical artistry gave pleasure, it was not the kind of pleasure that conferred cultural distinction on those who savored it. He was the supreme master of mass entertainment, as accessible to the unlettered groundlings standing in the pit as to the elite ensconced in their cushioned chairs. His plays mingled high and low in a carnivalesque violation of propriety. He was indifferent to the rules and hostile to attempts to patrol the boundaries of artistic taste. If his writing attained heights of exquisite delicacy, it also effortlessly swooped down to bawdy puns and popular ballads.

In Twelfth Night, one of those ballads, sung by a noisy, festive trio of drunkard, blockhead, and professional fool, enrages the censorious steward Malvolio. “Is there no respect of place, persons, nor time in you?” he asks indignantly, to which he gets a vulgar reply—“Sneck up!”—followed by a celebrated challenge: “Dost thou think because thou art virtuous there shall be no more cakes and ale?” Shakespearean cakes and ale may have been beloved by the crowds drawn to the Globe, but they were not fit fare for the champions of piety or decorum. The pleasure they offered was in indefinable ways subversive.

It was not until seven years after his death that Shakespeare’s comedies, histories, and tragedies were gathered together by his friends John Heminges and Henry Condell in an expensive edition, dedicated to William Herbert and his brother, that first laid claim to their status as high culture. And it was only then, in his commendatory poem to the volume, that Ben Jonson for the first time evoked a larger landscape in which to understand the significance of Shakespeare’s career, one that would make it appropriate for a nobleman to acknowledge a connection to a middle-class writer of popular plays. “And though thou hadst small Latin and less Greek,” Jonson wrote, “I would not seek/For names; but call forth thund’ring Aeschylus,/Euripides, and Sophocles to us.” These immortals could worthily bear witness to the greatness of Shakespeare as a tragic playwright; as for his comedies, Jonson added, these surpass everything “that insolent Greece or haughty Rome/sent forth.”

Jonson made Shakespeare into a global artist. Not in the sense that he imagined his work was or would ever become famous outside of England, but that he insisted it could bear comparison with the best that the world of letters had ever brought forth. Even if nothing in Shakespeare’s personal circumstances—his birthplace, parentage, education, affiliations, and the like—bore recording, he was nonetheless a national treasure. “Triumph, my Britain,” Jonson proclaimed, “thou hast one to show,/To whom all scenes of Europe homage owe.” To this proud boast he added the famous line: “He was not of an age, but for all time!”

The enduring and global success of Shakespeare’s work is due in part to his willingness to let go of it, a willingness perhaps conveyed by titles like As You Like It, Much Ado About Nothing, What You Will (the subtitle of Twelfth Night), and All’s Well That Ends Well. It is as if he were refusing to insist upon his own identity and proprietary claim. It goes without saying that Shakespeare was a genius who left his mark on everything he touched. But there is also a strange sense that his characters and plots seized upon him as much as he seized upon them.

Even at this distance in time, Shakespeare’s greatest contemporary playwrights, Christopher Marlowe and Ben Jonson, both seem directly and personally present in their work in a way that Shakespeare does not. In the case of Jonson—too eager to display his scholarly mastery over his source materials, too bound up with the drama of his own life, and too anxious to retain absolute control over his own finished work—that presence is explicitly avowed in a variety of prefaces, prologues, and authorial interventions, with the result that his work, though splendid, seems entirely of a particular time and place and author.

Shakespeare seems to have felt no comparable desire to make himself known or to cling tenaciously to what he had brought forth. The consequence is that it is not really necessary to know the details of Shakespeare’s life in order to love or understand his plays. This does not mean that Shakespeare was not present in every moment of his work. On the contrary, his vocation obliged him to use his personal experience, and his mastery of his medium meant that he managed to use an uncanny amount of it, mixing it with what he had read and observed and digested.

He was an expert—perhaps the greatest expert the world has ever known—in what the brilliant English anthropologist Alfred Gell called “distributed personhood.” Gell’s interest was exclusively in visual representations, paintings, sculptures, and the like. But the core of what he discovered in the analysis of Polynesian tattoos or Malangan carvings may be found as well in literature: the ability of an artist to fashion something—Gell called it an “index”—that carries agency, his own and that of others, into the world where it can act and be acted upon in turn.* A part of the personhood of the creator is detached from his body and survives after he or she has ceased physically to exist. Transformed often out of recognition, feared or attacked or reverenced, these redistributed parts live on, generating new experiences, triggering inferences, harming or rewarding those they encounter, arousing love.

Shakespeare created out of himself hundreds of secondary agents, his characters, some of whom seem even to float free of the particular narrative structures in which they perform their given roles and to take on an agency we ordinarily reserve for biological persons. As an artist he literally gave his life to them.

We speak of Shakespeare’s works as if they were stable reflections of his original intentions, but they continue to circulate precisely because they are so amenable to metamorphosis. They have left his world, passed into ours, and become part of us. And when we in turn have vanished, they will continue to exist, tinged perhaps in small ways by our own lives and fates, and will become part of others whom he could not have foreseen and whom we can barely imagine.

On April 23, 2014—the 450th anniversary of Shakespeare’s birth—a company of actors from London’s Globe (the modern reconstruction of the Elizabethan playhouse) embarked on a two-year tour with the ambition of performing Hamlet in every country of the world. The project makes vivid what has already been happening for a very long time. Shakespeare’s works have been translated, it is estimated, into more than a hundred languages. They have profoundly shaped national literary cultures not only in Great Britain and the United States but also of countries as diverse as Germany and Russia, Japan and India, Egypt and South Africa.

Shakespeare; drawing by David Levine

A few years ago, during a merciful remission in the bloodshed and mayhem that has for so many years afflicted Afghanistan, a young Afghan poet, Qais Akbar Omar, had an idea. It was, he brooded, not only lives and livelihood that had been ruthlessly attacked by the Taliban, it was also culture. The international emblem of that cultural assault was the dynamiting of the Bamiyan Buddhas, but the damage extended to painting, music, dance, fiction, film, and poetry. It extended as well to the subtle web of relations that link one culture to another across boundaries and make us, each in our provincial worlds, feel that we are part of a larger humanity. This web is not only a contemporary phenomenon, the result of modern technology; it is as old as culture itself, and it has been particularly dense and vital in Afghanistan with its ancient trade routes and its endless succession of would-be conquerors.

Omar thought that the time was ripe to mark the restoration of civil society and repair some of the cultural damage. He wanted to stage a play with both men and women actors performing in public in an old garden in Kabul. He chose a Shakespeare play. No doubt the choice had something to do with the old imperial presence of the British in Afghanistan, but it was not only this particular history that was at work. Shakespeare is the embodiment worldwide of a creative achievement that does not remain within narrow boundaries of the nation-state or lend itself to the secure possession of a particular faction or speak only for this or that chosen group. He is the antithesis of intolerant provinciality and fanaticism. He could make with effortless grace the leap from Stratford to Kabul, from English to Dari.

Omar did not wish to put on a tragedy; his country, he thought, had suffered through quite enough tragedy of its own. Considering possible comedies, he shied away from those that involved cross-dressing. It was risky enough simply to have men and women perform together on stage. In the end he chose Love’s Labour’s Lost, a comedy that arranged the sexes in distinct male and female groups, had relatively few openly transgressive or explicitly erotic moments, and decorously deferred the final consummation of desire into an unstaged future. As a poet, Omar was charmed by the play’s gorgeous language, language that he felt could be rendered successfully in Dari.

The complex story of the mounting of the play is told in semifictionalized form in a 2015 book Omar coauthored with Stephen Landrigan, A Night in the Emperor’s Garden. Measured by the excitement it generated, this production of Love’s Labor’s Lost was a great success. The overflow crowds on the opening night gave way to ever-larger crowds clamoring to get in, along with worldwide press coverage.

But the attention came at a high price. The Taliban took note of Shakespeare in Kabul and what it signified. In the wake of the production, virtually everyone involved in it began to receive menacing messages. Spouses, children, and the extended families of the actors were not exempt from harrassment and warnings. The threats were not idle. The husband of one of the performers answered a loud knock on the door one night and did not return. His mutilated body was found the next morning.

What had seemed like a vigorous cultural renaissance in Afghanistan quickly faded and died. In the wake of the resurgence of the Taliban, Qais Akbar Omar and all the others who had had the temerity to mount Shakespeare’s delicious comedy of love were in terrible trouble. They are now, every one of them, in exile in different parts of the world.

Love’s labors lost indeed. But the subtitle of Omar’s account—“A True Story of Hope and Resilience in Afghanistan”—is not or at least not only ironic. The humane, inexhaustible imaginative enterprise that Shakespeare launched more than four hundred years ago in one small corner of the world is more powerful than all the oppressive forces that can be gathered against it. Feste the clown at the end of Twelfth Night sings a farewell ditty:

A great while ago the world begun,
With hey, ho, the wind and the rain,
But that’s all one, our play is done.

For a split second it sounds like it is all over, and then the song continues: “And we’ll strive to please you every day.” The enemies of pleasure beware.

  1. *

    Art and Agency: An Anthropological Theory (Oxford University Press, 1998). 

Source Article from http://feedproxy.google.com/~r/nybooks/~3/cwEsJNecx1I/

An Amazon Without Certainty

Antonio Bolivar as Karamakate and Jan Bijvoet as Theo in Ciro Guerra’s Embrace of the Serpent, 2015
Oscilloscope LaboratoriesAntonio Bolivar as Karamakate and Jan Bijvoet as Theo in Ciro Guerra’s Embrace of the Serpent, 2015

It’s a story as old as Alexander von Humboldt: white explorer treks into the Amazon, becomes lost and disoriented, paints face with mud, eats beetles, and has visions of galaxies and exotic reptiles, before finally achieving enlightenment—or total madness. Werner Herzog’s Aguirre the Wrath of God and Fitzcarraldo are the archetypes of the genre, which also includes Luis Buñuel’s Death in the Garden, John Boorman’s The Emerald Forest, and Roland Joffé’s The Mission. There was a boomlet of jungle quest films during the Eighties and early Nineties, not all of them set in the Amazon, reflecting a dissatisfaction with what Jimmy Carter called the “moral and…spiritual crisis” of modern society. The heroes of these films come to the jungle with predatory or utopian intentions, only to discover the folly of their ways. The plot tends to resolve with the explosion of a forest-clearing project: a river dam in The Emerald Forest, a logging road in Medicine Man, a missionary camp in The Mosquito Coast. “In the end, Robinson Crusoe went back home!” says the hero of Paul Theroux’s novel, on which the latter was based. “But we’re staying.”

Ciro Guerra’s Embrace of the Serpent, a finalist for best foreign film at this year’s Oscars, features the familiar fever-addled explorers, close-ups of vigilant jaguars and snakes baring their fangs, long pans of the jungle canopy, and indigenous tribesmen imparting portentous wisdom (“The jungle is fragile; if you attack her, she’ll fight back”). But the film is strange enough to resist the worst of the old clichés, which is to say it resists moral certainty. Guerra, a young Colombian director, shoots in a muted black and white (apart from a stunning montage in Technicolor during a climactic hallucinatory sequence), employs a counterintuitive narrative structure, and, most significant, empathizes not with the two white explorers whose misadventures drive the plot but with their indigenous guide, a profoundly conflicted figure who bears no resemblance to the customary noble or murderous savages.

The plot is based loosely on the stories of two scholars who traveled to the Amazon decades apart. The German anthropologist Theodor Koch-Grünberg (1872-1924) made several expeditions in the early twentieth century, and shot documentary footage that appears to have inspired Guerra’s sets and costume design. The Harvard biologist Richard Evan Schultes (1915-2001) lived in the South American rainforests in the 1940s and 1950s, becoming an expert on hallucinogenic plants and natural rubber. Guerra unites the two men through Karamakate, a shaman who is the last surviving member of a tribe destroyed by rapacious Colombian rubber plantation barons. In the Koch-Grünberg scenes, set in 1909, a skeptical young Karamakate (played by Nilbio Torres, a hunter who lives on the Vaupés River) agrees to help the feverishly ill anthropologist find the fictional Yakruna flower, the only known cure for his disease. (Koch-Grünberg did, in fact, die on one of his expeditions, but of malaria, in 1924.) In the Schultes scenes, set three decades later, the biologist begs the now hermetic, senile Karamakate (Antonio Bolivar, one of the last fifty remaining members of the Ocaína tribe) to help him find the elusive Yakruna for an equally selfish, but more cynical, reason. Guerra alternates between the two stories reluctantly, allowing each to expand and respire. The film’s pace is never slow but patient, lingering, accretionary, dilating with the logic of a nightmare.

We see the two white explorers from Karamakate’s perspective. Both are gaunt, physically frail, obsessive, covetous of their few personal possessions and keepsakes. They are multilingual, immersed in the cultures of the local tribes, but remain aloof. They carry mementos of their old lives like talismans to ward against the jungle’s transformative powers. Karamakate does not trust either of them, with good reason. Among the lost-in-the-jungle films, Guerra’s is distinguished by his decision to begin the story after the initial encounter between civilized and primitive man. When Embrace of the Serpent begins, the rubber industry has already eradicated Karamakate’s tribe and brutalized the landscape. He has seen other tribes become weak, alcoholic, and debased in servitude. He agrees to help the two scientists not out of naiveté but in order to prevent further cataclysm. “If we can’t get the whites to learn, it will be the end of us,” says another native. “The end of everything.” Karamakate suspects it is already too late but he hopes he’s wrong. He is, in this way, a strikingly contemporary figure.

His belligerence and bravado mask a violent ambivalence. “Why do you whites love your things so much?” he asks as Theo (Jan Bijvoet), the Koch-Grünberg character, hauls his bulging suitcases over a steep portage. When Evan (Brionne Davis) consults his map, the older Karamakate ridicules him: “Listen to all of reality,” he says, “not just your map.” But a different aspect of Karamakate is revealed in one of the film’s most surprising episodes. After a convivial night spent with a friendly tribe on the bank of the river, Theo discovers that his compass is missing. He becomes furious at the chieftain, demanding its return. It is not simply a question of theft, he explains to Karamakate. If the tribe acquires Western navigational skills, it will lose the folk wisdom that has served it for generations. Karamakate responds with fury. “You can’t deny them knowledge,” shouts Karamakate, shocking Theo and, perhaps, himself.

As an old man, Karamakate has lost his own compass. When Evan first encounters him, Karamakate cannot understand the ideograms he himself has written on a rock wall. “The line is broken,” he says. “Now, I’m empty.” The expedition on which he embarks with Evan, retracing the expedition of decades earlier, helps him to recall his identity. But as his memory returns, he comes to conclude that it is better left erased. Guerra dedicates the film to all the lost Amazonian cultures “we’ll never know,” though he suggests that some horrors are best forgotten.

The film is punctuated by brief reprieves from a steadily encroaching madness. When we meet Theo, he is addled by fever, supine in the bow of a canoe. Although only the Yakruna flower can cure him, Karamakate offers in the meantime a restorative: a white powder that the shaman blows forcefully through a horn up Theo’s nostrils. It is a startlingly intimate act, performed each morning, a ritual that binds the two strangers. The powder’s effects are temporary, however. When the illness returns, the shaman explains, it will be even more deadly—and it is.

Embrace of the Serpent works in the same way. It is a glorious excursion, full of beauty and wonder, but it leaves its viewers more unsettled than before. In the most haunting scene Theo and the young Karamakate come upon an abandoned rubber plantation. While the buildings crumble and burn, the few surviving natives sit in a dark shack getting drunk. The liquor, it turns out, is distilled from the Yakruna, the rare, priceless flower now squandered for a fleeting drunk. “You bring hell and death to earth!” says Karamakate, horrified. The men shrug. They offer him a drink. “We’re toasting,” they say, “to the end of the world.” Aren’t we all?

Ciro Guerra’s Embrace of the Serpent is now playing in select theaters.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/xzxIeX0PXAI/

Crackdown in China: Worse and Worse

Chinese Communist Party General Secretary Xi Jinping, right, with Wang Qishan, who has been a major force in the recent crackdown as secretary of the Central Commission for Discipline Inspection (CCDI), at the National People’s Congress, Beijing, March 2015
China Daily/ReutersChinese Communist Party General Secretary Xi Jinping, right, with Wang Qishan, who has been a major force in the recent crackdown as secretary of the Central Commission for Discipline Inspection (CCDI), at the National People’s Congress, Beijing, March 2015

“As a liberal, I no longer feel I have a future in China,” a prominent Chinese think tank head in the process of moving abroad recently lamented in private. Such refrains are all too familiar these days as educated Chinese professionals express growing alarm over their country’s future. Indeed, not since the 1970s when Mao still reigned and the Cultural Revolution still raged has the Chinese leadership been so possessed by Maoist nostalgia and Leninist-style leadership.

As different leaders have come and gone, China specialists overseas have become accustomed to reading Chinese Communist Party (CCP) tea leaves as oscillating cycles of political “relaxation” and “tightening.” China has long been a one-party Leninist state with extensive censorship and perhaps the largest secret police establishment in the world. But what has been happening lately in Beijing under the leadership of Chinese Communist Party General Secretary Xi Jinping is no such simple fluctuation. It is a fundamental shift in ideological and organizational direction that is beginning to influence both China’s reform agenda and its foreign relations.

At the center of this retrograde trend is Xi’s enormously ambitious initiative to purge the Chinese Communist Party of what he calls “tigers and flies,” namely corrupt officials and businessmen both high and low. Since it began in 2012, the campaign has already netted more than 160 “tigers” whose rank is above or equivalent to that of the deputy provincial or deputy ministerial level, and more than 1,400 “flies,” all lower-level officials.1 But it has also morphed from an anticorruption drive into a broader neo-Maoist-style mass purge aimed at political rivals and others with differing ideological or political views.

To carry out this mass movement, the Party has mobilized its unique and extensive network of surveillance, security, and secret police in ways that have affected many areas of Chinese life. Media organizations dealing with news and information have been hit particularly hard. Pressured to conform to old Maoist models requiring them to serve as megaphones for the Party, editors and reporters have found themselves increasingly constrained by Central Propaganda Department diktats. Told what they can and cannot cover, they find that the limited freedom they had to report on events has been drastically curtailed.

The consequences of running afoul of government orders have become ever more grave. Last August, for instance, a financial journalist for the weekly business magazine Caijing was detained after reporting on government manipulation of China’s stock markets and forced to denounce his own coverage in a humiliating self-confession on China Central Television (CCTV). And more recently media outlets were reminded in the most explicit way not to stray from the Party line when Xi himself dropped by the New China News Agency, the People’s Daily, and CCTV.

“All news media run by the Party [which includes every major media outlet in China] must work to speak for the Party’s will and its propositions, and protect the Party’s authority and unity,” Xi warned. In front of a banner declaring “CCTV’s family name is ‘the Party,’” Xi urged people who work in the media to “enhance their awareness to align their ideology, political thinking, and deeds to those of the CCp Central Committee.” Then, only days later the Ministry of Industry and Information Technology announced new regulations banning all foreign-invested media companies from publishing online in China without government approval.

But the crackdown has hardly been limited to the media. Hundreds of crosses have been ripped from the steeples of Christian churches, entire churches have been demolished, pastors arrested, and their defense lawyers detained and forced to make public confessions. And even as civil society has grown over the past few decades, a constraining new civil society law is now being drafted that promises to put NGOs on notice against collaborating with foreign counterparts or challenging the government.

At the same time, independent-minded researchers at think tanks and outspoken professors at universities worry about the “chilling effect” of Xi’s policies on academic life in both China and Hong Kong. Feminist activists demonstrating against sexual harassment have been arrested for “picking quarrels and provoking trouble,” while human rights lawyers have been swept up in a mass wave of arrests for “creating public disorder,” and even for “subverting state power.”

But what has been perhaps most unexpected about this trend is the way that Beijing has begun to extend its claim to control people and organizations beyond its borders. Despite its stubborn defense of the sanctity of sovereignty, its agents have begun reaching overseas to manipulate the foreign dialogue by setting up hundreds of Confucius Institutes, newspapers, magazines, and even TV networks that answer to the Central Propaganda Department and the CCP.

The Chinese government is also denying visas to “unfriendly” (buyouhao) foreign journalists and scholars; blocking foreign websites with which it disagrees; demanding that public figures like the Dalai Lama, Hong Kong activists, or Chinese dissidents be refused foreign platforms; threatening the advertising bases of overseas media outlets that challenge its positions; and now even abducting foreign nationals abroad and “renditioning” them back to China where it forces them into making televised confessions. It is hardly surprising that Chinese have started whispering about a new “climate of fear” (kongbude qifen), what Eva Pils of King’s College London School of Law calls “rule by fear.”

What is most striking about these new tactics is their boldness and unrepentant tone. Instead of denying or apologizing for them, the CCP seems to proudly proclaim them as part of a new Chinese model of development, albeit one that has no use for liberal values from the West. In the new world of resurgent Chinese wealth and power, what is valued is strong leadership, short-term stability, and immediate economic growth.

Sitting at the very epicenter of this new nationwide campaign to more tightly control and rejuvenate China through a combination of more muscular leadership, regimented thought, and deeper loyalty to Xi is the Central Commission for Discipline Inspection (CCDI). Long one of the Party’s most powerful, secretive, and feared internal organs, the CCDI is dedicated to “maintaining Party discipline.” But when Xi came to power and appointed Vice-Premier and Politburo Standing Committee member Wang Qishan as its secretary, he also charged it with launching an unprecedented new anticorruption campaign.

Wang is the “princeling” son-in-law of former Vice-Premier Yao Yilin. The son of a university professor and himself a student of history, he has headed up the China Construction Bank and also creatively handled China’s financial and commercial affairs under Hu Jintao when he worked closely with US Secretary of the Treasury Henry Paulson to guide the early years of the Strategic and Economic Dialogue between the US and China. That period is looked back on as a particularly constructive one between the US and China. Why Wang gave up this portfolio to become an anonymous grand inquisitor is unknown, but his friendship with Xi, formed when both were “sent down” (xiafang) as youths to the same dirt-poor region of Shaanxi province in the early 1970s, may help explain his willingness.

According to Li Ling of the University of Vienna, who has written about the CCDI, “the party disciplinary system was and remains primarily a means for consolidating the authority of the Party Central Committee and preserving party unity.”2 But since Wang took over in 2012, its already significant network of twelve branch offices have along with the Central Commission expanded their number of investigations from twenty in 2013 to more than a hundred in 2016 to make it one of the most important organs in Xi’s effort to bolster China’s one-party system. Its work is considered so important that it is even allowed to hire and fire outside the Organization Department, the centralized clearing house that controls other high-level appointments.

As an old-style Leninist party in a modern world, the CCP is confronted by two major challenges: first, how to maintain “ideological discipline” among its almost 89 million members in a globalized world awash with money, international travel, electronically transmitted information, and heretical ideas. Second, how to cleanse itself of its chronic corruption, a blight that Xi has himself described as “a matter of life and death.”

Xi Jinping, center, meeting with representatives and descendants of revolutionary martyrs in Nanchang, Jiangxi province, during a trip over the Lunar New Year that also included a pilgrimage to Jinggangshan, Mao’s first revolutionary base, February 2016
Lan Hongguang/Xinhua/Eyevine/ReduxXi Jinping, center, meeting with representatives and descendants of revolutionary martyrs in Nanchang, Jiangxi province, during a trip over the Lunar New Year that also included a pilgrimage to Jinggangshan, Mao’s first revolutionary base, February 2016

The primary reason the Party is so susceptible to graft is that while officials are poorly paid, they do control valuable national assets. So, for example, when property development deals come together involving real estate (all land belongs to the government) and banking (all the major banks also belong to the government), officials vetting the deals find themselves in tempting positions to supplement their paltry salaries by accepting bribes or covertly raking off a percentage of the action. Since success without corruption in China is almost a non sequitur, officials and businessmen (and heads of state-owned enterprises are both) are all easily touched by what Chinese call “original sin” (yuanzui), namely, some acquaintance with corruption.

Although secret investigations, censorship, and political trials are nothing new in China, what is unique about the CCDI’s part in Xi’s anticorruption campaign is its explicitly extrajudicial status. The investigations it launches take clear precedence over the judicial processes that police, lawyers, and judges would normally carry out in democratic societies. The CCDI is unencumbered by any such legal niceties, except when show trials are needed at the very end of a case so that a formal sentence for, say, corruption, can seem to have been delivered “according to law,” a phrase the CCP tirelessly uses as if incantation alone could make it true. But by then, of course, “guilt” has long since been established and all that is usually needed is a little legal theater to give the CCDI’s investigation an air of legitimacy.

Besides investigating corruption and violations of “Party discipline,” the CCDI has one other more nebulous charge: to “achieve an intimidating effect” on wrongdoing, as its website described it in 2014. In other words, it hopes “by killing a few chickens to frighten the monkeys” (xiaji jinghou), as the ancient adage puts it, in hopes of discouraging other potential malefactors. The commission has even launched a new website and smartphone app that allows whistle-blowers to upload incriminating photographs and videos of officials caught violating new sumptuary rules or even in flagrante delicto.

As if the CCDI’s own investigative arm, the Discipline Inspection Supervision Office (Jijian jianchashi), was not up to the ambition of Xi’s purge, the Party has now also breathed new life into a second organ, the Central Inspection Patrolling Group (CIPG, xunshizu). It was originally set up in 2003 to investigate “leading cadres” whom the CCDI may have shielded owing to its own nepotism and cronyism. With each of their teams headed by a retired ministry-level official and reporting to the Central Committee’s new “Central Leadership Inspection Work Leading Group,” the CIPG has grown quickly into an important and feared investigatory unit within China’s already extensive security apparatus. Although it technically reports directly to the Party Central Committee, like the CCDI, its day-to-day activities are under the command of Wang Qishan, making him the capo di tutti capi of China’s secretive investigations units.

When a “tiger or fly” comes under suspicion by either investigative branch, the suspect can be detained for what is called “double designation” (shuanggui), meaning that they give themselves up for investigation at a designated time and place, but only by the CCDI. Kept in isolation—often under an around-the-clock suicide watch by multiple “accompanying protectors”—there are only murky limitations on the length of time a suspect can be held and no provisions for habeas corpus, legal counsel, or appeal. The object of shuanggui, according to the scholar Li Ling, “is to destroy the detainees’ psychological defense system so that he or she will ‘start to talk.’” Although some reform measures have recently been taken, in the past forced confession and physical abuse, even torture and death, have not been uncommon. Because any investigation comes with strong presumptions of guilt, shuanggui is usually as much a verdict as the start of an evidentiary process. Needless to say, few things strike more terror in the hearts of officials than news that they, or their “work unit” (danwei), are on the CCDI’s hit list.

“The CCDI’s anticorruption campaign is chillingly evocative of the draconian repressions launched by the Eastern Depot during the Ming dynasty,” one historically minded corporate consultant told me. She was referring to a period in imperial history that represented a high tide of Chinese despotism. As most Chinese know from histories, popular novels, and TV dramas, the Ming dynasty was characterized by factionalism, intrigue, paranoia, intimidation, fratricide, and extrajudicial ruthlessness. Trusting no one and fearing treason everywhere, the Yongle Emperor (reigning 1402–1424) sought to protect the throne with an elaborate network of internal surveillance and espionage.

When, like Xi Jinping, the Ming emperor decided that his existing security apparatus, the so-called “Embroidered Guard” (jinyiwei), was inadequate to the task of protecting his reign against subversion, he set up the infamous “Eastern Depot” (dongchang) and put it under the leadership of loyal palace eunuchs. Here secret files were maintained on all officials, just as today’s “dossier” (dangan) system keeps files on contemporary Chinese. With its epic history of forced confessions, torture, and grisly assassinations, this Ming dynasty security apparatus became a “diabolical force behind the throne,” writes historian Shih-shan Henry Tsai, “a monstrous secret police apparatus” whose “power grew like a giant octopus, extending to every corner of the empire.”

However, so rife with paranoia was the Ming court that later emperors came to distrust even the “Eastern Depot” and so set up the “Western Depot” (xichang) as well, yet another security organ outside of regular bureaucratic channels. The proliferation of security organizations under Xi Jinping today is hauntingly suggestive of this Ming precursor.

Moving away from the “consensus-style leadership” that came to distinguish China since Mao’s repressive rule, Xi Jinping has not only recentralized power, but just as Ming emperors abolished the position of prime minister, he has marginalized the position of the modern-day premier. Instead, he has set up a series of new “leading small groups” (lingdao xiaozu) and made himself head of the most important ones (covering such fields as military reorganization, cyberhacking, economic reform, maritime rights, etc.). More than prima inter pares, Xi has become what Party propaganda organs now grandly tout as the “core” (hexin) of the Party. As a well-known Chinese cultural figure recently complained in private, “Our leadership now has an indelibly ‘dictatorial personality’ (ducaide xingge).”

As popular as Xi’s battle against corruption has been among ordinary people—a 2014 Harvard study showed him as having the highest approval ratings of any world leader—it has had an undeniably chilling effect on anyone hoping to speak truthfully to power. And with its evolution from an anti-corruption drive to a far broader purge of political and ideological rivals, many fear that China is now regressing into a period of neo-Maoism.

Such fears were only reinforced when over the New Year’s holiday Xi made a televised pilgrimage to Jinggangshan where Mao had set up his first revolutionary base in 1927. Here Xi was seen paternalistically “at one with the masses,” sharing a meal with peasants in front of a reverential poster of Chairman Mao. And his trip has generated a great many photographs, news clips, fawning pop tunes, and videos all extolling the benevolence of “Uncle Xi” (Xi dada).

Then in late February, he ordered a yearlong socialist education campaign, especially designed for those comrades who might be experiencing “wavering confidence in communism.” He particularly recommended careful study of Mao’s 1949 essay “Methods of Work for Party Committees.”

The notion that the “Mao Zedong Thought” that had dominated the Cultural Revolution would ever make a comeback in China had long seemed as unlikely as it was unwelcome. But now that China is sliding ineluctably backward into a political climate more reminiscent of Mao Zedong in the 1970s than Deng Xiaoping in the 1980s, more and more educated Chinese are making allusions to such frightening periods of Chinese history as the Cultural Revolution and the Ming dynasty. And more and more of them are also seeking to financially anchor themselves abroad by finding ways to park assets outside their country, making it hardly surprising that China has been hemorrhaging foreign currency, with $1 trillion said to have fled the country last year alone.

When in 1978 the twice-purged Deng returned to power to lay out an ambitious reform agenda that allowed post-Mao China to enjoy greater liberalization in both its economic and political life, there was great relief. And during the relatively tolerant decade that followed, prior to Tiananmen Square in 1989, it was possible to imagine that with the passage of time China would not only become more market-oriented, politically open, and committed to the rule of law, but more in the world. Such optimism was only reinforced by such notions as China’s “peaceful rise” propounded later under Hu Jintao.

However, since Xi Jinping’s investiture such roseate hopes of a China slowly evolving away from its Leninist past have become increasingly remote. Indeed, in recent weeks, just as China’s annual “Two Meetings” (the National People’s Congress and the People’s Political Consultative Congress) were being held in Beijing, Xi’s efforts to command greater Party discipline and to censor the media began to provoke surprising levels of popular protest, including a flurry of unprecedented public challenges to both his policies and authority posted on the Internet. For example, an open letter by New China News Agency reporter Zhou Fang criticized censors for their “crude” and “extreme” violations of online freedom of expression. “Under the crude rule of the Internet control authorities,” Zhou wrote, “online expression has been massively suppressed and the public’s freedom of expression has been violated to an extreme degree.”

Zhou’s letter spread like wildfire online before being taken down by censors. Another online letter appeared in the government-linked news site “Watching” (Wujie). It was signed by an anonymous group labeling themselves as “loyal Communist Party members” and not only accused Xi of launching “a cult of personality,” but publicly urged him to step down from office. “You do not possess the capabilities to lead the Party and the nation into the future,” it declared.

His authoritarian style of leadership at home and belligerent posture abroad are ominous because they make China’s chances of being successful in reforming its own economy—on which the entire world now depends—increasingly unlikely. At the same time, because they seem bound to make the Party more dependent on nationalism and xenophobia, Xi’s policies also seem destined to prevent Beijing from being able to recast its inflamed relations with its neighbors around the South and East China seas. Finally, because such policies also grow out of a deeply paranoid view of the democratic world, they make it extremely difficult for China to effectively cooperate with countries like the US on crucial areas of common interests such as antiterrorism, climate change, pandemics, and nuclear proliferation.

Whatever may come, China is undergoing a retrograde change that will require every person, business, and country dealing with it to make a radical reassessment of its willingness to seek convergence with the rest of the world.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/8Jnd2D-vx7A/