Архив на категория: News

Why Britain Needs Its Own Mueller

Glyn Kirk/AFP/Getty ImagesNigel Farage speaking at a press conference the day after the Brexit referendum, flanked by Arron Banks (far) and Andy Wigmore (right), Westminster, London, June 24, 2016

At the end of January 2017, days after Donald Trump’s inauguration, I sat in a busy Pret a Manger sandwich bar in central London, a stone’s throw from the mother of parliaments, and flicked through snapshots of Donald Trump on a mobile phone.

The phone belonged to Andy Wigmore, an associate of Nigel Farage’s, the long-time leader of Britain’s insurgent anti-Europe campaign and latterly a friend and supporter of the man he refers to on his frequent appearances on Fox TV as “The Donald.” Wigmore, a businessman who has a sideline as a trade envoy to Belize, a Central American country known, among other things, for its sugar cane and money-laundering, had taken a photo of Farage and Trump standing in front of Trump’s golden elevator a month earlier. The photo went viral almost instantly.

This was Trump’s first meeting with a foreign politician, the man he called “Mr. Brexit,” and Wigmore was there for the ride alongside his business partner, a previously unremarkable insurance entrepreneur from Bristol in the west of England named Arron Banks. In the run-up to the 2016 Brexit referendum, Banks had given upward of £8 million to Nigel Farage’s successful Leave.EU campaign, an act that overnight had made him Britain’s biggest ever political donor.

Collectively, Farage, Banks, and Wigmore refer to themselves as “the Bad Boys of Brexit,” the title of Arron Banks’s memoir and a nod to Britain’s habit of celebrating the buffoonish provocateur (see also, Boris Johnson). The book makes clear that Nigel Farage, Arron Banks, and Andy Wigmore were the clueless outsiders who somehow triumphed over both the establishment and the odds to take Britain outside the European Union.

As French and Japanese tourists ebbed and flowed around us, Wigmore swiped through the photos. There was The Donald in his suite at Trump Tower. There was Raheem Kassam, a polemicist who had bounced between stints working for Farage and editing the British outpost of Breitbart News and who has now graduated to a position as handmaid-in-chief to Trump’s former chief strategist Steve Bannon. Another image was of Kellyanne Conway—“a very old and dear friend,” according to Wigmore. And then he sat back and told me about all the clever things they’d done with data during the Brexit campaign, and how it was a business named Cambridge Analytica that taught them how.

It was because of Cambridge Analytica that I’d asked to meet Wigmore, though I knew very little about the company at that stage, or indeed him. Neither he nor Banks were widely known then—though, nearly two years on, they’ve gone on to achieve something like notoriety. At the time, Wigmore was a friendly, convivial figure who’d said he’d be happy to talk to me about how Leave.EU had leveraged technology in revolutionary ways.

My interest was accidental. I had mentioned Cambridge Analytica’s work for both Trump and the Leave campaign in an article about Google in December 2016. And I’d been increasingly baffled by a series of letters from the firm vociferously claiming it had done no such work, despite the ample evidence for it all across the Internet. Why the denials? It made no sense. I’d asked Wigmore if he would meet. And he was happy to set the record straight. 

“Cambridge Analytica did work for us, yes,” he said. “We just didn’t pay them. They were happy to help.” Help?  They had “the same goals. We were part of the same family.” Farage and Bannon—a vice president at the firm—were close, he explained. Why wouldn’t they help?

In 2014, Steve Bannon set up Breitbart News in London with the specific intention of helping and supporting Farage’s campaign to take Britain out of the EU. The money came from Robert Mercer, the hedge-fund billionaire who would go on to become the single biggest donor to the Trump campaign. And Cambridge Analytica was another star in their firmament. Of course, they would help. Brexit, Wigmore explained, was the “petri dish” for Trump.

Fast forward twenty-one months and the story that was set in motion that day keeps spinning on. It was Wigmore’s words that led me to hunt down Christopher Wylie, the Cambridge Analytica employee-turned-whistleblower with whom I worked for a year to get on record for The Observer in the UK and The New York Times in the US. Cambridge Analytica is no more. Since then, Mark Zuckerberg has been dragged before Congress to account for Facebook’s actions. And in Britain, over the course of two years, the story has spawned a laundry list of official inquiries and investigations in Britain: into illegal use of data, into illegal electoral spending, into the source of Banks’s donation, into illegal campaign co-ordination—investigations whose final results we are unlikely to know until after Britain has exited the European Union at the end of March 2019.

But the most vital questions have not yet even started to be answered. What is Nigel Farage’s relationship to Donald Trump? How might that connect to Russian interference in Anglo-American politics and elections? And, crucially, why is the British government silent on these matters? Why has it refused to answer parliamentary questions on these issues? Why is it ignoring senior politicians’ calls for a wider public inquiry?

Britain and America, Brexit and Trump, are inextricably entwined. By Nigel Farage. By Cambridge Analytica. By Steve Bannon. By the Russian ambassador to London, Alexander Yakovenko, who has been identified by Special Counsel Robert Mueller as a conduit between the Trump campaign and the Kremlin. The same questions that dog the US election dog ours, too.

There is one vital difference on this between the US and the UK. America has the Mueller investigation. And Britain does not. With only months to go before Britain exits the European Union, there’s a rotten stench coming off this story. As Mueller painstakingly investigates Russian involvement in Trump’s campaign, compiling detail after detail about Russia’s deployment of information warfare and subversion of social media platforms—the same social media platforms at the center of British lives—this remains something apart from Britain’s Brexit trauma.

At the time of writing, our government is in meltdown. The Brexit deal is a national emergency. It takes all the heat and light. Our political journalists have, for months, reported every spit and cough, oblivious to what is happening in the world beyond and how it connects to Britain.

Only once, in November 2017, has Theresa May made a speech that accused Russia of trying to interfere in British democracy. Otherwise, the government has maintained silence. In this, it has been aided and abetted by Facebook.


Arron Banks is married to, now separated from, a Russian woman named Katya. Around 2000,  Katya (née Paderina) was close to an MP named Mike Hancock, who reportedly helped resolve her immigration status and find an apartment after her first marriage broke up. A member of Parliament’s defense select committee and chair of a parliamentary Russia group, Hancock was warned by Britain’s security service MI5 that he was a target for Russian intelligence operatives. In a nugget of information that came to light during an immigration tribunal in 2011, the Home Office had attempted to remove permission to stay in Britain of a young Russian woman, Katia Zatuliveter, who’d had a four-year affair with Hancock and had been identified by the intelligence services as a Russian agent and national security threat.

It is a testimony to the British way of doing things that due process was followed so impeccably. The Home Office made a case that Zatuliveter, then twenty-six, who’d previously had a relationship with a senior NATO official, had set out to seduce Hancock, sixty-five. Oleg Gordievksy, a former KGB colonel, described her to one newspaper as the “most useful KGB agent for thirty years,” but a judge found the relationship to be “enduring and genuine.” In his ruling, Justice Mitting wrote: “We cannot exclude the possibility that we have been gulled—but if we have been, it has been by a supremely competent and rigorously trained operative.”

In a long interview I conducted with Arron Banks back in March 2017, it was he who brought the subject up, joking about Katya being “a spy.” Did you know about her past, I asked him. He shook his head. “First I knew was a front-page story in the Mail!” he said. It was during the Zatuliveter trial, before Banks became a figure on the national stage; a business associate had rung him up, he told me, and said, “You’d better sit down.”

After the Zatuliveter business, Banks bought Katya a personalized plate for her car, X MI5 SPY, a classic Banksian ploy. He is a jovial figure, with a fondness for a drink, and, like Nigel Farage, trades on an image of himself as what Britons call a “good bloke.”

It was at the same interview Banks told me about his “boozy” lunch with the Russian ambassador in the months before the referendum. In The Bad Boys of Brexit, he noted that it was a rip-roaring affair in which they put away bottle after bottle, first of vodka and then of brandy. In our interview, it was two lunches, not one. And then he added, a minute later, “Not a single penny of Russian money went into Brexit.” In the resulting piece, I noted this was “a perfectly reasonable answer, if he had been asked if Russia had put money into Brexit. But he hadn’t. He asked and answered his own question.

There was something else he said to me. Banks’s business interests, besides insurance, included diamond mines in Lesotho and South Africa, as well as a jewelry shop in Bristol, and again, this was something Banks brought up, unprompted. Lefties are always “triggered” by diamonds, he claimed. This is from my transcript of the interview:

CC: Well, the argument is that diamond mines are the perfect vehicle for money-laundering because you own the entire flipping supply chain! So you own it from mine to shop, and you just throw some extra diamonds in.
AB: But that’s pure speculation.
CC: Yeah, it is pure speculation.
AB: (Laughs) You haven’t got a clue if that’s true or not.

It’s true. I didn’t. And still don’t. And Banks denies any allegations. All his wealth, he says, was generated in Britain, and it was his company, Rock Services, that made the Leave.EU donation. 

For two years, a tiny group of journalists has pursued the trail of crumbs over the financing of the Leave.EU campaign. The question of where the money came from that paid for Brexit did not add up. The independent website Open Democracy has plugged away at Banks’s business interests. Cynthia O’Murchu on the Financial Times has examined his insurance company. A former Metropolitan Police officer turned crowdfunded journalist, James Patrick, and Wendy Siegelman, a citizen journalist in the US, have tried to untangle the Lesotho connection. 

And I wrote articles and columns, and sparred with Banks on Twitter, often late at night. About the number of lunches he’d had with the Russian ambassador. About his diamonds. About why, if he’s so rich, he lets out the manor house in which he claims to reside for weddings and lives in a cottage down the road. About why he and Nigel Farage had gotten involved with “Calexit,” a plan for California to secede from the union. About Farage’s relationship with Dana Rohrabacher, the representative from California who, until he just lost re-election, was known as “Putin’s favorite congressman.” About what Farage was doing when he visited Julian Assange in Ecuador’s London embassy in early 2017.

In November 2017, the story took a darker turn. The Russian embassy wrote to call me a “bad journalist,” and Banks’s and Wigmore’s tone changed from laddish banter to a coarse threat of violence. They posted a spoof video that had my face Photoshopped into a scene from the film Airplane. A queue of people lined up to belt me over the head. The last passenger had a gun.

The new hostility was perhaps not only the result of our reporting. The timing may have been coincidental but what had also changed were the first indictments from Robert Mueller’s prosecutors, revealing a web of ties that ran through London, including the identification of the Russian ambassador as a contact between the Trump campaign and Moscow. It was via the London connection that Mueller claimed its first scalp, thanks to George Papadopoulos meeting with an Australian diplomat in a Kensington wine bar.

London: the city that Bill Browder, the Russian businessman who has been pursuing a global Magnitsky Act, says is irredeemably polluted by Russian money. Among London’s Russian “residents” is the sanctioned Russian businessman Oleg Deripaska—just one of a whole class of oligarchs who’ve mixed with British politicians and donated to British politics and who are now in Robert Mueller’s sights as an associate of Paul Manafort, the Special Counsel’s leading conviction so far; just last week, Deripaska was revealed to have been shuttling Konstantin Kliminik, a suspected Russian intelligence agent, also charged by Mueller, around the world in his jet.

It was in London, too, that the voter data of some 230 million US citizens was processed by Cambridge Analytica. In March, its servers were seized in a high-profile raid by the Information Commissioner’s Office, which is now co-operating with the FBI. And in London that “Organisation 1” is based—the name the Mueller investigation has apparently given Wikileaks in its explanation of the Russian military intelligence operation to subvert the US presidential election. 

We know that the release of the Clinton campaign’s emails was a defining moment in changing the course of the election. We know Mueller is following the data trail for evidence of coordination between the Trump campaign and Russian disinformation. And we know he’s circling Assange’s contacts: Roger Stone, Jerome Corsi, and Ted Malloch, another friend and associate of Nigel Farage, are all pieces of the puzzle. We know that American intelligence must be working with British intelligence on this. On the movements in and out of the Ecuadoran embassy. On the movements in and out of the Russian embassy, as detailed in the first indictment. We know that Mueller has talked, repeatedly, to Bannon. We know that he is asking questions about Farage.

Yet the implications for Britain of Mueller’s indictments have barely been reported, let alone understood, in the UK. Theresa May must have been briefed on all this. She knows. She’s just not telling.

Daniel Leal-Olivas/AFP/Getty ImagesWigmore and Banks arriving to give evidence to a parliamentary committee, London, June 12, 2018

Then, four months ago, a new door swung open. The journalist Peter Jukes and I were handed a stash of Arron Banks’s emails. The contents were jaw-dropping. There wasn’t one boozy lunch with the Russian ambassador. There weren’t even two. Banks had multiple meetings with the Russian ambassador and Russian officials in the run-up to the 2016 referendum. And then there were more meetings afterwards, during the US presidential campaign, when Banks and Wigmore also joined Farage in America as he campaigned for Trump.

The leaked emails revealed that on the very day Leave.EU held its press launch (with a panel that included a Cambridge Analytica employee), Banks and Wigmore visited the Russian embassy where the ambassador introduced them to a businessman, Siman Povarenkin. There, inside the embassy, Povarenkin delivered a presentation: he had not one but two potentially lucrative deals he wanted to pitch to them. The first was an offer to buy and consolidate a group of gold mines; the other was to participate in the privatization of a state diamond company. We know from the emails that Banks and Wigmore pursued the deals for months: there were further meetings and evidence of a trip to Moscow to meet Sberbank officials, who were providing the financing. This was months before Putin publicly announced the first major privatization of state firms for years, including of Alrosa, the diamond firm Povarenkin had first pitched to Banks. Later that year, Rosneft was also privatized—the state oil company that the Steele dossier claimed was part of a sweetheart deal offered to Donald Trump.

Both privatizations went ahead. Days after the EU referendum, 11 percent of Alrosa was sold; days after the US presidential election, 19.5 percent of Rosneft went private. Banks denies that he was went ahead with any deal or that he profited from any introductions the Russian ambassador made. He has also denied that “a penny of Russian money” went into the Brexit campaign. One difficulty for him is that a recent parliamentary report on Russian influence in British politics described him as misleading and evasive, and called his spokesman Wigmore “unreliable” and a “self-confessed liar.”

Banks even invited Yakovenko to Leave.EU’s headquarters for a party on the night of the Brexit referendum, together with the embassy’s third political secretary, a diplomat who was expelled from the UK in March this year in the wake of the attack on Sergei Skripal. And Banks and Wigmore were back inside the Russian embassy on the day that it was announced that Bannon was taking over from Manafort as Trump’s campaign manager. Shortly after that, they travelled with Farage to Mississippi, where Trump introduced Farage at a rally as “Mr. Brexit.”

Nigel Farage is the central figure in all this—close to Bannon, close to Banks and Wigmore, close to Trump, linked to Assange. But Britain does not see this. Farage, the former head of the now-defunct UK Independence Party, is still the BBC’s leading go-to for Brexit comment. It took a German journalist to ask him about his Russian ties. He has reported me and Jukes to the police and has called me a criminal, a lunatic, a conspiracy theorist, a loony, a blackmailer, a thief, a hacker, a mad cat lady. The attempted demonization is a distraction story that has worked. The BBC has still not reported on his relationship to Russia, an omission compounded this week by its star politics presenter’s echoing of Banks’s abuse by tweeting that I was the “mad cat lady from the Simpson’s [sic], Karol Kodswallop.”

And that was that, until two weeks ago when Britain’s Electoral Commission announced it had referred a year-long investigation to the National Crime Agency, which deals with serious and organized crime. The Electoral Commission noted that it was unclear where Banks’s £8 million donation had come from or was “permissible” under election rules that do not allow funding from non-UK sources.

The trouble keeps coming for Banks. There’s already an investigation by the Met Police into Leave.EU’s spending returns. Last week, he was found guilty of data crimes in a continuing investigation. The Financial Conduct Authority is looking into his insurance business. BBC and Channel 4 News reports into his business dealings in Africa have led to referrals to the Serious Fraud Office. All told there are currently nine criminal or serious investigations into possible misconduct during the referendum (at least four of which involve Banks).

Critically, it’s what happened on Facebook that remains the biggest question of all. The control of money spent during elections is the very basis of our electoral laws. But they no longer work. Facebook has become a giant funnel not just for dark ads, but for dark money that evades election finance laws. It is now not in doubt that Facebook facilitated data crimes, what we’ve failed to reckon with is how it has broken our democracy, too.

Five times, a parliamentary committee has asked Mark Zuckerberg to answer its questions. And five times, Zuckerberg has refused. In the last instance, he said no to five national assemblies, when, in an unprecedented act, the British Parliament joined forces with counterparts in Canada, Australia, Argentina, and Ireland to invite Facebook’s chief executive to an extraordinary joint international committee on November 27. It’s often said that Facebook is more powerful than a nation-state. It’s not; it’s more powerful than five nation-states.

Facebook’s lack of co-operation has been assisted by Theresa May’s government, which won’t co-operate with Parliament either. It has refused to answer the committee’s questions, refused to back its calls on Zuckerberg, refused its demand for a Mueller-style public inquiry. That demand has support from senior politicians from both main parties, including Damian Collins, the Conservative chair of the parliamentary committee, and Tom Watson, the deputy leader of the Labour Party, as well as from Hillary Clinton, who raised the issue in a speech in Oxford last month—and for whom the implications of crossovers between Brexit and Trump are abundantly clear. 

Just this week, May refused to deny a Daily Mail report that she had prevented the intelligence services from investigating Banks in 2016 when she was still Home Secretary. Earlier this year, the prime minister’s political secretary used her office to smear a whistleblower who exposed what has now been proven to be electoral fraud by the official Vote Leave campaign.

Britain sees none of this. All eyes are transfixed on the EU exit sign. Yet the skulduggery, mayhem, and amateur dramatics of the Brexit negotiations have become perfect cover for something far more chilling. From the corridors of Whitehall and Westminster to the studios of the BBC, Britain’s establishment has buried its head in the sand while democracy and the rule of law have been subverted in plain sight. The government is colluding with an omniscient surveillance superpower. A hostile foreign power has deployed a chemical weapon on our streets.

However bleak and dark and troubled America seems right now, it’s not as bleak and dark and troubled as Britain. You have Robert S. Mueller. We don’t. 

Carole Cadwalladr and Peter Jukes are contributors to the new podcast episode Untold: Dial M for Mueller.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/GowevLyH5G4/

The Other Constitutions

Keystone-France/Gamma-Keystone/Getty ImagesChildren of immigrants pledging allegiance to the flag in a classroom on Ellis Island, 1940s

In a law review article published over forty years ago, Supreme Court Justice William J. Brennan argued that state constitutions are a “font of individual liberties” and that their protections, in matters like search and seizures and the right to a jury trial, often extend beyond the protections of federal law.* In 51 Imperfect Solutions: States and the Making of American Constitutional Law, Jeffrey Sutton, a well-respected judge who sits on the United States Court of Appeals for the Sixth Circuit, endorses Brennan’s thesis and provides four examples in which state constitutional protections were or are more robust than federal ones. These examples demonstrate that the law may be best served if proponents of a new or expanded right give priority to a claim based on their state constitution, and that state judiciaries can set an example for the federal judiciary. Each of them—as well as a fifth example, regarding partisan gerrymandering, that Judge Sutton does not take up but is also worthy of study—merits separate discussion.

The first example involves the equality and adequacy of funding for public education. In Brown v. Board of Education (1954), the Supreme Court removed a major barrier to equal educational opportunities by prohibiting the segregation of public schools, but it left in place the financial obstacle faced by poor school districts unable to provide their students with an education comparable to that offered by wealthier districts. In San Antonio Independent School District v. Rodriguez (1973), over the dissent of four justices, including Thurgood Marshall, the Court rejected a challenge to Texas’s school-financing system, which was based on local property taxes and thus created wealth-based barriers to equal educational opportunities in Texas’s public schools.

The plaintiffs received no relief in their federal case, but in three cases in the 1980s and 1990s the Supreme Court of Texas held that the state’s school-financing system violated the Texas Constitution. Courts in other states, like New Jersey and Ohio, have similarly held that their state constitutions require some measure of equal funding among school districts, demonstrating that state constitutions sometimes offer greater protections than the US Constitution does.

In the second example, Sutton traces the development of the rule requiring that evidence obtained by unconstitutional means be excluded from a trial (the “exclusionary rule”) and the subsequent development of the good-faith exception to that rule. In Mapp v. Ohio (1961), the Supreme Court, noting the states’ “impressive” experience with such a rule, established that the exclusionary rule applies to evidence obtained as a result of an unreasonable search or seizure. Twenty-three years later, the Court established a good-faith exception to the exclusionary rule so that, if police officers reasonably rely on a warrant that is later found to be invalid, the evidence obtained pursuant to that warrant need not be excluded from trial.

However, at least one third of the states have decided not to apply that exception and instead allow for the exclusion of evidence even when officers act in good-faith reliance on the validity of a warrant. The approaches of those states not only show that state constitutions sometimes offer greater protections than the US Constitution does, they also undermine the assumption that life-tenured federal judges invariably provide greater protection to defendants than elected state judges do. Accordingly, that concern should not discourage plaintiffs from pursuing claims in state court. (Nevertheless, Sutton’s discussion does not persuade me to change my settled opposition to the popular election of judges.)

The third example that Sutton discusses grew out of the eugenics debate at the beginning of the twentieth century, and it serves as a warning to advocates not to rely solely on the Supreme Court to protect individual rights. Many national leaders at that time—including, for example, Theodore Roosevelt and John D. Rockefeller—believed that the genetic quality of the human population could be improved through selective breeding. Many eugenicists thus advocated the forced sterilization of people whom they viewed as inferior members of society, such as the mentally ill, the immoral, and the criminally inclined. In Buck v. Bell (1927), Justice Oliver Wendell Holmes wrote an opinion joined by all but one of his colleagues that upheld an order approving the sterilization of a woman who medical authorities claimed was “feebleminded.”

Sutton explains that before Buck v. Bell, many state courts had granted relief from forced sterilization on various state and federal constitutional grounds. But after the Supreme Court upheld the practice under the US Constitution, state courts were reluctant to conclude that it violated state constitutions. In this case, the Supreme Court’s decision had the effect of dissuading state courts from protecting individual rights as robustly as they had been protecting them before Buck v. Bell, underscoring the potential danger of focusing exclusively on federal rights and federal courts.

Sutton’s fourth example describes the treatment of Jehovah’s Witnesses who continued to refuse to allow their children to pledge allegiance to the American flag at school after Justice Felix Frankfurter’s opinion for the Supreme Court in Minersville School District v. Gobitis (1940) authorized schools to compel students to salute the flag. Following that ruling, hundreds of Jehovah’s Witnesses were attacked across the country. Although many state courts reached a conclusion similar to that of the Supreme Court in Gobitis, several prohibited such compulsion and thus set the stage for Justice Robert Jackson’s opinion overruling that decision in West Virginia Board of Education v. Barnette (1943).

A fifth example that Sutton does not discuss in his book, but that I believe is important to explore, is the possibility of mounting challenges to partisan gerrymandering in state courts.

Many state constitutions have equal protection provisions requiring that state officials act impartially when performing official duties. Those provisions provide standards that may impose a duty to draw neutral maps of voting districts, which would presumably prevent state legislators from adopting district maps that are specifically intended to give them an advantage in future elections.

Although the Supreme Court has been unable to find in the US Constitution manageable standards for judging the permissibility of partisan gerrymanders, state courts are not foreclosed from construing a state constitutional provision as mandating impartiality. In other words, state law may well offer an effective remedy against partisan gerrymandering regardless of whether federal law does. Indeed, just this year, in League of Women Voters v. Commonwealth of Pennsylvania, the Pennsylvania Supreme Court held that the state’s gerrymandered congressional map violated the state constitution. In light of the continuing uncertainty about when, if ever, the Supreme Court will identify federal standards for judging the constitutionality of partisan gerrymanders, it is particularly important for voting rights advocates to look to state courts and constitutions in mounting such challenges.

As Sutton’s book demonstrates, state judiciaries can set an example for the federal judiciary and ultimately persuade it to endorse rights that they have recognized and that should have prevailed as a matter of federal law for decades. I hope that 51 Imperfect Solutions convinces advocates to bring claims in state courts so that jurisprudence may continue to develop in this way.

  1. *

    “State Constitutions and the Protection of Individual Rights,” Harvard Law Review, Vol. 90, No. 3 (January 1977). 

Source Article from http://feedproxy.google.com/~r/nybooks/~3/1jFcc_-BSew/

Our Concentration Camps: An Open Letter

To the Editors:

In Tornillo, Texas, in rows of pale yellow tents, some 1,600 children who were forcefully taken from their families sleep in lined-up bunks, boys separated from the girls. The children, who are between the ages of thirteen and seventeen, have limited access to legal services. They are not schooled. They are given workbooks but they are not obliged to complete them. The tent city in Tornillo is unregulated, except for guidelines from the Department of Health and Human Services. Physical conditions seem humane. The children at Tornillo spend most of the day in air-conditioned tents, where they receive their meals and are offered recreational activities. There are three workers for every group of twenty children. The children are permitted to make two phone calls per week to their family members or sponsors, and are made to wear belts with their emergency contacts written on them.

However, the children’s psychological conditions are anything but humane. At least two dozen of the children who arrived in Tornillo were given just a few hours’ notice in their previous detention center before they were taken away—any longer than that, according to one of the workers at Tornillo, and the children may have panicked and tried to escape. Because of these circumstances, the children of Tornillo are inevitably subjected to emotional trauma. After their release (the date of which has not yet been settled), they will certainly be left with emotional scars, and it’s hard to imagine they could have any but the harshest feelings about a country that condemned them to this unjust imprisonment.

The workers at the Tornillo camp, which was expanded in September to a capacity of 3,800, say that the longer a child remains in custody, the more likely he or she is to become traumatized or enter a state of depression. There are strict rules at such facilities: “Do not misbehave. Do not sit on the floor. Do not share your food. Do not use nicknames. Do not touch another child, even if that child is your hermanito or hermanita [younger sibling]. Also, it is best not to cry. Doing so might hurt your case.” Can we imagine our own children being forced to go without hugging or being hugged, or even touching or sharing with their little brothers or sisters?

Federal officials will not let reporters interview the children and have tightly controlled access to the camp, but almost daily reports have filtered through to the press. Tornillo is part of a general atmosphere of repression and persecution that threatens to get worse. The US government is detaining more than 13,000 migrant children, the highest number ever; as of last month, some 250 “tender age” children aged twelve or under had not yet been reunited with their parents. Recently, the president has vowed to “put tents up all over the place” for migrants.

This generation will be remembered for having allowed concentration camps for children to be built in “the land of the free and the home of the brave.” This is happening here and now, but not in our names.

Rabih Alameddine
Jon Lee Anderson
Margaret Atwood
Paul Auster
Andrea Bajani
Alessandro Baricco
Elif Batuman
Neil Bissoondath
José Burucúa
Giovanna Calvino
Emmanuel Carrère
Javier Cercas
Christopher Cerf
Roger Chartier
Michael Cunningham
William Dalrymple
Robert Darnton
Deborah Eisenberg
Mona Eltahawy
Álvaro Enrigue
Richard Ford
Edwin Frank
Garth Greenwell
Andrew Sean Greer
Linda Gregerson
Ethel Groffier
Helon Habila
Rawi Hage
Aleksandar Hemon
Edward Hirsch
Siri Hustvedt
Tahar Ben Jalloun
Arthur Japin
Daniel Kehlmann
Etgar Keret
Peter Kimani
Binnie Kirshenbaum
Khaled Al Khamissi
Dany Laferrière
Jhumpa Lahiri
Laila Lalami
Herb Leibowitz
Barry Lopez
Valeria Luiselli
Norman Manea
Alberto Manguel
Yann Martel
Guillermo Martínez
Diana Matar
Hisham Matar
Maaza Mengiste
Rohinton Mistry
Benjamin Moser
José Luis Moure
Azar Nafisi
Guadalupe Nettel
Mukoma Wa Ngugi
Ruth Padel
Rajesh Parameswaran
Dawit L. Petros
Caryl Phillips
Nelida Piñon
Francine Prose
Sergio Ramírez
David Rieff
Salman Rushdie
Alberto Ruy Sánchez
Aurora Juana Schreiber
Wallace Shawn
Patti Smith
Susan Swan
Santiago Sylvester
Madeleine Thien
Colm Tóibín
Kirmen Uribe
Juan Gabriel Vásquez
Juan Villoro
Susan Yankowitz

Source Article from http://feedproxy.google.com/~r/nybooks/~3/b1Gv49BOYy0/

In the Review Archives: 2000–2004

A confederate flag; illustrating James M. McPherson’s 2001 review of three books about the Civil War

To celebrate the Review’s fifty-fifth anniversary in 2018, we have been going back into our archives year by year. Today we go back to the turn of the millennium, with Tatyana Tolstaya on Russia’s new president, Tony Judt on the future of Israel, James McPherson on enduring Civil War fantasies, William Nordhaus on what war in Iraq would cost, and Marcia Angell on the deceptions of the pharmaceutical industry.


David Levine

A month after Vladimir Putin was elected president for the first time, Tatyana Tolstaya reviewed Putin’s book First Person: An Astonishingly Frank Self-Portrait. “I’ve heard that the training program for US Green Berets includes an exercise in ‘determining the contents of a box without opening it,’” Tolstaya wrote. “Let’s try to apply something of the same approach to the freshly elected president of Russia as well.” Her review was translated from the Russian by Jamey Gambrell.

The Making of Mr. Putin

Tatyana Tolstaya

When we see Putin skiing down a mountain on TV, mixing with the crowd (his bodyguards also pretend to be red-cheeked skiers), or watch him drop by a little restaurant, supposedly to eat blinis, you understand that Putin himself is absent, that you are watching a kind of national fun-house mirror in which the projected fears, hopes, tastes, and customs of the electorate are reflected.

Also: Lars-Erik Nelson: Legacy


Reviewing three books about the Civil War, historian James McPherson provided a definitive refutation of the Lost Cause view, that it “was not a war to preserve the nation and, ultimately, to abolish slavery, but instead a war of Northern aggression against Southern constitutional rights.”

Southern Comfort

James M. McPherson

The Lost Cause myth helped Southern whites deal with the shattering reality of catastrophic defeat and impoverishment in a war they had been sure they would win. Southerners emerged from the war subdued but unrepentant; they had lost all save honor, and their unsullied honor became the foundation of the myth.

Also: Ronald Dworkin: A Badly Flawed Election


David Levine

William Nordhaus, who was recently named winner of the 2018 Nobel Prize in economics, published this analysis in 2002 of the costs of going to war in Iraq. At a time when the White House and the Department of Defense were predicting that a war would be over quickly and cost less than $50 billion, Nordhaus was nearly alone in estimating that the eventual cost could be well over a trillion dollars.

Iraq: The Economic Consequences of War

William D. Nordhaus

An assessment of the costs of a war with Iraq needs to be based on scenarios for the conduct of the war, the aftermath of hostilities, the impacts on the oil market and other related markets, and the “macroeconomic” impacts, i.e., on the overall US economy. It is impossible to project detailed military strategies. However, we can describe the general contours of a “quick victory” and a “protracted conflict” and attempt to put price tags on each.

Also: James Fallows: He’s Got Mail


“The very idea of a ‘Jewish state’—a state in which Jews and the Jewish religion have exclusive privileges from which non-Jewish citizens are forever excluded—is rooted in another time and place. Israel, in short, is an anachronism,” wrote the late Tony Judt in what was perhaps his most controversial article in the Review.

Israel: The Alternative

Tony Judt

The time has come to think the unthinkable. The two-state solution—the core of the Oslo process and the present “road map”—is probably already doomed. With every passing year we are postponing an inevitable, harder choice that only the far right and far left have so far acknowledged, each for its own reasons. The true alternative facing the Middle East in coming years will be between an ethnically cleansed Greater Israel and a single, integrated, binational state of Jews and Arabs, Israelis and Palestinians.

Also: An Alternative Future: An Exchange


“Over the past two decades the pharmaceutical industry has moved very far from its original high purpose of discovering and producing useful new drugs,” wrote Marcia Angell, a physician and the former editor-in-chief of the New England Journal of Medicine. “Now primarily a marketing machine to sell drugs of dubious benefit, this industry uses its wealth and power to co-opt every institution that might stand in its way, including the US Congress, the FDA, academic medical centers, and the medical profession itself.”

The Truth About the Drug Companies

Marcia Angell

In the past two years, we have started to see, for the first time, the beginnings of public resistance to rapacious pricing and other dubious practices of the pharmaceutical industry. It is mainly because of this resistance that drug companies are now blanketing us with public relations messages. And the magic words, repeated over and over like an incantation, are research, innovation, and American. Research. Innovation. American. It makes a great story. But while the rhetoric is stirring, it has very little to do with reality.

Also: Michael Massing: Now They Tell Us

Source Article from http://feedproxy.google.com/~r/nybooks/~3/BY8YT8J0DHU/

Decolonizing Commemoration: New War Art

Imperial War MuseumSouth African Native Labour Corps troops around a brazier at their camp in Dannes, France, March 1917

What sounds like a mechanical air-raid siren morphs into a human cry of loss at the start of The Head & the Load, a powerful ninety-minute stage production about the role and remembrance of Africans in the first truly global war. Conceived and directed by the South African artist William Kentridge, the work was created for the World War I centenary that culminated with Armistice Day on November 11, and will be staged in New York in December. It is part of a wave of work by artists and historians that has challenged World War I’s monochrome image. Kentridge’s piece and other ambitious centennial art works and exhibitions raise profound questions about the selectiveness of remembrance and how those who have been willfully erased can best be restored to memory.

The Head & the Load was co-commissioned for 14–18 NOW, a UK arts program to mark the centenary whose imaginative projects have ranged from a stunning staging at London’s Barbican Theatre of Alice Oswalds Memorial (2011), an elegiac poem for the hundreds of dead soldiers named in Homer’s Iliad, to a fireboat in the New York Harbor painted by Tauba Auerbach to recall the “dazzle”-camouflaged ships of World War I naval battles. “Lest We Forget?” at the Imperial War Museum North in Manchester, curated by Laura Clouting, traces both intimate and official ways in which the war dead and disabled of Britain and its Empire have been mourned and memorialized. The objects on display range from dried poppy petals enclosed in letters from France to a Ouija board for the séances that became popular as casualties soared. Along with a scrawled manuscript of Wilfred Owens poem “Dulce et Decorum Est” are official war paintings for a Hall of Remembrance that was never built. An unstated reason for that neglect may be surmised from works such as Gassed, John Singer SargentRenaissance-scale oil painting of a piteous line of soldiers blinded by mustard gas. In terms of the official response to the terrible losses, the uniformity of Western Front graves is described in the exhibition as “an unprecedented democracy in death.” Yet not everyone was enfranchised, and not all theaters of war were equal.

Imperial War MuseumJohn Singer Sargent: Gassed, 1919

An opening section on “Dealing with the Dead,” with a wall of slides of strewn corpses and battlefield graves, traces how an official “commemorative legacy” grew in response to unprecedented carnage caused by pulverizing weapons. By 1919, “some 559,000 British and Empire casualties had no known graves.” When a controversial wartime decision not to repatriate the dead was affirmed, the question of how to honor these men at the fronts—so many of whom were volunteers—led to cemeteries of standardized gravestones, to be tended by the Imperial (later Commonwealth) War Graves Commission in perpetuity, and monuments inscribed with the names of the missing. Sir Frederic Kenyon’s 1918 report, War Graves: How the Cemeteries Abroad Will be Designed, laid down the universal principles. “No difference could be made between the graves of officers or men,” wrote Rudyard Kipling, the commission’s literary adviser.

Partly because religion had been a catalyst of the Indian Uprising of 1857, equality of race and creed was also affirmed by the commission. Makeshift wooden crosses were replaced with rounded white gravestones (a secular move vehemently opposed by Lady Florence Cecil and 8,000 co-petitioners). The architect Sir Edwin Lutyens designed London’s austere Cenotaph in 1919, arguing that, for a symbolic empty tomb for the “faithful” across the Empire, “It would be an unchristian act to offend men of different faiths.” On display in Manchester are Herbert Bakers designs with stone tigers for the 1927 Neuve-Chapelle Memorial in France that named the 4,742 Indian soldiers and laborers missing at the Western Front.

A coda to “Lest We Forget?” labeled “The Empire Remembered?” hints at how far reality could diverge from the stated aim of “identical treatment of British and native troops,” as “circumstances permit.” In East and West Africa, for example, where 8,000 black soldiers, including West Indians, were killed fighting for Britain under white officers, the names of an estimated 120,000 African laborers and porters of the so-called Carrier Corps who died, mainly from sickness, “were not inscribed onto formal memorials. A complete record of their identities does not exist.”

Some historians have gone further in impugning a “systematic inequality” in official remembrance. According to Michèle Barrett, writing in Santanu Das’s exemplary collection, Race, Empire and First World War Writing (2011), many Africans’ graves were deliberately abandoned. Some names were reallocated to memorials and officially “sent missing,” as though those men had no known graves—which was a lie. Their families were never to know where they were buried. Barrett, who contends that “the argument about inadequate records often functions as a screen,” also found evidence that a common grave at Salaita Hill in East Africa had been sifted and sorted to determine the racial identities of the skeletal remains (including by skull shape), which would then determine the care with which they were memorialized.

Monuments in Mombasa, Nairobi, and Dar es Salaam, inscribed to “Arab and Native African troops” and the “Carriers and Porters who were the feet and hands of the army,” display no names or even numbers of dead. Instead, they bear Kiplings vacuous consolation: “If you fight for your country, even if you die, your sons will remember your name.” Official reasons for not respecting colonial war graves included “waste of public money,” and that Africans had not reached a “stage of civilization” to appreciate them. Yet the colonial authorities were also fearful that such vast reminders of the scale of sacrifice made by British subjects would lead to unrest. According to Barrett, the British governor of Tanganyika urged that the “vast Carrier Corps cemeteries in Dar es Salaam and elsewhere should be allowed to revert to nature as speedily as possible.”

In wartime, the service of Africans and Asians was widely reported. Yet, as the abandoned war graves suggest, enforced forgetting can be a more brutal business than the tweaking of text books, the mere “airbrushing” of history—particularly when that service and sacrifice had stoked expectations around the globe. As the British-Nigerian historian David Olusoga said in a talk at Tate Modern before The Head & the Load, “Indians, North Africans, French West Africans on parade was one of the great stories of the war that journalists couldnt get enough of, because it was romantic and exotic. Then that stops almost immediately the great guns fall silent. Decade by decade, the war becomes monochrome while the bones are still leached in the jungles.”

Over the past two decades, this illusion has been assailed by a growing body of scholarship, as well as oral history and archival finds (by historians including Das and Olusoga, on whom I have drawn). The Worlds War: Forgotten Soldiers of Empire (a 2014 book tied to a BBC TV series) is Olusoga’s powerful account of how the European nations fought on global fronts for imperial spoils, using colonial manpower—both voluntary and press-ganged. The industrial killing required a vast mobile labor force to move supplies, dig trenches and bury the dead. Over the course of the war, for example, nearly 1.5 million volunteers from pre-partition India served in the British army, while some 3,000 Africans a month were forced into its Carrier Corps. In French West Africa, sons were demanded as a “blood tax” for the benefits of French civilization. German forces in East Africa, later in the war, openly abducted men from their villages. Though many volunteers joined up to keep hunger at bay, independence leaders from Mahatma Gandhi to Marcus Garvey argued that loyal service would reap rewards in self-rule—a dream harshly dispelled.

In wartime, the racial hierarchies of the day had largely determined who could bear arms or command, who could eat with or nurse whom. Men of color fighting against white men in Europe, and alongside them, risked undermining the racial mystique that underpinned colonial rule. That this fear was well founded was later evident. The late Senegalese novelist and filmmaker Ousmane Sembène, drafted into the Free French Army in 1944, discovered with his generation the irony of helping Nazi-occupied France fight for a liberation denied his own people. He also told me during an interview in 2005: “In the army we saw those who considered themselves our masters naked, in tears, some cowardly or ignorant. When a white soldier asked me to write a letter for him, it was a revelation—I thought all Europeans knew how to write. The war demystified the coloniser; the veil fell.”

Fear of this demystification often overruled military imperatives. Indian sepoys initially had a decisive influence on the Western Front. But those infantrymen who had docked in Marseilles to shouts of “Vivent les Hindous!” were swiftly redeployed to Mesopotamia. Britains African troops were barred from European fronts. France, which recruited 500,000 troops from Africa and Indochina, used Tirailleurs Sénégalais on the Western Front, but as cannon fodder. Germany, whose propaganda claimed that deployment of African and Asian troops in Europe was a war crime and race treason, recruited 14,000 Askaris (mercenaries) for East Africa, where they would fight other Africans. Some 75 percent of the British West Indies Regiment and 80 percent of African-Americans were confined to labor units. Discrimination in pay and rations while under indiscriminate shell fire partly explains why these men sought the status of soldiers not auxiliaries—usually to no avail.

African-Americans, under French command, “fought a double battle in France,” wrote Jessie Redmon Fauset in 1924, “one with Germany, and one with white America.” French respect, in particular, alarmed US authorities. A secret memo from the American Expeditionary Force headquarters in August 1918 urged French officers and civil authorities not to treat African-American officers as equals (by eating with them or shaking their hands), nor to commend the troops too highly or allow “familiarity” with French women (Mississippi Senator James K. Vardaman expressed loathing, in a choice phrase, for “French-women-ruined Negro soldiers”). Any such indulgence, the memo argued, would upset both white Americans, as “an affront to their national policy,” and “experienced” French colonials, “who see in it an overweening menace to the prestige of the white race.” The intimacy entailed by nursing was also frowned upon as “lowering the prestige of the white woman.”

Some of the thousands of wounded Indians evacuated to England were photographed, for recruitment purposes, convalescing under the faux-Mughal domes of Brighton’s Pavilion hospital. Yet the Army Council sought to bar women from nursing them. Patients were fenced in, and allowed out only under male escort.

Official betrayal was epitomized in Britain by the Victory Parade of July 19, 1919. Lutyens, whose Cenotaph in London was the saluting point, may have sought to embrace all the Empire faithful, but Colonial Office officials deemed it “impolitic to bring coloured detachments to participate in the peace processions.” Indians were among the 15,000 soldiers, sailors, and airmen on parade, but West Indians and Nigerians were not. Nor were there African-Americans in the US contingent in Paris. Officially, these men’s wartime service was to be buried as deliberately as the carriers’ graves under weeds.

Stella Olivier/Tate ModernWilliam Kentridge: The Head & the Load, 2018

Amid institutionalized forgetting, art shakes us awake. The world première of The Head & the Load, which I saw at London’s Tate Modern last summer, was met with a standing ovation, five-star reviews, and murmurings of shock at an “unknown,” “untold,” or “forgotten” scandal. In line with Kentridge’s past in Brechtian theater during apartheid, the work is part memorial, part provocation. On a fifty-five-meter-wide runway of a stage in the gallery’s cavernous Turbine Hall, it had moments of terrible beauty. The music of composers Philip Miller and Thuthuka Sibisi, and choreography of Gregory Maqoma, provide an electrifying current of African resistance, inciting questions not only about colonial plunder and wartime exploitation, but also about the process by which these facts were obscured—and their recovery by art.

“It’s about what weve chosen not to remember,” the Johannesburg-based artist told me at the flat near the British Museum where he was staying for the week-long London run. The son of anti-apartheid lawyers, Kentridge read African studies at the University of Witwatersrand, but knew almost nothing of World War I in Africa: “Frustration at my own ignorance was part of the goad.” With influences from Goya, Hogarth, George Grosz, and Max Beckmann to Dumile Feni, Kentridge became internationally known in the 1980s for animation shorts made by filming his charcoal drawings, erasing, redrawing, and re-filming them. This “stone-age” technology leaves stubborn traces that highlight the erasures. According to a persuasive new monograph, William Kentridge: Process as Metaphor & Other Doubtful Enterprises (2018) by Leora Maltz-Leca, this technique was developed in response to apartheid censorship, at a time when the black marks of redaction in newspapers were being banned as subversive. The regime’s ultimate goal was to prevent awareness that anything had been withheld.

The scandal at the heart of Kentridge’s new production concerns the porters recruited from across the continent to serve British, French, and German forces in Africa. The Carrier Corps absorbed most of the 2 million Africans mobilized in the war, of whom at least 200,000 died. The real toll is unknowable; no army kept complete records. As many as 1 million Africans may have perished as marching armies requisitioned food and field hands, spreading famine and disease (with 200,000 deaths from Spanish flu alone). The wartime brutality followed the atrocities of colonial campaigns such as the Maji Maji war of 1905–1907 in East Africa and the German genocide against the Nama and Herero in Southwest Africa (today’s Namibia) in 1904–1908—the subject of Kentridges 2005 installation Black Box. Testing grounds for the machine-gun, these forays foreshadowed the mechanized slaughter about to come home to Europe.

The first British shot of the entire war was fired by a West African soldier in Togoland, a German imperial protectorate encompassing what is now Togo and part of Ghana. Togo, Cameroon, and Namibia soon fell to the Allies. But their invasion of Tanganyika led to terrible loss of life in protracted fighting; the German General Paul von Lettow-Vorbeck, a veteran of the colonial genocide in Namibia, even fought on, oblivious, for two weeks after the Armistice.

On the Western Front, horses and mules carried supplies. In Africa, where the tsetse fly and mosquito killed off the oxen, loads were borne on men’s backs. “For every soldier, three carriers. For every officer, nine carriers. For every machine-gun, twelve carriers. For every cannon, three hundred carriers,” intones a master of ceremonies in mustard-colored jacket, played by the actor Mncedisi Shabangu, relaying the British military’s “advanced arithmetic” in English and Xhosa. “For every man who dies from bullets, thirty-one die from disease.” Yet, he goes on,“they are not men because they have no name, or soldiers because they have no number.” Their deaths were recorded, if at all, as “wastage.”

Stella Olivier/Tate ModernWilliam Kentridge: The Head & the Load, 2018

The production’s title is taken from a Ghanaian proverb—“The head and the load are the troubles of the neck”—which may allude to both physical burdens and colonized minds. For Kentridge, “people carrying the world on their shoulders, moving across the world with loads, is the dominant image of the war.” It is also his signature motif of the past three decades: from live actors to puppet projections, processions have been his versatile metaphor for everything from history and regime change to failed utopias. The carriers’ procession harks back to enslaved forebears (“Ropes were put through their ears as they were tied together”), and toward descendant miners and industrial workers.

The new project represents history through absurdist collage. Along with projections of Kentridges drawings of African waterfalls are telegraph text, film clips, and ledgers. The libretto is composed of mutually unintelligible fragments: Dadaist poems translated into isiZulu, lines from Wilfred Owen into French accompanied by dog-barks. There are instructions from Kiswahili for Beginners, and Setswana proverbs from a 1920 collection by the black South African writer Sol Plaatje: “God’s opinion is unknown” and “Hunger makes no man wise.” Fanon, too, is quoted: “When the whites feel they have become too mechanized, they turn to men of color and ask for a little human sustenance.” As is a version of Conrad: “Annihilate all the brutes.” Amid the sonic terror of war rendered by singers (“kaboom,” “ta ta ta”), Tristan Tzara’s mocking words ring out: “This is a fair idea of progress.”

An eagle-helmeted Kaiser sputtering German-accented gibberish is trolleyed across the stage; his French equivalent declaims from a watchtower—the burlesque cabaret recalls what Dada’s founders recognized in 1916, that war had robbed words of meaning, just as the Surrealists mimicked shell-shock. But the bricolage, which includes projected footage of enormous scissors devouring maps, is also a response to lacunae in colonial history and the carve-up of a continent: Versailles, where Germany’s African territories were divided as spoils among the victorious Allies, sealed the Scramble for Africa. Rare ethnographic recordings made in prisoner-of-war camps near Berlin were also a starting point for Kentridge’s composers, whose own collage of musical quotations ranges from Satie and Schoenberg to a Mandinka melody.

“Constructive amnesia” is Kentridge’s term for the process by which the porters were actively forgotten. The Head & the Load, Kentridge told me, was “to note my own ignorance and that of others. Here are things that are gone that should be remembered.” Why they are gone is hinted at in a quotation from a wartime military official: “Lest their actions merit recognition, their deeds must not be recorded.”

Another 14–18 NOW commission, Mimesis: African Soldier, a video installation at the Imperial War Museum in London by the British artist and filmmaker John Akomfrah, gives poetic expression to the sense of rewards due yet withheld. The seventy-five-minute, three-channel video installation draws on obscure film footage, including archival material from Senegal, Ghana, and Kenya, to build an astonishing composite of soldiers and laborers from Africa, Asia, and the Americas being signed up, feeding supply lines, and converging on fronts, from Flanders mud to jungle, desert, and veld.

Akomfrah, who had his first US retrospective recently at the New Museum in New York, splices the black-and-white documentary footage with staged tableaux of seven actors in distinctly colored colonial uniforms, wearing fezzes, kepis, or turbans. We see them solitary on ship decks, or wrenched from women in silent, depopulated villages, or wandering on overgrown battlefields where flags mark the fallen of many nationalities. A soundtrack with songs and whizzbangs hisses to simulate the terrifying release of mustard gas as yellowish smoke fills the screen. The artful placement of props, from displaced furniture and ownerless kit bags to archive photographs under flowing water, creates a powerful sense of rupture and desolation. Subtitled The Ambiguities of Colonial Disenchantment and signposted with titles such as “rude awakening,” and “letdown,” the new footage with actors becomes a little formulaic only toward the end.

The final, ironic footage is of colonial commanders embracing troops and pinning medals in a hollow masquerade of recognition. The film’s epigraph from Rosa Luxemburg, “Those who do not move do not notice their chains,” fits the mass mobilization of both the colonized and the carriers. Both Akomfrah’s new work and Kentridge’s reminded me of something the late Nobel laureate Günter Grass, whose novel The Tin Drum (1959) countered German amnesia after World War II, said to me when I interviewed him in 2010 at his home in Lübeck: “Perhaps in time your country, England, will think about its colonial crimes… Everybody has to empty their own latrine.”

Smoking Dog Films/IWMThe filmmaker John Akomfrah in front of Mimesis: African Soldier, Imperial War Museum, London

“We have obligations to the dead,” Akomfrah writes in a note on Mimesis, quoting the historian Carlo Ginzburg. Yet memorials are for the living, too.

At London’s busy Hyde Park Corner, four urn-topped pillars flanking the road to Buckingham Palace commemorate the “service and sacrifices of five million men and women from the Indian Sub-continent, Africa and the Caribbean, who volunteered to fight with the British in the two World Wars.” The Memorial Gates on Constitution Hill—erected in 2002 after decades of lobbying—are inscribed as a “debt of honour.” Another memorial was unveiled last year in Windrush Square in Brixton, South London, to commemorate African and Caribbean war service. Recently announced is one for London’s Docklands to honor the 140,000 Chinese laborers who served the Allies on the Western Front. Their duties included clearing shells and reinterring the remains from mass graves.

In the pointed inscription on the 2002 monument: “With so many descendants of these volunteers now living in the United Kingdom, the Memorial Gates serve to remind us all of our shared sacrifices in times of greatest need.” War service has become a certificate of still-contested belonging. Gaps in official versions of the past can undermine people’s claim to the future. During the Windrush scandal this year, those demanding residency rights for the generation of Caribbean migrants who settled in Britain, as British citizens, in the 1940s and 1950s invoked the West Indian contribution to both world wars. As the “Lest We Forget?” exhibition traces, the Black Poppy Campaign to recognize African and Caribbean war service began in 2010, partly to counter the far right’s co-opting of the red poppy as a nationalist symbol.

Yet curious blind spots persist. Up the Thames from Tate Modern, “Aftermath: Art in the Wake of World War One” at Tate Britain ostensibly explored the artistic legacy of the war in Britain, France, and Germany, and “how memories of the war were filtered through… political agendas.” Yet this otherwise thoughtful exhibition placed Europe’s overseas territories outside its scope. It traced the manipulation of war memories through the visibility of disabled veterans at official parades (France made much of mutilés de guerre; Britain hid them), but made no mention of black soldiers whose invisibility was as politically driven.

Other than a reference to memorial sculptures depicting “white bodies,” there seemed little recognition of Britain and France as imperial powers whose empires were enlarged by Germany’s defeat—a silence filled with unspoken ironies. Votes for women in Britain and Germany were held up as a reward for war service without reference to the millions—women and men alike—whose hopes of emancipation were crushed. No insight was offered into what a nostalgic “return to order” might mean for much of the globe, or a French rediscovery of classicism aligned with “national identity” and “values of civilization.”    

In 1918, the British War Memorials Committee was part of the Ministry of Information, the government’s propaganda arm. After the war, William Orpen’s oil painting To the Unknown British Soldier in France (1921–1928) was barred from the official collection until the artist removed the skeletal spirits of naked, dead soldiers and hovering cherubs he had painted beside the flag-draped coffin at Versailles. The catalogue for “Aftermath” shows both the altered painting and a photograph of the original with its ghostly apparitions. The absence of black servicemen from the Victory Parade was as much a propaganda exercise as censoring war art—with one eye to colonial control, the other to projecting a desired image of the nation into the future.

“Aftermath” noted that there were “few official memorials to the servicemen from Asia, Africa and the Caribbean.” But at a preview I attended, the Cenotaph in Frank Owen Salisbury’s oil painting The Passing of the Unknown Warrior, 11 November 1920 (1920) was described as representing all British and Empire dead. This essay was triggered by that half-truth: Could it be that, in an age of high imperialism, there was genuine equality in war remembrance? Or was the idea that Britain and Empire fought shoulder to shoulder, as brothers-in-arms, to face a democracy in death, a projection back from our own time—itself a kind of airbrushing of history?

British war graves policy is now, where possible, to “correct anomalies.” Yet, if the past is another country, how they did things differently—at times, pathologically—should be spotlit, not smoothed away. We need to be aware of the skeletons beside the coffin. Like Kentridge’s charcoal traces, we can at least remember what was erased, or made so painfully and violently to disappear from memory.

The Head & the Load will be at the Park Avenue Armory in New York from December 4 through 15. Mimesis: African Soldier is at the Imperial War Museum, London, until March 31, 2019; and “Lest We Forget?” is at the Imperial War Museum North in Manchester, England, until February 24, 2019.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/BivmKN-VKfA/

How Brexit Broke Up Britain

Mary Turner/Bloomberg via Getty ImagesA sticker reading “No Border, No Brexit” on a road sign near the “Hands Across The Divide” sculpture, Derry, Northern Ireland, July 22, 2018

So, at long last, it seems that the negotiations on Brexit between the United Kingdom and the European Union have produced a draft agreement. We do not yet know what it contains but it will be a compromise that falls far short of the high expectations of June 2016 when the British voted to leave. It will tie Britain to the EU’s customs union and single market for an indefinite but probably very long time. Instead of making a glorious leap to independence, Britain will become a satellite orbiting the European planet, obliged to follow rules it will have no say in devising.

This is an exercise in damage limitation, not a bold break from the recent past. But the question is whether the British political system is capable of resigning itself to this least bad outcome. Theresa May will put the draft deal to her cabinet Wednesday and thereafter try to cobble together a parliamentary majority for it at Westminster. Can a chaotic political establishment find a way to swallow a complex, ambiguous, and deeply disillusioning necessity? Nothing in this story so far suggests that this will be easy.

Oscar Wilde’s Lady Bracknell did not quite say that when a government loses its mind it may be regarded as a misfortune but when the opposition does so as well, it begins to look like carelessness. But had she been around for Brexit, she might well have done. The British government’s journey toward Brexit is like a ride in Disneyland: every bout of soaring optimism is followed by a vertiginous plummet into despair.

It is easy to blame May and her bitterly divided Conservative Party for creating a situation in which a deal has been done with just four months to go before the UK leaves the EU, and in which nothing is yet certain about its fate. Easy because entirely justified: the Tories have plunged their country into its biggest crisis since World War II and seem utterly incapable of providing a credible or coherent collective leadership.

But what makes that crisis all the more profound is that what we would usually expect in a parliamentary democracy—that the main opposition party provides a distinct alternative to a failing and flailing government—is patently not happening either. The Labour Party’s members are overwhelmingly opposed to Brexit: a poll in September showed that 86 percent of them say they want a second referendum. In a recent large-scale national survey for Channel 4 News, 75 percent of Labour voters said they want the UK to retain a close relationship with the EU. Surveys show that in Labour’s old industrial heartlands, where working-class voters strongly backed Brexit in 2016, opinion is swinging sharply toward a rethink.

Yet Labour leader Jeremy Corbyn told the German newspaper Der Spiegel last week that Article 50 (the clause in the European treaty that allows a member state to leave and that May triggered in March 2017) is irrevocable and that his party had to “recognise the reasons why people voted leave.” He was then almost immediately contradicted by both his chief foreign affairs spokeswoman Emily Thornberry and by his senior Brexit spokesman Keir Starmer, both of whom insisted that a second referendum is still possible. The party’s divisions are now as public as the Conservatives’.

Labour, like the Tories, is being held together only by a fantasy. Its official position is that it supports Brexit but will oppose any deal with the EU that does not “deliver the ‘exact same benefits’ as we currently have as members of the Single Market and Customs Union.” This is either utterly delusional or, more probably, deeply dishonest. The EU cannot give the “exact same benefits” to a non-member as its remaining members enjoy. If it did so, it would cease to exist: Who would accept the responsibilities and costs of being in the club if all the club’s facilities were freely available to non-members? The Labour leadership undoubtedly knows this, but it maintains this illusion so it can talk out of both sides of its mouth at the same time: supporting Brexit but condemning May for failing to secure in the negotiations an outcome that was inherently impossible.

So what’s going on here? The most recent evidence from that Channel 4 News survey, the largest of its kind since the 2016 referendum, is that the UK would now vote to remain in the EU by a majority of 54 percent to 46 percent. The very least that might be said is that there is a large political constituency for a coherent opposition to Brexit, based on the demand that whatever deal (or no deal) emerges from the talks be put back to a popular vote. How can it be that the entire British political system seems incapable, at a moment of national crisis, of presenting citizens with a clear set of alternatives?

One can blame poor leadership and there is plenty of that to go around. But there is surely more to it than that. There is a deeper problem of articulation. Two very big things—both of them central to Brexit—are not being addressed at all. They are being ignored because they are the great contradictions of the whole crisis. The EU has repeatedly expressed frustration at the inability of the British to say exactly what it is they want. But this is not just a failure of negotiation. The British government and its technocrats can’t say exactly what they want because the whole Brexit process is fundamentally tongue-tied. It is driven by two things that dare not speak their name.

The energy of Brexit is contained in the brilliant slogan of the Leave campaign in 2016: Take back control. It is brilliant because it slides smoothly over two very awkward questions: What is “control”? And who is to have it?

Another word for “control” is “regulation.” The fundamental appeal of Brexit is that the British have had too much regulation imposed from Brussels and desire in the future to regulate themselves. Thus the British will control their own environmental safeguards, their own food safety, their own labor standards, their own laws on competition and monopolies. The EU does indeed do many of these things and there is a perfectly coherent argument to be made that the British state should do them instead. It is a safe bet that this is what most people who voted for Brexit want and expect.

But that’s not actually what Brexit is about. The real agenda of the Hard Brexiteers is not, in this sense, about taking back control; it is about letting go of control. For people like Dominic Raab, the Brexit secretary, the dream is not of a change in which regulation happens, but of a completion of the deregulating neoliberal project set in motion by Margaret Thatcher in 1979. The Brexit fantasy is of an “open” and “global” Britain, unshackled from EU regulation, that can lower its environmental, health, and labor standards and unleash a new golden age of buccaneering hyper-capitalism. Again, this is a perfectly coherent (if repellent) agenda. But it is not what most of those who voted for Brexit think it is supposed to be. And this gap makes it impossible to say what “the British” want—they want contradictory things.

The second question is who is supposed to be taking control: Who, in other words, are “the people” to whom power is supposedly being returned? Here we find the other thing that dare not speak its name: English nationalism. Brexit is in part a response to a development that has been underway since the turn of the century. In reaction to the Belfast Agreement of 1998 that created a new political space in Northern Ireland and the establishment of the Scottish Parliament in 1999 that did the same for another part of the UK, there has been a rapid change in the way English people see their national identity. Increasingly, they are not British, but English. This resurgent identity has not been explicitly articulated by any mainstream party and surveys have shown a growing sense of English alienation from the center of London government in Westminster and Whitehall. Brexit, which is overwhelming an English phenomenon, is in part an expression of this frustration. In Anthony Barnett’s blunt and pithy phrase from his 2017 book The Lure of Greatness: England’s Brexit and America’s Trump, “Unable to exit Britain, the English did the next-best thing and told the EU to fuck off.”

There is stark and overwhelming evidence that the English people who voted for Brexit do not, on the whole, care about the United Kingdom and in particular do not care about that part of it called Northern Ireland. When asked in the recent “Future of England” survey whether “the unravelling of the peace process in Northern Ireland” is a “price worth paying” for Brexit that allows them to “take back control,” fully 83 percent of Leave voters and 73 percent of Conservative voters in England agree that it is. This is not, surely, mere mindless cruelty; it expresses a deep belief that Northern Ireland is not “us,” that what happens “over there” is not “our” responsibility. Equally, in the Channel 4 survey, asked how they would feel if “Brexit leads to Northern Ireland leaving the United Kingdom and joining the Republic of Ireland,” 61 percent of Leave voters said they would be “not very concerned” or “not at all concerned.”

This may be startling but it is also a pretty clear message. The problem, though, is that no one in either of the two main parties wants to talk about it. In one of history’s little jokes, the English national revolution that is Brexit led to Northern Ireland’s small ultra-unionist Democratic Unionist Party holding the balance of power at Westminster and keeping Theresa May in office. Thus, while the people who voted for Brexit are waving goodbye to the UK, May—with, in this, the support of Labour—has turned up the volume on her declarations of love for the United Kingdom: “I will always fight to strengthen and sustain this precious, precious Union.”

The future of the Union, moreover, has become central to the negotiations with the EU. The emerging deal will be horribly complex, largely because of British insistence that no arrangements must be made to prevent a hard border in Ireland that would in any way differentiate Northern Ireland from the rest of the UK. Brexit cannot be properly articulated because it has made a sacred cause of fighting for the very thing that Brexit’s voters don’t care about. As Lady Bracknell remarked, “This shilly-shallying with the question is absurd.”

Source Article from http://feedproxy.google.com/~r/nybooks/~3/amCYQkuugss/

‘I’m not the Resistance, I’m a reporter’: An Interview with April Ryan

Alex Brandon/AP PhotoApril Ryan raising her hand to put a question to White House press secretary Sarah Huckabee Sanders, Washington, D.C., October 31, 2017

At the top of Donald Trump’s journalistic enemies list is April Ryan, the fifty-one-year-old American Urban Radio Networks correspondent. Ryan—who has covered the presidency for more than two decades—is also an on-air political analyst for CNN and the author of three books, including the recently released Under Fire: Reporting from the Front Lines of the Trump White House. Around Washington, D.C., Ryan holds the title “Dean of the White House Press Corps.”

“I watched her get up,” the president fumed last week before departing for Paris. “I mean, you talk about somebody that’s a loser. She doesn’t know what the hell she’s doing… She’s very nasty, and she shouldn’t be… You’ve got to treat the White House and the Office of the Presidency with respect.”

What Trump may find disrespectful is that Ryan has a penchant for asking tough questions on topics he’d doesn’t want to hear about: voter suppression, civil rights, Russia. Ryan is also black, female, middle-aged, and resolute. In January 2018, she asked, “Mr. President, are youa racist?”

This boldness has made Ryan the target of Trump’s more ardent followers. She receives frequent death threats. On a reporter’s salary, she’s had to hire a full-time bodyguard. There are reports that Cesar Sayoc Jr., who is accused of sending pipe bombs to Hillary Clinton, George Soros, Barack Obama, and others, also had Ryan on his mailing list.

We spoke for two hours in New York City in the early fall, and then again, twice after the November 6 election, by telephone. An edited and condensed version of the conversations follows.

Claudia Dreifus: Like the late Helen Thomas of the UPI, you’re known as the “Dean of the White House Press Corps.” It’s an honorific earned by covering four presidents. When did you realize that reporting on the Donald Trump presidency would be very different?

April Ryan: I saw it during the 2016 campaign. I knew in my gut he was going to win. You could see it, if you were honest with yourself and listened to what he was saying. He’d say to a 90 percent white crowd, “We built this nation!”

He used code words and they solidified the crowd. With Donald Trump, there wasn’t political decorum. It was shock and awe. This was a street game and he was playing “the Dozens.” His opponents, in both the primaries and the general election, were too polished to understand what was happening. I used to go on MSNBC and tell Chris Matthews, “This man could be our president.” 

Chris would say, “No, it will never happen!”  

When it did happen, how did life change for journalists covering the White House?

There used to be an atmosphere of mutual respect there. You had journalists like Bill Plante of CBS and Ann Compton of ABC. They were tough questioners, but they always were respected.

I didn’t have any problem with any president until now. George W. Bush, I really have a fondness for him because we were able to talk about race. Bill Clinton, I asked him hard questions about Monica Lewinsky. I dogged Obama about unemployment in the black community. They may not have agreed with the questions I threw out, but they respected me.

This president changed the dynamic. The press room now, every day, it’s something different. Now looking at this new crop, you’ve got conservative journalists over here, you’ve got liberal journalists, and you got those few who are in the middle. We didn’t have that back then. If a reporter had politics, you didn’t know about it. It wasn’t out like it is now.

Where do you sit on the spectrum?

I sit in the middle. I sit with the tradition of Walter Cronkite. You didn’t know his politics until he left journalism. And you don’t know if I’m a Republican or a Democrat. I don’t talk about my politics. No one knows my politics.

Whatever they are, you’re a journalist with a bodyguard. Why?

There have been threats. It’s gone beyond emails, beyond the phone calls, beyond threatening messages to the company website. The security: it’s not just for myself, it’s for protecting me and mine. I have kids. I’m trying to keep my life and the lives of those I love safe.

The other thing is that people have tried to intimidate me. Right now, I can’t cover a Trump rally in a red state, even if I wanted to. I mentioned that to Steve Bannon when I talked to him. He said, “No. I wouldn’t advise it.” 

The questions you ask the president appear to be the source of the hostility. This began in February of 2017 at Donald Trump’s first solo presidential press conference when you asked what would, under ordinary circumstances, have been a rather mundane question.

Yes. I asked him a question about urban America and it wound up being crazy. I asked if, in his plans for an inner cities agenda, he was going to talk with the Congressional Black Caucus. The president went, “Well, I would… I’ll tell you what, do you want to set up the meeting? Are they friends of yours?” 

All I could say was something like, “I’m just a reporter. I know some of them.” But the president went on, “Let’s go… let’s set up a meeting. I would love to meet with the Black Caucus.”

When he started speaking those words, I was at a loss. Blood rushed to my ears because that’s my way when things get tough. I was shaking my head, not realizing what I was doing. “No, I can’t do that, sir.” I was in shock. In my mind, I thought, but didn’t say, “That’s not my job.”

This exchange was the worst thing ever for me. That moment is forever etched into history because that video went viral. For better or worse, I’m a meme from that. 

Who do you actually work for?

American Urban Radio Networks. We send newsfeeds to black-owned radio stations or stations with a black focus. Our focus is on minority America, though we ask questions about all of America—health disparities, lead poisoning, police shootings, Freddie Gray, Eric Garner. 

How do you develop your questions for the president? Dan Rather, who once covered the White House, told me that he spent months crafting a single question for Richard Nixon.

Months? It doesn’t take me that long. We’re in a different day now. You’re responding to the moment. The 24/7 news machine and social media have changed the dynamic of how we ask questions. I do a lot of research. I have sources who tell me things. My questions are often driven by the events of the day.   

When I asked Sarah Huckabee Sanders if the president had ever thought of resigning, that was driven by the news that a federal prosecutor had just raided Michael Cohen’s office. I had sources telling me that things were going on, like wiretaps and things of that nature.

This past January, at a public ceremony celebrating Martin Luther King Jr. Day, you bluntly asked the president if he was a racist. What was your process in developing that question?

As a journalist, as you know, you have your ear to the ground. You have a Rolodex of people you talk to quite frequently to find out what they’re thinking. You hear grumbling. 

I’d heard that federal lawmakers came out of a meeting where the president allegedly said “shithole nations,” versus people from Norway who’d be welcome here. Not only that: before that was Charlottesville.

So I was hearing this groundswell from black leaders and white leaders. And then I called the NAACP and asked, “What is the definition of a racist?” And it was simple: the intersection or the meeting of prejudice and power.

I was torn about asking it. Even in the room, I was back and forth. I’ll never forget the president kept looking at me from there like he doesn’t like me. When I asked it, I knew I had done something. I realized that it’s a sad day when you have to ask a US president if you’re a racist. It hurt me.

Does the president ever call on you at events?

Not anymore. I’ll yell out a question sometimes. He’ll give me a squint and close his mouth and skip to the next questioner. He doesn’t like me. That’s okay.

At his recent post-Election Day press conference, the one where the president grew furious with CNN’s Jim Acosta, he also rounded on you. “Sit down,” he told you. “I didn’t call on you.” What were your feelings as he said that?

I was shocked, though I wasn’t as surprised as some people. He’s done this before to me. Whatever he thinks about me, I’ll be doing what I’m doing. In this instance, I was asking him about voter suppression and he was avoiding talking about the issue.

I don’t know why people are shocked. He’s gone after women before. We’ve seen how he is when he talks to Kaitlan Collins or Cecilia Vega.  

He has said to ABC’s Cecilia Vega, “You’re not thinking, you never do.”

I love Cecilia. She’s amazing. And a thinker. And she asks really good questions. It’s wrong to denigrate a woman, a journalist. It shows where women, where journalists, stand with him. In a time when women are walking away from him and his message is against women, that’s a dangerous game to play. Independent women are leaving him and going toward the Democrats.

Just last week, the president revoked Jim Acosta’s White House press credentials. In the wake of that, some journalists are suggesting that the entire White House press corps should boycott some of the president’s events. Is that feasible?

If they do boycott, I would have to be part of it. But there are ways that we can be just as effective without a boycott. We have to be all together and we’ll have to work together to find those areas. We have to figure out what to report and what not to report. We can be more strategic in our reporting.

You’ve had some run-ins with Sanders. During Brett Kavanaugh’s confirmation hearings, you threw a remarkable question at her. You asked why the president had so readily believed in Kavanaugh’s innocence, when in 1989 Trump had rushed to judgment in the case of five black teenagers accused of raping a white jogger in New York’s Central Park. Many years later, the real rapist was arrested. Now, the 1980s is a long time ago; why ask that question now?

I didn’t initiate the question. Ayesha Rascoe of NPR did. The minute Ayesha asked it, I saw Sarah getting riled, as if to say, “Leave that question.” And I was like, “You are not going to step over this.” I went right in.

What it does show is that he’s selective in who he believes and who he doesn’t. The reason I asked is because people know what happened there. There’s a record. And he has yet to say, “I’m sorry.”

About your bodyguard, how do you afford him? I know what radio reporters earn.

I’m creative. I put a lot of jobs together. I’m on CNN. I write books, I lecture. The security: it costs a lot, but I can’t leave this job. I’m not a quitter. If people threaten me because I’m asking questions, that’s not right. They view me as the Resistance; I’m not the Resistance, I’m a reporter.

How do your colleagues in the White House press corps react to your being targeted?

Some of them don’t care. They’re of the mindset, “Oh, she gets so much attention.” Then there are others, they feel bad for me. I had a co-worker walk with me when she saw a couple of in-my-face attempts at intimidation.

A lot of the conservative newbies who think I’m not friendly to the president or who want to write stories that are for the president, they want to challenge me. I’m like, “You just got here, who are you?”

You’ve suggested that Sarah Huckabee Sanders ought to be paying for your security detail. Why?

It’s because every time Sarah comes out and says something against me, I get these emails that go, “Oh, I’m going to do this and I’m going to do that to you.” Every time, there’s an elevated level of hate. They are generating this hate. I am “the enemy.” Or one of the enemies.

A hypothetical question: If your predecessor as the dean of the White House press corps, the late Helen Thomas, were, in some mythical way, to come down from journalism heaven, what do you think she’d tell you?

“Keep doing what you’re doing.” She’d be the first person banging on the door for answers. She had the doors closed quite a bit on her, though it never stopped her. But people in power were afraid of her. She wielded real power.   

Like her, I’m not looking for approval. I’m looking to do my job.

Paul J. Richards/AFP Photo/Getty ImagesApril Ryan assisting Helen Thomas, then known as the dean of the White House press corps, Washington, D.C., November 12, 2008

Source Article from http://feedproxy.google.com/~r/nybooks/~3/ri6JvI48v-M/

‘This Is a Reality, Not a Threat’

David LevinthalDavid Levinthal: Untitled, from his 2008 series ‘I.E.D.,’ about the US wars in Afghanistan and Iraq. Levinthal’s work is on view in ‘David Levinthal: War, Myth, Desire,’ at the George Eastman Museum, Rochester, New York, until January 1, 2019. The accompanying book is published by the museum and Kehrer.

The Reign of George VI, 1900–1925, published anonymously in London in 1763, makes for intriguing reading today. Twentieth-century France still groans under the despotism of the Bourbons. America is still a British colony. “Germany” still means the sprawling commonwealth of the Holy Roman Empire. As the reign of George VI opens, the British go to war with France and Russia and defeat them both. But after a Franco-Russian invasion of Germany, the war reignites in 1917. The British invade and subdue France, deposing the Bourbons. After conquering Mexico and the Philippine Islands, the Duke of Devonshire enters Spain, and a general peace treaty is signed in Paris on November 1, 1920.

The impact of revolution on the international system lies far beyond this author’s mental horizons, and he has no inkling of how technological change will transform modern warfare. In his twentieth century, armies led by dukes and soldier-kings still march around the Continent reenacting the campaigns of Frederick the Great. The Britannia, flagship of the Royal Navy, is feared around the world for the devastating broadsides of its “120 brass guns.” The term “steampunk” comes to mind, except there is no steam. But there are passages that do resonate unsettlingly with the present: English politics is mired in factionalism, Germany’s political leadership is perilously weak, and there are concerns about the “immense sums” Russian Tsar Peter IV has invested in British client networks, with a view to disrupting the democratic process.

Predicting future wars—both who will fight them and how they will be fought—has always been a hit-and-miss affair. In The Coming War with Japan (1991), George Friedman and Meredith Lebard solemnly predicted that the end of the cold war and the collapse of the Soviet Union would usher in an era of heightened geopolitical tension between Japan and the US. In order to secure untrammeled access to vital raw materials, they predicted, Japan would tighten its economic grip on southwest Asia and the Indian Ocean, launch an enormous rearmament program, and begin challenging US hegemony in the Pacific. Countermeasures by Washington would place the two powers on a collision course, and it would merely be a matter of time before a “hot war” broke out.

The rogue variable in the analysis was China. Friedman and Lebard assumed that China would fragment and implode just as the Soviet Union had, leaving Japan and America as rivals in a struggle to secure control over it. It all happened differently: China embarked upon a phase of phenomenal growth and internal consolidation, while Japan entered a long period of economic stagnation. The book was clever, well written, and deftly argued, but it was also wrong. “I’m sure the author had good reasons in 1991 to write this, and he’s a really smart guy,” one reader commented in an Amazon review in 2014 (having failed to notice Meredith Lebard’s co-authorship). “But, here we are, 23 years later, and Japan wouldn’t even make the list of the top 30 nations in the world the US would go to war with.”

This is the difficult thing about the future: it hasn’t happened yet. It can only be imagined as the extrapolation of current or past trends. But forecasting on this basis is extremely difficult. First, the present is marked by a vast array of potentially relevant trends, each waxing and waning, augmenting one another or canceling one another out; this makes extrapolation exceptionally hard. Second, neither for the present nor for the past do experts tend to find themselves in general agreement on how the most important events were or are being caused—this too, bedevils the task of extrapolation, since there always remains a degree of uncertainty about which trends are more and which are less relevant to the future in question.

Finally, major discontinuities and upheavals seem by their nature to be unpredictable. The author of The Reign of George VI failed to predict the American and French Revolutions, whose effects would be profound and lasting. None of the historians or political scientists expert in Central and Eastern European affairs predicted the collapse of the Soviet bloc, the fall of the Berlin Wall, the unification of Germany, or the dissolution of the Soviet Union. And Friedman and Lebard failed to foresee the current economic, political, and military ascendancy of China.

Lawrence Freedman’s wide-ranging The Future of War: A History is aware of these limits of human foresight. It is not really about the future at all, but about how societies in the Anglophone West have imagined it. The book doesn’t advance a single overarching argument; its strength lies rather in the sovereign presentation of a diverse range of subjects situated at various distances from the central theme: the abiding military fantasy of the “decisive battle,” the significance of peace conferences in the history of warfare, the impact of nuclear armaments on strategic thought, the quantitative analysis of wars and their human cost, the place of cruelty in modern warfare, and the changing nature of war in a world of cyberweapons and hybrid strategy.

In modern societies, as Freedman shows, imagining wars to come has been done not just by experts and military planners but also by autodidacts and writers of fiction. The most influential early description of a modern society under attack by a ruthless enemy was H.G. Wells’s best seller The War of the Worlds (1897), in which armies of Martians in fast-moving metal tripods poured “Heat-Rays” and poisonous gas into London, clogging the highways with terrified refugees who were subsequently captured and destroyed, their bodily fluids being required for the nourishment of the invaders. The Martians had been launched from their home planet by a “space gun”—borrowed from Jules Verne’s From the Earth to the Moon (1865)—but the underlying inspiration came from the destruction of the indigenous Tasmanians after the British settlement of the island, an early nineteenth-century epic of rapes, beatings, and killings that, together with pathogens carried by the invaders, wiped out virtually the entire black population (a few survived on nearby Flinders Island). The shock of Wells’s fiction derived not so much from the novelty of such destruction, which was already familiar from the European colonial past, but from its unexpected relocation to a white metropolis.

The most accurate forecast of the stalemate on the Western Front in 1914–1918 came not from a professional military strategist but from the Polish financier and peace advocate Ivan Stanislavovich Bloch (1836–1901), whose six-volume study The War of the Future in Its Technical Economic and Political Relations (1898) argued that not even the boldest and best-trained soldiers would be able to cut through the lethal fire of a well-dug-in adversary. The next war, he predicted, would be “a great war of entrenchments” that would pit not just soldiers but entire populations against one another in a long attritional struggle. Bloch’s meticulously detailed scenario was an argument for the avoidance of war. If this kind of thinking failed to have much effect on official planning, it was because military planners foresaw a different future, one in which determined offensives and shock tactics would still carry the day against defensive positions. Their optimism waned during the early years of World War I but was revived in 1917–1918, with the return to a war of movement marked by huge offensive strikes and breakthroughs into enemy terrain.

The prospect of aerial warfare aroused a similar ambivalence. Wells’s War in the Air (1908) imagined a form of warfare so devastating for all sides that a meaningful victory by any one party was unthinkable. He depicted America as under attack from the east by German airships and “Drachenfliegers” and from the west by an “Asiatic air fleet” equipped with swarms of heavily armed “ornithopters” (lightweight one-man flying machines). The book closed with a post-apocalyptic vision of civilizational collapse and the social and political disintegration of all the belligerent states.

But others saw aerial warfare as a means of recapturing the promise of a swift and decisive victory. Giulio Douhet’s The Command of the Air (1921) aimed to show how an aerial attack, if conducted with sufficient resources, could carry war to the nerve centers of the enemy, breaking civilian morale and thereby placing decision-makers under pressure to capitulate. The ambivalance remains. To this day, scholars disagree on the efficacy of aerial bombing in bringing the Allied war against Nazi Germany to an end, and the Vietnam War remains the classic example of a conflict in which overwhelming air superiority failed to secure victory.

The ultimate twentieth-century weapon of shock was the atomic bomb. The five-ton device dropped on Hiroshima by an American bomber on August 6, 1945, flattened four square miles of the city and killed 80,000 people instantly. The second bomb, dropped three days later on Nagasaki, killed a further 40,000. The advent of this new generation of nuclear armaments—and above all the acquisition of them by the Soviet Union—opened up new futures. In 1954, a team at the RAND Corporation led by Albert Wohlstetter warned that if the leadership of one nuclear power came to the conclusion that a preemptive victory over the other was possible, these devastating weapons might be used in a surprise attack. On the other hand, if the destructive forces available to both sides were in broad equilibrium, there was reason to hope that the fear of nuclear holocaust would itself stay the hands of potential belligerents. “Safety,” as Winston Churchill put it in a speech to the British Parliament in March 1955, might prove “the sturdy child of terror, and survival the twin brother of annihilation.”

This line of argument gained ground as the underlying stability of the postwar order became apparent. The “function of nuclear armaments,” the Australian international relations theorist Hedley Bull suggested in 1959, was to “limit the incidence of war.” In a nuclear world, Bull argued, states were not just “unlikely to conclude a general…disarmament agreement,” but would be “behaving rationally in refusing to do so.” In an influential paper of 1981, the political scientist Kenneth Waltz elaborated this line of argument, proposing that the peacekeeping effect of nuclear weapons was such that it might be a good idea to allow more states to acquire one: “more may be better.”*

Most of us will fail to find much comfort in this Strangelovian vision. It is based on two assumptions: that the nuclear sanction will always remain in the hands of state actors and that state actors will always act rationally and abide by the existing arms control regimes. The first still holds, but the second looks fragile. North Korea’s nuclear deterrent is controlled by one of the most opaque personalities in world politics. This past January, Kim Jong-un reminded the world that a nuclear launch button is “always on my table” and that the entire United States was within range of his nuclear arsenal: “This is a reality, not a threat.”

For his part, the president of the United States taunted his Korean opponent, calling him “short and fat,” “a sick puppy,” and “a madman,” warning him that his own “Nuclear Button” was “much bigger & more powerful” and threatening to rain “fire and fury” down on his country. Then came the US–North Korea summit of June 12, 2018, in Singapore. The two leaders strutted before the cameras and Donald Trump spoke excitedly of the “terrific relationship” between them. But the summit was diplomatic fast food. It lacked, to put it mildly, the depth and granularity of the meticulously prepared summits of the 1980s. We are as yet no closer to the denuclearization of the Korean peninsula than we were before.

Meanwhile Russia has installed a new and more potent generation of intermediate-range nuclear missiles aimed at European targets, in breach of the 1987 INF Treaty. The US administration has responded with a Nuclear Posture Review that loosens constraints on the tactical use of nuclear weapons, and has threatened to pull out of the treaty altogether. The entire international arms control regime so laboriously pieced together in the 1980s and 1990s is falling apart. In a climate marked by resentment, aggression, braggadocio, and mutual distrust, the likelihood of a hot nuclear confrontation either through miscalculation or by accident seems greater than at any time since the end of the cold war.

Freedman is unimpressed by Steven Pinker’s claim that the human race is becoming less violent, that the “better angels of our nature” are slowly gaining the upper hand as more and more societies come to accept the view that “war is inherently immoral because of its costs to human well-being.” Pinker’s principal yardstick of progress, the declining number of violent deaths per 100,000 people per year across the world over the span of human history, strikes Freedman as too crude: it fails to take account of regional variations, phases of accelerated killing, and demographic change; it assumes excessively low death estimates for the twentieth century and fails to take account of the fact that deaths are not the only measure of violence in a world that has become much better at keeping the maimed and traumatized alive.

However the numbers stack up, there has clearly been a change in the circumstances and distribution of fatalities. Since 1945, conflicts between states have caused fewer deaths than various forms of civil war, a mode of warfare that has never been prominent in the fictions of future conflict. Two million are estimated to have died under the regime of Pol Pot in Cambodia in the 1970s; 80,000–100,000 of these were actually killed by regime personnel, while the rest perished through starvation or disease. In a remarkable spree of low-tech killing, the Rwandan genocide took the lives of between 500,000 and one million people.

The relationship between military and civilian mortalities has also seen drastic change. In the early twentieth century, according to one rough estimate, the ratio of military to civilian deaths was around 8:1; in the wars of the 1990s, it was 1:8. One important reason for this is the greater resistance of today’s soldiers to disease: whereas 18,000 British and French troops perished of cholera during the Crimean War, in 2002, the total number of British soldiers hospitalized in Afghanistan on account of infectious disease was twenty-nine, of whom not one died. On the other hand, civilians caught up in modern military conflicts, especially in situations where medical services and humanitarian supplies are disrupted, remain highly exposed to disease, thirst, and malnutrition.

A further reason for the disproportionate ballooning of civilian deaths is the tendency of military interventions to morph into chronic insurgencies and civil wars. Counting the dead is extremely difficult in a dysfunctional or destroyed state riven by civil strife, but the broad trends are clear enough. Whereas the total number of Iraqi combat deaths from the air and ground campaigns in the 1991 Gulf War appears to have been between 8,000 and 26,000, the total number of “consequential” Iraqi civilian deaths was around 100,000. Several tens of thousands of Iraqi military personnel were killed in the second Gulf War; the total civilian death toll may have been as high as 460,000 (the Lancet’s estimate of 655,000 is widely regarded as too high). The deaths incurred by the coalition forces in these two conflicts were 292 and 4,809 respectively. The problem is that even the most determined and skillful applications of military force, rather than definitively resolving disputes, inaugurate processes of escalation or disintegration that exact a much higher human toll than the military intervention itself.

British Library/Bridgeman ImagesMartian Tripods; illustration by Jacobus Speenhoff for a Dutch edition of H.G. Wells’s The War of the Worlds, 1899

Today, the phenomenon of the “battle” in which highly organized state actors are engaged is making way for a decentered form of ambient violence in which states engage “asymmetrically” with nonstate militias or civilians; cyberattacks disrupt elections, infrastructures, or economies; and missile-bearing drones cruise over insurgent suburbs. The resulting deterritorialization of violence in regions marked by decomposing states makes the kind of “decision” Clausewitz associated with battle difficult to achieve or even to imagine. “With the change in the type and tactics of a new and different enemy,” Robert H. Latiff writes in Future War, “we have evolved in the direction of total surveillance, unmanned warfare, stand-off weapons, surgical strikes, cyber operations and clandestine operations by elite forces whose battlefield is global.”

In pithy, flip-chart paragraphs, Latiff, a former US Air Force major general, sketches a vision of a future that resembles the fictional scenarios of William Gibson’s Neuromancer.

In the wars of the future, Latiff suggests, the “metabolically dominant soldier” who enjoys the benefits of immunity to pain, reinforced muscle strength, accelerated healing, and “cognitive enhancement” will enter the battlespace neurally linked not just to his human comrades but also to swarms of semiautonomous bots. “Flimmers,” missiles that can both fly and swim, will menace enemy craft on land and at sea, while undersea drones will seek out submarines and communication cables. Truck-mounted “Active Denial Systems” will deploy “pain rays” that heat the fluid under human skin to boiling point. Enemy missiles and aircraft will buckle and explode in the intense heat of chemical lasers. High-power radio-frequency pulses will fry electrical equipment across wide areas. Hypersonic “boost-glide vehicles” will ride atop rockets before being released to attack their targets at such enormous speeds that shooting them down with conventional missiles will be “next to impossible.” “Black biology” will add to these terrors a phalanx of super-pathogens. Of the more than $600 billion the US spends annually on defense, about $200 billion is allocated to research, development, testing, and procurement of new weapons systems.

Latiff acknowledges some of the ethical issues here, though he has little of substance to say about how they might be addressed. How will the psychology of “human-robot co-operation” work out in practice? Will “metabolically dominant” warriors returning from war be able to settle back comfortably into civilian society? What if robots commit war crimes or children get trapped in the path of “pain rays”? What if radio-magnetic pulse weapons shut down hospitals, or engineered pathogens cause epidemics? Will the growing use of drones or AI-driven vehicles diminish the capacity of armed forces personnel to perceive the enemy as fully human? “An arms race using all of the advanced technologies I’ve described,” writes Latiff toward the end of his book, “will not be like anything we’ve seen, and the ethical implications are frightening.”

Frightening indeed. A dark mood overcame me as I read these two books. It’s hard not be impressed by the inventiveness of the weapons experts in their underground labs, but hard, too, not to despair at the way in which such ingenuity has been uncoupled from larger ethical imperatives. And one can’t help but be struck by the cool, acquiescent prose in which the war studies experts portion out their arguments, as if war is and will always be a human necessity, a feature of our existence as natural as birth or the movement of clouds. I found myself recalling a remark made by the French sociologist Bruno Latour when he visited Cambridge in the spring of 2016. “It is surely a matter of consequence,” he said, surprising the emphatically secular colleagues in the room, “to know whether we as humans are in a condition of redemption or perdition.”

The principled advocacy of peace also has its history, though it receives short shrift from Freedman. The champions of peace will always be vulnerable to the argument that since the enemy, too, is whetting his knife, talk of peace is unrealistic, even dangerous or treacherous. The quest for peace, like the struggle to arrest climate change, requires that we think of ourselves not just as states, tribes, or nations, but as the human inhabitants of a shared space. It demands feats of imagination as concerted and impressive as the sci-fi creativeness and wizardry we invest in future wars. It means connecting the intellectual work done in centers of war studies with research conducted in peace institutes, and applying to the task of avoiding war the long-term pragmatic reasoning we associate with “strategy.”

“I don’t think that we need any new values,” Mikhail Gorbachev told an interviewer in 1997. “The most important thing is to try to revive the universally known values from which we have retreated.” And it must surely be true, as Pope Francis remarked in April 2016, that the abolition of war remains “the ultimate and most deeply worthy goal of human beings.” There have been prominent politicians around the world who understood this. Where are they now?

  1. *

    See Kenneth N. Waltz, “The Spread of Nuclear Weapons: More May Be Better,” The Adelphi Papers, Vol. 21, No. 171 (1981). 

Source Article from http://feedproxy.google.com/~r/nybooks/~3/3ejqjSmjxvc/

An Artist’s Menagerie

Childhood in France:

It was the summer of 1947; the war had ended only two years earlier and food was still fairly scarce. My adoptive parents brought me to a farm on an isolated mountaintop of the Morvan region in the center of France. I was nine, pale, and as skinny as a rail. Madame Durand, the farmer, was a wonderful woman. She would take me along when she milked her favorite cow, Blanchette, and she let me drink the milk fresh from the pail, warm, sweet, and foamy. Blanchette’s milk was more delicious than anything I had ever tasted.

When I was a little girl, I used to spend my summers in a village on the French Riviera, staying with my aunt Mada and an old Russian lady who raised goats and rabbits. She only spoke Russian and so did her goats. I soon learned the words you need in order to tell a Russian-speaking goat to go left or right—though there was something absolutely contrary about these goats, and likable as they might be, if you wanted them to go left, you had to say, “Go right!” and vice versa. I was told my Russian accent was terrible, but the goats seemed to understand anyway, and always went the right way.

Except for the fact that they are always clucking and flapping about, hens are not devoid of a certain shapely elegance. Some hens, in fact, are downright beautiful. If I lived in the country, I might like to keep a few hens and give them French names. Then, every morning, I would go looking for fresh eggs to eat “à la coque,” boiled and with a little salt, the way I did when I was young.

Summers in Vermont:

One day, a big white dog came lolloping down the meadow where I was painting in Vermont, and, with a contented sigh, settled himself right between me and the easel. I couldn’t continue without stumbling over him or splashing him with paint. And if I moved my easel, he moved with it. There was nothing to do but surrender and pack up for the day. The dog observed my defeat with a funny look—disdainful or disappointed, I couldn’t tell—and then lolloped off again without a backward glance. I never saw him again.

For years, we left New York to spend the summer northern Vermont, where we had a cabin. One summer, a cat named Booties, which belonged to some folks in the village, got into the habit of visiting us daily. He appeared at the top of the road, then walked in leisurely fashion through our door. We were always delighted, even somewhat honored, to receive his visits. I have never known a more personable cat. He was very smart and dignified; our whole family was crazy about him. The next summer, when we came back, Booties was gone. He’d been killed by a car that winter, we were told. I was always forgetting he was gone, and kept looking toward the top of the road, expecting to see him.  

Occasional meetings:

This seal belongs to the Brooklyn Zoo. She barks like a dog and even looks a bit like a dog. She also looks like an old man—Winston Churchill, specifically—and depending on her expression, like another, more contemporary politician who will remain nameless. She is a fantastic swimmer. She catches small fish and the children love to watch. Altogether, she seems content enough with her life, if a bit pensive.

In his Travels with a Donkey in the Cevennes (1879), Robert Louis Stevenson described a donkey better than anyone ever had. Modestine, as she was named, “was patient, elegant in form, the color of an ideal mouse, and inimitably small. Her faults were those of her race and sex; her virtues were her own.” I have always liked donkeys. I like how they look and I love to paint them. (I find donkeys, goats, and cats best to paint in general.) If I were an animal, I probably would not want to be a donkey as most of them have pretty rotten lives, but I feel that of all the animal kingdom, donkeys are my landsmen, my kin, and chances are I would be one.

Fabulous creatures:

There is something self-assured, brave, even gallant, about this bird. It doesn’t care what you think. All you need to know is that blue-footed boobies live on the coasts of Central and South America, mostly the Galapagos Islands. I have never met a booby in the flesh, but I would recognize one at once if we ever crossed paths.

The wolf in “Little Red Riding Hood” is not very smart. Why can’t he just eat Red Riding Hood right there in the woods? Why all the rigmarole about running to Grandmother’s house, and worse yet, donning her cap and nightgown? It’s completely unrealistic! Wolves are dignified and proud—but dangerous, too: you wouldn’t want to meet a pack of them as you crossed the frozen steppes of Russia alone in a troika. But from afar, there is much to admire about this untamed and beautiful beast.

Between France and America:

I found this portrait of a young badger the other day, and I thought he looked so appealing I decided to consult my French-English dictionary to see what this interesting animal was called in my native French: blaireau! Well, of course, I knew what a blaireau was, just as I knew what a badger was. But I hadn’t thought of either one in a long time, and I had never put the two together. Will the wonders of language and nature never cease?

Source Article from http://feedproxy.google.com/~r/nybooks/~3/4BkuMj_bCek/

Obama and the Legacy of Africa’s Renaissance Generation

Reuters/Obama For America/HandoutBarack Obama as a child with his father Barack Obama Sr., 1960s

It came to be a core belief held by the American public and media that Barack Obama was a self-creation who had stepped out of nowhere. In a racially divided society, for some the idea that he belonged to no tribe made it possible to vote for him. For his detractors, of whom Trump and his birther movement were the most visible, the belief provided an opportunity to claim that Obama was not a true American. Indeed, he cut a solitary figure: parents and American grandparents dead, no full siblings; what else there was of his family lived in Kenya, which might as well have been the moon to many Americans. Marriage to Michelle gave Obama what he appeared to lack, a family and a community, though his Kenyan ancestry meant he was a member of the African-American community by adoption rather than birthright.

Against the backdrop of the fantasy of normality to which American (and not just American) popular culture subscribes—that is to say, the insistence that all but a few grow up in the same town and live there all their lives—Obama’s story appeared unusual. The truth is that his grandparents made the move to Hawaii (after several moves around the country), doing what millions of Americans before them have done and continue to do: searching for better opportunities. One result is that families become stretched over distance and time until the links between uncles, aunts, cousins, and generations are broken and reformed with new generations in new places.

Even so, the stand-out fact of Obama’s biography remained and remains that he had been born of a Kenyan father and a white mother. “No life could have been more the product of randomness than that of Barack Obama,” wrote David Maraniss in his 2012 biography of the former president. This, though, is the case only when his life is viewed from an American perspective. From an African perspective, the tradition of sending young men to study overseas, as was the case with Barack Obama Sr., is a familiar and longstanding one. In 1852, William Wells Brown, the American playwright, fugitive slave, and abolitionist, noted that he might meet half a dozen black students in an hour’s walk through central London. Some sixty years before that, in 1791, the Temne King Naimbana (of what became Sierra Leone in West Africa) sent his son John Frederick to England, for reasons of political expediency (he sent another to France, and a third to North Africa to acquire an Islamic education). Tragically, John Frederick never made it home, but died on the return passage.

In the second half of the twentieth century, geopolitical events—the end of empires, the rise of nationalism in African countries, the cold war, communism, and the second “red scare”—would see an exponential rise in the numbers of Africans sent to study overseas. So the meeting of Obama’s parents came about more as the unintended consequence of political policy than by random chance. For me, Obama’s story is remarkably familiar. My parents met under very similar circumstances. My father was born in 1935 in Sierra Leone; Barack Obama Sr. was born in Kenya in 1936. My mother was white and British; Obama’s mother was a white American. Both women met and married the men who would become our respective fathers when those men were selected to study at university abroad—a story Obama relates only briefly in his memoir Dreams from My Father:

My father grew up herding his father’s goats and attending the local school, set up by the British colonial administration, where he had shown great promise. He eventually won a scholarship to study in Nairobi; and then on the eve of Kenyan independence, he had been selected by Kenyan leaders and American sponsors to attend a university in the United States, joining the first wave of Africans to be sent forth to master Western technology and bring it back to forge an new, modern Africa.

Obama was wrong about one thing: his father was not in the first wave of students sent overseas to master Western technology, though he was in the first wave of Kenyans who were sent to America. Up until then, most African students had been destined for Britain and, starting after World War II, to the Soviet Bloc and China. In fact, the adventures of this generation of Africans would one day inspire a genre of literature, collectively known as the “been to” novels, exemplified by Ay Kwei Armah’s Fragments, No Longer at Ease by Chinua Achebe, and Ama Ata Aidoo’s Dilemma of a Ghost, fictions that told of the challenges both of leaving the motherland for the West and of return.


My father’s insistence that only a British boarding school was able to provide an education good enough for his children had me in tears at Freetown’s Lungi Airport three times a year as we waited to board the plane to London. My father was unyielding, reminding us constantly of the value of the enterprise we were undertaking and about which I didn’t care in the slightest. Paying for our education came before buying a house, before foreign travel, before everything. My father’s own story was both extraordinary and yet, in its own way, entirely typical of the changing times in which he was born. The son of a wealthy farmer and a regent chief from the north of Sierra Leone, Mohamed Forna had won a scholarship at an early age to Bo School, “the Eton of the Protectorate,” as it was known, many miles from home in the south of the country.

Aminatta FornaMohamed Forna, 1957

At the time, Sierra Leone was a British colony, though one that was never settled by whites, who, unable to tolerate the climate, died in such droves from malaria and tropical illnesses that the country was dubbed “the white man’s grave.” British fragility made a crucial difference to the style of governance Britain chose to adopt in West Africa. Instead of a full-fledged colonial government such as existed in Kenya, where the climate of the Highlands was suited to both coffee and Europeans, in Sierra Leone the guardians of empire relied instead on a system of “native administration.” Bo School was founded by the British for the sons of the local aristocracy, who, according to plan, would play a leading role in governing Sierra Leone on behalf of the British.

Generally, the British were cautious about allowing their colonial subjects much in the way of book-learning. The colonial project had begun with a great deal of hubris, talk of a civilizing mission and the belief that Britain could create the world in its own image. Education was a part of that mission. But by the time Lord Lugard, the colonial administrator and architect of native administration, became the governor of Nigeria in 1912, he was sounding warnings against “the Indian disease,” namely the creation, through education, of an intellectual class who would embrace nationalism. Burned by the threat of insurrection elsewhere in the Empire, though still intent on pursuit of an administration staffed by local talent, the British allowed a few Africans just enough education to create a core of black bureaucrats, but no more.

Sierra Leone’s beginnings were a little different from those of Britain’s other African holdings. In the late eighteenth century, British philanthropists had established settlements there of people freed from slavery, many of whom had fled from America to Britain following Lord Mansfield’s 1772 ruling that protected escaped slaves. As part of this social engineering experiment, schools and even a university were established in the capital, Freetown. Fourah Bay College, established in 1827, was the first institute of higher education built in West Africa since the demise of the Islamic universities in Timbuktu. Elsewhere in Britain’s African dominions, and in the early days of empire, most educational establishments were built by evangelically motivated Christian missionaries, and they were tolerated but not encouraged by the colonial administration.

In Kenya in the 1920s, precisely what Lugard feared began to happen: missionary-educated Kenyan men established their own churches and challenged white rule. The locals had a name for Western-educated Kenyans: Asomi. Harry Thuku, the father of Kenyan nationalism (whose story is narrated in Ngũgĩ wa Thiong’o’s tale of the Mau Mau rebellion, A Grain of Wheat) was one such. In their churches, Asomi pastors accused the missionaries of distorting the Bible’s message to their own ends and preached an Africanized version of Christianity, and the Asomi founded associations to represent African interests and built their own schools in which pupils were imbued with a sense of patriotism and pride.

Still, whatever resistance Britain’s Colonial Office offered to the idea of the educated native, by the later days of empire, faced with ever-growing demands for colonial reform, the British began to build a limited number of government institutions, with the intention, in the words of the Conservative minister Oliver Stanley in 1943, of guiding “Colonial people along the road to self-government within the framework of the British Empire.” Any future form of self-governance was intended to create the basis for neocolonialism and a bulwark against the threat of communism.

Shifts in British attitudes, however, were soon outstripped by African ambitions. One million African men had fought on the Allied side during World War II, and those experiences had broadened their worldview. Many had learned to read and write—among them, Obama’s grandfather, Onyango, who, according to Obama family lore, traveled to Burma, Ceylon, the Middle East, and Europe as a British officer’s cook. Whether Onyango knew how to read and write English before he was recruited is unknown; it is possible, though unlikely. By the time he came back, however, he was able to teach his young son his letters before sending him to school. In Dreams from My Father, Barack Obama recounts Onyango’s surviving sister and his great aunt Dorsila’s memories of his grandfather: “For to [Onyango] knowledge was the source of all the white man’s power, and he wanted to make sure his son was as educated as any white man.”

Across the continent, emerging nationalist movements were gaining ground. For them, literacy followed by the creation of an elite class of professionals were the necessary first steps toward full independence. The courses on offer at the government colleges were restricted in subject and scope (syllabuses had to be approved by the colonial authorities) and the colleges themselves could admit only limited numbers of students. Energized and impatient, a new generation refused to wait or to play by the Englishman’s rules. With too few opportunities on the continent, they set their sights overseas, on Britain itself.

Few had the means to cover the costs of travel and fees. There were a limited number of scholarships available through the colonial governments, mainly to study subjects the local universities were not equipped to teach, such as medicine. A lucky few found wealthy patrons; others still were sponsored by donations from their extended families, and sometimes from entire villages. The Ghanaian nationalist and politician Joe Appiah, father of the philosopher Kwame Anthony Appiah, ditched his job in Freetown without telling his employers and bought himself a one-way ticket on a ship bound for Liverpool, hoping to get by on his luck and wit.


Aminatta FornaThe author’s parents, Mohamed Forna and Maureen Margaret Christison, on their wedding day, 1961

My mother Maureen has a particular memory of my father. On April 27, 1961, the day Sierra Leone became a self-governing nation, he got roaring drunk at a sherry party held by African students at the premises of the British Council in Aberdeen. The couple had married at the registry office in Aberdeen one month before, in a ceremony attended by their friends among the West African students. On the way home, on the top deck of the bus, my father lit six cigarettes and puffed on them all at once. “But Mohamed, you don’t even smoke,” my mother had protested. And my father replied: “I’m smoking the smoke of freedom, man. I’m smoking the smoke of freedom.”

In the decades between the two world wars, Britain emerged as “the locus of resistance to empire” where anti-colonial movements were shaped by the growth of Pan-Africanist ideals among artists, intellectuals, students, and activists from the colonies. The Kenyan writer and activist Ngũgĩ wa’ Thiong’o, commenting on his arrival in Leeds in 1964, remarked to me:

For the first time I was able to look back at Kenya and Africa, from outside Kenya. Many of the things that were happening in Africa at that time, independence and all that, were not clear to me when I was in Kenya but made sense when I was in Leeds meeting other students from Africa, Nigeria, Ghana, students from Australia, every part of the Commonwealth, students from Bulgaria, Greece, Iraq, Afghanistan—we all met there in Leeds, we had encounters with Marx with Lenin, and all that began to clarify for me a change of perspective.

Among those elites who gathered there, driven by, and driving, the desire for self-rule, were Jomo Kenyatta, Kwame Nkrumah, Michael Manley, Marcus Garvey, C.L.R. James, Seretse Khama, Julius Nyerere, as well as a number of African Americans, including Paul and Eslanda Goode Robeson. In London, anti-colonial and Pan-Africanist ideas were shared and enlarged, spurred by a shared experience as colonial subjects in their homelands and as the victims of racism and the color bar in Britain. “They were brought together too by the fact that the British—those who helped and those who hindered—saw them all as Africans, first of all,” writes Anthony Appiah. And so those who may previously never have identified themselves as such began to do so and explore the commonalities of race, racism, and nationalism. And out of those conversations arose new political possibilities involving international organizations and the opportunity for cultural exchange.

Arrival in Britain brought with it many shocks for the colonial student. Whereas before they were Sierra Leonian and Temne, Luo and Kenyan, Hausa and Nigerian, suddenly they were simply black, subject to all the attitudes and reactions conferred by their skin color. Signs declaring “No Irish, No Dogs, No Blacks” were still common on rental properties during my father’s time in Scotland. My mother told me of the insults my father endured in the street—directed at her as well, when they were together. Later, my father’s second wife—my stepmother, who also went to university in Aberdeen and vacationed in London, staying in the apartments of other African students—recalled the gangs of racist skinheads who arrived to break up their gatherings. “Somebody would run and call for the West Indians,” she told me, their Caribbean neighbors being more experienced in fending off such attacks. In a reversal of the immigrant dream story, Sam Selvon’s 1956 novel The Lonely Londoners tells the story of black people arriving in the 1950s in search of prosperity and a new life, only to discover cruelty and misery.

In order to confront the challenges of their new lives, as well as to keep abreast of political developments back home, the colonial students organized themselves into societies and associations. One such was the hugely influential West African Students’ Union, or WASU. If London was the heart of resistance, then WASU was its circulatory system. My father and his friends were all WASU members, as was every former student of that time from a West African country to whom I have ever spoken. WASU was the center of their social, cultural, and, especially, political life. It also “functioned as a training ground for leaders of the West African nationalist movement,” wrote the historian Peter Fryer; indeed, both Kwame Nkrumah and Joe Appiah were among the leading names who served on WASU’s executive committee.

Unnerved at the pace with which calls for independence were gathering, the Colonial Office kept a close eye on the students’ activities. In London, the department funded two student hostels, which aided the many students whom the color bar prevented from finding decent lodging (and also kept the students conveniently in one location). The civil servants also spied on the African students through MI5. A tug-of-war was taking place within the Colonial Office: on one side were the “softly-softlies” who favored an approach designed to promote good relations with the future leaders; on the other were the hardliners concerned that Communist ideas might take root among the rising generation. Such was the fear of Communist-inspired insurrection in West Africa that Marxist literature was banned and travel to Eastern European countries restricted in those countries.

The colonial administrator Lord Milverton once described WASU as “a communist medium for the contact of communists with West Africans” through the Communist Party of Great Britain. Then-parliamentarian David Rees Williams even accused the Communist Party of using prostitutes to spread its message and called for restrictions on the numbers of students entering the country from the colonies. Though MI5 did not go so far as to keep individual files on all the students, they did do so for the most visible leaders like Nkrumah, whose phone they tapped.

Certainly, there were Marxist sympathizers among the WASU leadership and the African student body in general. Ngũgĩ wa’ Thiong’o talked to me about his road to Marxism, which began during his student years in Leeds, when he saw poor whites for the first time and witnessed, during the student demonstrations in Leeds, white policemen turning on their own, a “vicious crushing of dissent.” Julius Nyerere turned to socialism during his time in Edinburgh, returning to Tanzania in 1952 to become a union organizer and later the first president of a new, socialist republic.

wasuproject.org.ukMembers of the West African Students’ Union (WASU), London, 1920s–1930s

By the 1960s, with the colonies gaining independence one by one, and China and the Soviet bloc beginning to offer their own scholarships, the softly-softly approach had prevailed within Britain’s Colonial Office. The administration of the students’ affairs was handed over to the British Council, which began a diplomatic charm offensive. Before they even left home, students on government scholarships were offered induction seminars on what to wear and how to conduct themselves in the homes of British people, and shown films on how to navigate the challenges of daily life. In one of these films, entitled Lost in the Countryside, a pair of Africans abroad (dressed in tweeds, they emerge from behind a haystack) are instructed firmly: “Do not panic! Find a road. Locate a bus-stop. Join the queue [and there in the middle of nowhere is a line of people]. A bus will arrive. Board it and return to town.” Once the students were in the UK, the British Council arranged home-stays for those Africans who wanted an up-close experience of the British (some 9,500 said they did). My stepmother recalls being advised never to sit in the chair of the head of household, a faux pas of which she has retained a dread all her life.

And finally, there were social events at the Council’s premises in various British cities. At a Christmas dance in the winter of 1959, my father, a third-year medical student at Aberdeen University, approached a young woman, a volunteer named Maureen who was helping to pour drinks for the party, put out his hand and said: “I’m Mohamed.”


If the attitude of the British authorities toward the West Africans was one of wavering welcome, the attitude toward the East Africans, Kenyans in particular, was even more complicated. In 1945, there were about 1,000 colonial students in Britain, two thirds of whom came from West Africa and only sixty-five of whom came from East Africa. In Kenya, a simmering mood of rebellion had by the 1950s given rise to the Mau Mau, a movement that explicitly rejected white rule and gave voice to the resentment against colonial government taxes, low wages, and the miserable living conditions endured by many Kenyans. The Mau Mau, which found its support mainly among the Kikuyu people who had been displaced from their lands by white farmers, demanded political representation and the return of land. Facing armed insurrection, in 1952 the British declared a state of emergency, and tried and imprisoned the nationalist leader (who would later become the first president of Kenya) Jomo Kenyatta, who had returned to his homeland from London in 1947.

Upon Kenyatta’s imprisonment, Kenyan nationalists turned to the United States for support. The activist Tom Mboya, a rising political star who in 1960 featured on Time magazine’s cover as the face of the new Africa, became the strongest voice calling for independence in Kenyatta’s absence. In 1959, Mboya began working with African-American organizations—in particular, the historically black private and state colleges, as well as civil rights champions such as Harry Belafonte, Sidney Poitier, Jackie Robinson, and Martin Luther King Jr.—and toured the United States talking about black civil rights and African nationalism as two sides of the same coin. His aim was to raise money for a scholarship program to bring Kenyan students to the US. Over two months, Mboya gave a hundred speeches and met with then Vice President Richard Nixon at the White House. By that point, independence for Kenya was a matter of when, not if—after all, Ghana had already attained independence—and it looked very much as though Britain was deliberately refusing Kenyans the help they needed to prepare for self-governance.

So here was Mboya offering the United States a foothold of influence in Africa, which Britain, even against the backdrop of a cold war scramble for the allegiance of African nations, was too churlish or too arrogant to secure. Although Nixon stopped short of agreeing to meet Mboya’s request for help, the Democratic candidate for the 1960 presidential election John F. Kennedy did do so, and his family’s foundation donated $100,000 to what became known as the “African student airlifts,” the first of which had taken place in 1957.

Mboya was a member of the Luo people, a friend of Onyanga’s, and sometime mentor to his son, Barack Obama Sr. On his own initiative, Obama Sr. had managed to secure himself an offer from the University of Hawaii, and this won him a place on a later airlift in 1959. Here was a young man with an excellent brain, and here, too, was a new dawn on the horizon bringing with it a new country—Obama Sr. saw himself as part of it all. The writer Wole Soyinka, who himself studied at Leeds, England, in the 1950s, had a name for them, the young men and women who came of age at the same time as their countries; he called them the “Renaissance Generation.”

Evening Standard/Getty ImagesJomo Kenyatta, the first president of Kenya, with Ghanaian Prime Minister Kwame Nkrumah at the Commonwealth Prime Ministers’ Conference, Marlborough House, London, 1965

Just as the West African students bound for Britain had been coached in what to expect, so the Kenyans were briefed on arrival in the United States, including about the prevailing racial attitudes they should expect to encounter there. The world-renowned anthropologist and now director of the Makarere Institute of Research Mahmood Mamdani, who traveled to the US on a 1963 Kenyan airlift, recalls being told it would be “preferable for us to wear African clothing when going into the surrounding communities because then people would know we were African and we would be dealt with respectfully.” Under colonial rule, Kenyans certainly did not share the privileges of whites; even so, for many African students the daily indignities of racial segregation in America came as a shock. At least one was arrested for trying to buy a sandwich at a whites-only lunch counter, and some of those studying at universities in the South were prompted by their experience of Southern racism to ask to be transferred to Northern colleges. As had been the case for their counterparts in Britain, a close eye was kept on their activities. Returning from a trip to Montgomery, Alabama, Mamdani got a visit from FBI agents; he recalls that they asked if he liked Marx, to which Mamdani replied in perfect innocence that he had never met the man. Informed that Marx was dead, he replied: “Oh no! What happened?” And as he told me in our conversation many years later: “The abiding outcome of that visit was that I went to the library to look up Marx.”

Obama Sr.’s choice of the University of Hawaii was, in many ways, an unfortunate one. Hawaii was more cosmopolitan than other parts of the United States and he did at least escape some of the racist attitudes that confronted other African students, but he was far from all the debates, meetings, lobbying, and activism about independence that were taking place at the universities and historically black colleges on the mainland. When the opportunity arose, he chose to continue his studies at Harvard—and part of the reason was undoubtedly that he wanted to get closer to the action. In 1961, Kenyatta was released from jail; two years later, Kenya declared independence. When all that happened, Obama was still a long way from home—just as my father was when Sierra Leone won its independence.

In time, Ngũgĩ would return from Leeds, and Mamdani from the United States. Ngũgĩ was by then a published author, having abandoned his studies to write Weep Not, Child. Mamdani went on to teach at Makarere University, which became the venue for the famous 1962 African Writers’ Conference, and he helped to transform it from a colonial technical college into a vibrant university. One of the few women on the airlift, Wangari Maathai, flew back home from Pittsburg in 1966, later to found the Green Belt Movement, an initiative focusing on environmental conservation that today is credited with planting fifty one million trees in Kenya and for which Maathai would be awarded a Nobel Peace prize. Still, for Kenya, as for every one of the new African nations, independence proved a steep and rocky road. Five hundred students who had earned their degrees overseas returned home, a significant proportion of them the American-educated AsomiThey would become the educators, administrators, accountants, lawyers, doctors, judges, and businessmen in the new Kenya. Despite the best efforts of Tom Mboya and his supporters, Kenya had only a fraction of the college-educated young professionals it needed.


Aminatta FornaThe author with her father, 1966

Eight years after he had left Sierra Leone, my father returned. His elder brother had died and his family wrote that Mohamed was needed at home. By then, he was a qualified medical doctor, with a wife and three children. The year before, Obama Sr. had also returned home after the US government declined to renew his visa. Medical students and those who went on to higher degrees, especially, had found themselves away for long periods, as much as a decade. Unsurprisingly, in that time, many of the men had formed romantic attachments with local women. If those relationships were frowned upon in Britain, they were illegal in much of America. Loving v. Virginia, the case before the Supreme Court that finally overturned the ban on interracial unions, was not decided until 1967. When the Immigration and Naturalization Service declined Obama Sr.’s request to remain in the country, his relations with women were reported to be part of the problem. Already, he had fathered one child with Ann Dunham, a son also named Barack, but that marriage was over, and he had formed a new relationship with another white woman, Ruth Baker.

In Britain, the authorities, though they did not encourage such unions, did not intervene except, notably, in the case of Seretse Khama, heir to the Bangwato chieftaincy in Bechuanaland (now Botswana) and Ruth Williams. This was at the behest of white-ruled South Africa, whose government would not tolerate an interracial marriage within its borders. Jomo Kenyatta had a child, Peter, with his British wife. I used to pass Peter in the corridors of the BBC, where for a time we both worked; he was in management, while I was a junior reporter awed by the prestige of his last name. The marriage of Joe Appiah to Peggy Cripps, the daughter of the Labour politician Sir Stafford Cripps, was one of the most high-profile unions of the day that also happened to be a mixed marriage.

Of Ann Dunham, first wife to Obama Sr. and mother of the future president, a childhood friend would later say: “She just became really, really interested in the world. Not afraid of newness or difference. She was afraid of smallness.” The same could be said of my mother, Maureen Christison. Aberdeen was simply too small for her. The African students represented a world beyond the gray waters of the North Sea. In the Scottish writer Jackie Kay’s Red Dust, her 2010 memoir of her search for her Nigerian father who studied in Scotland in the 1950s, her father overturns conventional wisdom in remarking how popular the male African students were with the local girls. The men frequently came from aristocratic families—both Appiah and Khama were royal, and my father was the son of a regent chief and landowner. “You must remember,” a contemporary of my parents observed during the time when I was researching my own memoir of my father, “they were the chosen ones.”

In 2017, in a New York Times op-ed assessing President Obama’s foreign policy legacy, Adam Shatz noted that Obama was “A well-traveled cosmopolitan… seemingly at home wherever he planted his feet. His vision of international diplomacy stressed the virtues of candid dialogue, mutual respect and bridge building.” Obama’s cosmopolitanism was rooted in several places: the fact of his Kenyan father (though not his immediate influence, since Obama Sr. was gone from the family before Obama was old enough to remember him), and later his painstaking search to assemble the pieces of his birthright, would do much to extend his vision. But before all of that, it was his mother, Ann, who instilled in him the foundations of his internationalism. She rehearsed for her son the version of his father’s story that Obama Sr. told of himself: that of the idealist devoted to building a new Kenya—albeit that in reality he was an unreliable husband and father, whose career came well short of his own expectations. It was Ann who remained true to that vision of a new world, who easily made friends with people of different nationalities, who subsequently married an Indonesian, and took her son to Indonesia to spend a formative period of his childhood, where she spent many years running development projects. My mother Maureen never returned to Scotland after the break-up of her marriage to my father. She married again, to a New Zealander who worked for the United Nations, and spent her life moving around the world, in time building her own international career within the UN.

Both women entered an international professional class, a group that the British historian David Goodhart disparagingly describes as the “anywheres”: people whose sense of self is not rooted in a single place or readymade local identity. If Obama’s search in Dreams from My Father was a quest for his African identity, it was also, and conversely, an attempt to discover whether he could ever be a “somewhere,” whether that somewhere was a place (in time, he would choose Chicago) or a people, part of an African-American community.

His next book, The Audacity of Hope, became, by contrast, a plea for complexity. Of his extended family of Indonesians, white Americans, Africans, and Chinese—in which I find a mirror for mine: African, European, Iranian, New World, and Chinese—Obama writes: “I’ve never had the option of restricting my loyalties on the basis of race or measuring my worth on the basis of tribe.” Obama knew and understood that he had more than one identity, that all of us do. Anthony Appiah credits his own avowed cosmopolitanism to his father Joe’s relaxed way with people from different worlds. I believe my father thought that his children would grow up to be both Sierra Leonian and British, a new kind of citizen, a new African, comfortable with our place in the world.

Kwame Anthony AppiahPeggy and Joe Appiah with their children, Ghana, circa 1972

For all the hope, there were bitter disappointments as well. Shortly after Obama Sr. returned to Kenya, his mentor Tom Mboya was assassinated. Obama Sr. would lose himself to drink and die in a car crash. My father arrived back in Sierra Leone to a government openly talking of introducing a one-party system, a threat to his democratic ideals. As politically opportunistic leaders across the continent quickly realized how easily the newborn institutions of democracy could be subverted to personal gain, the returning graduates would find themselves forced to confront the very governments they had come home to serve. In Ghana, Joe Appiah was jailed by his former good friend Nkrumah; Ngũgĩ wa’ Thiong’o would be imprisoned for sedition against the Kenyan government and then exiled; in Nigeria, Soyinka encountered a similar fate. My father was jailed and killed. Many would pay a high price for the privilege of having traveled beyond Africa, for coming of age at the same time as their countries, for working and dreaming of a Renaissance yet to come.

How many times in my own travels in this world have I come across one of them, the chosen, of my father’s generation? There’s a quality of character they wear, whose origins I have come to understand. They carry, alongside a worldly ease, a sense of duty, of obligation and responsibility, that imbues all they say and do. Unlike the generations that followed, they never saw their own future beyond Africa. I try to imagine an Africa if they had never been, and I cannot. There are those the world over who decry the failings and weaknesses of the post-independence African states at the same time as many in the West—after Afghanistan, after Iraq, and facing assaults on their own democratic institutions—have slowly come to the realization that nation-building is no simple task, that democracy takes more than a parliament building. The generation of Africans to whom the task fell of creating new countries knew, or came to know, that alongside the desires and dreams, and the promise of a new-found freedom, they had been set up to fail. Their real courage lay in the fact that they did not surrender, that they tried to do what they had promised themselves and their countries they would. They went forward anyway.

Source Article from http://feedproxy.google.com/~r/nybooks/~3/M_XaNkPhsIA/