Sunday, June 14, 2015

Comments: (0)

The Old Standbys

Here's my two most recent publications (or one of them, anyways):

I reviewed volume one of Zachary Leader's The Life of Saul Bellow for the good people over at The Hudson Review (where, in a nice twist, Bellow published some of his earliest work). Unfortunately, it isn't available online, but the Spring edition is in well-stocked bookstores across our great land.

You can, however, read my little five-paragraph memoir of growing up with Jonathan Franzen's How To Be Alone, which my friends at The American Scholar kindly invited me to write for their new online series "Reading Lessons."

Lots of fun.

Thursday, May 21, 2015

Comments: (1)

Words, Words, Words

One of the few apps on my smartphone is the Merriam-Webster dictionary app. It's perhaps the first innovation that has caused me to truly marvel at the speed with which technology is progressing. Unlike my parents, I haven't yet witnessed leaps of the kind that begin with IBM punch cards and end with iPads; by contrast, the path that leads from the first PC we owned, a boxy Compaq, to my trusty MacBook feels natural and gradual.

The Merriam-Webster app is different. Just ten years ago, when I was a freshman in high school, I was required to lug the one-pound dictionary to English class at least once a week. Now that same book, which must have made us 15-year-olds appear like so many laborers building the Pyramid at Giza, fits in my pocket, along with my photos, my correspondence, my voice messages, my calendar, and my music. It's easy to imagine Samuel Johnson gawping at the sight: the language itself at one's fingertips, wherever one goes.

Aside from its size, the app's handiest feature is a "favorite" button that allows me to flag words for later recall. I find myself adding to this list almost daily. It's especially useful when my subway reading strands me upon the rocks of "infrangible" (adj: not capable of being broken or separated into parts) or "epigone" (n: follower, disciple; also: an inferior imitator). I look up a word, click the little star button at the bottom of the entry, and then put my phone away, buzzing with the warm satisfaction of having expanded my vocabulary.

But have I? Somewhere in my childhood desk in St. Louis, there sits a sheet or two of looseleaf paper covered in definitions copied from that high school dictionary. I filled these pages when I was newly ambitious of becoming a writer, and I had read somewhere that one author or another had made a habit of writing down the definition of words he didn't know as he encountered him, thus enriching his prose. I pursued this Franklinesque exercise in self-improvement for at most a few weeks, but to this day, I can still remember the definition of one word I copied onto those sheets: "crepuscular" (adj: of, resembling, or relating to twilight), which I encountered in the first chapter of Jonathan Franzen's The Corrections. It isn't a word I deploy on a regular basis, but when I do use it, I feel the special gratification of education put to work.

I can't say the same about the words I accumulate on my phone. The speed with which I can stow them gives me little chance to absorb their meaning; the ease with which I do so deters me from revisiting them immediately--they'll always be there, so what's the hurry?

Of course, they won't always be there. Lately, my phone has displayed some of the early symptoms of its inevitable demise: a less robust battery life, a slower loading time, the occasional freeze-up. And if the thing dies without me having recorded my list of words, I'll be back where I started, as ignorant of the definitions of "inculpate" or "chatelaine" as ever. Realizing this the other day, I decided that I'd better store these definitions somewhere more permanent. I'd better get them down on paper.

Thursday, April 23, 2015

Comments: (0)

The Novelist’s Eye View of Conservation and Climate Change: An Exchange

I so thoroughly enjoyed my exchange with Andrew Ivers that I decided to rope another one of my blogging friends, Conor Gearin, into a back-and-forth. This time our topic was Jonathan Franzen's recent New Yorker essay about global warming--an ideal subject to discuss with Conor, a biology and English literature major about to graduate from Truman State University. He's off to MIT to study in their science writing program, a perfect place for his deep appreciation for and understanding of science and the humanities. Conor contributed two entries to this post; my laggardly self offered only one. Conor blogs at Notebook, which is also the title of the wonderful column he wrote for the Prep News at St. Louis U. High, where our friendship began.


Science is about skepticism, and that’s what makes it so valuable, beautiful, and hard. It fits well into an essay when things appear as tidy opposites—when just a small cost increase would buy special glass to reduce bird window-strike mortality, but the bad men don’t buy it to save costs. But I work with an ornithologist studying window-strikes. He’s skeptical about the techniques currently marketed to reduce such bird deaths, and hopes to experiment with them to find out if any truly work. It might turn out that the one act we think would restore balance to Middle Earth might not save a single hobbit.

In “Carbon Capture,” a recent piece in the New Yorker, novelist Jonathan Franzen offers a corrective to our “Puritan” focus on climate change to the detriment of more immediately impactful conservation efforts. His opening gambit is convincing: when the Vikings built their new glass-walled football stadium, the Minneapolis Star Tribune’s Jim Williams argued that it was worthless to worry about the thousands of birds projected to die from striking the glass walls when “the real threat to birds was climate change.” Williams seemed to be drawing from a contemporary report from the National Audubon Society stating that climate change was the “greatest threat” to American birds, even though the apocalyptic date for this threat was 2080. Arguments like this allowed the sponsors to decline increasing the cost of the project by one tenth of one percent to install specially patterned glass that could have helped prevent bird deaths.

As a polemic, Franzen’s essay is successful in getting his readers out of the rut of worrying about climate change and forgetting about the issues close at hand. It’s not that we shouldn’t worry about climate change—it’s that not driving to get groceries doesn’t get you five minutes less time in climate change purgatory. You should focus on “helping something you love, something right in front of you, [where] you can see the results”—for the sake of our own inner peace, and for the birds.

But first, a word of reproach. In criticizing the Audubon’s climate change PR tactics, Franzen puts scare quotes around “citizen science data,” as well as “report,” and scare quotes seem to be implied around ‘scientists’ in the phrase “its own scientists.” While it is true that a report from a nonprofit is different from peer-reviewed literature, Franzen would do well to remember that those individuals are actual scientists, and that online citizen birdwatcher data—often collected by people with more experience than he—has made possible continental and global population monitoring.

Being someone “who cares about birds more than the next man” does not qualify one to make sweeping generalizations about bird biology and responses to climate change. Ornithologists with PhD’s are not exempt from this—the only difference is that there’s more pressure on an ornithologist to cite her sources and provide convincing evidence than there is for a science journalist. Franzen’s claim, early in the essay, that “North America’s avifauna may well become more diverse,” is the sort of marginally acceptable evidence used in a polemic. The range expansions from climate change we have seen are not good ones, not ones that increase biodiversity: mosquitoes moving up mountains and armadillos tearing up the southern United States. Franzen conveniently overlooks the utter havoc that would likely result if tropical species expanded their ranges into temperate North American ecosystems, and the possibility of next-to-irreversible loss or alteration of northern habitats that are dependent on particular climactic conditions.

A dose of humility, in other words, would be refreshing.

If it matters, I agree with Franzen that some models based on range-shifts have overestimated the impacts of climate change on bird populations. But I would only go so far as the cautious correctives written by those with the data: Dawson and colleagues in 2011, or Millar and Herdman in 2004, for example. Vertebrate responses to climate change was the topic of a literature review I wrote for a class last spring. I cited 17 sources in that paper, and still sometimes think of how my teacher, a seasoned ornithologist and evolutionary biologist, recommended I should have included more studies, more systems. I do wonder how many peer-reviewed articles Franzen read for his essay, and how many he thought was enough.

But after his arrogance plays out, Franzen stays and listens to longtime conservationist Daniel Janzen later in the piece. This tropical ecologist acts as a sort of doppelgänger for Franzen—instead of devoting his life to fiction like Franzen, Janzen and his wife Winnie Hallwachs have spent “nearly half [their] lives” creating the Area Conservation de Guanacaste (ACG) in Costa Rica, a massive reserve that includes much of the tropical dry forest remaining in the world. (Aside: Franzen makes the comment that “the forest in Santa Rosa seemed desperately dry to me,” aware that he is visiting a dry forest in the dry season. He next concludes all cows in Scotland are brown.) Listening to Janzen’s stories about the many struggles of ACG—“the story of Oliver North’s airstrip for the contras … the story of Janzen’s discovery that dry-forest moth species spend part of their life cycle in humid forest, and how this led him and Hallwachs to expand the scope of their already ambitious project … the story of how Janzen and Hallwachs learned to do business with multiple landowners simultaneously”—leads Franzen to his most admirable idea: that conservation work “is novelistic.” It is about narrative, and “no narrative is simple.”

My time in the company of ecologists could be reduced to this elegantly simple idea. The traditional scientific manuscript format is also known as a “narrative report”—it tells a complex story. Here’s what we thought at first, here’s where we are now, and here’s all the weird shit that happened in between. And the usually unspoken reality is that ecologists fall in love with their subject, and that this and only this can sustain a worthwhile career. It is this close association between fiction writing and conservation that allows me to continue on a trajectory towards becoming a full-time biologist rather than a full-time writer, and I thank Franzen for his flawed but beautiful reminder of what it is I think I’m doing with my life.

But if I see him, the first thing I’ll say is that you don’t “census” birds, you survey them. (You can never detect every individual.) It matters.



You have the virtue of being both scientist and writer; I can lay claim only to the latter title. Still, as someone who has spent more time thinking about Franzen than the next man (har har), I was intrigued by his argument that our fixation on the “eschatology” of climate change has come at the expense of more traditional conservation efforts—that, in environmental terms, we’ve allowed anxiety about the future to license indifference to the present.

The essay bears many Franzonian hallmarks: a certain arrogance, which you have deftly identified; a seductive either/or argument that suffers under scrutiny; a rueful faith in mankind’s ability to somehow stumble through the end times; and a tendency toward contrarianism.

Along with his significant novelistic success, these are the things that drive people nuts about Franzen. As an unabashed fan, I admit to wishing he would avoid the clumsy scare-quoting that you identify. From what I can tell, Franzen is a pretty knowledgeable birdwatcher, and he was also a devoted high school science student. You’re right: he ought to know better than to denigrate researchers whom he should probably consider allies. (A separate bad science question: Franzen quotes Don Alberto, a leader of an indigenous Colombian community, as saying that the sun feels hotter to him in recent years. Without question, Franzen accepts Don Alberto’s testimony as a sign of a warming planet. That’s not really how climate change works, is it?)

I think the weakest part of the essay is the either/or argument; a professor responding to Franzen’s essay in the April 20th New Yorker called Franzen’s opposition between conservation and climate change a “false choice.” Your post gets at the same point, and it confirmed some of the doubts that arose in my mind as I read the essay. Is there really demonstrable evidence that climate change has diverted resources that would have once gone to what Franzen conceives of as “pure” conservation activities? Is there really a division between the two?

The nut of Franzen’s argument is that such a cleavage does exist: that climate change is a “done deal,” and that, in the absence of the kind of intergovernmental efforts that are required to make a real dent in climate trends (sorry, Prius drivers), we ought to do more to mitigate its immediate efforts. As Franzen puts it, “We can dam every river and blight every landscape with biofuel agriculture, solar farms, and wind turbines, to buy some extra years of moderated warming. Or we can settle for a shorter life of higher quality, protecting the areas where wild animals and plants are hanging on, at the cost of slightly hastening the human catastrophe.”

This dichotomy in turn rests on an argument about human nature, an argument that I found compelling. Basically, it goes like this: we now know that climate change is so large a problem that only massive governmental action can make a difference. This knowledge, coupled with the abstract quality of climate change that Franzen rightly identifies, makes us somewhat fatalistic about the prospects of reversing the effects of global warming: Governments aren’t likely to act, so what can I meaningfully do about rising tides and melting glaciers? The answer, of course, is nothing—and the cruel joke is that my indifference is precisely what makes my government less likely to act (and the American government is the key player here; without meaningful action by the United States on carbon emissions, the international community isn’t likely to do much). As Franzen writes, summarizing Dale Jamieson’s book “Reason in a Dark Time,” “…America’s inaction on climate change is the result of democracy. A good democracy, after all, acts in the interests of its citizens, and it’s precisely the citizens of the major carbon-emitting democracies who benefit from cheap gasoline and global trade, while the main costs of our polluting are borne by those who have no vote: poorer countries, future generations, other species. The Amercan electorate, in other words, is rationally self-interested.” 

This is a cruel spiral in which to be trapped, and if my own experience is any indication, Franzen is right about the despair that this helplessness can engender. In high school, when I’d been reading Elizabeth Kolbert’s landmark New Yorker essays about the impending catastrophe of climate change, I believed that if we all just recycled and drove hybrids, everything would be all right. Today, I understand the utter futility of that position. (“The problem here,” Franzen says, “is that it makes no difference to the climate whether any individual…drives to work or rides a bike.”) I still recycle, but without the hope that used to accompany dropping my completed New Yorkers into the blue bin.

I don’t think I’m alone in feeling powerless against climate change, and this is what makes Franzen’s argument so hypnotic. I agree with you that the high point of the essay—the point where Franzen, as one of our leading novelists, has something truly unique to contribute to the discussion around environmental decline—is Franzen’s discussion of the relative narrative characteristics of conservation and contra-climate change efforts. This is the kind of language that my English major self finds too often lacking in our scientific discourse (and, for what it’s worth, it’s the kind of language that you are uniquely equipped to bring to our scientific discourse, my multi-disciplinary friend).

Climate change, Franzen argues, is a “story [that] can be told in fewer than a hundred and forty characters: We’re taking carbon that used to be sequestered and putting it in the atmosphere, and unless we stop we’re fucked.”

This is the story I’ve succumbed to since high school. It’s both harder and easier to tell oneself this story, at once depressingly hopeless and perversely liberating. “Climate change is everyone’s fault—in other words, no one’s,” Franzen says.

But I wonder if Franzen is too quick to wave the flag of surrender. Of the three letters that the New Yorker printed in the April 20th essay in response to the essay, one from Jane Alexander, an Audobon Board Member, caught my attention in particular. Alexander contends that “Climate change is not, as Franzen writes, an abstract idea. Each of us experiences global warming as a local, and visceral, phenomenon—as drought, typhoons, snow, melting ice, or rising tides.”

This story—the story of climate change as indeed my problem—is the story that is lost as soon as we get too sanguine about climate change as a fait accompli. By rehearsing this story—and, it should be noted, by only interviewing people in South America—Franzen inadvertently reinforces this story. The alternate tale—that no, I’m not off the hook, and that yes, climate change is touching my life in immediate, painful ways (in, for instance, the ways it affects the birds that nest in my back yard)—has yet to be told successfully, Al Gore’s efforts notwithstanding. But that doesn’t mean we should stop trying to tell it. 



You’re right to point out that scientific prose often lacks in storytelling quality. But in most cases, this is intentional—scientists’ skepticism makes them very cautious about adopting a narrative to explain the data. Only with extensive testing and independent confirmation do they begin to believe in their own stories. Certainly without having hunches, no one would try anything new. But unlike you and me (I don’t feel like I’ve earned my science stripes quite yet), these people ruthlessly poke holes in their own ideas for the sake of trying to get at realities outside themselves. These realities can be counter-intuitive, boring, or bizarre. Often, telling a story too early, even a good one, can lead to a wrong conclusion.

The task of a science communicator writing for a broad audience, I suppose, is to tell a story about what we know so far, or how we’ve gotten where we are, without trampling over the uncertainty that remains. And while narrative can be crucial, I think the currency of science communication is metaphor—that bridge between what readers know and what they are about to imagine. As a poet and a science journalist, I’m intensely interested in this bridge.

And I think you have a great point about the essay that still needs to be written about climate change in our backyards. I wonder if you’d agree with me that the reasons for this lack have to do with our imperial economics. Like debilitating manual labor, we’ve outsourced the current effects of climate change to developing nations and impoverished regions: the tropics, Siberia, the lower 9th ward of New Orleans. Its most immediate effects (increased storm frequencies, higher average temperatures, and sea level rises) are not felt in the temperate climate of Franzen’s backyard, but are felt in real ways on the margins—in latitudinal extremes and the tropics. So even though science doesn’t work that way, perhaps Don Alberto does know more about climate change on an experience level than you or I.

Wednesday, April 8, 2015

Comments: (0)

Imperial Presidency, Incorrigible Congress

Last week, I wrote in favor of the preliminary nuclear accord that the United States and other powers have reached with Iran. The tentative agreement is a heartening reminder of what diplomacy can achieve when parties keep larger interests in sight and negotiate in good faith. The accord is not without its significant risks, of course—diplomacy takes as much, if not more, courage than war—but it deserves to be welcomed as a significant step toward peacefully resolving what has been one of the most destabilizing issues in international relations.

Unfortunately, the breakthrough in Lausanne was quickly followed by the usual tawdriness in Washington, D.C. On this count, the Obama Administration deserves its share of blame. When The New York Times’s Thomas Friedman asked the President about Congress’s role in approving a final deal with Iran, Obama replied, “My hope is that we can find something that allows Congress to express itself but does not encroach on traditional presidential prerogatives and ensures that if in fact we get a good deal that we can go ahead and implement it.” Surely the President, as a scholar of Constitutional Law, is aware that Article I grants Congress powers far greater than self-expression.

But Obama answered the way he did in large part because he is the prisoner of larger forces. To see what I mean, consider the case of Senator Bob Corker, a Republican from Tennessee. Corker has been trying to gather support for a bill that would force President Obama to send any final Iran agreement before Congress for approval. He appears to be acting in good faith;  as the Times reported last week, Corker is one of the last of the old-fashioned dealmakers in the Senate, a statesman who “sees his mandate as criticizing the president for his failings but searching for a way to forge agreement.” Corker senses the real good that could come from an ideal, but he also wants to reassert Congressional primacy in the age of the expanding presidency. His sentiment is welcome: in the post-9/11 years, the presidency run rampant has given us the invasion of Iraq, endless drone strikes, and a secret program of detention and torture.

The problem is that most of Corker’s colleagues aren’t like him. They’re interested in the president’s failure rather than the nation’s success. The likes of Ted Cruz would vote against the bill out of partisan self-interest rather than national disinterestedness. And it isn’t just Republicans who are playing politics; Chuck Schumer, the presumptive successor to departing Minority Leader Harry Reid and a staunch supporter of Israel, has expressed his support for the Corker bill. 
And so we've reached the all-too-familiar impasse where the imperial presidency meets the incorrigible Congress, the disheartening juncture where the good intentions of the few Bob Corkers are used for the cynical ends of the far more numerous Ted Cruzes. The only solution is mutual withdrawal, but the circumstances that have given rise to this situation—the expanding bureaucracy, the widening division between urban and rural, the undue influence of wealthy activist donors—are complex and not easily unwound.  
What follows is drearily predictable: Congressional grandstanding (rather than honest debate); presidential recalcitrance; and, if the deal survives the final round of negotiations (hardly a foregone conclusion), a deepening of the nation's partisan divisions.

Nobody wins.

Friday, April 3, 2015

Comments: (0)

A Small Victory

“Peace comes dripping slow,” wrote W.B. Yeats.

In terms of advancing peace in the Middle East, yesterday’s preliminary nuclear accord between the United States and Iran is just thata drip. At a time when Saudi Arabia and Egypt intensify their campaign against the Shiite Houthi rebels in Yemen, the Islamic State clings to its territory on the Iraq-Syrian border, Libya collapses into civil war, and the spiral of sectarian conflict coils ever more tightly in Syria, a tentative framework between Iran, the U.S., and five other world powers is hardly the “game-changer” that it would have been in the Middle East of five years ago.

Mohammad Javad Zarif, Iran’s lead nuclear negotiator, downplayed the deal’s implications beyond nuclear policy.“Iran-U.S. relations have nothing to do with this,” he said after the deal was announced. “This was an attempt to resolve the nuclear issue.”  He added, “We have serious differences with the United States.”

Zarif’s comments are in part designed to calm the hard-liners who will be alarmed by this deal, which outlines the parameters of a more detailed agreement to be finalized by the end of June. President Obama will have to reckon with his own hard-liners, the Congressional skeptics like House Speaker John Boehner, who called the prospective deal “alarming.” This is to say nothing of the massaging that Israeli Prime Minister Benjamin Netanyahu—who lately has been acting as if his office entitles him to a binding say in American foreign policy—will require in the days ahead.

What galls the critics is that this deal is risky. Their basic complaint is that the deal does not guarantee with one-hundred percent certainty that Iran will no longer pursue a nuclear weapon. (If you ask John Bolton, only bombing Iran can provide that kind of assurance.)

They are correct, but a pact between Iran and the U.S. was never going to totally eliminate Iran’s nuclear capabilitiesthe Ayatollah would have never allowed his government to take a seat at the table if the Americans demanded that kind of absolute reduction. What John Kerry and his counterparts have instead hammered out is the stuff of diplomacy: an agreement, a compromise—and it seems like a surprisingly good one, all things considered.

As the talks have unfolded, President Obama has been given to quoting John F. Kennedy’s dictum that we should “never negotiate out of fear, but let us never fear to negotiate.” That wonderful Ted Sorenson chiasmus, at face value full of Cold Warrior swagger, bears a subtler point: that diplomacy is often frightening; iinvolves trusting adversaries and sacrificing smaller interests for the sake of larger ones.

This is a fundamental truth of diplomacy, and yet in this instance, critics seem have taken it as proof that we shouldn’t be negotiating in the first place. They argue that President Obama has sold out, given too much, shown himself to be weak. A stronger president would demand more; if the negotiations failed, he or she would ramp up sanctions, further isolating Iran; if Iran then calculated that it might as well develop a nuclear weapon, this heroic president would be free to attack. It’s all symptomatic of our American tendency to equate war with strength and diplomacy with weakness.

Force can never entirely be ruled out as a tool of foreign policy, but it should hardly be our first option where Iran is concerned, not when President Rouhani has expressed a desire to ease relations with the West; not when the Ayatollah has allowed his representatives unprecedented freedom in their dealings over the nuclear program; not when Iran’s willingness to repeatedly extend negotiation deadlines showed the extent to which it wants an agreement, even one with an exceptionally invasive inspection protocol.

President Obama has demonstrated the strength that diplomacy requires. He's accepted the risk that Iran is negotiating in bad faith. Against doubt and criticism, he has steered the U.S. to the verge of a peaceful accord with Iran over its nuclear program. It is only a drip of peace, yes. But  in a region parched with war--and in a relationship with a history of enmity-- it’s worth celebrating.  

Wednesday, March 18, 2015

Comments: (0)

An Exchange

I decided to try something new here at Traction: an exchange with my friend Andrew Ivers, a fellow St. Louisan behind the great blog Loomings and an editor at World Affairs (all views expressed here his own, etc etc). Andrew graciously agreed to a joint-blog discussion about the intersection of free speech, extremism, and liberalism, a topic that's taken on new urgency after the attacks in Paris and Copenhagen. This could've gone on for a while, but for time's sake, we did one post each; our entries are below.



The attacks in Denmark were depressingly unsurprising to me. I'm afraid that the Charlie Hebdo massacre has ushered in a new epoch in the age of terrorism. Rather than wondering if another 9/11 will happen--which isn't to say it won't; I suspect the West has been beating the odds on that score--it seems that we we will have to wonder when the next Charlie Hebdo will occur. If the speed with which the violence in Copenhagen followed that in Paris is any indication, I fear that our expectations on that front will be calculated in months and even weeks, rather than years.

In the light of the attacks, I've been reading a lot about liberalism and its response to radical Islam lately. The New York Review of Books published two essays on the topic in the latest issue, one about France and one about Norway.

The situations in the two countries are by no means identical. I suppose I've had an inkling of the peculiarities of French republicanism, but Mark Lilla does a nice job of clearly laying them out in his piece. To wit:

"...[republicanism] was used to describe a very specific kind of democratic ideal. It is one that guarantees rights but also envisages a strong state to provide for the public welfare and control the economy, and is proudly national--and therefore hostile to outside influences like Catholicism, international communism, the United States, and now the global economy and Islamism. Classic republicanism is not libertarian or communitarian; it presumes that rights come with public obligations, and that fraternity must be bulit through a common, quasi-sacred education in those rights and duties. One is not born a French republican citizen, one becomes one in school by being initiated into the republican ideal."

It's interesting to contrast that view of republicanism with the one we hold in the United States. The French conceive of the state as this kind of ideal realm into which one is initiated: Everyone has a right to liberty, fraternity, and equality, but only if one first makes clear their allegiance to liberty, fraternity, and equality. There's something noble about this, yes—safe to say that our sense of liberty in the US of A is problematically devoid of responsibilities—but radical Islam lays bare the snake-eating-its-own-tail logic implicit in this notion of the state. France, being a liberal society that values free speech, free assembly, the right to worship, and all those good Enlightenment verities, leaves room for an ideology like radical Islam to take hold. But that ideology is totally hostile to the very liberal idea; it wants to destroy that which gives it room to flourish.

The problem for the French people, then, is how to maintain an open society while reasonably managing the threat of radical Islam. Many (especially on the far right) would solve this dilemma by propounding an ever-more exclusive notion of what it is to be French. But at what point does liberty, fraternity, and equality become as stifling and antidemocratic as the monarchies and dictatorships that the French so proudly rejected?

France's situation has its peculiarities, but really, it seems like a question facing all democracies. What is the answer to this dilemma? The War on Terror has never struck me as a terribly good idea. As President Obama has said (though he has not acted accordingly), it's a war that will never end--at least not with endless drone strikes. It strikes me as an even worse idea as Islamism's preferred m.o. shifts from big, organization-driven attacks to smaller atrocities committed by self-radicalized individuals. You'll never prevent every single one of those kinds of attacks, though we seem willing to try.

Years ago I read an essay by Robert Pape, a terrorist specialist at the U. of Chicago. He argued that we should conduct the War on Terror as we conducted the Cold War at our best: by having a long-term faith in our ideals and trusting that we would win out in the end.

Given that Islamism won't be beaten on the battlefield, it seems like Pape's approach may be the right one, even if we feel (or in fact become) less secure as a result. Do what we can to strike groups like ISIS abroad, take reasonable measures at home, but put even more effort in shoring up democracy and the rule of law, all of which have been sorely degraded by the spectre of terrorism. The irony, of course, is that Islamism's greatest victory against the West is not any of the attacks it has successfully carried out, but rather the extent to which it has caused us to abandon our ideals. That was why the march in solidarity through the streets of Paris after Charlie Hebdo was so heartening. But I fear it was a scene more indicative of liberalism's past than its future.


Andrew's reply:

The Last Days of Europe by Walter Laqueur lays out pretty well the way in which Muslim immigrants have failed to assimilate—especially in Britain, France, and Germany—despite serious efforts by governments to bring them into society. It’s the liberal case that multiculturalism has failed basically—liberal in the classical European sense, which these days Americans tend to regard as moderate or conservative—and it’s pretty convincing. One line that struck me in particular: “There is considerably more phobia vis-à-vis Westerners and things Western than Islamophobia.” I trust Professor Laqueur because he’s been covering Europe as a journalist and historian for decades. (Disclosure: He also writes for World Affairs sometimes.) The Last Days of Europe came out in 2009 and I doubt much since then would change his argument.

However I’m also aware that chauvinism and xenophobia exist in all societies, especially ones that used to run empires. Poverty and other problems that have little to do with Islam are also in play. So I can’t bring myself to fully subscribe to the anti-immigration arguments as such even though I think they’re an important part of the conversation and often right about serious problems that progressives and politicians in general would rather not consider. David Rieff had a great essay for us a while back about the tendency to oversimplify this issue. It’s a review of Christopher Caldwell’s book Reflections on the Revolution in Europe (also from 2009), but it’s a good primer on the complexities of this topic as well. This line really sums up where I stand:

"There is a global crisis in Islam, and it is foolish and stupid of some European multiculturalists (whom Caldwell justifiably skewers) to deny that it exists. It is even more foolish to deny that, because of this crisis, not only does Europe face ongoing, serious, and likely long-lived threats from homegrown jihadist terrorists, but also that assimilating the overwhelming majority of Muslim immigrants has become more difficult than it was a generation ago—and such difficulties are unlikely to abate in the future. But to derive from these challenges the conviction that Europe is doomed seems to me quite unwarranted."

Another thing that comes to mind is the very question of what makes an open society. This rousing broadside from Christopher Hitchens says it pretty well. People should be able to say whatever they want, and everyone has a right to hear all that is said. People have a right to harbor private beliefs, but if they want to propose that other people should take them up, or use them to make policy, those ideas should be presented for all to hear, consider, and debate. In order for this to work you have to have a society that trusts this process, trusts that ideas shall be heard and seriously considered. And also trusts that no one will be judged always and forever for ideas put forth—that the contest will be fair, in other words, that players will “tackle the ball not the man.” (Americans might be more polite when they argue, but the British tend to be much better at not taking debates personally, and not expecting them to actually solve problems so much as ease tension and air ideas.) Such a society must also understand that the debate is not just a means to an end but society itself—that society is “an argument without end.” If I could think of a cornerstone of open society with representative government, this would be it. I suppose it’s also what the French republicanism is attempting in its own way: If you can get the most people living on, and holding up, some basic liberal principles—conflicts are never resolved but at least they’re less destructive when fought with honest debate rather than physical violence; plurality is a good thing, especially compared with the tyranny of one faction over the others—then you’ve come farther than you might think.

It’s also worth noting that this process should not be limited to mainstream conflicts. If the mainstream regards an idea as abhorrent, it should be brought out into the light of day and debated. There has to be public trust in the power of better arguments and values prevailing. Society is weaker when these ideas fester in private. Chauvinism, xenophobia, bigotry, racism—these all occur naturally in human societies, as do exploitation, domination, submission, revolution. Society should discuss these things, debate why they are both natural and usually toxic for an open, liberal society that values human rights and equality under law. Nothing should be outside the pale of questioning, in other words, even the process itself. It should be understood as imperfect but (so far as we know) defensible: Unlike religions or other regimes of strict ideology, a Socratic way of life is less reassuring but truer to the nature of things, messy and adaptable, and as a result fairer to more people. Even children can appreciate this, which could cut into the indoctrination that keeps religions alive. In a truly open society superstition would be perfectly acceptable in private life, but wouldn’t get very far in public life because it is unreasonable. But it could still be presented for debate. I think most people, whether religious or not, would think this a fair compromise, compared to a society in which one ideology or religion is imposed on all members.

I think the best hope for reducing violent extremism would be for superstitions to be vetted in a fair and honest and ongoing debate. If superstitions lead to actions that harm others, society has a right to punish the actor and treat those who approve of the action with suspicion, so as to anticipate future harmful acts before they’re carried out. However society has no right to punish the superstition itself, or the words of agreement. It’s imperfect, but I think it’s fair to say that, whereas no ideas should be censured in a society that can be mature about its openness, people have a right to protect themselves against physical violence. France apparently thinks that expressing ideas that could lead to physical violence is worth punishing, but I’d disagree with that for the reasons Mr. Hitchens discusses. An open society can punish acts that harm others but I don’t believe it can reasonably punish ideas or expressions it deems potentially harmful because an open society believes that no idea is beneath debate, and that better arguments will keep bad thoughts from turning into bad deeds. The risk that this does not work is the cost of liberty, a cost most people are willing to pay. This stance also makes it harder for critics to demonize necessary security measures that limit personal liberty as nothing more than political censorship. Tackle the ball not the man, in other words, until that rare moment when they’re one and the same. This principle alone threatens Islamist extremism, which exploits vulnerable young people by convincing them they have no personhood and should submit to barbaric religious dictates. It would also clash with Mr. Hitchens’s proposal that religion should be “treated with ridicule, and hatred, and contempt,” which only garners more sympathy for it. As long as someone comes forward with it, it’s worth debating. I have no doubt the argument based on reason rather than superstition will win every time.

I realize that having an open society does not fix all problems. It’s human nature to fuck things up and a society will always fail, often gruesomely, if it tries to change human nature. I do believe, though, that most people brought up to appreciate these liberal values would choose them over superstition, xenophobia, violence. The question is whether we can present liberal values to more people so they have at least a chance of trusting in, and contributing to, a more open society. This crucially involves fixing material problems, but it also involves values promotion, which I think is Robert Pape’s point. The leftist elitism that pervades Western media and education, and is therefore a heavy influence on society right now, is often squeamish about the idea of values promotion because it confuses it (too easily) with values enforcement or imposition. Indeed it sometimes is, but that’s not all it is. It can just as often be the strong but respectful presentation of values, in words and deeds, for serious consideration by those who, not without cause, don’t trust mainstream society. Society also gets stronger when it brings in more perspectives, especially differing ones. At the moment though, values promotion tends to fall mainly to the right and far right, who of course aren’t necessarily promoting the same values and are usually doing so in an aggressive, defensive, exclusionary manner. I say save the defensive impositions for times when there are actual security threats, which will also be when the most people will find them legitimate.

My conclusion for the moment is that even though Islamist terrorism will require European societies to undertake defensive security measures, these societies can also do more to help Muslims assimilate in a peaceful and healthy way. If immigrants or other outsiders actually trust that they’ll be welcomed in, assimilation won’t look so impossible, and the fight against terrorism will also gain a native ally. The desire for liberty, after all, is also part of human nature.

Saturday, March 7, 2015

Comments: (0)

Of the people, by the people

Laura Poitras's documentary Citizenfour (available on HBO GO to anyone with their parents' password) is in part a gripping portrait of a very particular individual in very particular circumstances. The individual is Edward Snowden; the circumstances are the few days in June, 2013 when he met Poitras and journalist Glenn Greenwald in a Hong Kong hotel and revealed that the United States and its allies were engaging in the mass and indiscriminate surveillance of their citizens.

As a human story alone, Citizenfour is incredible, not least because we in the public tend to experience these kinds of momentous, policy-shaping moments through the scrim of newspaper-ese or the fun-house mirrors of fictionalization. Citizenfour, by contrast, puts us in the room as Snowden reveals the details of the NSA PRISM program to Greenwald and Poitras.

This privileged vantage allows us to watch as a person becomes a story. Poitras is especially good at juxtaposing Snowden the person and Snowden the story; she shows us, for instance, Snowden agonizing over his hairstyle while a talking head on the TV in the other room discusses Snowden and his actions. It is an oddly poignant moment, a reminder that at the center of history, there are human beings. Even Napoleon had to use the bathroom.

I have to admit that my main feeling for the Snowden of Citizenfour is admiration. It's always worth being wary of leakers, who tend to have an axe to grind or a complex to fulfill. Snowden is clearly a quirky guy who subscribes to the kind of slightly paranoiac libertarianism at which I--with my continued (and perhaps naive) reverence for the United States Senate--tend to roll my eyes. But the film makes clear that Snowden made a very hard choice at great personal cost. It's clear that he agonized over his decision to go public, ultimately choosing to obey the dictates of his conscience. Perhaps it's my Jesuit theology classes talking, but I have to admire his courage.

(Those who argue that Snowden was merely seeking fame should see Citizenfour, where he comes across as self-serious but not self-aggrandizing, and in any case, it's hard to think that holing up in a Hong Kong hotel and then fleeing to Putin's Russia are prices worth paying for notoriety.)

For all its interest in Snowden, however, Citizenfour's most valuable point is that the debate over whether he was a traitor or a patriot is an incredibly stupid one. Snowden should not be our focus; the programs he revealed demand our attention. Unfortunately, NSA surveillance has been fodder for late-night comedians as often as its been the subject of public debate; we shake our heads and then return to our iPhones. Citizenfour might indulge in a bit of paranoia (I could've done without the buzz-y Trent Reznor soundtrack), but it forefronts the seriousness of what our intelligence agencies are doing in the name of national security, cutting through the pablum about meta-data that dominated official statements after Snowden's leaks. Greenwald, in particular, has a gift for explaining what's transpired: without our consent, the government has begun massively intruding on our personal lives in the name of national security. Hopefully, the film can stir a new sense of outrage among the American people.

That outrage is more important than ever, because the depressing conclusion that Citizenfour leaves one with is that our government, at least where national security is concerned, operates beyond the reach of the citizenry, using sophistic legalisms to justify what it's (confidentially) up to: you voted for your representatives, and they voted for the Patriot Act, and so we can do these things (see footnote). Two moments in Citizenfour bring this reality home: one is the enraging footage of James Clapper, the Director of National Security, telling Senator Ron Wyden that the U.S. does not "wittingly" collect data on its citizens; the other is President Obama's frustratingly measured assertion that America should have a debate on surveillance, but that they shouldn't have found out through a leaker like Snowden. Maybe so, but what evidence is there that the president had any plans to make these programs public?

Unfortunately, if the government can't claim the express consent of the American people to engage in these kinds of activities, it can claim a different kind of consent, the consent of apathy. Whatever public demonstrations there have been against surveillance, torture, and drone strikes have been far too small to put any pressure on those in power to change their ways. The vox populi has offered little more than a shrug on these issues.

Perhaps the American public really does want its government to pursue security at all costs. Most opinion polls, for instance, suggest wide public support for drone strikes. But I think there may be a kind of chicken-and-egg problem here, too: if the people are apathetic, it may be in part because the government feels so large and unresponsive, which gives rise to the kind of wised-up "Don't vote for any of the bastards" cynicism that in turn allows the government to become even larger and more unresponsive.

I don't know how to end this democratic death spiral, but I hold out hope that we can find a way. We haven't done much about it so far, and Citizenfour makes clear the costs of our lethargy.

Footnote: yes, this is basically how representative government works. The difference when it comes to National Security is just how broadly intelligence agencies have been willing to interpret post-9/11 statutes. Add in the fact that the NSA's secret surveillance program is reviewed by a secret court, and it's hard to justify any of it as constitutional.