The Gnostic Gospels (Modern Library Nonfiction #72)

(This is the twenty-eighth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: James Joyce.)

As I write these words — some eight months before a fateful presidential election threatens to steer my nation into a theocratic hellscape that will permanently erode many of the liberties and freedoms I have been humbled to partake in for cnearly fifty years — the tireless researchers at PRII inform me that Christian nationalism has substantive support in all fifty states (with the exception of California, New York, and Virginia — in which 75% remain skeptics or outright reject it), the Pew Research Center reports that 45% of Americans believe that our democratic republic should be “a Christian nation,” and 55% of Latino Protestants support Christian nationalism. Blind zealotry, even with white supremacy mixed into the sickening formula, comes in many colors.

Undoubtedly, many of these hayseed fanatics are easily manipulated and illiterate. They conveniently overlook the “love thy neighbor” ethos from Western civilization’s best known zombie in favor of a greater affinity for the limitless imbecility of zealous violence and tyranny, falsely believing themselves to be misunderstood rebels living in a new Roman Empire — this as the very institutional framework continues to uphold their right to yap and bellow in hateful and discriminatory terms as they line the pockets of wealthy telegenic carpetbaggers like Joel Osteen. They lead campaigns to ban books and to deracinate vital areas of knowledge from schools which offend their delicate and autocratically vanilla sensibilities. While the Book of Luke informs us that Christ asked us to “love and pray for our enemies,” you will find these unremarkable lemmings keeping their traps shut as trans kids commit suicide or another maniac massacres dozens in the week’s latest mass shooting. (Unable to summon true comity for anyone who deviates from their ugly and crudely formed politics, right-wing statesmen have substituted “love” for “thoughts,” presumably so they can show up to church on Sunday with a “clean” Christian conscience — even though they do nothing to curb this malignant cancer and care no more for these victims than any garden-variety sociopath.)

It has frequently been observed that atheists like myself know the Bible better than these monomoniacal morons. I have often been surprised by how easy it is to thoroughly rebut some born-again loser based on a singular reading of the King James more than twenty years ago and my apparent recall of specific passages that are well outside the soft and useless hippocampi of my hopelessly dim opponents. It never occurs to Christians to question their faith or even to comprehend (much less read) the very words they purport to uphold in their everyday living. And it certainly wouldn’t occur to them to consider that, much like any moment in history, the narrative and the very belief structure upholding this nonsense was written by the winners, by those who spent the majority of their lives silencing (and even murdering) anyone who offered perfectly reasonable questions about a man who rose from the dead.

Elaine Pagels’s excellent book, The Gnostic Gospels, is an equitable study of the many Gnostic sects that dared to question the Christian status quo. Indeed, had not the 52 treatises been discovered in Nag Hammadi in 1945, there is a good chance that many of us who tirelessly call out bullshit on all fronts would have lacked a far more seminal faith than one in Christ — namely, a boundless pride in our ancestors practicing the vital art of critical thinking.

The orthodox position of the Resurrection, as defined by Tertullian, is quite clear. Jesus Christ rose from the dead with full corporeal gusto. It was “this flesh, suffused with blood, built up with bones, interwoven with nerves, entwined with veins” (one might add “consummated with claptrap” and “molded with malarkey” to this laundry list). Tertullian further adds, “it must be believed, because it is absurd!” And, look, I’d like to believe in kaiju secretly emerging from the oceans to stomp on every megachurch from here to Alpharetta, Georgia, but I have confined my love for absurdity to my deviant imagination and my performative antics on TikTok.

What’s especially astonishing about Tertullian is how literal he is. The New Testament is ripe with stories in which Jesus’s disciples are invited to prod and touch the newly reanimated corpse. (There is curiously nothing in the Bible in which anyone asks Jesus about why he doesn’t carry the pungent smell of the dead or how the bearded wonder managed to rid himself of all the maggots gnawing at his decaying flesh.) And yet Pagels points out that not every story within the New Testament aligns with Tertullian’s “my way or the highway” interpretation of full-fledged concrete return. Acts 9:3-4 informs us that Christ’s Resurrection is merely “a light from heaven” with a voice. Acts 22:9 even points out that some observed the light, but ‘heard not the voice that spake to me.” And if that’s the case, would Tertullian have declared the Apostles heretics? In Acts, Christ’s “return” sounds very much like a low-rent Vegas act without a PA system.

And that’s just in the Bible, folks! I haven’t even snapped my fingers to summon the Gnostics on stage. Depending upon what part of the Bible you read, it is either Peter or Mary Magdalene who first sees Christ rise from the dead. Paul tells us that Christ said hello to five hundred people all at once. And if we take that literally, any of us could now do the same thing on social media. Pagels informs us that from the second century onward, “orthodox churches developed the view that only certain resurrection appearances actually conferred authority on those who received them.” And just like that, the manner in which you contend with Christ’s reappearance isn’t all that different from telling the right story to some bouncer on a Saturday night to slip past the velvet rope!

Believe in the power of this two-bit magician and the terms of the deal, as set up by Luke, are as follows: Christ returned from the dead, walked the earth for forty days, and then rose to the heavens in a bright coruscating light. This may not have the razzle-dazzle of Cirque du Soleil, but it is a belief that has nevertheless been swallowed whole and without question by generations of gullible rubes.

The Gnostics were the first to call this “the faith of fools.” In The Acts of John, one of the rare Gnostic texts that survived before Nag Hammadi in fragmented form, John offers the completely reasonable argument that, because Christ did not leave any footprints, he could not possibly be human, but spiritual. The Gnostics clearly had a more sophisticated interpretation of the Resurrection: it was not the literal observation of Christ’s Resurrection that counted, but the spiritual meaning behind it. But the underlying facts didn’t matter nearly as much as winning over the authorities who conferred you with a position of trust:

Consider the political implications of the Gospel of Mary: Peter and Andrew, here representing the leaders of the orthodox group, accuse Mary — the gnostic — of pretending to have seen the Lord in order to justify the strange ideas, fictions, and lies she invents and attributes to divine inspiration. Mary lacks the proper credentials for leadership, from the orthodox viewpoint: she is not one of the ‘twelve.’ But as Mary stands up to Peter, so the gnostics who take her as their prototype challenge the authority of those priests and bishops who claim to be Peter’s successors.

It thus became necessary for the Gnostics to expand authority to those who stood outside the Twelve. Some Gnostics were generous enough to ascribe VIP treatment to the Disciples, claiming that they had received the kind of custom vision that is a bit like the gift you receive nine months after you donate to a Kickstarter campaign. But as you can imagine, all this resulted in many elbowing their way into a vicious power grab over which interpretation of the Resurrection represented the “true” belief. And there was another important consideration. If Christ himself served as the truest source of spiritual authority, who then would be the authority in the years after his crucifixion and his “Hey there, baby!” sojurn from the great beyond?

The more bellicose strains of Christianity continue to endure in large part because a belief in Christ conveniently allows you to disguise your own sinister lunges for power. Enter Pope Clement I, who was arguably the first significantly ruthless monster who saw an opportunity. Clement insisted that, in the absence of his august presence, God delegates his authority to the “rulers and leaders on earth.” Naturally, these “rulers and leaders” were bishops, deacons, and priests. And if you didn’t bend at the knee to these sham practitioners, then Clement stated, with his great gift for speaking without nuance, that you would receive the death penalty.

Of course, this raises the question of whom you can trust within the church: an issue that has become evermore important given the decades of sexual abuse carried out by men of the cloth within the Catholic Church. A bloodthirsty fellow by the name of Irenaeus succeeded in widening the divide between orthodoxy and the Gnostics by suggesting that any interpretation existing outside Clement’s stern terms was not only heretical, but originated from Satan himself, thus paving the way for Christians to denounce any belief or behavior they disagreed with as “Satanic” over the next two thousand years. Over the years, they proceeded to execute innocent women in Salem and imagine Satanic messages in records.

These developments spelled trouble for the poor Gnostics. Within a few centuries, their texts were buried and destroyed. Their reasonable questions and liberal interpretations became casus belli to string them up. The Christians had the good sense to market themselves as victims persecuted by the Roman Empire and they began to realize sometime in the second century that pointing out how Christians suffered was a great draw for new acolytes. (Eighteen centuries later, Israel would employ the same tactic: use the suffering from the Holocaust to recruit Zionists, where they could then justify the seizure of Palestinian land and the mass-murdering of children on the Gaza Strip.) All this is a pity. Because the Gnostics were often far more interesting in their radicalism and their creative liturgical analysis than what we find in the so-called Holy Book. Consider The Gospel of Philip‘s inventive spin on the virgin birth. How can the Spirit be both Virgin and Mother? By a union between the Father of All and the Holy Spirit. And unlike the Christians, The Gospel of Peter ascribed a third quality to the Divine Mother (the first two being the Silence and the Holy Spirit): Wisdom, clearly delineated as a feminine power.

It is a testament to Christianity’s enduring evil that few people listen to the Gnostics in the twenty-first century. But if their reasonable transposition of literal interpretation to metaphor had become the more dominant text, it is quite possible that the millions of nonbelievers who died during the Crusades might have survived and that the present plague of Christian nationalism, which remains highly dangerous and ubiquitous in our dystopian epoch, might have nestled into the less injurious category of “optional only.”

{Next Up! William H. McNeill’s The Rise of the West!)

James Joyce (Modern Library Nonfiction #73)

(This is the twenty-seventh entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Florence Nightingale.)

“Mr. Joyce, first of all, is a little bourgeois Irishman of provincial tastes who has spent a lifetime on the continent of Europe in a completely fruitless attempt to overcome the Jesuit bigotry, prejudice, and narrowness of his childhood training. Mr. Joyce began his literary career as a fifth-rate poet, from there proceeded to become a seventh-rate short-story writer, graduated from his mastery in this field into a ninth-rate dramatist, from this developed into a thirteenth-rate practitioner of literary Mumbo-Jumboism which is now held in high esteem by the Cultured Few and I believe is now engaged in the concoction of a piece of twenty-seventh-rate incoherency, as if the possibilities in this field had not already been exhausted by the master’s preceding opus.” — Thomas Wolfe, The Web and the Rock

James Joyce was probably the greatest writer of the 20th century, although opinions vary. (Many of today’s young whipper-snappers sound astonishingly similar to a dead-inside academic like Thomas Wolfe’s Mr. Malone when dispensing their rectal-tight rectitude and uncomprehending pooh-poohs on social media.) But as a wildly ambitious literary athlete nearing fifty (353 books read so far this year, with a little more than a week left), I cannot think of any other writer whom I have returned to with such regularity and gusto. Even the dreaded “Oxen of the Sun” chapter in Ulysses, which caused at least six hundred grad students to faint from fatigue in the last year (and a good dozen young scholars to permanently lose their minds), demands that you peruse it anew to appreciate its multitudinous parodies.

Only a handful of living writers can summon a similar obsession in me through the power of their words. But even when these hypergraphic bards descend from the Mount with their thick portentous volumes, they are hopelessly outmatched by the Dublin bard’s mighty polyglot yardstick. (Certainly Anthony Burgess spent his prolific literary career forever lost in Joyce’s formidable fug and forever resented the fact that his best known work, A Clockwork Orange, with its captivating NADSAT, caught on, perhaps because it represented some attempt to mimic Joyce’s word-soaked playfulness.)

When I visited the Martello Tower at Sandycove Point not long before the pandemic, it was the closest thing that an atheist like me has ever had to a religious experience. It had never occurred to me — a relentlessly abused white trash kid who fought off bullies (and still has to do so in his forties) when not filling his voracious noggin with too many books, a reader from the age of two, an accidental provocateur who still manages to piss off PhDs and varying mediocre literary types whenever I quote long passages from memory culled from books they claim to have read but have somehow forgotten — that I would ever have the divine privilege of standing at the very location where “Telemachus” begins. My first walk alongside the Mississippi River last summer in deference to another literary hero of mine was close, but Joyce was the clear winner when it came to summoning such heartfelt psychogeographical wonder. As I sauntered along the swerve of shore to bend of Scotsman’s Bay back to the Dublin train, I trembled with tears of joy, feeling great shudders push me into a state of awe that I did not know was writhing within me. I simply could not believe it. I had already been impressed by the social code of the great Irish people, who would always give you at least five minutes of banter and who were never shy in expressing their opinions and who immediately unlocked the key to further appreciating “Ivy Day in the Committee Room” through their innate conversational finesse. But was I actually standing in the same room in which Samuel Trench (the basis for Haines) had shot at an imaginary panther that had plagued him in his sleep? And was that truly Joyce’s guitar? The good people who run this landmark were incredibly kind to this wildly voluble and incredibly excited Brooklynite. I flooded their robust Irish souls with endless questions and an irrepressible giddiness. A kind woman, who did her best to suppress laughter over my ostentatious literary exuberance, remarked that they had not seen such a visitor display such bountiful passion in months.

But I am and always will be a Joyce stan. I own five Joyce T-shirts, including an artsy one in which the opening words of Finnegans Wake are arranged in a pattern matching one of Joyce’s most iconic photographs. Before I deleted all of my TikTok accounts, my handles were various riffs on Joyce’s most difficult volume. There has rarely been a week in which I have not thought about Ulysses or “The Dead” or, on a whim or in need of a dependable method to restore my soul, picked up my well-thumbed copy of Finnegans Wake and recited pages and laughed my head off. When I went through the roughest patches of my life nine years ago, it was James Joyce who helped save me. I reread Ulysses while living in a homeless shelter. And had I not had that vital volume on me to renew my fortitude and passion, it is quite likely that I would be dead in a ditch somewhere and that the words I am presently writing would not exist.

So I’m obviously already in the tank for Joyce and deeply grateful to him. He has proven more reliable and loyal to me than my toxic sociopathic family. These moments I have chronicled would be enough. But Richard Ellmann hath made my cup runneth over. He somehow achieved the unthinkable, writing what is probably the best literary biography of all time. Other biographers have combed through archives and badgered aging sources, hoping to stitch their tawdry bits with dubious “scholarship.” Small wonder that Joyce himself referred to these highfalutin ransackers, who have more in common with TMZ reporters than academics, as “biografiends.”

But one cannot lay such a mildewed wreath at Ellmann’s feet. There are very few details in Ellmann’s book that do not relate directly to the work. We learn just how invaluable Stanislaus Joyce was to his brother. Stanislaus — an adept peacemaker who documented his fractious fraternal relationship in his own book, My Brother’s Keeper — is liberally excerpted. If Stanislaus hadn’t pushed back hard on the alleged “Russian” feel of Joyce’s great short story “Counterparts,” would we have had “The Dead”? (“The Dead” was written three years after the other fourteen tales contained in Dubliners.) To cite just one of many Ellmann’s cogent connections between Joyce’s life and work, we learn that Edy Boardman — Gertie McDowell’s friend in the “Nausicaa” chapter of Ulysses — represented faithful recreations of neighbors that the Joyce family knew on North Richmond Street and that “the boy that had the bicycle always riding up and down in front of her window” was, in fact, a callout to one Eddie Boardman, who had the first pneumatic-tired bike in the hood. Joyce’s crazed jealousy towards any man whom he suspected had designs on Nora Barnacle — with his insecure interrogations of Nora by letter and in person — are duly chronicled. The boy that Nora had dated before Joyce came along was Sonny Bodkin (who died tragically young of tuberculosis) and she was initially attracted to Joyce because of their close physical resemblance. And while Joyce was forward-thinking when it came to presenting Jewish life in Dublin (and arguably creating one of the most fully realized Jewish heroes in literature with Leopold Bloom), his regressive masculinity could not stand the notion that his great love’s heart had stirred long before he came along. And yet, even with his nasty and unfair and unreasonable accusations, he was able to find a way to broach this in fiction with Gretta Conroy recalling her dead lover Michael Furrey in “The Dead.” It is often the darkest personal moments that fuel the best of fiction.

And let’s talk about that ugly side of Joyce. The great Dublin exile was also an unapologetic leech, a shrewd manipulator, and a master of dodging creditors. He fantasized about pimping his wife Nora out to other men while also being naive enough to believe Vincent Cosgrave’s claim that Cosgrave was sleeping with Nora before him in the fateful summer of 1904, nearly sabotaging his relationship with a series of angsty transcontinental missives. For better or worse, Joyce refused to see the full extent of his poor daughter Lucia’s troubles. He treated many who helped him very poorly. And, of course, he despised explaining his work. He wanted to keep the scholars busy for centuries. And he succeeded. Here we are still discussing him, still mesmerized by him. Even when his life and work are often infuriating.

If there is any weakness to Ellmann’s formidable scholarship, it is with the women who were vital to Joyce’s life. Ellmann was so focused on finding precise parallels between Joyce’s life and work — but usually only including Jim and his brother Stanislaus at the center — that he often portrays these invaluable lieutenants in superficial terms — that is, if he even mentions them at all. Let us not forget that Joyce was a man terrified of dogs, violence, and thunderstorms. The women in his life empathized with the effete qualities of this indisputable genius and provided financial and scholarly resources for Joyce to continue his work, even when they found Finnegans Wake baffling and not to their taste. Perhaps most criminally, there is no mention in Ellmann’s book of Myrsine Moschos (who was Lucia Joyce’s lover at one point), the dutiful woman who toiled at the famous bookstore Shakespeare & Company and spent long days in the dank chambers of Parisian libraries, sifting through decaying volumes that often crumbled to dust in search of obscure words and other arcane lexical associations that Joyce included in Finnegans Wake. Moschos often returned from these scholarly journeys so exhausted that Sylvia Beach — arguably the greatest bookseller in all of human history and the woman who took significant risks to get Ulysses published — had stern words for Joyce about Moschos’s health.

In 2011, Gordon Bowker published a biography — something of a quixotic project, given the long imposing shadow cast by Ellmann — that was more inclusive of Nora Barnacle, Sylvia Beach, and Harriet Shaw Weaver. But I do recommend Brenda Maddox’s Nora, Carol Loeb Schloss’s Lucia Joyce: To Dance in the Wake (with significant reservations), and Noel Riley Fitch’s Sylvia Beach and the Lost Generation as volumes that fill in these significant gaps that Ellmann, in his efforts to portray Joyce as his own master, often failed to address. (Even Jo Davidson, the sculptor who was instrumental in making the New York theatrical run of Joyce’s play Exiles happen, is merely afforded a footnote by Ellmann.)

Can one literary biography be the all-encompassing volume that captures a life? Even one that was as complicated as Joyce’s? Perhaps not. But Ellmann has certainly come closest. Now that Joyce’s famously hostile grandson Stephen has passed away and the copyright for much of Joyce’s work has at long last been released into the public domain, it’s possible that another biographer will be better situated to come closer to revealing the Joyce mystique without being strangled by the bitter hands of some unremarkable apple twice removed from the great tree. But I doubt that any future scholar will match Ellmann. For all of his modest limitations, he was the right man at the right time to capture a seminal literary life in perspicacious and tremendously helpful form.

(Next Up: Elaine Pagels’s The Gnostic Gospels!)

Florence Nightingale (Modern Library Nonfiction #74)

(This is the twenty-sixth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Great War and Modern Memory.)

Of the four illustrious figures cannonaded in Eminent Victorians, Florence Nightingale somehow evaded the relentless reports of Lytton Strachey’s hard-hitting flintlocks. Strachey, of course, was constitutionally incapable of entirely refraining from his bloodthirsty barbs, yet even he could not find it within himself to stick his dirk into “the delicate maiden of high degree who threw aside the pleasures of a life of ease to succor the afflicted.” Despite this rare backpedaling from an acerbic male tyrant, Nightingale was belittled, demeaned, and vitiated for many decades by do-nothings who lacked her brash initiative and who were dispossessed of the ability to match her bold moves and her indefatigable logistical acumen, which were likely fueled by undiagnosed bipolar disorder.

As someone who has been diagnosed with bipolar, I am inclined to stick up for my fellow aggrieved weirdos. We bipolar types can be quite difficult, but you can’t gainsay our superpowers. A relentlessly productive drive, a magnetism and a magnanimity that bubbles up at our high points, an overwhelming need to help and empathize with others, and a crushing paralysis during depressive spells that often has us fighting the urge to stay in bed. And yet we get up every day anyway, evincing an energy and an eccentric worldview that others sometimes perceive as magical, but that our enemies cherrypick for lulz and fodder — the basis for unfounded character assassin campaigns, if not permanent exile. Hell hath no greater fury than that of aimless and inexplicably heralded mediocrities puffed up on their own prestige and press.

But regular people who aren’t driven by the resentful lilts of petty careerism do get us. And during her life, they got Florence Nightingale. She was flooded with marriage proposals, all of which she rebuffed and not always gently. She was celebrated with great reverence by otherwise foulmouthed soldiers. Yet she also suffered the slings and arrows of bitter schemers who resented her for doing what they could not: obtaining fresh shirts and socks and trays and tables and clocks and soap and any number of now vital items that one can find ubiquitously in any ward, but that were largely invisible in 19th century hospitals and medical military theatres. She had the foresight to study the statistics and the fortitude to work eighteen hour days practicing and demanding reform. And whatever one can say about Nightingale’s mental state, it is nigh impossible to strike at Florence Nightingale without coming across as some hot take vagabond cynically cleaving to some bloodless Weltanschauung that swiftly reveals the superficial mercenary mask of a boorish bargain hunter.

Florence Nightingale nobly and selflessly turned her back from the purse strings of privilege, hearing voices caracoling within her head that urged her to do more. While she was not the only nurse who believed in going to the front lines to improve conditions (the greatly overlooked Mary Seacole, recently portrayed by the wildly gifted and underrated Tina Fabrique in a play, also went to Crimea), it is now pretty much beyond question that she revolutionized nursing and military medicine through her uncommon will and a duty to others in which she sacrificed her own needs (and caused a few early suitors to suffer broken hearts). That she was able to do all this while battling her own demons is a testament to her redoubtable strength. That her allies returned to her, determined to see the best in her even after she was vituperative and difficult, is a tribute to one of humanity’s noblest qualities: putting your ego aside for the greater good.

A century before PowerPoint turned 90% of all meetings into meaningless displays of vacuous egotism, Florence Nightingale was quite possibly the first person to use colorful graphical data at great financial expense (see above — it’s beautiful, ain’t it?) to persuade complacent men in power to care for overlooked underlings wounded in war and dying of septic complications in overcrowded and unhygienic hospitals. She was savvy and charismatic enough to win the advocacy of Lord Sidney Herbert, who, despite being a Conservative MP, had the generosity and the foresight to understand the urgent need for Nightingale’s call for revolution. Herbert secured funds. The two became close confidants. Yet poor Herbert suffered a significant erosion in his health and died at the age of fifty because he could not keep up with Nightingale’s demands.

I suspect that men in power resented such noble sacrifices, which could account for why Nightingale was often portrayed as a freak and a deranged outlier in the years immediately following her death. But biographer Cecil Woodham-Smith saw a different and far more complex woman than the haters. Her terrific and mesmerizing and well-researched 1950 biography on Nightingale greatly helped to turn the tide against one of the most astonishing and inspiring women that medicine has ever known. And Woodham-Smith did so not through preordained hagiography, but by taking the time to carefully and properly sift through her papers (and even a well-preserved lock of her bright chestnut hair, still as robust and as lambent as the lamp Nightingale carried in the dark more than a century later). There is a vital lesson here for today’s social media castigators, especially the testosterone-charged troglodytes who casually smear women, that they will likely ignore.

Next Up: Richard Ellmann’s James Joyce!

The Great War and Modern Memory (Modern Library Nonfiction #75)

(This is the twenty-fifth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The City in History.)

The men went to war. Their psyches were scarred and sotted by the sights and sounds of death and dreary dissolution — all doled out at a hellish and unprecedented new normal. Machine guns, mustard gas, the ear-piercing shrieks of shrapnel and shells, rats gnawing on nearby corpses. The lush fields of France anfracted into a dark flat wasteland.

The war was only supposed to last a few months, but it went on for more than four years. Twenty-two million lost their lives in the First World War. Many millions more — the ones who were lucky to live — were shattered by the experience. Their bodies were bent and their souls were broken. As Richard Aldington observed in his bleak comic novel, Death of a Hero, the trauma that the soldiers carried home became all too common, unworthy of commiseration and often received with scorn.

But, despite the scars and notwithstanding the cruel homeland rebuke, these men somehow sustained a culture during hard-won moments when they weren’t fighting in the trenches and when they weren’t watching their close friends mowed down by the newer and deadlier weapons. Their noble commitment, their fervent faith in some lambent hope plucked from the maws of a mottled landscape, forever changed the way we saw, heard, and expressed ourselves. As Paul Fussell nimbly argues in The Great War and Modern Memory, we are indebted to these soldiers in ways that most people today cannot appreciate.

* * *

While The Great War and Modern Memory doesn’t contain the intoxicating sweep and ambition of Frazer’s The Golden Bough in identifying the underlying rituals that have come to define the manner in which we reckon with disruptive and often inexplicable quagmires, it is nevertheless a remarkable volume, one quite essential in charting the trajectory of how humans expressed themselves through poetry, letters, fiction, and even postwar mediums. I first read this book in my early twenties — many years before I would stumble onto sound design as a method of communicating feelings often untranslatable through words — and, even then, I was startled by how Fussell identified early phonographic recordings as a liminal theatre sprinkled with sounds of attack. This was evidenced not only in the hit novelty records scooped up by supercilious aristocrats comfortably ensconced in cushy sacrosanct parlors without a care in the world, but further immortalized in such unlikely texts as Anthony Burgess’s underrated dystopian novel, The Wanting Seed.

There are so many bones baked into the silt of the Somme that human remains were still being exhumed in Fussell’s day. Forensic experts have continued to make efforts to identify skulls in more recent years. But beyond all these history-shattering casualties, there were also significantly influential linguistic precedents derived from these disfiguring events. The “us vs. them” vernacular that was to become a regular feature of all subsequent wars began with the Great War’s “we” and the xenophobia that was swiftly ascribed to the other side through epithets like “Boche,” as well as the cartoonish pastiches that no soldier in history has been immune from assigning to a mortal enemy. Germans were depicted as giants, memorialized in Robert Graves’s “David and Goliath.” Blunden’s Undertones of War described German barbed wire with “more barbs in it and foreign-looking.” Whether John Crowe Ransom explicitly derived his notion of the other from Blunden, as Fussell imputes, is anyone’s guess. But Fussell’s confidence and deep dive into phrases and terms of art is strangely persuasive. He has, unlike any other scholar since, made a vigorous and spellbinding examination of how language pertaining to division and the unshakeable sense that the war would go on forever influenced the Modernists (and even the postmodernists) as they rolled out their comparatively more peaceful masterpieces to the literary front lines in the 1920s.

Contrary to the cliches, life on the front wasn’t just about poetry and gardening. There was the unappetizing perdition of stale biscuits and Maconochie stew, a hideous tinned concoction (which at least one YouTuber has attempted to recreate!) involving bully beef that reminded the men of meals tendered to dogs. There were startlingly brave figures like Siegfried Sassoon, who not only took a bold stance against the war, but evoked the sordid memories of the trenches and a forgotten England in his Sherston trilogy (which dropped just as autofiction practiced by the likes of Dorothy Richardson and Proust was being quietly celebrated and, in turn, inspired Pat Barker to write her terrific Regeneration trilogy). The stertorous gunfire on the front was so loud that, as Fussell helpfully notes, even Pynchon was compelled to memorialize the idea of shells being heard hundreds of miles away in Gravity’s Rainbow. There was even a series of Illustrated Michelin Guides to the Battlefields that made the rounds after the Treaty of Versailles. Fussell repeatedly points to maps as shaky palimpsests staggered with thick wavy lines and often wry notations, but the lack of tangible geography had to spill over somewhere. Poetry was fated to account for the ambiguity.

Fussell makes a strong case for a tectonic shift in expression being practiced even before the war began. Indeed, the war gave E.M. Forster’s famous “Only connect” sentiment some completely unanticipated momentum as the landed gentry attempted to reckon with the period between the two world wars. If the Great War had not happened, what would be the trajectory of literature? Fussell doesn’t mention Rebecca West’s 1918 novel, The Return of the Soldier, but this was one of the first Great War novels to explicitly deal with shellshock and one can read this book today as a fascinating glimpse into a period between frivolous prewar innocence and the stark and gravid sentences that were to come with Eliot, Hemingway, Woolf, and Fitzgerald. Fussell suggests that the young Evelyn Waugh was emboldened in his poetic and often brutal satire by much of the lingering language that the war had extracted from the patina of once regular summer comforts. The charred scenery on the front lines caused soldiers and servicemen to look upward into the possibilities contained within the sky — itself a predominant fixation within Ruskin’s Modern Painters — and not only did Waugh mimic this in the opening pages of his later novel, Officers and Gentlemen, but one cannot read John McCrae’s “In Flanders Fields” without being acutely aware of the “sunset glow” or the sky serving as an anchor for the poppies blowing beneath the crosses or the singing larks still “bravely singing” amidst the destruction.

It’s possible that Fussell may not have arrived at his perspicacious observations had he not gone through wartime and its preceding ablutions himself. In his memoir Doing Battle, Fussell notes that he could not have unpacked Wilfred Owen’s veiled sensuality had he not been smitten himself with the looks of boys in his adolescent years. He also writes of identifying strongly with Robert Graves’s sentiment that one could not easily be alone in the thronged throes of battle. In The Great War and Modern Memory, Fussell sought to unpack irony and poetic elegy as it became increasingly expressed during the First World War. He claimed his study to be “an act of implicit autobiography” and “a refraction of current events.” In Fussell’s case, he had sickened of the Vietnam War’s overuse of “body count” and perceived perspicacious parallels between Owen’s “Insensibility,” a poem which suggests that expressing “sufferings” is simply not enough to understand real loss. One must have palpable experience of warfare’s devastation in order to reckon properly with it.

And perhaps The Great War and Modern Memory is more serious than Fussell’s “stunt books” (Class, which The Atlantic‘s Sandra Tsing Loh rightfully described as a “snide, martini-dry American classic,” and Bad) because Fussell could not find it within himself to betray his own personal connection to war.

Even so, Jay Winter, Daniel Swift, and Dan Todman have rightfully censured Fussell for leaving out or even demeaning the contributions of working stiffs. Make no mistake: Paul Fussell is an elitist snob and more than a bit of a sneering egomaniac. To cite but one of countless examples, Fussell overreaches and reveals his true colors when he suggests that all letters home from the soldiers adhered to what he calls “British Phelgm” (“The trick here is to affect to be entirely unflappable; one speaks as if the war were entirely normal and matter-of-fact.”). War censors certainly created a creative smorgasbord of workaround phrases, but, as someone who has reviewed World War I letters for research, this is an unequivocal load of bollocks — as a cursory plunge into the National Archives swiftly reveals. Fussell is much better tracking idioms like “in the pink” and using his mighty forensic chops to expose undeniable lexical influence.

As our present world moves ever closer to a potential third world war — with Ukraine standing in for a “trouble in the Balkans” — The Great War and Modern Memory reminds us that all the trauma on our shoulders — whether endured by soldiers or civilians — is destined to spill somewhere. We may not have five centuries of democracy and peace to give us the cuckoo clock that Orson Welles famously snarked up in The Third Man, but there are certainly plenty of unknown Michelangelos and da Vincis waiting in the wings to make sense of the ordeals of 2022 life. History, to paraphrase Stephen Dedalus’s famous sentiment, is a nightmare from which all of us are trying to awake.

Next Up: Cecil Woodham-Smith’s Florence Nightingale!

The City in History (Modern Library Nonfiction #76)

(This is the twenty-fourth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Battle Cry of Freedom.)

I’ve been a city man all my life. Certainly all of my adult life. At the age of twenty, I escaped from the dour doldrums of suburban Sacramento — the kind of hideous Flintstones-style recurring backdrop that seems to encourage broken dreams, angry tears, and rampant abuse behind model home replica doors — for the bright foggy beauty and the joyful pastels of San Francisco.

That gorgeous place from the not so distant past — with the co-op movie theatres playing weirdass indie flicks you couldn’t find on video or teevee, the cafes pattering with political idealism and the streets rattling with the chatty pugnacious jingle of strange conceptual punks, the crumbling encyclopedic bookstores and the boldly strange dive bars of the Tenderloin, and the wonderful mariachi players serenading Valencia Street taquerias for a quick buck, a Mexicoke, and a smile — was exactly the urban realm I needed at the time. Only real souls committed to an increasingly rarefied inclusiveness like Michelle Tea and William T. Vollmann knew how to capture these meat-and-potatoes freak-friendly details in their novels. What I didn’t know, as San Francisco became an unaffordable playground invaded by elitist and not especially perspicacious techbro affluents, was that this coastal metropolis was no longer a place for weirdos like me. I was outpriced and outmatched, like so many who bolted to Oakland, Los Angeles, and elsewhere. It was an all-too-common tale of gentrification and migration, of a city permanently regurgitating its most promising inhabitants and falling victim to an influx of wealth that forever altered its essence. Like any foolish romantic, I fell in love with someone who was absolutely wrong for me and became seduced by the Brooklyn brownstones, the skyscrapers spiring along the rivers, and the giddy pace of a megacity demanding all of its inhabitants to make something of themselves. I’ve been in New York City now for fourteen years — most of my thirties and all of my forties. I hope to continue to live here. But like anything in life, it’s largely the luck of the draw, hoping that the law of averages will work out in your favor. Especially in this age of mass unemployment and pandemic uncertainties and anybody who doesn’t make more than $200,000 a year left in the cold and declared the enemy.

I mention these bona-fides in advance of my thoughts on the great Lewis Mumford to give you a sense of why his amazing book, The City in History, took me much longer to read than I anticipated. The problem with an encyclopedic smartypants like Mumford is that he’ll drop a casual reference that is supremely interesting if you are even remotely curious. One paragraph will send you down an Internet rabbit hole. The next thing you know, you’ve just spent hours of your life trying to find any information on the ancient Greek artisans who hustled their goods in the agora and why slavery was simply accepted as a part of city life for centuries. An email correspondent, learning that I was taking a deep dive into Mumford, urged me to plunge into the four volumes kick-started by Technics and Civilization. And I have to say, given the many months I spent not so much reading The City in History but taking in the many references orbiting its scholarship, I will probably have to wait until perhaps my seventies — should I live that long — for such an enormous undertaking. I could easily see myself as an old bachelor on a beach — filling in crossword puzzles, tendering stories about my misspent youth to any sympathetic ear, respectfully flirting with any lingering divorcé with the decency to not see me as invisible, and carrying along the four Mumford volumes with me (along with whatever will then pass for a tablet to look up all the references) in a satchel.

This is my roundabout way of saying that Lewis Mumford’s The City in History is a wonderfully robust and often grumbly tome from a dude who spent most of his years considering how cities thrive through technological and architectural development. One of the book’s charms is seeing Mumford gradually becoming more pissed off as he gets closer to the modern age. It’s almost as if he resents what the city transformed into in the twentieth century. For example, in a weird aside, Mumford complains about the increased number of windows in residential buildings after the seventeenth century, bemoaning the lack of privacy with a touch of principle rarely remembered by people who grew up with nothing but the Internet’s exhibitionistic cadences. He also has a healthy aversion to the “often disruptive and self-defeating” nature of constant growth. It is, after all, possible for a city or a small town to develop too much. Once cities ditched their walls, there were no longer any physical boundaries to how far any teeming area could spread while arguably become lesser the further it rolled along. (See, for example, the anarchic sprawl of Texas today. Everyone from the likes of the Manhattan Institute’s Michael Hendrix to James Howard Kuntsler has spoken, in varying degrees of horror, about this endless expansion.) On this point, Mumford pushes back against the myth of the medieval town as a place of static boredom. He points to religious edifices somehow transforming these clusters where, for the first time in history, “the majority of the inhabitants of a city were free men.” Even when mercantile centers dried up as trade died, Mumfurod points to the limitless evolution of the countryside. Feudalism subsided for a stabler and more available food supply and new forms of home-spun industry that made many of these smaller villages special. Textile industries flourished in northern Italy and not only resulted in innovations such as the spinning wheel, but some healthy revolutionary pushback against tyrants — such as the weavers rebelling against the ruling elite in 1370-1371. In short, Mumford argues that a reasonably confined city was capable of nearly anything.

But what of the modern metropolis? The cities that called to people like me as a young man? Mumford’s view was that the enormity of a place like Paris or Rome or London or New York City wasn’t merely the result of technological progress. As he argues:

…the metropolitan phase became universal only when the technical means of congestion had become adequate — and their use profitable to those who manufactured or employed them. The modern metropolis is, rather, an outstanding example of a peculiar cultural lag within the realm of technics itself: namely, the continuation by highly advanced technical means of the obsolete forms and ends of a socially retarded civilization.

Well, that doesn’t sound too nice. So the punks who I jammed with in Mission District warrens and the scrappy filmmakers piecing together stories and the bizarre theatre we were putting on while eating ramen and Red Vines were cultural atavists? Gee, thanks, Lewis! Would Mumford apply this same disparaging tone to the CBGB punk crowd and artists who flourished in the East Village and arguably altered the trajectory of popular music? Or, for that matter, the 1990s hip-hop artists who flourished in Bed-Stuy and Compton? This is where Mumford and I part ways. Who are any of us to dictate what constitutes cultural lag? In my experience, obsolete forms tend to square dance with current mediums and that’s usually how the beat rolls on. Small wonder that Jane Jacobs and Mumford would get involved in a philosophical brawl that lasted a good four decades.

It’s frustrating that, for all the right criticism Mumford offers, he can be a bit of a dowdy square. He’s so good at showing us how the office building, as we still know it today, initiated in Florence thanks to Giorgio Vasari. It turns out that this amazing Italian Renaissance man wasn’t just committed to corridors. He designed an interior with an open-floor loggia — those reception areas that can now be found in every damned bureaucratic entity. We now have someone to blame for them! Mumford offers us little details — such as the tendency of early cities to repave streets over the layers of trash that had been thrown over the past twenty years. This resulted in developments such as doorways increasingly becoming lower — often submerged beneath the grade entirely — as history carried on. There are very useful asides in Mumford’s book on the history of multistory buildings. We learn how Roman baths and gymnasiums did make efforts to accommodate the rabble, despite the rampant exploitation of humans. Calvino was only scratching the surface. As long as cities have been around, humans have created new structures and new innovations. For all we know, the Coronavirus pandemic could very well lead to some urban advancement that humankind had hitherto never considered.

Because of all this, I can’t square Mumford’s elitism with the beautiful idealism that he lays down here:

The final mission of the city is to further man’s cautious participation in the cosmic and the historic process. Through its own complex and enduring structure, the city vastly augments man’s ability to interpret these processes and take an active, formative part in them, so that every phase of the drama it stages shall have, to the highest degree possible, the illumination of consciousness, the stamp of purpose, the color of love. That magnification of all the dimensions of life, through emotional communion, rational communication, technological mastery, and, above, all, dramatic representation, has been the supreme office of the city in history. And it remains the chief reason for the city’s continued existence.

Who determines the active and formative development of the city? Do we leave it to anarchy? Do we acknowledge the numerous forces duking it out over who determines the topography? I can certainly get behind Mumford railing against mercantilism. But who establishes the ideal? One of the most underrated volumes contending with such a struggle between social community and the kind of “high-minded” conservative finger-wagging that Mumford too often espouses is Samuel R. Delany’s excellent book, Times Square Red, Times Square Blue, a brilliant portrait of the undeniable “color of love” practiced in the Times Square adult movie theatres through the mid-1990s — until Mayor Giuliani declared war on what he deemed unseemly. In a sidebar, Delany, buttressing Jane Jacobs, observes that the problem here is that this sort of idealism assumes two conditions: (1) that cities are fundamentally repugnant places and that we must therefore hide the poor and the underprivileged and (2) the city is defined by the big and the monumental.

The sheer amount of suffering undergone by the impoverished is something that Mumford, to his credit, does broach — particularly the unsanitary conditions that those in London and New York lived in as these cities expanded. (For more on the working stiffs and those who struggled, especially in New York, I highly recommend Luc Sante’s excellent book Low Life.) But while Mumford is willing to go all in on the question of bigness, he’s a little too detached and diffident on the issue of how the have nots contribute to urban growth, although he does note how “the proletariat had their unpremeditated revenge” on the haves as New York increasingly crammed people like sardines into airless cloisters. And, as such, I found myself pulling out my Jane Jacobs books, rereading passages, and saying, with my best Mortal Kombat announcer voice, “Finish him!”

But maybe I’m being a little too hard on Mumford. The guy wasn’t a fan of architect Leon Battista Alberti’s great rush for suburban development, with this funny aside: “one must ask how much he left for the early twentieth-century architect to invent.” Mumford had it in for Le Corbusier and his tower-centric approach to urban planning (which is perhaps best observed in Chandigarh, India — a place where Le Corbusier was given free reign), but he was also a huge fan of Ebeneezer Howard and his “Garden City” movement, whereby Howard suggested that some combination of city and country represented the best living conditions. Even if you side with Jane Jacobs, as I do, on the whole Garden City question, believing that there can be some real beauty in staggering and urban density, you can’t help but smile at his prickliness:

For the successor of the paleotechnic town has created instruments and conditions potentially far more lethal than those which wiped out so many lives in the town of Donora, Pennsylvania, through a concentration of toxic gases, or that which in December 1952 killed in one week an estimated five thousand extra of London’s inhabitants.

Oh, Mumford! With endearingly bleak observations like this, why couldn’t you be more on the side of the people?

Next Up: Paul Fussell’s The Great War and Modern Memory!

Battle Cry of Freedom (Modern Library Nonfiction #77)

(This is the twenty-third entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Why We Can’t Wait.)

In his 1966 essay “The White Man’s Guilt,” James Baldwin — never a man to mince words or to refrain from expressing searing clarity — declared that white Americans were incapable of facing the deep wounds suppurating in the national fabric because of their refusal to acknowledge their complicity in abusive history. Pointing to the repugnant privilege that, even today, hinders many white people from altering their lives, their attitudes, and the baleful bigotry summoned by their nascent advantages, much less their relationships to people of color, Baldwin noted:

For history, as nearly no one seems to know, is not merely something to be read. And it does not refer merely, or even principally, to the past. On the contrary, the great force of history comes from the fact that we carry it within us, are unconsciously controlled by it in many ways, and history is literally present in all that we do. It could scarcely be otherwise, since it is to history that we owe our frames of reference, our identities, and our aspirations.

Fifty-four years after Baldwin, America now finds itself enmired within its most seminal (and long delayed) civil rights movement in decades, awakened from its somnambulistic malaise through the neck-stomping snap of systemic racism casually and ignobly practiced by crooked cops who are afforded impunity rather than significant consequences. The institution of slavery has been replaced by the indignities of racial profiling, income disparity, wanton brutality, constant belittlement, and a crass cabal of Karens who are more than eager to rat out people of color so that they can scarf down their soy milk lattes and avocado toast, rarely deviating from the hideous cues that a culture — one that prioritizes discrimination first and equality last — rewards with all the perfunctory mechanics of a slot machine jackpot.

Thus, one must approach James McPherson’s mighty and incredibly impressive Civil War volume with mindfulness and assiduity. It is not, as Baldwin says, a book that can merely be read — even though it is something of a miracle that McPherson has packed as much detail and as many considerations as he has within more than 900 pages. McPherson’s volume is an invaluable start for anyone hoping to move beyond mere reading, to significantly considering the palpable legacy of how the hideous shadow of white supremacy and belittlement still plagues us in the present. Why does the Confederate flag still fly? Why do imperialist statues — especially monuments that celebrate a failed and racist breakaway coalition of upstart states rightly starved and humiliated and destroyed by Grant and Sherman — still stand? Battle Cry of Freedom beckons us to pay careful attention to the unjust and bestial influences that erupted before the war and that flickered afterwards. It is thankfully not just a compilation of battle summaries — although it does do much to mark the moments in which the North was on the run and geography and weather and lack of supplies often stood in its way. The book pays welcome scrutiny to the underlying environment that inspired the South to secede and required a newly inaugurated Lincoln to call for 75,000 volunteers a little more than a month after he had been sworn in as President and just after the South Carolina militia had attacked Fort Sumter.

* * *

It was technological innovation in the 1840s and the 1850s — the new machines putting out watches and furniture and bolts and damn near anything into the market at a rapid clip previously unseen — that helped sow the seeds of labor unrest. To use the new tools, a worker had to go to a factory rather than operating out of his home. To turn the most profit possible and sustain his venal wealth, the aspiring robber baron had to exploit the worker at subhuman wages. The South was more willing to enslave people. A barbaric racist of that era ranting in a saloon could, much like one of Trump’s acolytes today, point to the dip in the agricultural labor force from 1800 to 1860. In the North, 70% of labor was in agriculture, but this fell to 40%. But in the South, the rate remained steady at 80%. But this, of course, was an artificial win built on the backs of Black lives.

You had increasing territory in the West annexed to the United States and, with this, vivacious crusaders who were feeling bolder about their causes. David Wilmot, a freshman Congressional Representative, saw the Mexican War as an opportunity to lay down a proviso on August 8, 1846. “[N]either slavery nor involuntary servitude shall ever exist in any part of said territory” were the words that Wilmot added to an appropriations bill amendment. Like any politician, Wilmot was interested in settling scores. The Wilmot Proviso was as much the result of long pent-up frustration among a cluster of Northern Democrats who cared more about holding onto power than pushing forward abolition. The proviso kept being reintroduced and the Democratic Party of the time — much of it composed of racists from the South — began to splinter.

Northern Democrats shifted their support from the Wilmot Proviso to an idea known as popular sovereignity, which placed the decision on whether to sustain or abolish slavery into the hands of settlers moving into the new territories. But Wilmot’s more universal abolition approach still had the enthusiastic support of northern Whigs. The Whigs, for those who may not recall, were essentially middle-class conservatives living it large. They represented the alternative to Democrats before the Republican Party was created in 1854. The Whigs emerged from the ashes of the Nullification Crisis of 1832 — which you may recall me getting into when I was tackling Herbert Croly a few years ago. Yes, Andrew Jackson was responsible for (a) destroying the national bank, thus creating an economically volatile environment and (b) creating enough fury for Henry Clay and company to form an anti-Jackson opposition party. What’s most interesting here is that opposing Jackson also meant opposing one of his pet causes: slavery. And, mind you, these were pro-business conservatives who wanted to live the good life. This is a bit like day trading bros dolled up in Brooks Brothers suits suddenly announcing that they want universal healthcare. Politics may make strange bedfellows, but sometimes a searing laser directed at an enemy who has jilted you in the boudoir creates an entirely unexpected bloc.

Many of the “liberals” of that era, especially in the South, were very much in favor of keeping slavery going. (This historical fact has regrettably caused many Republicans to chirp “Party of Lincoln!” in an attempt to excuse the more fascistic and racist overtures that these same smug burghers wallow in today.) Much like Black Lives Matter today and the Occupy Wall Street movement nine years ago, a significant plurality of the Whigs, who resented the fact that their slave-owning presidential candidate Zachary Taylor refused to take a position on the Wilmot Proviso, were able to create a broad coalition at the Free Soil convention of 1848. Slavery then became one of the 1848 presidential election’s major issues.

In Battle Cry, McPherson nimbly points to how all of these developments led to a great deal of political unrest that made the Civil War inevitable. Prominent Republican William H. Seward (later Lincoln’s Secretary of State) came out swinging against slavery, claiming that compromise on the issue was impossible. “You cannot roll back the tide of social progress,” he said. The 1854 Kansas-Nebraska Act (authored by Stephen Douglas) repealed the Missouri Compromise, which in turn led to “Bleeding Kansas” — a series of armed and violent struggles over the legality of slavery that carried on for the next seven years. (Curiously, McPheron downplays Daniel Webster’s 1850 turncoat “Seventh of March” speech, which signaled Webster’s willingness to enforce the Fugitive Slave Act and forever altered his base and political career.) And while all this was happening, cotton prices in the South were rising and a dying faction of Southern unionists led the Southern states to increasingly consider secession. The maps of 1860 reveal the inescapable problem:

* * *

The Whigs were crumbling. Enter Lincoln, speaking eloquently on a Peroria stage on October 16, 1854, and representing the future of the newly minted Republican Party:

When the white man governs himself that is self-government; but when he governs himself, and also governs another man, that is more than self-government — that is despotism. If the negro is a man, why then my ancient faith teaches me that “all men are created equal;” and that there can be no moral right in connection with one man’s making a slave of another.

Enter the Know Nothings, a third party filling a niche left by the eroding Whigs and the increasingly splintered Democratic Party. The Know Nothings were arguably the Proud Boys of their time. They ushered in a wave of nationalism and xenophobia that was thoughtfully considered by the Smithsonian‘s Lorraine Boissoneault. What killed the Know Nothings was their failure to take a stand on slavery. You couldn’t afford to stay silent on the issue when the likes of Dred Scott and John Brown were in the newspapers. The Know Nothings further scattered political difference to the winds, giving Lincoln the opportunity to unite numerous strands under the new Republican Party and win the Presidency during the 1860 election, despite not being on the ballot in ten Southern states.

With Lincoln’s win, seven slave states seceded from the union. And the beginnings of the Confederacy began. Historians have been arguing for years over the precise reasons for this disunion. If you’re a bit of a wonk like me, I highly recommend this 2011 panel in which three historians offer entirely different takeaways. McPherson, to his credit, allows the events to unfold and refrains from too much editorializing. Although throughout the book, McPherson does speak from the perspective of the Union.

* * *

As I noted when I tackled John Keegan’s The Face of Battle, one of my failings as an all-encompassing dilettante resides with military history, which I find about as pleasurable to read as sprawling myself naked, sans hat or suntan lotion, upon some burning metal bed on a Brooklyn rooftop during a hot August afternoon — watching tar congeal over my epidermis until I transform into some ugly onyx crust while various spectators, saddled with boredom and the need to make a quick buck, film me with their phones and later email me demands to pay up in Bitcoin, lest my mindless frolicking be publicly uploaded to the Internet and distributed to every pornographic website from here to Helsinki.

That’s definitely laying it on thicker than you need to hear. But it is essential that you understand just how much military history rankles me.

Anyway, despite my great reluctance to don a tricorne of any sort, McPherson’s descriptions of battles (along with the accompanying illustrations) did somehow jolt me out of my aversion and make me care. Little details — such as P.G.T. Beauregard designing a new Confederate battle flag after troops could not distinguish between the Confederate “stars and bars” banner from the Union flag in the fog of battle — helped to clarify the specific innovations brought about by the Civil War. It also had never occurred to me how much the history of ironclad vessels began with the Civil War, thanks in part to the eccentric marine engineer John Ericsson, who designed the famed USS Monitor, designed as a counterpoint to the formidable Confederate vessel Virginia, which had been created to hit the Union blockade at Ronoake Island. What was especially amazing about Ericsson’s ship was that it was built and launched rapidly — without testing. After two hours of fighting, the Monitor finally breached the Virginia‘s hull with a 175-pound shot, operating with barely functioning engines. For whatever reason, McPherson’s vivid description of this sea battle reminded me of the Mutara Nebula battle at the end of Star Trek II: The Wrath of Khan.

But even for all of McPherson’s synthesizing legerdemain, the one serious thing I have to ding him on is his failure to describe the horrors of slavery in any form. Even William L. Shirer in The Rise and Fall of the Third Reich devoted significant passages to depicting what was happening in the Holocaust death camps. Despite my high regard for McPherson’s ability to find just the right events to highlight in the Civil War timeline, and his brilliantly subtle way of depicting the shifting fortunes of the North and the South, can one really accept a volume about the Civil War without a description of slavery? McPherson devotes more time to covering Andersonville’s brutal statistics (prisoner mortality was 29% and so forth) before closing his paragraph with this sentence:

The treatment of prisoners during the Civil War was something that neither side could be proud of.

But what of the treatment of Black people? Why does this not merit so much as a paragraph? McPherson is so good at details — such as emphasizing the fact that Grant’s pleas to have all prisoners exchanged — white and Black — in the cartel actually came a year after negotiations had stopped. He’s good enough to show us how southern historians have perceived events (often questionably). Why then would he shy away from the conditions of slavery?

The other major flaw: Why would McPherson skim over the Battle of Gettysburg in just under twenty pages? This was, after all, the decisive battle of the war. McPherson seems to devote more time, for example, on the Confederate raids in 1862. And while all this is useful to understanding the War, it’s still inexplicable to me.

But these are significant nitpicks for a book that was published in 1988 and that is otherwise a masterpiece. Still, I’m not the only one out here kvetching about this problem. The time has come for a new historian — ideally someone who isn’t a white male — to step up to the challenge and outdo both Ken Burns and James McPherson (and Shelby Foote, who I’ll be getting to when we hit MLNF #15 in perhaps a decade or so) and fully convey the evils and brutality of slavery and why this war both altered American life and exacerbated the problems we are still facing today.

Next Up: Lewis Mumford’s The City in History!

Why We Can’t Wait (Modern Library Nonfiction #78)

(This is the twenty-second entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Rise of Theodore Roosevelt.)

It was a warm day in April when Dr. Martin Luther King was arrested. It was the thirteenth and the most important arrest of his life. King, wearing denim work pants and a gray fatigue shirt, was manacled along with fifty others that afternoon, joining close to a thousand more who had bravely submitted their bodies over many weeks to make a vital point about racial inequality and the unquestionable inhumanity of segregation.

The brave people of Birmingham had tried so many times before. They had attempted peaceful negotiation with a city that had closed sixty public parks rather than uphold the federal desegregation law. They had talked with businesses that had debased black people by denying them restaurant service and asking them to walk through doors labeled COLORED. Some of these atavistic signs had been removed, only for the placards to be returned to the windows once the businesses believed that their hollow gestures had been fulfilled. And so it became necessary to push harder — peacefully, but harder. The Birmingham police unleashed attack dogs on children and doused peaceful protesters with high-pressure water hoses and seemed hell-bent on debasing and arresting the growing throngs who stood up and said, without raising a fist and always believing in hope and often singing songs, “Enough. No more.”

There were many local leaders who claimed that they stood for the righteous, but who turned against King. White leaders in Birmingham believed — not unlike pro-segregation Governor George Wallace just three months earlier — that King’s nonviolent protests against segregation would incite a torrent of violence. But the violence never came from King’s well-trained camp and had actually emerged from the savage police force upholding an unjust law. King had been very careful with his activists, asking them to sign a ten-point Commitment Card that included these two vital points:

6. OBSERVE with both friend and foe the ordinary rules of courtesy.

8. REFRAIN from the violence of fist, tongue, or heart.

Two days before King’s arrest, Bull Connor, the racist Birmingham Commissioner of Public Safety and a man so vile and heartless that he’d once egged on Klansmen to beat Freedom Riders to a pulp for fifteen minutes as the police stood adjacent and did not intervene, had issued an injunction against the protests. He raised the bail bond from $200 to $1,500 for those who were arrested. (That’s $10,000 in 2019 dollars. When you consider the lower pay and the denied economic opportunities for Birmingham blacks, you can very well imagine what a cruel and needless punishment this was for many protesters who lived paycheck to paycheck.)

And so on Good Friday, it became necessary for King, along with his invaluable fellow leaders Ralph Abernathy and Fred Shuttlesworth, to walk directly to Birmingham Jail and sing “We Shall Overcome.” King took a very big risk in doing so. But he needed to set an example for civil disobedience. He needed to show that he was not immune to the sacrifices of this very important fight. The bondsman who provided the bail for the demonstrators told King that he was out as King pondered the nearly diminished funds for the campaign. In jail, King would not be able to use his contacts and raise the money that would keep his campaign going. Despite all this, and this is probably one of the key takeaways from this remarkable episode in political history, King was dedicated to practicing what he preached. As he put it:

How could my failure now to submit to arrest be explained to the local community? What would be the verdict of the country about a man who had encouraged hundreds of people to make a stunning and then excused himself?

Many who watched this noble march, the details of which are documented in S. Jonathan Bass’s excellent book Blessed Are the Peacemakers, dressed in their Sunday best out of respect for King’s efforts. Police crept along with the marchers before Connor gave the final order. Shuttlesworth had left earlier. King, Abernathy, and their fellow protestors were soon surrounded by paddy wagons and motorcycles and a three-wheel motorcart. They dropped to their knees in peaceful prayer. The head of the patrol squeezed the back of King’s belt and escorted him into a police car. The police gripped the back of Abernathy’s shirt and steered him into a van.

King was placed in an isolation cell. Thankfully, he did not suffer physical brutality, but the atmosphere was dank enough to diminish a weaker man’s hope. As he wrote, “You will never know the meaning of utter darkness until you have lain in such a dungeon, knowing that sunlight is streaming overhead and still seeing only darkness below.” Jail officials refused a private meeting between King and his attorney. Wyatt Tee Walker, King’s chief of staff, sent a telegram to President Kennedy. The police did not permit King to speak to anyone for at least twenty-four hours.

As his confidantes gradually gained permission to speak to King, King became aware of a statement published by eight white clergy members in Birmingham — available here. This octet not only urged the black community to withdraw support for these demonstrations, but risibly suggested that King’s campaign was “unwise and untimely” and could be settled by the courts. They completely missed the point of what King was determined to accomplish.

King began drafting a response, scribbling around the margins of a newspaper. Abernathy asked King if the police had given him anything to write on. “No,” King replied, “I’m using toilet paper.” Within a week, he had paper and a notepad. King’s “Letter from Birmingham Jail,” contained in his incredibly inspiring book Why We Can’t Wait, is one of the most powerful statements ever written about civil rights. It nimbly argues for the need to take direct action rather than wait for injustice to be rectified. It remains an essential text for anyone who professes to champion humanity and dignity.

* * *

King’s “Letter” against the eight clergymen could just as easily apply to many “well-meaning” liberals today. He expertly fillets the white clergy for their lack of concern, pointing out that “the superficial kind of social analysis that deal with effects and does not grapple with underlying causes.” He points out that direct action is, in and of itself, a form of negotiation. The only way that an issue becomes lodged in the national conversation is when it becomes dramatized. King advocates a “constructive, nonviolent tension that is necessary for growth” — something that seems increasingly difficult for people on social media to understand as they block viewpoints that they vaguely disagree with and cower behind filter bubbles. He is also adamantly, and rightly, committed to not allowing anyone’s timetable to get in the way of fighting a national cancer that had then ignobly endured for 340 years. He distinguishes between the just and the unjust law, pointing out that “one has a moral responsibility to obey unjust laws.” But he is very careful and very clear about his definitions:

An unjust law is a code that a numerical or power majority group compels a minority group to obey but does not make binding on itself. This is difference made legal. By the same token, a just law is a code that a majority compels a minority to follow and that it is willing to follow itself. This is sameness made legal.

This is a cogent philosophy applicable to many ills beyond racism. This is radicalism in all of its beauty. This is precisely what made Martin Luther King one of the greatest Americans who ever lived. For me, Martin Luther King remains a true hero, a model for justice, humility, peace, moral responsibility, organizational acumen, progress, and doing what’s right. But it also made King dangerous enough for James Earl Ray, a staunch Wallace supporter, to assassinate him on April 4, 1968. (Incidentally, King’s family have supported Ray’s efforts to prove his innocence.)

* * *

Why We Can’t Wait‘s scope isn’t just limited to Birmingham. The book doesn’t hesitate to cover a vast historical trajectory that somehow stumps for action in 1963 and in 2019. It reminds us that much of what King was fighting for must remain at the forefront of today’s progressive politics, but also must involve a government that acts on behalf of the people: “There is a right and a wrong side in this conflict and the government does not belong the middle.” Unfortunately, the government has doggedly sided against human rights and against the majestic democracy of voting. While Jim Crow has thankfully been abolished, the recent battle to restore the Voting Rights Act of 1965, gutted by the Supreme Court in 2013, shows that systemic racism remains very much alive and that the courts for which the eight white Birmingham clergy professed such faith and fealty are stacked against African-Americans. (A 2018 Harvard study discovered that counties freed from federal oversight saw a dramatic drop in minority voter turnout.)

Much as the end of physical slavery inspired racists to conjure up segregation as a new method of diminishing African-Americans, so too do we see such cavalier and dehumanizing “innovations” in present day racism. Police shootings and hate crimes are all driven by the same repugnant violence that King devoted his life to defeating.

The economic parallels between 1963 and 2019 are also distressingly acute. In Why We Can’t Wait, King noted that there were “two and one-half times as many jobless Negros as whites in 1963, and their median income was half that of the white man.” Fifty-six years later, the Bureau of Labor Statistics informs us that African Americans are nearly twice as unemployed as whites in a flush economic time with a low unemployment rate, with the U.S. Census Bureau reporting that the median household income for African-Americans in 2017 was $40,258 compared to $68,145 for whites. In other words, a black family now only makes 59% of the median income earned by a white family.

If these statistics are supposed to represent “progress,” then it’s clear that we’re still making the mistake of waiting. These are appalling and unacceptable baby steps towards the very necessary racial equality that King called for. White Americans continue to ignore these statistics and the putatively liberal politicians who profess to stand for fairness continue to demonstrate how tone-deaf they are to feral wrongs that affect real lives. As Ashley Williams learned in February 2016, white Democrats continue to dismiss anyone who challenges them on their disgraceful legacy of incarcerating people of color. The protester is “rude,” “not appropriate,” or is, in a particularly loaded gerund, “trespassing.” “Maybe you can listen to what I have to say” was Hillary Clinton’s response to Williams, to which one rightfully replies in the name of moral justice, “Hillary, maybe you’re the one here who needs to listen.”

Even Kamala Harris, now running for President, has tried to paint herself as a “progressive prosecutor,” when her record reveals clear support for measures that actively harm the lives of black people. In 2015, Harris opposed a bill that demanded greater probing into police officer shootings. That same year, she refused to support body cams, only to volte-face with egregious opportunism just ten days before announcing her candidacy. In the case of George Gage, Harris held back key exculpatory evidence that might have freed a man who did not have criminal record. Gage was forced to represent himself in court and is now serving a 70-year sentence. In upholding these savage inequities, I don’t think it’s a stretch to out Kamala Harris as a disingenuous fraud. Like many Democrats who pay mere lip service to policies that uproot lives, she is not a true friend to African Americans, much less humanity. It was a hardly a surprise when Black Lives Matter’s Johnetta Elzie declared that she was “not excited” about Harris’s candidacy back in January. After rereading King and being reminded of the evils of casual complicity, I can honestly say that, as someone who lives in a neighborhood where the police dole out regular injustices to African-Americans, I’m not incredibly thrilled about Harris either.

But what we do have in this present age is the ability to mobilize and fight, to march in the streets until our nation’s gravest ills become ubiquitously publicized, something that can no longer be ignored. What we have today is the power to vote and to not settle for any candidate who refuses to heed the realities that are presently eating our nation away from the inside. If such efforts fail or the futility of protesting makes one despondent, one can still turn to King for inspiration. King sees the upside in a failure, galvanizing the reader without ever sounding like a Pollyanna. Pointing to the 1962 sit-ins in Albany, Georgia, King observes that, while restaurants remained segregated after months of protest, the activism did result in more African-Americans voting and Georgia at long last electing “the first governor [who] pledged to respect and enforce the law equally.”

It’s sometimes difficult to summon hope when the political clime presently seems so intransigent, but I was surprised to find myself incredibly optimistic and fired up after rereading Why We Can’t Wait for the first time in more than two decades. This remarkable book from a rightfully towering figure seems to have answered every argument that milquetoasts produce against radicalism. No, we can’t wait. We shouldn’t wait. We must act today.

Next Up: James M. McPherson’s Battle Cry of Freedom!

The Rise of Theodore Roosevelt (Modern Library Nonfiction #79)

(This is the twenty-first entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Studies in Iconology.)

One of many blistering tangerines contained within Mark Twain’s juicy three volume Autobiography involves his observations on Theodore Roosevelt: “We have never had a President before who was destitute of self-respect and of respect for his high office; we have had no President before who was not a gentleman; we have had no President before who was intended for a butcher, a dive-keeper or a bully, and missed his mission of compulsion of circumstances over which he had no control.”

He could just as easily have been discussing the current doddering charlatan now forcing many otherwise respectable citizens into recuperative nights of heavy drinking and fussy hookups with a bespoke apocalyptic theme, but Twain’s sentiments do say quite a good deal about the cyclical American affinity for peculiar outsiders who resonate with a populist base. As I write these words, Bernie Sanders has just decided to enter the 2020 Presidential race, raising nearly $6 million in 24 hours and angering those who perceive his call for robust social democracy to be unrealistic, along with truth-telling comedians who are “sick of old white dudes.” Should Sanders run as an independent, the 2020 presidential race could very well be a replay of Roosevelt’s Bull Moose Party run in 1912.

Character ultimately distinguishes a Chauncey Gardner couch potato from an outlier who makes tangible waves. And it is nearly impossible to argue that Teddy Roosevelt, while bombastic in his prose, often ridiculous in his obsessions, and pretty damn nuts when it came to the Rough Riders business in Cuba, did not possess it. Edmund Morris’s incredibly compelling biography, while subtly acknowledging Teddy’s often feral and contradictory impulses, suggests that Roosevelt was not only the man that America wanted and perhaps needed, but reminds us that Roosevelt also had the good fortune of being in the right place at the right time. Had not Vice President Garret Hobart dropped dead because of a bum ticker on November 21, 1899, and had not a sour New York Republican boss named Tom Platt been so eager to run Teddy out of Albany, there is a good chance that Roosevelt might have ended up as a serviceable two-term Governor of New York, perhaps a brasher form of Nelson Rockefeller or an Eliot Spitzer who knew how to control his zipper. Had not a Russian anarchist plugged President McKinnley two times in the chest at the Temple of Music, it is quite possible that Roosevelt’s innovative trust busting and his work on food safety and national parks, to say nothing of his crazed obsession with military might and giving the United States a new role as international police force, would have been delayed altogether.

What Roosevelt had, aside from remarkable luck, was a relentless energy which often exhausts the 21st century reader nearly as much as it fatigued those surrounding Teddy’s orbit. Here is a daily timetable of Teddy’s activities when he was running for Vice President, which Morris quotes late in the book:

7:00 A.M. Breakfast 
7:30 A.M. A speech
8:00 A.M. Reading a historical work
9:00 A.M. A speech
10:00 A.M.  Dictating letters
11:00 A.M. Discussing Montana mines
11:30 A.M. A speech
12:00 Reading an ornithological work
12:30 P.M. A speech
1:00 P.M. Lunch
1:30 P.M. A speech
2:30 P.M. Reading Sir Walter Scott
3:00 P.M. Answering telegrams
3:45 P.M. A speech
4:00 P.M. Meeting the press
4:30 P.M. Reading
5:00 P.M. A speech
6:00 P.M. Reading
7:00 P.M. Supper
8-10 P.M. Speaking
11:00 P.M. Reading alone in his car
12:00 To bed

That Roosevelt was able to do so much in an epoch before instant messages, social media, vast armies of personal assistants, and Outlook reminders says a great deal about how he ascended so rapidly to great heights. He could dictate an entire book in three months, while also spending his days climbing mountains and riding dozens of miles on horseback (much to the chagrin of his exhausted colts). Morris suggests that much of this energy was forged from the asthma he suffered as a child. Standing initially in the shadow of his younger brother Elliott (whose later mental collapse he callously attempted to cover up to preserve his reputation), Teddy spent nearly his entire life doing, perhaps sharing Steve Jobs’s “reality distortion field” in the wholesale denial of his limitations:

In between rows and rides, Theodore would burn off his excess energy by running at speed through the woods, boxing and wrestling with Elliott, hiking, hunting, and swimming. His diary constantly exults in physical achievement, and never betrays fear that he might be overtaxing his strength. When forced to record an attack of cholera morbus in early August, he precedes it with the phrase, “Funnily enough….”

Morris is thankfully sparing about whether such superhuman energy (which some psychological experts have suggested to be the result of undiagnosed bipolar disorder) constitutes genius, only reserving the word for Roosevelt in relation to his incredible knack for maintaining relations with the press — seen most prominently in his fulsome campaign speeches and the way that he courted journalistic reformer Jacob Riis during his days as New York Police Commissioner and invited Riis to accompany him on his nighttime sweeps through various beats, where Roosevelt micromanaged slumbering cops and any other layabout he could find. The more fascinating question is how such an exuberant young autodidact, a voracious reader with preternatural recall eagerly conducting dissections around the house when not running and rowing his way with ailing lungs, came to become involved in American politics.

Some of this had to do with his hypergraphia, his need to inhabit the world, his indefatigable drive to do everything and anything. Some of it had to do with deciding to attend Columbia Law School so he could forge a professional career with his new wife Alice Hathaway Lee, who had quite the appetite for social functions (and whose inner life, sadly, is only superficially examined in Morris’s book). But much of it had to do with Roosevelt regularly attending Morton Hall, the Republican headquarters for his local district. Despite being heckled for his unusual threads and side-whiskers, Roosevelt kept showing up until he was accepted as a member. The Roosevelt family disapproved. Teddy reacted in anger. And from moment forward, Morris writes, Roosevelt desired political power for the rest of his life. Part of this had to do with the need for family revenge. Theodore Roosevelt, Sr. suffered a swift decline in health (and quickly died) after Roscoe Conklin and New York State Republicans set out to ruin him over a customs collector position.

These early seeds of payback and uncompromising individualism grew Roosevelt into a fiery oleander who garnered a rep as a fierce and feisty New York State Assemblyman: the volcanic fuel source that was to sustain him until mortality and dowdiness sadly caught up with him during the First World War. But Roosevelt, like Lyndon B. Johnson later with the Civil Rights Act (documented in an incredibly gripping chapter in Robert A. Caro’s Master of the Senate), did have a masterful way of persuading people to side with him, often through his energy and speeches rather than creepy lapel-grabbing. As New York Police Commissioner, Roosevelt upheld the unpopular blue laws and, for a time, managed to get both the grumbling bibulous public and the irascible tavern keepers on his side. Still, Roosevelt’s pugnacity and tenacity were probably more indicative of the manner in which he fought his battles. He took advantage of any political opportunity — such as making vital decisions while serving as Acting Secretary of the Navy without consulting his superior John Davis Long. But he did have a sense of honor, seen in his refusal to take out his enemy Andrew D. Parker when given a scandalous lead during a bitter battle in New York City (the episode was helpfully documented by Riis) and, as New York State Assemblyman, voting with Democrats on March 7, 1883 to veto the Five-Cent Bill when it was discovered to be unconstitutional by then Governor Grover Cleveland. Perhaps his often impulsive instincts, punctuated by an ability to consider the consequences of any action as it was being carried out, is what made him, at times, a remarkable leader. Morris documents one episode during Roosevelt’s stint as Assistant Secretary of the Navy in which he was trying to build up an American navy and swiftly snapped up a Brazilian vessel without a letter. When the contract was drafted for the ship, dealer Charles R. Flint noted, “It was one of the most concise and at the same time one of the cleverest contracts I have ever seen.”

Morris is to be praised for writing about such a rambunctious figure with class, care, and panache. Seriously, this dude doesn’t get enough props for busting out all the biographical stops. If you want to know more about Theodore Roosevelt, Morris’s trilogy is definitely the one you should read. Even so, there are a few moments in this biography in which Morris veers modestly into extremes that strain his otherwise eloquent fairness. He quotes from “a modern historian” who asks, “Who in office was more radical in 1899?” One zips to the endnotes, only to find that the “historian” in question was none other than the partisan John Allen Gable, who was once considered to be the foremost authority on Teddy Roosevelt. Morris also observes that “ninety-nine percent of the millions of words he thus poured out are sterile, banal, and so droningly repetitive as to defeat the most dedicated researcher,” and while one opens a bountiful heart to the historian prepared to sift through the collected works of a possible madman, the juicy bits that Morris quotes are entertaining and compelling. Also, to be fair, a man driven to dictate a book-length historical biography in a month is going to have some litters in the bunch.

But these are extremely modest complaints for an otherwise magnificent biography. Edmund Morris writes with a nimble focus. His research is detailed, rigorous, and always on point, and he has a clear enthusiasm for his subject. Much of Morris’s fall from grace has to do with the regrettable volume, Dutch, in which Morris abandoned his exacting acumen and inserted a version of himself in a biography of Reagan. This feckless boundary-pushing even extended into the endnotes, in which one Morris inserted references to imaginary people. He completely overlooked vital periods in Reagan’s life and political career, such as the Robert Bork episode. Given the $3 million advance and the unfettered access that Morris had to Reagan, there was little excuse for this. Yet despite returning valiantly to Roosevelt in two subsequent volumes (without the weirdass fictitious asides), Morris has been given the Wittgenstein treatment (“That whereof we cannot speak, thereof we must remain silent”) by his peers and his colleagues. And I don’t understand why. Morris, much like Kristen Roupenian quite recently, seems to have been needlessly punished for being successful and not living up to a ridiculous set of expectations. But The Rise of Theodore Roosevelt, which rightfully earned the Pulitzer Prize, makes the case on its own merits that Morris is worthy of our time, our consideration, and our forgiveness and that the great Theodore Roosevelt himself is still a worthwhile figure for contemporary study.

Next Up: Martin Luther King’s Why We Can’t Wait!

Studies in Iconology (Modern Library Nonfiction #80)

(This is the twentieth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Face of Battle.)

Titian’s Sacred and Profane Love (pictured above) is one of my favorite paintings of the 16th century, in large part because its unquestionable beauty is matched by its bountiful and alluring enigma. We see two versions of love at opposing ends of a fountain — one nearly naked without apology, but still partially clad in a windswept dark salmon pink robe and holding an urn of smoke as she languorously (and rebelliously?) leans on the edge of a fountain; meanwhile the other Love sits in a flowing white gown on the other end, decidedly more dignified, with concealed legs that are somehow stronger and more illustrious than her counterpart, and disguising a bowl that, much like the Kiss Me Deadly box or the Pulp Fiction suitcase, could contain anything.

We know that the Two Loves are meant to coexist because Titian is sly enough to imbue his masterpiece with a sartorial yin-yang. Profane Love matches Sacred with a coiled white cloth twisting around her waist and slipping down her left leg, while Sacred has been tinctured by Profane’s pink with the flowing sleeve on her right arm and the small slipper on her left foot. Meanwhile, Cupid serves as an oblivious and possibly mercenary middleman, his arm and his eyes deeply immersed in the water and seemingly unconcerned with the Two Loves. We see that the backdrops behind both Loves are promisingly bucolic, with happy rabbits suggesting prolific promiscuity and studly horsemen riding their steeds with forelegs in the air, undoubtedly presaging the stertorous activity to commence sometime around the third date.

Sacred’s backdrop involves a castle situated on higher ground, whereas Profane’s is a wider valley with a village, a tableau that gives one more freedom to roam. The equine motif carries further on Sacred’s side with a horse prancing from Sacred to Profane in the marble etching just in front of the fountain, while Profane’s side features equally ripe rapacity, a near Fifty Shades of Grey moment where a muscled Adonis lusts over a plump bottom, hopefully with consensual limits and safewords agreed upon in advance. Titian’s telling takeaway is that you have to accept both the sublime and the salacious when you’re in love: the noble respect and vibrant valor that you unfurl upon your better half with such gestures as smoothing a strand of hair from the face along with the ribald hunger for someone who is simultaneously desirable and who could very well inspire you to stock up on entirely unanticipated items that produce rather pleasurable vibrations.

There are few works of art that are so dedicated to such a dichotomous depiction of something we all long for. And Titian’s painting endures five centuries later because this Italian master was so committed to minute details that, rather incredibly, remain quite universal about the human condition.

But what the hell does it all mean? We can peer into the canvas for hours, becoming intoxicated by Titian’s fascinating ambiguities. But might there be more helpful semiotics to better grasp what’s going on? Until I read Panofsky’s Studies in Iconology, I truly had no clue that Titian had been influenced by Bembo’s Asolani or that the Two Loves were a riff on Cesare Ripa’s notion of Eternal Bliss and Transient Bliss, which was one of many efforts by the Neoplatonic movement to wrestle with a human state that occupied two modes of shared existence. Panofsky also helpfully points out that Cupid’s stirring of the fountain water was a representation of love as “a principle of cosmic ‘mixture,’ act[ing] as an intermediary between heaven and earth” and that the fountain can also be looked upon as a revived sarcophagus, meaning that we are also looking at life and love springing from a coffin. And this history added an additional context for me to expand my own quasi-smartypants, recklessly dilletantish, and exuberantly instinctive appreciation of Titian. In investigating iconology, I recalled my 2016 journey into The Golden Bough (ML NF #90), in which Frazer helpfully pointed to the symbolic commonality of myths and rituals throughout multiple cultures and across human history, and, as I examined how various symbolic figures morphed over time, I became quite obsessed with Father Time’s many likenesses (quite usefully unpacked by Waggish‘s David Auerbach).

Any art history student inevitably brushes up against the wise and influential yet somewhat convoluted views of Erwin Panofsky. Depending upon the degree to which the prof resembles Joseph Mengele in his teaching style, there is usually a pedagogical hazing in which the student is presented with “iconology” and “iconography.” The student winces at both words, nearly similar in look and sound, and wonders if the distinction might be better understood after several bong hits and unwise dives into late night snacks, followed by desperate texts to fellow young scholars that usually culminate in more debauchery which strays from understanding the text. Well, I’m going to do my best to explicate the difference right now.

The best way to nail down what iconography entails is to think of a painting purely in terms of its visuals and what each of these elements means. Some obvious examples of iconography in action is the considerable classroom time devoted to interpreting the green light at the end of The Great Gatsby or the endless possibilities contained within the Mona Lisa‘s smile. It is, in short, being that vociferous museum enthusiast pointing at bowls and halos buried in oil and doing his best to impress with his alternately entertaining and infuriating interpretations. All this is, of course, fair game. But Panofsky is calling for us to think bigger and do better.

Enter iconology, which is more specifically concerned with the context of this symbolism and the precise technical circumstances and historical influences that created it. Let me illustrate the differences between iconography and iconology using Captain James T. Kirk from Star Trek.

Here are the details everyone knows about Kirk. He is married to his ship. He is a swashbuckling adventurer who gets into numerous fights and is frequently seen in a torn shirt. He is also a nomadic philanderer, known to swipe right and hookup with nearly every alien he encounters. (In the episode “Wink of an Eye,” there is a moment that somehow avoided the censors in which Kirk was seen putting on his boots while Deela brushes her hair.) This is the iconography of Kirk that everyone recognizes.

But when we begin to examine the origins of these underlying iconographic qualities, we begin to see that there is a great deal more than a role popularized by William Shatner through booming vocal delivery, spastic gestures, and an unusual Canadian hubris. When Gene Roddenberry created Star Trek, he perceived Captain Kirk as “Horatio Hornblower in Space.” We know that C.S. Forester, author of the Hornblower novels, was inspired by Admiral Lord Nelson and a number of heroic British authors who fought during the Napoleonic Wars. According to Bryan Perrett’s The Real Hornblower, Forester read three volumes of The Naval Chronicle over and over. But Forester eventually hit upon a trope that he identified as the Man Alone — a solitary individual who relies exclusively on his own resources to solve problems and who carries out his swashbuckling, but who is wedded to this predicament.

Perhaps because the free love movement of the 1960s made the expression of sexuality more open, Captain Kirk was both a Man Alone and a prolific philanderer. But Kirk was fundamentally married to his ship, the Enterprise. In an essay collected in Star Trek as Myth, John Shelton Lawrence ties this all into a classic American monomyth, suggesting that Kirk also represented

…sexual renunciation, a norm that reflects some distinctly religious aversions to intimacy. The protagonist in some mythical sagas must renounce previous sexual ties for the sake of their trials. They must avoid entanglements and temptations that inevitably arise from satyrs, sirens, or Loreleis in the course of their travels…The protagonist may encounter sexual temptation symbolizing ‘that pushing, self-protective, malodorous, carnivorous, lecherous fever which is the very nature of the organic cell,’ as Campbell points out. Yet the ‘ultimate adventure’ is the ‘mystical marriage…of the triumphant hero-soul with the Queen Goddess” of knowledge.

All of a sudden, Captain Kirk has become a lot more interesting! And moments such as Kirk eating the apple in Star Trek II: The Wrath of Khan suddenly make more sense beyond the belabored Project Genesis metaphor. We now see how Roddenberry’s idea of a nomad philanderer and Forester’s notion of the Man Alone actually takes us to a common theme of marriage with the Queen Goddess of the World. One could very well dive into the Kirk/Hornblower archetype at length. But thanks to iconology, we now have enough information here to launch a thoughtful discussion — ideally with each of the participants offering vivacious impersonations of William Shatner — with the assembled brainiacs discussing why the “ultimate adventure” continues to crop up in various cultures and how Star Trek itself was a prominent popularizer of this idea.

Now that we know what iconology is, we can use it — much as Panofsky does in Studies in Iconology — to understand why Piero di Cosimo was wilder and more imaginative than many of his peers. (And for more on this neglected painter, who was so original that he even inspired a poem from Auden, I recommend Peter Schjeldahl’s 2015 New Yorker essay.) Panofsky points out how Piero’s The Finding of Vulcan on Lemnos (pictured above) differs in the way that it portrays the Hylas myth, whereby Hylas went down to the river Ascunius to fetch some water and was ensnared by the naiads who fell in love with his beauty. (I’ve juxtaposed John William Waterhouse’s Hylas and the Nymphs with Piero so that you can see the differences. For my money, Piero edges out Waterhouse’s blunter version of the tale. But I also chose the Waterhouse painting to protest the Manchester Art Gallery’s passive-aggressive censorship from last year. You can click on the above image to see a larger version of both paintings.) For one thing, Piero’s painting features no vase or vessel. There is also no water or river. The naiads are not seductive charmers at all, but more in the Mean Girls camp. And Hylas himself is quite helpless. (The naiad patting Hylas on the head is almost condescending, which adds a macabre wit to this landlocked riff.) Piero is almost the #metoo version of Hylas to Waterhouse’s more straightforward patriarchal approach. And it’s largely because not only did Piero have a beautifully warped imagination, but he was relying, like many Renaissance painters, upon post-classical commentaries rather than the direct source of the myths themselves. And we are able to see how a slight shift in an artist’s inspiration can produce a sui generis work of art.

Panofsky is on less firm footing when he attempts to apply iconology to sculptures and architecture. His attempts to ramrod Michelangelo into the Neoplatonic school were unpersuasive to me. In analyzing the rough outlines of a monkey just behind two of Michelangelo’s Slaves (the “dying” and the “rebellious” ones) in the Louvre, Panofsky rather simplistically ropes the two slaves into a subhuman class and then attempts to suggest that Ficino’s concept of the Lower Soul — which is a quite sophisticated concept — represents the interpretive smoking gun. This demonstrates the double-edged sword of iconology. It may provide you a highly specific framework for which to reconsider a great work of art, but it can be just as clumsily mistaken for the absolute truth as any lumbering ideology.

Then again, unless you’re an insufferable narcissist who needs to be constantly reminded how “right” you are, it’s never any fun to discuss art and ideas with people who you completely agree with. Panofsky’s impact on art analysis reminds us that iconology is one method of identifying the nitty-gritty and arguing about it profusely and jocularly for hours, if not decades or centuries.

Next Up: Edmund Morris’s The Rise of Theodore Roosevelt!

The Face of Battle (Modern Library Nonfiction #81)

(This is the twentieth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Strange Death of Liberal England.)

Thy fanes, thy temples to the surface bow,
Commingling slowly with heroic earth,
Broke by the share of every rustic plough:
So perish monuments of mortal birth,
So perish all in turn, save well-recorded worth;

— Lord Byron, Child Harolde’s Pilgrimage

I must confess from the outset that the study of armed human conflict, with its near Spartan fixation on tactics and statistics, has long filled me with malaise. It is among the least sexiest subjects that a polymath of any type can devote her attentions to, akin to cracking open a thick, limb-crushing tax code volume written in a way that obliterates all joy and finding a deranged pleasure within this mind-numbingly dull amalgam of numbers and turgid prose. As Margaret Atwood once quipped in a poem, the military historian says, “I don’t ask why, because it is mostly the same.” And when the song remains the same, why would anyone other than a ketamine fiend dance to it?

I’ve long pictured the military historian as some aging jingoistic white male whose idea of a good time involves blasting John Philip Sousa from a set of speakers that should be devoted to happening hip-hop: a lonely and humorless parasite who moves cast-iron figures across a threadbare map in some dusty basement, possibly talking to himself in a gruff tone that uncannily mimics Rod Steiger’s inebriated cadences. He seems overly enamored of the dry details of ordnance, mirthless arrows, and terrain circles. Perhaps he fritters away his time in some homebuilt shack far off the main artery of Interstate 76, ready to reproduce well-studied holes with his Smith & Wesson should any nagging progressive come to take away his tattered Confederate flag or any other paleolithic memorabilia that rattles his martial disposition. But let’s say that such a man is committed to peace. Then you’re left with his soporific drone as he dodders on about some long dead general’s left flank attack in the most unpalatable ramble imaginable. He prioritizes a detached tabulative breakdown over the more palpable and poignant truths that motivates men. He doesn’t seem to care about how a soldier experiences trauma or summons bravery in impossible conditions, or how these battles permanently alter nations and lives. The military historian is, in Napoleonic short, a buzz killer despite his buzz cut. Indeed, military history is so embarrassing to read and advocate that, only a few weeks ago, I was forced to hide what I was reading when a woman started flirting with me at a bar. (I sheepishly avoided revealing the title to her for fifteen minutes. Nevertheless, she persisted. And upon seeing The Face of Battle, the woman in question rightfully headed for the hills, even after I offered to buy her a drink.)

There are quite a few military history books on the Modern Library list. So I’m more or less fucked. It is not that war itself does not interest me. Human beings have been fighting each other since the beginning of time and only a soulless anti-intellectual fool resolutely committed to the vulgar act of amusing himself to death would fail to feel anything pertaining to this flaw in the human makeup. The podcaster Dan Carlin, who specializes in military history, is one of the few people who I can listen to in this medium for many hours and remain completely enthralled. But that is only because Carlin is incredibly skilled at showing how the paradigm shifts of war influence our everyday lives. Christopher Nolan’s Dunkirk was a remarkable film that hurled its audience into the dizzying depths of war, but this is merely a vicarious sensory experience. I can get behind Paul Fussell’s The Great War and Modern Memory (ML NF #75) because of that book’s cogent observations on how war influenced literary culture. Neil Sheehan’s A Bright Shining Lie (ML NF #84) remains a journalistic masterpiece that I very much admire — in large part because of its razor-sharp commitment to human psychology, which in turn allows us to understand the miasmic madness of making tactical decisions (see that book’s incredible “Battle of Ap Bac” chapter). But I’d hesitate to categorize either of these two brilliant volumes within the exacting genre of unadulterated military history. I’ve always had the sense that there’s an underlying bellicosity, if not an outright advocacy of warfare, with books that are exclusively dolled up in camo.

So upon reading The Face of Battle, it was something of a relief to see that John Keegan was up front from the get-go about what military history fails to do, and why accounts of battles are so problematic. He begins the book saying that he has never seen or been in a battle. And this is a hell of a way to open up a book that professes to give us the lowdown on what war is all about. It is a genuinely humble statement from someone who has made a career out of being an expert. He openly points to military history’s major weakness: “the failure to demonstrate connection between thought and action.” “What of feeling?” I thought as I read this sentence. According to Keegan, historians need to keep their emotions on a leash. And the technical example he cites — the British Official History of the First World War — is an uninspiring passage indeed. So what is the historian to do? Quote from the letters of soldiers. But then Keegan writes, “The almost universal illiteracy, however, of the common soldier of any country before the nineteenth makes it a technique difficult to employ.” Ugh. Keegan!

From Ilya Berkovich’s Motivation in War: The Experience of Common Soldiers in Old-Regime Europe:

Considering the social origins of most eighteenth-century soldiers, one might think that literate soldiers were uncommon. However, literacy among the lower classes in old-regime Europe was becoming less exceptional. It is estimated that up to 40 per cent of the labouring poor in Britain were literate. Between 1600 and 1790, the portion of French bridegrooms singing their parish records doubled to about half of the total male population. Interestingly, the corresponding figures in northern and eastern frontier regions, which provided most French recruits, were much higher, with some areas coming close to universal literacy. Literacy rates in the Holy Roman Empire fluctuated widely, yet it is telling that over 40 per cent of the day labourers in mid-century Coblenz were able to sign their names. In rural East Prussia, one of the poorest regions in Germany, comparable figures were reached in 1800, although this was still a fourfold increase compared to only half-a-century before….

And so on. Fascinating possibilities for scholarship! It seems to me that someone here did not want to roll up his sleeves and get his hands dirty.

You see the problems I was having with this book. On one hand, Keegan wants to rail against the limitations of military history (and he should! you go, girl!). On the other hand, he upholds the very rigid ideas that stand against the execution of military history in a satisfying, fact-based, and reasonably emotional way that allows voluble chowderheads like me an entry point.

But that’s not the main focus of this book. Keegan settles upon three separate events — the Battle of Agincourt on October 25, 1415, the Battle of Waterloo on June 18, 1815, and the first day of battle on the Somme (July 1, 1916) — to seek comparisons, commonalities, and various parallels that we might use to understand military mechanics. He is duly reportorial in each instance, but overly fond of taxonomy rather than tangibility. Still, there are moments when Keegan’s bureaucratic obsessiveness are actually interesting — such as his examination of British archers and infantry running up against French cavalry during Agincourt. After all, if a horse is charging its way into a man, either the horse is going to run away, men are going to be knocked down, or there’s going to be a “ripple effect” causing open pockets on each side of the horse. So it’s actually quite extraordinary to consider how the French got their asses kicked with such a clear advantage. Well, the British did this with stakes, which impaled the horses. And the threat of this obstacle caused the French to retreat with their backs to the British, resulting in archers lobbing arrows into their vertebrae.

Keegan informs us that “the force of unavoidable circumstances” sealed the fate of the French and allowed Henry V to win at Agincourt. When Keegan gets to Waterloo, we see a similar approach adopted by Napoleon near the end. Large crowds of French infantry rushed towards the British line, landing within mere yards. The two armies exchanged fire and the French, at a loss of what to do, turned around and fled. This was not an altogether smart strategy, given the depleting reserves that Napoleon had at his disposal. But it does eloquently demonstrate that battles tend to crumble once one side has entered an unavoidable choice. The rush of men on both sides at the Somme in 1916, of course, in the trenches not only escalated this to an unprecedented scale of atrocity, but essentially laid down the flagstones for the 20th century’s practice of mutually assured destruction.

These are vital ideas to understand. Still, I’m not going to lie. Keegan was, in many ways, dull and soporific — even for a patient reader like me. I learned more about Henry V’s campaign by reading Juliet Barker’s excellent volume Agincourt, which not only unpacked the incredible logistics of invading northwestern France with engrossing aplomb but also juxtaposed this campaign against history and many vital realities about 15th century life. And a deep dive into various World War I volumes (I especially recommend Richard Aldington’s surprisingly ribald novel, Death of a Hero) unveiled a lot of unanticipated sonic transcriptions that inspired me to draft an audio drama script that I hope to produce in a few years. Keegan is certainly helpful in a dry intellectual manner — the equivalent of being served a dull dish of desiccated biscuits when you haven’t eaten anything for days; I mean, there’s a certain point in which you’ll gorge on anything — but he’s not the man who inspired me about battle. Hell, when one of the most boring and pretentious New Yorker contributors of all time espouses Keegan’s “matchlessly vivid pen,” you know there’s a reason to hide beneath your blanket. Keegan is undoubtedly on this list because nobody before him had quite unpacked war from the bottom-up approach rather than the general’s top-down viewpoint. But like most military historians, he didn’t have enough of a heart for my tastes. There’s a way to present a detailed fact-driven truth without being such a detached fussbucket about it. And we shall explore and exuberantly praise such virtuosic historians in future Modern Library installments!

Next Up: Erwin Panofsky’s Studies in Iconology!

The Strange Death of Liberal England (Modern Library Nonfiction #82)

(This is the nineteenth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Vermeer.)

It was a picnic-perfect summer in 1914. The rich flaunted their wealth with all the subtlety of rats leaping onto a pristine wedding dress. The newspapers steered their coverage away from serious events to pursue lurid items about sports and celebrity gossip. A comic double act by the name of Collins & Harlan recorded an absurd ditty called “Aba Daba Honeymoon,” which Thomas Pynchon was to describe fifty years later as “the nadir of all American expression.” Few human souls twirling their canes and parasols in these conditions of unbridled frivolity could have anticipated that an archduke’s assassination in late June would plunge Europe into a gruesome war that would leave twenty million dead, permanently altering notions of honor, bloodshed, and noblesse oblige.

But even a few years before the July Crisis, there were strong signs in England that something was amiss. Politicians demonstrated a cataclysmic failure to read or address the natural trajectory of human progress. Women justly demanded the right to vote and were very willing to starve themselves in prison and burn down many buildings for it. Workers fought violently for fair wages, often locked into stalemates with greedy mining companies. They were intoxicated by a new militant brand of syndicalism from France then popularized by Georges Sorel. The atmosphere was one of increasing upheaval and escalated incoherence, even among the most noble-minded revolutionaries. The influx of gold from Africa inspired both lavish spending and an inflated currency. The liberals in power were supposed to stand up for the working stiffs who couldn’t quite meet the rising prices for boots and food and clothes with their take home pay. And much like today’s Democratic Party in the States, these tepid Parliamentary wafflers past their Fabian prime revealed a commitment to ineptitude over nuts-and-bolts pragmatism. They allowed the Tories to play them like rubes losing easy games of three-card monte. Amidst such madness, England became a place of oblivious tension not dissimilar to the nonstop nonsense that currently plagues both sides of the Atlantic. With the middle and upper classes keeping their heads in the clouds and their spirits saturated in moonbeam dreams and a bubble gum aura, is it any wonder that people were willing to incite war and violence for the most impulsive reasons?

George Dangerfield’s The Strange Death of Liberal England examines this crazed period between 1910 and 1914 with an exacting and quite entertaining poetic eye. Dangerfield, an erudite journalist who parlayed his zingy word-slinging into a teaching career, is somewhat neglected today, but his remarkable knack for knowing when to suggest and when to stick with the facts is worthy of careful study, a summation of the beautifully mordant touch he brought as a historian. He describes, for example, the “dismal, rattling sound” of Liberalism refusing to adjust to the times, and eloquently sends up the out-of-touch movement in a manner that might also apply to today’s neoliberals, who stubbornly refuse to consider the lives and needs of the working class even as they profess to know what’s best for them:

[I]t was just as if some unfortunate miracle had been performed upon its contents, turning them into nothing more than bits of old iron, fragments of intimate crockery, and other relics of a domestic past. What could be the matter? Liberalism was still embodied in a large political party; it enjoyed the support of philosophy and religion; it was intelligible, and it was English. But it was also slow; and it so far transcended politics and economics as to impose itself upon behaviour as well. For a nation which wanted to revive a sluggish blood by running very fast and in any direction, Liberalism was clearly an inconvenient burden.

Dangerfield knew when to let other people hang themselves by their own words. The infamous Margot Asquith, the starry-eyed socialite married to the Prime Minister who led England into World War I, is quoted at length from her letters to Robert Smillie, the brave union organizer who fought on behalf of the Miners’ Federation of Great Britain. Asquith, so fundamentally clueless about diplomacy, could not understand why meeting Smillie might be a bad idea given the tense negotiations.

I did feel that Dangerfield was unduly harsh on Sylvia Pankhurst, one of the key organizers behind the suffragette movement. His wry fixation upon Pankhurst’s indomitable commitment — what he styles “the fantastic Eden of militant exaltation” — to starvation and brutality from the police, all in the brave and honorable fight for women, may very well be a product of the 1930s boys’ club mentality, but it seems slightly cheap given how otherwise astute Dangerfield is in heightening just the right personality flaws among other key figures of the time. The Pankhurst family was certainly eccentric, but surely they were deserving of more than just cheap quips, such as the volley Dangerfield lobs as Christabel announces the Pankhurst withdrawal from the WSPU (“She made this long-expected remark quite casually — she might almost have been talking to the little Pomeranian dog which she was nursing.”).

Still, Dangerfield was the master of the interregnum history. His later volume, The Era of Good Feelings, examined the period between Jefferson and Jackson and is almost as good as The Strange Death. One reads the book and sees the model for Christopher Hitchens’s biting erudite style. (The book was a favorite of Hitch’s and frequently cited in his essays.)

But it is clear that Dangerfield’s heart and his mischievous vivacity resided with his homeland rather than the nation he emigrated to later in life. In all of his work, especially the material dwelling on the United Kingdom, Dangerfield knew precisely what years to hit, the pivotal moments that encapsulated specific actions that triggered political movements. As he chronicles the repercussions of the June 14, 1911 strike in Southampton, he is careful to remark upon how “it is impossible not to be surprised at the little physical violence that was done — only a few men killed, in Wales in 1912, and two or three in Dublin in 1913; in England itself not a death. Is this the effect of revolutionary methods, and, if so, do the methods deserve the word?” He then carries on speculating about the pros and cons of peaceful revolution and ties this into the “spiritual death and rebirth” of English character. And we see that Dangerfield isn’t just a smartypants funnyman, but a subtle philosopher who leaves human possibilities open to the reader. He is a welcome reminder that seeing the real doesn’t necessarily emerge when you lock eyes on an alluring Twitch stream or a hypnotic Instagram feed. It comes when you take the time to step away, to focus on the events that are truly important, and to ruminate upon the incredible progress that human beings still remain quite capable of making.

Next Up: John Keegan’s The Face of Battle!

Vermeer (Modern Library Nonfiction #83)

(This is the eighteenth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: A Bright Shining Lie.)

Johannes Vermeer was the Steph Curry of 17th century painters: a dazzling mack daddy who spent lengthy periods of his choppy forty-three year life layering lapis lazuli and ultramarine and madder lake onto some of the most beautiful paintings ever created in human history. To ask how he perfected the glowing pour of his domestic scenes through painstaking brush strokes is to court trouble. Did he do so through mirrors and lenses? Does the Hockney-Falco theory have any real bearing on appreciating his work? Vermeer famously left no record of how he achieved his elegant handwrought touch, which has left many to become obsessed with the question, even taking the trouble (as Tim Jennison, subject of the controversial Penn and Teller documentary, did) to learn Dutch, which is a maddening language by all reasonable standards.

The great mystery of how this genius mastered light purely by eye and through no apparent line work, all two centuries before the camera’s invention, has been taken up by such feverishly committed investigators as Philip Steadman, an architect who meticulously measured Vermeer’s interiors and constructed a one-sixth scale model of his room to uphold the theory that Vermeer used a camera obscura. For now, our attentions are with Lawrence Gowing, a self-taught art historian whose Vermeer obsession resulted in a highly useful and slyly passionate book, a short but smart volume bizarrely downplayed in The New York Times‘s Gowing obituary, but a title that the Modern Library judges were at least munificent enough to rank above the likes of Robert Caro eight years after Gowing kicked the proverbial bucket of paint.

Gowing frames Vermeer’s achievements by observing that this painter, unlike his 17th century Dutch peers Gabriël Metsu and Jan Steen, eschewed line and overt modelling work. Vermeer’s purity as an artist emerged with his curious pursuit of diffuse light at all costs. He remained quite impartial about how light spilled into his scenes. As Gowing notes, even a detail such as The Lacemaker‘s cushion tassels (pictured left) “have an enticing and baffling bluntness of focus.” In an age when anyone can instantly snap a picture to memorialize how light drifts into a room, this revolutionary approach cannot be understated, especially because Vermeer was confident enough in his aesthetic to push against the mercantile herd even as he served as head of the Guild of Saint Luke. In the seventeenth century, painters wanted to be noticed. They were, after all, artists with constantly grumbling bellies. So they tended to emphasize particular objects, even if it meant exaggerating the look, in an attempt to stand out. They might approach a patron and say, “Ha ha! I am Hendrik Van de Berg, the greatest painter of Maastricht! I have fifty thousand followers on…well, just imagine a world, preposterous as this may sound, in which short text messages determine your stature among peers and, yup, that would be me! Art King of Maastricht! Anyway, that nifty apple in the far right corner may look a little unnatural, but, dude, I think we can both agree that it really pops! And it will look good in your study while your starving servant polishes your boots and dreams of something to eat! Oh, I know you can’t pay your servants and that you are, in fact, fond of flogging them. But I am an artist and surely you can pay me! I’ll even throw in a complimentary whipping if you buy my work! Think of it as a patron reward!” Vermeer, by contrast, willfully blurred the apple. Vermeer’s peers in his hometown of Delift understood what he was doing, but the cost of being an artist was, alas, premature death due to exasperated financial stress.

Gowing’s gushing critical distinctions are a welcome reminder that it’s sometimes more important to know why art stands out rather than how it is created. The “No haters” crowd, fed on the soothing alfalfa sprouts of director’s commentaries and lengthy pop culture oral histories, would rather view Vermeer as a magician or a technical wizard than an artist. If Vermeer did use a camera obscura, he was certainly not the only Dutch painter doing so at the time. Gowing emphasizes that Vermeer’s style went above and beyond merely accumulating details. What should concern us is why he was so committed to the optical. What counts is Vermeer’s commitment to the visual experience: commonplace scenes that are somehow both radiant and persuasive depictions of reality. Gowing helpfully points out that any Vermeer investigation of life was never direct. The paintings were often established at an oblique angle. He singles out Vermeer’s “inhuman fineness of temper,” a tranquility that is quite extraordinary given that Vermeer was working with ten kids running around and the financial turmoil he had to endure.

Gowing is also very good at only drawing upon Vermeer’s biography when it is pertinent. Vermeer’s detachment and his slow output certainly hinges upon disappointments and setbacks he contended with during the last years of his life. Still, one only needs to look at Vermeer’s paintings to feel their somewhat passive but stirring view of humanity. Gowing distinguishes Vermeer from other painters by observing that “with the passivity characteristic of his thought, he accepted this part of his nature as the basis of the expressive content of his style.” Somehow Vermeer could inject his view on humanity purely through style. And somehow in this stylistic transformation, what seems passive is actually carefully rendered depth. Despite confining his paintings to two rooms, Gowing finds enough common qualities within these limitations for us to get a sense of what Vermeer was up to:

In only three of the twenty-six interiors that we have is the space between painter and sitter at all uninterrupted. In five of the others passage is considerably encumbered, in eight more the heavy objects interposed amount to something like a barrier and in the remaining ten they are veritable fortifications. It is hard to think that this preference tells us nothing about the painter’s nature. In it the whole of his dilemma is conveyed.

The book’s second part is more akin to descriptive liner notes for a must have box set and doesn’t quite match the first part’s perspicacity. But Gowing does provide several useful antecedents (such as Jan Van Bronkhorst’s The Procuress) that allow us to track Vermeer’s development as an artist. Again, because Vermeer didn’t leave much behind on his life or methods, it has been left for us to speculate on how he cultivated his exquisite style. But Gowing is too sharp a critic to be seduced by gossip and thankfully confines his findings to other paintings, showing us several paths leading us to Utrecht Caravaggism and trompe l’oeil.

I must warn you, however, that Gowing’s Vermeer, despite its ostensibly breezy length, will likely have you losing many hours studying Vermeer. What Gowing could not have foreseen is that his ruminations would be even more vital in a climate where some otherwise smart people believe that an ire-inducing and ill-considered think piece cobbled together in an hour constitutes serious thought.

Next Up: George Dangerfield’s The Strange Death of Liberal England!

A Bright Shining Lie (Modern Library Nonfiction #84)

(This is the seventeenth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: West with the Night.)

Young scrappy soldiers came to walk the villages and the jungles and the ricepaddies from all hopeful parts of America, itching to step into boots that matched the size and the bravery of their heroic fathers. They hungered to prove their manhood and their patriotism even as their spirits dwindled and their moral core dissipated as it became common knowledge that Vietnam was an unwinnable war. They came home in dishonor and disgrace, losers who had sacrificed their bodies and minds and souls in the name of failed American exceptionalism, and they were left to rot by their government and sometimes by their fellow citizens.

Much as the shellshocked men in World War I returned to their native soil facing similar indifference to their trauma and their pain, as memorably chronicled in Richard Aldington’s brutally mordant novel Death of a Hero, the men who served in Vietnam learned that the best years of their lives had been little more than a cruel joke, even when they defended napalm-soaking sorties that burned vast horrifying holes into villages and hospitals and fields and homes and schools that happened to be situated near a hopped up Huey often manned by a pilot who was losing his mind. Their collective shellshock was as commonplace as heartbreak and many dozen times more devastating. The Vietnam vets, who were all very brave and worthy of the same valor afforded the Greatest Generation (but never received their due), suffered PTSD and traumatic injuries and severe psychological damage. Every day of their lives after the war was a new battle against painful inner turmoil that spread to their families and their friends and their loved ones, stretching well beyond the poisoned polyester of the flapping American flag itself. It seemed that nobody wanted to hear their stories, much less any news about the one million civilians and Viet Cong soldiers who were slaughtered above the 17th parallel or the estimated 741,000 who died below it or the 312,000 people who died by direct order of various governments or the 273,000 Cambodians and the 62,000 Laotians.

They all died, and none of them needed to, because the conflict had escalated through the foggy hubris of war and the dogged jingoism of three U.S. Presidents and the exacting Pentagon number crunchers who believed they could will their analytical acumen into a guaranteed victory even when the truth was fudged and altered and far too frequently ignored and contemned. For all the Pentagon’s professed understanding, the imperious powers that be could not comprehend that the massive influx of American supplies would be plundered and reused by resourceful Viet Cong soldiers with a very long memory of history who learned how to take out Bell UH-1 helicopters and M-113 armored personnel carriers from the ground. They carried out the strategic hamlet program without providing basic needs to the very villagers who were supposed to be their allies. Most disastrously, the American interventionists severely underestimated the damage that the Ngo Dinh Diem regime was doing to South Vietnamese loyalty, culminating in the Buddhist Crisis of 1963, which persecuted religion in a manner shockingly similar to ongoing present-day American indignities against Muslims.

Somewhere between 1.5 million and 2.5 million people died in the Vietnam War. That’s close to the entire population of Chicago or the total population of Jamaica. It is the entire population of Nebraska. It is the combined population of Wyoming, Vermont, Washington D.C., and Alaska. It is the combined population of Iceland, Fiji, and Cyprus. It is a staggering and heartbreaking sum by any stretch of the imagination that should cause any human being to stop in his tracks and ponder how so much bloodshed could happen. Those who would blithely dismiss the study of all this as a priapic man’s game to keep close tabs on some completely insignificant item of celebrity gossip usually cannot comprehend the full scale of such unfathomable devastation and our duty to closely examine history so that such a bewildering bloodbath never happens again. And yet, even with the strong reception of Ken Burns’s recent documentary, the Vietnam War remains one of those subjects that Americans do not want to talk about, even when it epitomizes the toxic mix of Yankee Doodle Vanity, bureaucratic shortsightedness, savage masculinity, unchecked hypocrisy, credibility gaps, imperialist dishonesty, and cartoonish escalation of resources — all pernicious checkboxes that still mark American policy today.

We wouldn’t know of this American complicity without the invaluable work of reporters like Neil Sheehan and David Halberstam, who were raw and young and brash and sometimes foolhardy in their dispatches. It was undoubtedly their dogged free-wheeling approach, a fierce pursuit of journalistic truth that is unthinkable to such useless and unfathomably gullible New York Times company men like Richard Fausset and Peter Baker today, which caused Americans to ask questions of the war and that eventually led Daniel Ellsberg to release the Pentagon Papers (which Sheehan himself would later acquire for the New York Times in 1971). The quest for understanding, especially in the conflict’s early years, proved just as intoxicating to these sleep-deprived and overworked journos as it did to the soldiers who kept coming back for further tours of duty. All wondered why common sense had been so rashly and cheaply capitulated.

Sheehan and Halberstam followed in the footsteps of such famous war reporters as Francois Sully, Homer Bigart, Malcolm Browne, and Horst Faas. (William Prochnau’s book Once Upon a Distant War is an excellent and vivacious account of this period, although not without its minor liberties. A 1988 Neil Sheehan profile that Prochnau wrote for The Washington Post, offering some useful carryover material for his book, is also available online.) The two men arrived in Vietnam separately in 1962. They had both attended Harvard, but had arrived at the hallowed university through altogether different routes. Sheehan came from a working-class Irish background and lucked out with a scholarship. By the time Sheehan arrived in Saigon, he was a reformed alcoholic and a tortured man who had learned the fine art of carving extra hours out of any day, a talent he had honed while running a dairy farm as a kid. Sheehan worked for the penny-pinching UPI wire service and, much as a contemporary journalist is expected to write, shoot and cut video, and preserve his crisp telegenic form if he wishes to hold onto his job, he was often responsible for logistics extending well beyond the writing and transmission of copy.

Halberstam was a tall and lanky man from a middle-class Jewish background, but decidedly brasher than Sheehan. His trenchant reporting of civil rights struggles in the South attracted the notice of The New York Times‘s James Reston. Halberstam was a formidable if slipshod workhorse, banging out thousands of words per day that often had to be shoehorned into coherent shape by the exasperated Times team. But Halberstam’s reporting in the Congo was strong and gallant enough to land him in Saigon.

Sheehan and Halberstam would become friends and roommates, working very long days and often falling asleep at their typewriters. They chased any source that led them to demystify the war, but they were both seduced by a man named John Paul Vann, who became the subject of Sheehan’s journalistic masterpiece, A Bright Shining Lie. Halberstam would write two books from his Vietnam experience: The Making of a Quagmire, a short and useful 1965 volume that faded into obscurity within a decade, and The Best and the Brightest, a juicy and detailed top-down account of bureaucratic blunder that Stephen Bannon even pushed onto every member of the Trump transition team in February 2017 (as reported by the New York Times‘s Marc Tracy). But Neil Sheehan, who carried on with a quieter and more methodical approach than Halberstam’s gigantic and flagrant “us vs. them” style, rightly decided that more time and considerable rumination and careful reporting was the way in. He wisely believed that John Vann was the key to understanding American involvement and the mentality behind it. The book would consume sixteen years of Sheehan’s life. And for all the anguish that Sheehan suffered through that long and painful period, we are incredibly lucky to have it.

John Vann was a wildly energetic colonel from Norfolk, Virginia who could survive on four hours of sleep and sometimes none at all. He had built a military career on the “Vann luck.” He would willfully fly aircraft through a suicidal fusillade of fire and drive down dangerous roads that were known to be mined and patrolled by the Viet Cong. He would miraculously survive. Like Robert McNamara, he was very certain of how to win the war. But unlike McNamara, Vann did not rely on problematic data, but rather the know-how of knowing people and the pragmatic logistics that he picked up from his experience in the battlefield, often talking with and distributing candy to the South Vietnamese citizens suffering under the Diem regime. It was through such gestures that Vann avoided a few attempts on his life. Vann was savvy enough to court the trust and admiration of reporters like Sheehan and Halberstam pining for a few dependable truth bombs, to the point where the reporters pooled in their resources to buy him an engraved cigarette box when Vann left Vietnam the first time. But Vann would find a way back a few years later as an Agency for International Development official. He portrayed himself as a scrappy underdog whose candid bluster had prevented him from advancing to general, whose near twenty years of service and bravery and experience had simply not been heeded. But the truth of his checkered life, carefully concealed from many who knew him, told the real story.

Sheehan is both sensitive and meticulous in telling Vann’s take. We cannot help but admire Vann’s dogged work ethic and charisma in the book’s first section, as we see Vann attempting to bring the ARVN (the Army of the Republic of Vietnam, the South Vietnamese army known to recklessly attack insurgents under Diem) together with the then comparatively diminutive American presence in an attempt to win the war. Vann hoped to train the ARVN to better fight against the guerillas, but faced indifference from Huỳnh Văn Cao, an AVRN colonel to whom Vann was appointed an adviser. Cao often liked to don the bluster of a general. We see Vann being kind to the common soldiers, whether peasants or seasoned regulars, but we also see Vann as an egomaniac willing to overstep his rank to get results. One of Vann’s guides to negotiating the tricky turmoil of Vietnam was a 1958 novel called The Ugly American, which depicted American diplomats in a fictitious nation named Sarkhan that proved incredibly arrogant towards the culture, customs, and language of the people. The book would inspire Kennedy so much that he had sent copies of the book to every American Senator. (The Peace Corps would later become a Kennedy campaign talking point turned into a reality.) Vann would take an altogether different lesson from the book in attempting to turn Cao to his side by appealing to his ego and by flattering him. But in practice, Vann’s benign puppeteering as military command could result in disaster, such as a July 20, 1962 battle in the lower delta, in which Cao resisted Vann’s efforts to load helicopters with a second reserve to prevent Viet Cong soldiers from escaping by flatly declining the request. Such stalling allowed the Viet Cong more opportunities to pluck American ordnance, transforming .50 caliber machine guns into antiaircraft weapons through tireless ingenuity.

This communicative combativeness between the Americans and the ARVN would reach its nadir with the Battle of Ab Pac, which is one of the most gripping sections of Sheehan’s book. Vann would watch helplessly from a L-19 Bird Dog surveiling the battlefield as the AVRN delayed sending troops, not knowing that the Viet Cong had intercepted radio transmissions using stolen American equipment. This allowed the Viet Cong to strike hard and accurately against task forces that were effectively separated and caught adrift, leaving them open to attack. The American Hueys disregarded Vann’s orders and were hit by the Viet Cong. Vann, whose domineering tone could be off-putting, was unable to send M-113 carriers across the canals to save the remaining soldiers and reinforce the territory. Vann, increasingly desperate and flustered by the ARVN’s recalcitrance in advancing, approached Captain Ly Tong Ba, the ARVN man holding up support who said that he refused to take Americans, and ordered Robert Bays to “shoot that rotten, cowardly son of a bitch right now and move out.” The battle became the Viet Cong’s first major victory.

By presenting the facts in this manner, Sheehan leaves us with many lingering questions. Was Vann a somewhat more informed version of American interventionist arrogance? Was American might, in Vann’s obdurate form, needed to atone for serious deficiencies from Diem and the ARVN? Even if the ARVN had permitted the Americans to have more of a commanding hand, would not the Viet Cong have eventually secured a victory comparable to Ab Pac? Even at this stage in the book, Vann remains strangely heroic and we can sympathize with his frustration. But in allowing us to vicaroiusly identify with Vann, Sheehan slyly implicates the reader in the desire to win by any means necessary.

And then Sheehan does something rather amazing in his portrait of Vann. In a section entitled “Taking on the System,” he broadens the scope to the soldiers and the command contending with Vann’s aggressiveness (while likewise exposing the hubris of civilian leadership under McNamara, along with the bomb-happy pacification strategy of Victor Krulak and the foolhardy optimism of MACV commander Paul Harkins). And we begin to see that the Vietnam quagmire, like any intense battle for victory and power, was absolutely influenced by strong and truculent personalities, which young reporters like Halberstam and Sheehan were rightfully challenging. Unable to get the top dogs to understand through meetings and communiques, Vann began to weaponize the press against Harkin’s reality distortion field — this as the Diem regime’s increasing persecution of the Buddhists revealed the vast fissures cracking into South Vietnamese unity. Sheehan begins to insert both Halberstam and himself more into the narrative. With Vann now retired from the Army, we are rightly left to wonder if he was indeed as indispensable as many believed him to be.

But then Sheehan backtracks to Vann’s past. And we begin to see that he had been living a lie. He pulled himself from an impoverished Virginian upbringing, where he was an illegitimate child raised by a wanton alcoholic mother, and married a respectable woman named Mary Jane. But while stationed as an Army officer, he cultivated a taste for underage girls and hushed up both his numerous affairs and the allegations, even persuading Mary Jane to lie for him during a court-martial for statutory rape and adultery while also training himself to pass a lie detector test. While stationed in Vietnam the second time, Vann could not control his sexual appetite. He carried on numerous affairs, devoting his attentions quite ardently with two mistresses who were half his age, one of whom had his child, and keeping the two women largely in the dark about each other for a sustained period. His predatory behavior presents itself as a bigger lie more unsettling than the Harkin-style prevarications that resulted in needless deaths.

In the end, the “Vann luck” could not hold out. His death in 1972, at least as portrayed by Sheehan, is almost anticlimactic: the result of a helicopter crashing into a series of trees. As Vietnam changed and the American presence grew with unmitigated enormity, Vann’s apparent know-how could not penetrate as an AID commander, even though Sheehan depicts Vann having many adventures.

A Bright Shining Lie isn’t just an epic history of Vietnam. It also reveals the type of conflicted and deeply flawed American personality that has traditionally been allowed to rise to the top, influencing key American decisions, for better or worse. I read the book twice in the last year and, particularly in relation to Vann’s obstinacy and his abuse of women, I could not help but see Donald Trump as a more cartoonish version of Vann’s gruff and adamantine bluster. But the present landscape, as I write these words near the end of 2017, a year that has carried on with an endless concatenation of prominent names revealed as creeps and abusers of power, is now shifting to one where a masculine, wanton, and ultimatum-oriented approach to command is no longer being tolerated. And yet, even after war has devastated a nation through such a temperament, it is possible for those who are ravaged by violence to be forgiving. In 1989, Sheehan returned to Vietnam for two profiles published in The New Yorker (these are collected in the volume After the War Was Over). In his trip to North Vietnam, Sheehan is baffled by the farmers and the villagers showing no bad blood to Americans:

I encountered this lack of animosity everywhere we went in the North and kept asking for an explanation. The first offered was that the Vietnamese had never regarded the entire American people as their enemy. The American government — “the imperialists” — had been the enemy; other Americans, particularly the antiwar protesters, had been on the Vietnamese side. This did not seem explanation enough for people like the farmer on the road to Lang Son. He had suffered dearly at the hands of Americans who had not been an abstract “imperialist” entity. One afternoon in a village near Haiphong, when Susan and I were with Tran Le Tien, our other guide-interpreter, we were received with kindness by a family who lost a son in the South. On the way back home onto Hanoi I said to Tien that thee had to be more to this attitude than good Americans versus bad Americans. “It’s the wars with China,” Tien said. I decided he was right.

In other words, the enemy in war is the one that has most recently caused the greatest devastation. While the North Vietnamese’s forgiving character is quite remarkable in light of the casualties, perhaps it’s also incumbent upon all nations to be on the lookout for the character flaws in failed men who lead us into failed wars so that nothing like this ever has to happen again. Men do not have all the answers they often claim to possess, even when they look great on paper.

Next Up: Lawrence Gowing’s Vermeer!

West with the Night (Modern Library Nonfiction #85)

(This is the sixteenth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: This Boy’s Life.)

She remains a bold and inspiring figure, a galvanizing tonic shimmering into the empty glass of a bleak political clime. She was bright and uncompromising and had piercingly beautiful eyes. She was a stratospheric human spire who stood tall and tough and resolute above a patriarchal sargasso. Three decades after her death, she really should be better known. Her name is Beryl Markham and this extraordinary woman has occupied my time and attentions for many months. She has even haunted my dreams. Forget merely persisting, which implies a life where one settles for the weaker hand. Beryl Markham existed, plowing through nearly every challenge presented to her with an exquisite equipoise as coolly resilient as the Black Lives Matter activist fearlessly approaching thuggish cops in a fluttering dress. I have now read her memoir West with the Night three times. There is a pretty good chance I will pore through its poetic commitment to fate and feats again before the year is up. If you are seeking ways to be braver, West with the Night is your guidebook.

She grew up in Kenya, became an expert horse trainer, and befriended the hunters of her adopted nation, where she smoothly communed with dangerous animals. For Markham, the wilderness was something to be welcomed rather than dreaded. Her natural panorama provided “silences that can speak” that were pregnant with natural wonder even while being sliced up by the cutting whirl of a propeller blade. But Markham believed in being present well before mindfulness became a widely adopted panacea. She cultivated a resilient and uncanny prescience as her instinct galvanized her to live with beasts and brethren of all types. It was a presence mastered through constant motion. “Never turn your back and never believe that an hour you remember is a better hour because it is dead,” wrote Markham when considering how to leave a place where one has lived and loved. This sentiment may no longer be possible in an era where one’s every word and move is monitored, exhumed by the easily outraged and the unadventurous for even the faintest malfeasance, but it is still worth holding close to one’s heart.

In her adult life, Markham carried on many scandalous affairs with prominent men (including Denys Finch Hatton, who Markham wooed away from Karen Blixen, the Danish author best known for Out of Africa (to be chronicled in MLNF #58)) and fell almost by accident into a life commanding planes, often scouting landscapes from above for safari hunts. Yet Markham saw the butcherous brio for game as an act of impudence, even as she viewed elephant hunting as no “more brutal than ninety per cent of all other human activities.” This may seem a pessimistic observation, although Markham’s memoir doesn’t feel sour because it always considers the world holistically. At one point, Markham writes, “Nothing is more common than birth: a million creatures are born in the time it takes to turn this page, and another million die.” And this grander vantage point, which would certainly be arrived at by someone who viewed the earth so frequently from the sky, somehow renders Markham’s more brusque views as pragmatic. She preferred the company of men to women, perhaps because her own mother abandoned her at a very young age. Yet I suspect that this fierce lifelong grudge was likely aligned with Markham’s drive to succeed with a carefully honed and almost effortlessly superhuman strength.

Markham endured pain and fear and discomfort without complaint, even when she was attacked by a lion, and somehow remained casual about her vivacious life, even after she became the first person to fly solo without a radio in a buckling plane across the Atlantic from east to west, where she soldiered on through brutal winds and reputational jeers from those who believed she could not make the journey. But she did. Because her habitually adventurous temperament, which always recognized the importance of pushing forward with your gut, would not stop her. And if all this were not enough, Markham wrote a masterpiece so powerful that even the macho egotist Ernest Hemingway was forced to prostrate himself to editor Maxwell Perkins in a letter: “She has written so well, and marvelously well, that I was completely ashamed of myself as a writer.” (Alas, this did not stop Hemingway from undermining her in the same paragraph as “a high-grade bitch” and “very unpleasant” with his typically sexist belittlement, a passage conveniently elided from most citations. Still, there’s something immensely satisfying in knowing that the bloated and overly imitated impostor, who plundered Martha Gellhorn’s column inches in Collier’s because he couldn’t handle his own wife being a far superior journalist, could get knocked off his peg by a woman who simply lived.)

In considering the human relationship to animals, Markham writes, “You cannot discredit truth merely because legend has grown out of it.” She details the beauty of elephants going out of their way to hide their dead, dragging corpses well outside the gaze of ape-descended midgets and other predators. And there is something about Markham’s majestic perspective that causes one to reject popular legends, creating alternative stories about the earth that are rooted in the more reliable soil of intuitive and compassionate experience. For Markham, imagination arrived through adventure rather than dreams. She declares that she “seldom dreamed a dream worth dreaming again, or at least none worth recording,” yet the fatigue of flying does cause her to perceive a rock as “a crumpled plane or a mass of twisted metal.”

Yet this considerable literary accomplishment (to say nothing of Markham’s significant aviation achievements) has been sullied by allegations of plagiarism. It was a scandal that caused even The Rumpus‘s Megan Mayhew Bergman to lose faith in Markham’s bravery. Raoul Schumacher, Markham’s third husband, was an alcoholic and a largely mediocre ghost writer who, much like Derek Stanford to Muriel Spark, could not seem to countenance that his life and work would never measure up to the woman he was with. Fragile male ego is a most curious phenomenon that one often finds when plunging into the lives of great women: not only are these women attracted to dissolute losers who usually fail to produce any noteworthy work of their own, but these men attempt to make up for their failings by installing or inventing themselves as collaborators, later claiming to be the indispensable muse or the true author all along, which is advantageously announced only after a great woman has secured her success. Biographers and critics who write about these incidents years later often accept the male stories (one rarely encounters this in reverse), even when the details contain the distinct whiff of a football field mired in bullshit.

I was not satisfied with the superficial acceptance of these rumors by Wikipedia, Robert O’Brien, and Michiko Kakutani. So I took it upon myself to read two Markham biographies (Mary S. Lovell’s Straight on Till Morning and Errol Trzebinski’s The Lives of Beryl Markham), where I hoped that the sourcing would offer a more reliable explanation.

I discovered that Trzebinski was largely conjectural, distressingly close to the infamous Kitty Kelley with her scabrous insinuations (accusations of illiteracy, suggestions that Markham could not pronounce words), and that Lovell was by far the more doggedly reliable and diligent source. Trzebinski also waited until many of the players were dead before publishing her biography, which is rather convenient timing, given that she relies heavily on conversations she had with them for sources.

The problem with Schumacher’s claim is that one can’t easily resolve the issue by going to a handwritten manuscript. West with the Night‘s manuscript was typed, dictated to Schumacher by Markham (see the above photo). The only photograph I have found (from the Lovell biography) shows Markham offering clear handwritten edits. So there is little physical evidence to suggest that Schumacher was the secret pilot. We have only his word for it and that of the friends he told, who include Scott O’Dell. Trzebinski, who is the main promulgator of these rumors, is slipshod with her sources, relying only upon a nebulous “Fox/Markham/Schumacher data” cluster (with references to “int. the late Scott O’Dell/James Fox, New York, April 1987” and “15/5/87” — presumably the same material drawn upon for James Fox’s “The Beryl Markham Mystery,” which appeared in the March 1987 issue of Vanity Fair, as well as a Scott O’Dell letter that was also published in the magazine) that fails to cite anything specific and relies on hearsay. When one factors in an incredulous story that Trzebinski spread about her own son’s death that the capable detectives at Scotland Yard were unable to corroborate, along with Trzebinski’s insistence on camera in the 1986 documentary World Without Walls that only a woman could have written West with the Night, one gets the sense that Trzebinski is the more unreliable and gossipy biographer. And Lovell offers definitive evidence which cast aspersions on Tzrebinski’s notion that Markham was something of a starry-eyed cipher:

But this proof of editing by Raoul, which some see as evidence that Beryl might not have been the sole author of the book, surely proved only that he acted as editor. Indeed his editing may have been responsible for the minor errors such as the title arap appearing as Arab. Together with the Americanization of Beryl’s Anglicized spelling, such changes could well have been standard editorial conversions (by either Raoul or Lee Barker – Houghton Mifflin’s commissioning editor) for a work aimed primarily at an American readership.

The incorrect spelling of Swahili words has an obvious explanation. In all cases they were written as Beryl pronounced them. She had learned the language as a child from her African friends but had probably never given much thought to the spelling. Neither Raoul nor anyone at Houghton Mifflin would have known either way.

In his letter to Vanity Fair, and in two subsequent telephone conversations with me, Scott O’Dell claimed that after he introduced Beryl and Raoul “they disappeared and surfaced four months later,” when Raoul told him that Beryl had written a memoir and asked what they should do with it. This is at odds with the surviving correspondence and other archived material which proves that the book was in production from early 1941 to January 1942, and that almost from the start Beryl was in contact with Lee Barker of Houghton Mifflin.

When Raoul told his friend that it was he who had written the book, could the explanation not be that he was embittered by his own inability to write without Beryl’s inspiration? That he exaggerated his editorial assistance into authorship to cover his own lack of words as a writer?

From the series of letters between Beryl and Houghton Mifflin, it is clear that Beryl had sent regular batches of work to the publishers before Raoul came into the picture. As explained earlier, Dr. Warren Austin lived in the Bahamas from 1942 to 1944, was physician to HRH the Duke of Windsor and became friends with Major Gray Phillips. Subsequently Dr. Austin lived for a while with Beryl and Raoul whilst he was looking for a house in Santa Barbara. The two often discussed their mutual connections in Raoul’s presence. Dr. Austin is certain that Raoul had never visited the Bahamas, reasoning that it would certainly have been mentioned during these conversations if he had. This speaks for itself. If Raoul was not even present when such a significant quantity of work was produced, then that part – at the very least – must have been written by Beryl.

Lovell’s supportive claims have not gone without challenge. James Fox claimed in The Spectator that he had seen “photostated documents, from the trunk since apparently removed as ‘souvenirs’ and thus not available to Lovell, which show that Schumacher took part in the earliest planning of the contents and the draft outline for the publisher and show whole passages written by Schumacher in handwriting.” But even he is forced to walk the ball back and claim that this “proves nothing in terms of authorship.” Since Fox is so fixated on “seeing” evidence rather than producing it, he may as well declare that he visited Alaska and could see Russia from his AirBnB or that he once observed giant six-legged wombats flying from the deliquescent soup he had for supper. If this is the “Fox/Markham/Schumacher data” that Trzebinski relied upon, then the plagiarism charge is poor scholarship and poor journalism indeed.

So I think it’s safe for us to accept Markham’s authorship unless something provable and concrete appears and still justifiably admire a woman who caused Hemingway to stop in his tracks, a woman who outmatched him in insight and words, a woman – who like many incredible women – was belittled by a sloppy, gossip-peddling, and opportunistic biographer looking to make name for herself (and the puff piece hack who enabled her) rather than providing us with the genuine and deserved insight on a truly remarkable figure of the 20th century.

Next Up: Neil Sheehan’s A Bright Shining Lie!

A Mathematician’s Apology (Modern Library Nonfiction #87)

(This is the fourteenth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Six Easy Pieces.)

mlnf87Clocking in at a mere ninety pages in very large type, G.H. Hardy’s A Mathematician’s Apology is that rare canapé plucked from a small salver between all the other three-course meals and marathon banquets in the Modern Library series. It is a book so modest that you could probably read it in its entirety while waiting for the latest Windows 10 update to install. And what a bleak and despondent volume it turned out to be! I read the book twice and, each time I finished the book, I wanted to seek out some chalk-scrawling magician and offer a hug.

G.H. Hardy was a robust mathematician just over the age of sixty who had made some serious contributions to number theory and population genetics. He was a cricket-loving man who had brought the Indian autodidact Srinivasa Ramanujan to academic prominence by personally vouching for and mentoring him. You would think that a highly accomplished dude who went about the world with such bountiful and generous energies would be able to ride out his eccentric enthusiasm into his autumn years. But in 1939, Hardy survived a heart attack and felt that he was as useless as an ashtray on a motorcycle, possessing nothing much in the way of nimble acumen or originality. So he decided to memorialize his depressing thoughts about “useful” contributions to knowledge in A Mathematician’s Apology (in one of the book’s most stupendous understatements, Hardy observed that “my apology is bound to be to some extent egotistical”), and asked whether mathematics, the field that he had entered into because he “wanted to beat other boys, and this seemed to be the way in which I could do so most decisively,” was worthwhile.

You can probably guess how it all turned out:

It is indeed rather astonishing how little practical value scientific knowledge has for ordinary man, how dull and commonplace such of it as has value is, and how its value seems almost to vary inversely to reputed utility….We live either by rule of thumb or other people’s professional knowledge.

If only Hardy could have lived about sixty more years to discover the 21st century thinker’s parasitic relationship to Google and Wikipedia! The question is whether Hardy is right to be this cynical. While snidely observing “It is quite true that most people can do nothing well,” he isn’t a total sourpuss. He writes, “A man’s first duty, a young man’s at any rate, is to be ambitious,” and points out that ambition has been “the driving force behind nearly all the best work of the world.” What he fails to see, however, is that youthful ambition, whether in a writer or a scientist, often morphs into a set of routines that become second-nature. At a certain point, a person becomes comfortable enough with himself to simply go on with his work, quietly evolving, where the ambition becomes more covert and subconscious and mysterious.

Hardy never quite confronts what it is about youth that frightens him, but he is driven by a need to justify his work and his existence, pointing to two reasons for why people do what they do: (1) they work at something because they know they can do it well and (2) they work at something because a particular vocation or specialty came their way. But this seems too pat and Gladwellian to be a persuasive dichotomy. It doesn’t really account for the journey we all must face over why one does something, which generally includes the vital people you meet at certain places in your life who point you down certain directions. Either they recognize some talent in you and give you a leg up or they are smart and generous enough to recognize that one essential part of human duty is to help others find their way, to seek out your people — ideally a group of eclectic and vastly differing perspectives — and to work with each other to do the best damn work and live the best damn lives you can. Because what’s the point of geeking out about Fermat’s “two squares” theorem, which really is, as Hardy observes, a nifty mathematical axiom of pure beauty, if you can’t share it with others?

But let’s return to Hardy’s fixation on youth. Hardy makes the claim that “mathematics, more than any other art or science, is a young man’s game,” yet this staggering statement is easily debunked by such late bloomers as prime number ninja Zhang Yitang and Andrew Wiles solving Fermat’s Last Theorem at the age of 41. Even in Hardy’s own time, Henri Poincaré was making innovations to topology and Lorentz transformations well into middle age. (And Hardy explicitly references Poincaré in § 26 of his Apology. So it’s not like he didn’t know!) Perhaps some of the more recent late life contributions have much to do with forty now being the new thirty (or even the new twenty among a certain Jaguar-buying midlife crisis type) and many men in Hardy’s time believing themselves to be superannuated in body and soul around the age of thirty-five, but it does point to the likelihood that Hardy’s sentiments were less the result of serious thinking and more the result of crippling depression.

Where Richard Feynman saw chess as a happy metaphor for the universe, “a great game played by the gods” in which we humans are mere observers who “do not know what the rules of the game are,” merely allowed to watch the playing (and yet find marvel in this all the same), Hardy believed that any chess problem was “simply an exercise in pure mathematics…and everyone who calls a problem ‘beautiful’ is applauding mathematical beauty, even if is a beauty of a comparatively lowly kind.” Hardy was so sour that he compared a chess problem to a newspaper puzzle, claiming that it merely offered an “intellectual kick” for the clueless educated rabble. As someone who enjoys solving the Sunday New York Times crossword in full and a good chess game (it’s the street players I have learned the most from; for they often have the boldest and most original moves), I can’t really argue against Hardy’s claim that such pastimes are “trivial” or “unimportant” in the grand scheme of things. But Hardy seems unable to remember the possibly apocryphal tale of Archimedes discovering gradual displacement while in the bathtub or the more reliable story of Otto Loewi’s dream leading the great Nobel-winning physiologist to discover that nervous impulses arose from electrical transmissions. Great minds often need to be restfully thinking or active on other fronts in order to come up with significant innovations. And while Hardy may claim that “no chess problem has ever affected the development of scientific thought,” I feel compelled to note Pythagoras played the lyre (and even inspired a form of tuning), Newton had his meandering apple moment, and Einstein enjoyed hiking and sailing. These were undoubtedly “trivial” practices by Hardy’s austere standards, but would these great men have given us their contributions if they hadn’t had such downtime?

It’s a bit gobsmacking that Hardy never mentions how Loewi was fired up by his dreams. He seems only to see value in Morpheus’s prophecies if they are dark and melancholic:

I can remember Bertrand Russell telling me of a horrible dream. He was in the top floor of the University Library, about A.D. 2100. A library assistant was going round the shelves carrying an enormous bucket, taking down book after book, glancing at them, restoring them to the shelves or dumping them into the bucket. At last he came to three large volumes which Russell could recognize as the last surviving copy of Principia mathematica. He took down one of the volumes, turned over a few pages, seemed puzzled for a moment by the curious symbolism, closed the volume, balanced it in his hand and hesitated….

One of an author’s worst nightmares is to have his work rendered instantly obsolescent not long after his death, even though there is a very strong likelihood that, in about 150 years, few people will care about the majority of books published today. (Hell, few people care about anything I have to write today, much less this insane Modern Library project. There is a high probability that I will be dead in five decades and that nobody will read the many millions of words or listen to the countless hours of radio I have put out into the universe. It may seem pessimistic to consider this salient truth, but, if anything, it motivates me to make as much as I can in the time I have, which I suppose is an egotistical and foolishly optimistic approach. But what else can one do? Deposit one’s head in the sand, smoke endless bowls of pot, wolf down giant bags of Cheetos, and binge-watch insipid television that will also not be remembered?) You can either accept this reality and reach the few people you can and find happiness and gratitude in doing so. Or you can deny the clear fact that your ego is getting in the way of your achievements, embracing supererogatory anxieties and forcing you to spend too much time feeling needlessly morose.

I suppose that in articulating this common neurosis, Hardy is performing a service. He seems to relish “mathematical fame,” which he calls “one of the soundest and steadiest of investments.” Yet fame is a piss-poor reason to go about making art or formulating theorems. Most of the contributions to human advancement are rendered invisible. These are often small yet subtly influential rivulets that unknowingly pass into the great river that future generations will wade in. We fight for virtues and rigor and intelligence and truth and justice and fairness and equality because this will be the legacy that our children and grandchildren will latch onto. And we often make unknowing waves. Would we, for example, be enjoying Hamilton today if Lin-Manuel Miranda’s school bus driver had not drilled him with Geto Boys lyrics? And if we capitulate those standards, if we gainsay the “trivial” inspirations that cause others to offer their greatness, then we say to the next generation, who are probably not going to be listening to us, that fat, drunk, and stupid is the absolute way to go through life, son.

A chair may be a collection of whirling electrons, or an idea in the mind of God: each of these accounts of it may have its merits, but neither conforms at all closely to the suggestions of common sense.

This is Hardy suggesting some church and state-like separation between pure and applied mathematics. He sees physics as fitting into some idealistic philosophy while identifying pure mathematics as “a rock on which all idealism flounders.” But might not one fully inhabit common sense if the chair exists in some continuum beyond this either-or proposition? Is not the chair’s perceptive totality worth pursuing?

It is at this point in the book where Hardy’s argument really heads south and he makes an astonishingly wrongheaded claim, one that he could not have entirely foreseen, noting that “Real mathematics has no effects on war.” This was only a few years before Los Alamos was to prove him wrong. And that’s not all:

It can be maintained that modern warfare is less horrible than the warfare of pre-scientific times; that bombs are probably more merciful than bayonets; that lachrymatory gas and mustard gas are perhaps the most humane weapons yet devised by military science; and that the orthodox view rests solely on loose-thinking sentimentalism.

Oh Hardy! Hiroshima, Nagasaki, Agent Orange, Nick Ut’s famous napalm girl photo from Vietnam, Saddam Hussein’s chemical gas massacre in Halabja, the use of Sarin-spreading rockets in Syria. Not merciful. Not humane. And nothing to be sentimental about!

Nevertheless, I was grateful to argue with this book on my second read, which occurred a little more than two weeks after the shocking 2016 presidential election. I had thought myself largely divested of hope and optimism, with the barrage of headlines and frightening appointments (and even Trump’s most recent Taiwan call) doing nothing to summon my natural spirits. But Hardy did force me to engage with his points. And his book, while possessing many flawed arguments, is nevertheless a fascinating insight into a man who gave up: a worthwhile and emotionally true Rorschach test you may wish to try if you need to remind yourself why you’re still doing what you’re doing.

Next Up: Tobias Wolff’s This Boy’s Life!

Six Easy Pieces (Modern Library Nonfiction #88)

(This is the thirteenth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Pilgrim at Tinker Creek.)

mlnf88Richard Feynman, exuberant Nobel laureate and formidable quantum mechanics man, may have been energetic in his lectures and innovatively performative in the classroom, but I’m not sure he was quite the great teacher that many have pegged him to be. James Gleick’s biography Genius informs us that students dropped out of his high-octane, info-rich undergraduate physics classes at a remarkable rate, replaced by Caltech faculty members and grad students who took to the Queens-born superstar much like baryons make up the visible matter of the universe. The extent to which Feynman was aware of this cosmic shift has been disputed by his chroniclers, but it is important to be aware of this shortcoming, especially if you’re bold enough to dive into the famed three volume Feynman Lectures on Physics, which are all thankfully available online. Six Easy Pieces represents an abridged version of Feynman’s full pedagogical oeuvre. And even though the many YouTube videos of Feynman reveal an undeniably magnetic and indefatigably passionate man of science who must have been an incredible dynamo to experience in person, one wonders whether barraging a hot room of young nervous twentysomethings with hastily delivered information is the right way to popularize science, much less inspire a formidable army of physicists.

Watch even a few minutes of Feynman firing on all his robust cylinders and it becomes glaringly apparent how difficult it is to contend with Feynman’s teaching legacy in book form. One wonders why the Modern Library nonfiction judges, who were keen to unknowingly bombard this devoted reader with such massive multivolume works as The Golden Bough, Dumas Malone’s Jefferson and His Time, and Principia Mathematica, didn’t give this spot to the full three volume Lectures. Did they view Feynman’s complete lesson plan as failed?

Judging from the sextet that I sampled in this deceptively slim volume, I would say that, while Feynman was undeniably brilliant, he was, like many geniuses, someone who often got lost within his own metaphors. While his analogy of two corks floating in a pool of water, with one cork jiggling in place to create motion in the pool that causes indirect motion for the other cork, is a tremendously useful method of conveying the “unseen” waves of the electromagnetic field (one that galvanized me to do the same in a saucepan after I had finished two bottles of wine over a week and a half), he is not nearly on-the-nose with his other analogies. The weakest lesson in the book, “Conservation on Energy,” trots out what seems to be a reliably populist metaphor with a child named “Dennis the Menace” playing with 28 blocks, somehow always ending up with 28 of these at the end of the day. Because Feynman wants to illustrate conservational constants, he shoehorns another element to the narrative whereby Dennis’s mother is, for no apparent reason, not allowed to open up the toy box revealing the number of blocks and thus must calculate how many blocks reside within. The mother has conveniently weighed the box at some unspecified time in advance back when it contained all 28 blocks.

This is bad teaching, in large part because it is bad storytelling that makes no sense. I became less interested in conservation of energy, with Feynman’s convoluted parallel clearly becoming more trouble than it was worth, and more interested in knowing why the mother was so fixated on remembering the number of blocks. Was she truly so starved for activity in her life that she spent all day at work avoiding all the juicy water cooler gossip about co-workers, much less kvetching about the boss, so that she might scheme a plan to at long last show her son that she would always know the weight of a single block? When Dennis showed resistance to opening the toy box, why didn’t the mother stand her ground and tell him to buzz off and stream an episode of Project Mc²?

Yet for all these defects in method, there is an indisputable poetic beauty in the way in which Feynman reminds us that we live in a vast world composed of limitless particles, a world in which we still aren’t aware of all the rules and in which even the particles contained within solids remain “fixed” in motion. Our universe is always moving, even when we can’t see it or completely comprehend it. Feynman is quick to observe throughout his lessons that “The test of all knowledge is experiment,” which again points to my theory that Feynman’s teachings, often accentuated by experiment, were probably better experienced than read. Nevertheless, even in book form, it is truly awe-inspiring to understand that we can still not accurately predict the precise mass, form, and force of all the cascading droplets from a mighty river once it hits the precipice of a waterfall. Such mysteries capture our imagination and, when Feynman is committed to encouraging our inventiveness through open and clear-eyed examples from our world, he is very much on point. Thanks in part to Feynman reminding me just how little we silly humans now know, I began to feel my heart open more for Tycho Brahe, that poor Dane who spent many years of his life refining Copernicus’s details and determining the elliptical patterns of planetary orbits. Brahe worked out his calculations entirely without a telescope, which allowed Johannes Kepler to sift through his invaluable measurements and forge laws that all contemporary astronomers now rely on to determine where a planet might be in the sky on any given night of the year. Heisenberg’s uncertainty principle hasn’t even been around a century and it’s nothing less than astounding to consider how our great grandparents had a completely different understanding of atoms and motion in their early lifetime than we do today.

Feynman did have me wanting to know more about the origins of many scientific discoveries, causing me to contemplate how each and every dawning realization altered human existence (an inevitable buildup for Thomas Kuhn and paradigms, which I will take up in ML Nonfiction #69). But unlike such contemporary scientists as Neil deGrasse Tyson, Alan Guth, or Brian Greene, Feynman did not especially inspire me to plunge broadly into my own experiments or make any further attempts to grapple with physics-based complexities. This may very well be more my failing than Feynman’s, but there shall be many more stabs at science as we carry on with this massive reading endeavor!

Next Up: G.H. Hardy’s A Mathematician’s Apology!

Pilgrim at Tinker Creek (Modern Library Nonfiction #89)

(This is the twelfth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Golden Bough.)

“Either this world, my mother, is a monster, or I myself am a freak.” — Annie Dillard, Pilgrim at Tinker Creek

I was a sliver-thin, stupefyingly shy, and very excitable boy who disguised his bruises under the long sleeves of his shirt not long before the age of five. I was also a freak.

bedroomI had two maps pinned to the wall of my drafty bedroom, which had been hastily constructed into the east edge of the garage in a house painted pink (now turquoise, according to Google Maps). The first map was of Tolkien’s Middle-earth, in which I followed the quests of Bilbo and Frodo by finger as I wrapped my precocious, word-happy head around sentences that I’d secretly study from the trilogy I had purloined from the living room, a well-thumbed set that I was careful to put back to the shelves before my volatile and often sour father returned home from the chemical plant. In some of his rare calm moments, my father read aloud from The Lord of the Rings if he wasn’t too drunk, irascible, or violent. His voice led me to imagine Shelob’s thick spidery thistles, Smeagol’s slithering corpus, and kink open my eyes the next morning for any other surprises I might divine in my daily journeys to school. The second map was of Santa Clara County, a very real region that everyone now knows as Silicon Valley but that used to be a sweeping swath of working and lower middle-class domiciles. This was one of several dozen free maps of Northern California that I had procured from AAA with my mother’s help. One of the nice perks of being an AAA member was the ample sample of rectangular geographical foldouts. I swiftly memorized all of the streets, held spellbound by the floral and butterfly patterns of freeway intersections seen from a majestic bird’s eye view in an errant illustrated sky. My mother became easily lost while driving and I knew the avenues and the freeways in more than a dozen counties so well that I could always provide an easy cure for her confusion. It is a wonder that I never ended up working as a cab driver, although my spatial acumen has remained so keen over the years that, to this day, I can still pinpoint the precise angle in which you need to slide a thick unruly couch into the tricky recesses of a small Euclidean-angled apartment even when I am completely exhausted.

mlnf89These two maps seemed to be the apotheosis of cartographic art at the time, filling me with joy and wonder and possibility. It helped me cope with the many problems I lived with at home. I understood that there were other regions beyond my bedroom where I could wander in peace, where I could meet kinder people or take in the beatific comforts of a soothing lake (Vasona Lake, just west of Highway 17 in Los Gatos, had a little railroad spiraling around its southern tip and was my real-life counterpart to Lake Evendim), where the draw of Rivendell’s elvish population or the thrill of smoky Smaug stewing inside the Lonely Mountain collided against visions of imagined mountain dwellers I might meet somewhere within the greens and browns of Santa Teresa Hills and the majestic observatories staring brazenly into the cosmos at the end of uphill winding roads. I would soon start exploring the world I had espied from my improvised bedroom study on my bike, pedaling unfathomable miles into vicinities I had only dreamed about, always seeking parallels to what the Oxford professor had whipped up. I once ventured as far south as Gilroy down the Monterey Highway, which Google Maps now informs me is a thirty-six mile round trip, because my neglectful parents never kept tabs on how long I was out of the house or where I was going. They didn’t seem to care. As shameful as this was, I’m glad they didn’t. I needed an uncanny dominion, a territory to flesh out, in order to stay happy, humble, and alive.

The maps opened up my always hungry eyes to books, which contained equally bountiful spaces devoted to the real and the imaginary, unspooling further marks and points for me to find in the palpable world and, most importantly, within my heart. I always held onto this strange reverence for place to beat back the sadness after serving as my father’s punching bag. To this day, I remain an outlier, a nomad, a lifelong exile, a wanderer even as I sit still, a renegade hated for what people think I am, a black sheep who will never belong no matter how kind I am. I won’t make the mistake of painting myself as some virtuous paragon, but I’ve become so accustomed to being condemned on illusory cause, to having all-too-common cruelties inflicted upon me (such as the starry-eyed bourgie Burning Man sybarite I recently opened my heart to, who proceeded to deride the city that I love, along with the perceived deficiencies of my hard-won apartment, this after I had told her tales, not easily summoned, about what it was like to be rootless and without family and how home and togetherness remain sensitive subjects for me) that the limitless marvels of the universe parked in my back pocket or swiftly summoned from my shelves or my constant peregrinations remain reliable, life-affirming balms that help heal the scars and render the wounds invisible. Heartbreak and its accompanying gang of thugs often feel like a mob bashing in your ventricles in a devastatingly distinct way, even though the great cosmic joke is that everyone experiences it and we have to love anyway.

So when Annie Dillard’s poetic masterpiece Pilgrim at Tinker Creek entered my reading life, its ebullient commitment to finding grace and gratitude in a monstrous world reminded me that seeing and perceiving and delving and gaping awestruck at Mother Earth’s endless glories seemed to me one one of the best survival skills you can cultivate and that I may have accidentally stumbled upon. As I said, I’m a freak. But Dillard is one too. And there’s a good chance you may walk away from this book, which I highly urge you to read, feeling a comparable kinship, as I did to Dillard. Even if you already have a formidable arsenal of boundless curiosity ready to be summoned at a moment’s notice, this shining 1974 volume remains vital and indispensable and will stir your soul for the better, whether you’re happy or sad. Near the end of a disastrous year, we need these inspirational moments now more than ever.

* * *

“Our life is a faint tracing on the surface of mystery.” – Pilgrim at Tinker Creek

Annie Dillard was only 28 when she wrote this stunning 20th century answer to Thoreau (the subject of her master’s thesis), which is both a perspicacious journal of journeying through the immediately accessible wild near her bucolic Southwestern Virginia perch and a daringly honest entreaty for consciousness and connection. Dillard’s worldview is so winningly inclusive that she can find wonder in such savage tableaux as a headless praying mantis clutching onto its mate or the larval creatures contained within a rock barnacle. The Washington Post claimed not long after Pilgrim‘s publication that the book was “selling so well on the West Coast and hipsters figure Annie Dillard’s some kind of female Castaneda, sitting up on Dead Man’s Mountain smoking mandrake roots and looking for Holes in the Horizon her guru said were there.” But Pilgrim, inspired in part from Colette’s Break of Day, is far from New Age nonsense. The book’s wise and erudite celebration of nature and spirituality was open and inspiring enough to charm even this urban-based secular humanist, who desperately needed a pick-me-up and a mandate to rejoin the world after a rapid-fire series of personal and political and romantic and artistic setbacks that occurred during the last two weeks.

For all of the book’s concerns with divinity, or what Dillard identifies as “a divine power that exists in a particular place, or that travels about over the face of the earth as a man might wander,” explicit gods don’t enter this meditation until a little under halfway through the book, where she points out jokingly how gods are often found on mountaintops and points out that God is an igniter as well as a destroyer, one that seeks invisibility for cover. And as someone who does not believe in a god and who would rather deposit his faith in imaginative storytelling and myth rather than the superstitions of religious ritual, I could nevertheless feel and accept the spiritual idea of being emotionally vulnerable while traversing into some majestic terrain. Or as Pascal wrote in Pensées 584 (quoted in part by Dillard), “God being thus hidden, every religion which does not affirm that God is hidden, is not true, and every religion which does not give the reason of it, is not instructive.”

Much of this awe comes through the humility of perceiving, of devoting yourself to sussing out every conceivable kernel that might present itself and uplift you on any given day and using this as the basis to push beyond the blinkered cage of your own self-consciousness. Dillard uses a metaphor of loose change throughout Pilgrim that neatly encapsulates this sentiment:

It is dire poverty indeed when a man is so malnourished and fatigued that he won’t stoop to pick up a penny. But if you cultivate a healthy poverty and simplicity, so that finding a penny will literally make your day, then, since the world is in fact planted in pennies, you have with your poverty bought a lifetime of days. It is that simple. What you see is what you get.

This is not too far removed from Thoreau’s faith in seeds: “Convince me that you have a seed there, and I am prepared to expect wonders.” The smug and insufferable Kathryn Schulzes of our world gleefully misread this great tradition of discovering possibilities in the small as arrogance, little realizing how their own blind and unimaginative hubris glows with crass Conde Nast entitlement as they fail to observe that Thoreau and Dillard were also acknowledging the ineluctable force of a bigger and fiercer world that will carry on with formidable complexity long after our dead bodies push against daisies. Faced with the choice of sustaining a sour Schulz-like apostasy or receiving every living day as a gift, I’d rather risk the arrogance of dreaming from the collected riches of what I have and what I can give rather than the gutless timidity of a prescriptive rigidity that fails to consider that we are all steeped in foolish and inconsistent behavior which, in the grand scheme of things, is ultimately insignificant.

Dillard is guided just as much by Heisenberg’s uncertainty principle as she is by religious and philosophical texts. The famous 1927 scientific law, which articulates how you can never know a particle’s speed and velocity at the same time, is very much comparable to chasing down some hidden deity or contending with some experiential palpitations when you understand that there simply is no answer, for one can feel but never fully comprehend the totality in a skirmish with Nature. It accounts for Dillard frequently noting that the towhee chirping on a treetop or the muskrat she observes chewing grass on a bank for forty minutes never see her. In seeing these amazing creatures carry on with their lives, who are completely oblivious to her own human vagaries, Dillard reminds us that this is very much the state of Nature, whether human or animal. If it is indeed arrogance to find awe and humility in this state of affairs, as Dillard and Thoreau clearly both did, then one’s every breath may as well be a Napoleonic puff of the chest.

Dillard is also smart and expansive enough to show us that, no matter where we reside, we are fated to brush up against the feral. She points to how arboreal enthusiasts in the Lower Bronx discovered a fifteen feet ailanthus tree growing from a lower Bronx garage and how New York must spend countless dollars each year to rid its underground water pipes of roots. Such realities are often contended with out of sight and out of mind, even as the New York apartment dweller battles cockroaches, but the reminder is another useful point for why we must always find the pennies and dare to dream and wander and take in, no matter what part of the nation we dwell in.

Another refreshing aspect of Pilgrim is the way in which Dillard confronts her own horrors with fecundity. Yes, even this graceful ruminator has the decency to confess her hangups about the unsettling rapidity with which moths lay their eggs in vast droves. She stops short at truly confronting “the pressure of birth and growth” that appalls her, shifting to plants as a way of evading animals and then retreating back to the blood-pumping phylum to take in blood flukes and aphid reproduction more as panorama rather than something to be felt. This volte-face isn’t entirely satisfying. On the other hand, Dillard is also bold enough to scoop up a cup of duck-pond water and peer at monostyla under a microscope. What this tells us is that there are clear limits to how far any of us are willing to delve, yet I cannot find it within me to chide Dillard too harshly for a journey she was not quite willing to take, for this is an honest and heartfelt chronicle.

While I’ve probably been “arrogant” in retreating at length to my past in an effort to articulate how Dillard’s book so moved me, I would say that Pilgrim at Tinker Creek represents a third map for my adult years. It is a true work of art that I am happy to pin to the walls of my mind, which seems more reliable than any childhood bedroom. This book has caused me to wonder why I have ignored so much and has demanded that that I open myself up to any penny I could potentially cherish and to ponder what undiscoverable terrain I might deign to take in as I continue to walk this earth. I do not believe in a god, but I do feel with all my heart that one compelling reason to live is to fearlessly approach all that remains hidden. There is no way that you’ll ever know or find everything, but Dillard’s magnificent volume certainly gives you many good reasons to try.

Next Up: Richard Feynman’s Six Easy Pieces!

The Golden Bough (Modern Library Nonfiction #90)

(This is the eleventh entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Shadow and Act.)

mlnf90It is somewhat difficult to obtain a decent print edition of the Third Edition of The Golden Bough without getting fleeced by some squirrely operator working out of a shady storage unit in the middle of nowhere. For nobody seems to read the whole enchilada anymore. This is hardly surprising in an age, abundantly cemented last week, when most people are more inclined to celebrate regressive stupidity, melting their minds in any self-immolating pastime rather than opening a book. But I was able to find an affordable edition with the help of a British antiquarian. I had no idea what I was in for, but some initial research suggested very strongly that I should not settle for the abridged edition that is much easier to acquire. Certainly the sheer time-sucking insanity of the Modern Library Reading Challenge, one of the many dependable bastions I have left in a bleak epoch, demands that I go the distance on all entries, even if it means becoming ensnared by a particular title for several weeks, often answering texts from pals checking in on me with fun little snippets from Estonian folklore quebraditaing somewhere within a “Fine. How are you?” Such is the life of a book-loving eccentric with a ridiculous self-imposed mandate that involves refusing to let terrible setbacks get in the way of happy rumination. We find hope and courage and new ideas and fierce fortitude in remembering that not a single authoritarian entity or pernicious individual can ever crush the possibilities contained within our minds, our hearts, and our souls.

The thirteen volume set landed at my feet with a promising thud after a month-long voyage by boat across the Atlantic Ocean, where it occupied my reading time for many months and proceeded to change my life. I realize that such a claim may sound trite in light of the devastating and life-altering results of the 2016 U.S. presidential election, but, if there’s anything we can learn from Stefan Zweig’s suicide, we must never forget the importance of patience, of working long and hard to fight and endure while steering humanity’s promising galleon back on the right course even as we look to culture’s power to sustain our spirits in the darkest times.1

James George Frazer’s The Golden Bough proved so galvanizing that I began to marvel more at trees and desired to spend more time beneath their magnificent branches. I began picking up the junk that other New Yorkers had so thoughtlessly deposited under their glorious leafy crowns. I began naming some of the trees I liked, saying “Hello, Balder!” (styled after the Norse god) to a beloved maple near the southwestern edge of Central Park. I started paying closer attention to the modest superstitious rituals that most of us take for granted, wanting to know why we feared black cats crossing our path (it started in the 1560s and originated with the idea that black cats were witches who had transformed their corporeal state) or worried ourselves into years of bad luck from walking under a ladder (it goes back to the Egyptians, who believed that walking under any pyramid would attenuate its mystical power). And, of course, I began to wonder if other superstitious rituals, such as voting for a vicious sociopathic demagogue to make a nation “great” again, originated from similar irrational fears. Despite being a secular humanist, I was stunned to discover that I had modest pagan proclivities and started to ask friends to engage in rather goofball offshoots of established rites in a somewhat libertine manner, much of which is unreportable. And if you think such a reaction is idiosyncratic (and it is), consider the strange takeaway that D.H. Lawrence memorialized in a December 8, 1915 letter:

I have been reading Frazer’s Golden Bough and Totemism and Exogamy. Now I am convinced of what I believed when I was about twenty — that there is another seat of consciousness than the brain and the nerve system: there is a blood-consciousness which exists in us independently of the ordinary mental consciousness, which depends on the eye as its source or connector. There is the blood-consciousness, with the sexual connection, holding the same relation as the eye, in seeing, holds to the mental consciousness. One lives, knows, and has one’s being in blood, without any references to nerves and brain. This is one half of life, belonging to the darkness. And the tragedy of this our life, and of your life, is that the mental and nerve consciousness exerts a tyranny over the blood-consciousness, and that your will has gone completely over to the mental consciousness, and is engaged with the destruction of your blood-being or blood-consciousness, the final liberating of the one, which is only death in result.

When I finished Frazer’s final volume, I certainly wasn’t prepared to suggest that any part of my consciousness was tyrannizing the others because of some eternal human connection to myths and rites enacted to answer and make sense of the presently inexplicable. But Lawrence did have a point about the way humans are naturally drawn to unusual ceremonies and celebrations that go well beyond Carolina Panthers coach Ron Rivera wearing the same shoes on game days, with the impulse often defying any steely rationalism we may use to make sense of our inherently animalistic nature, which any glance at a newspaper reveals to be quite frighteningly present.

goldenboughset

More importantly, The Golden Bough changed everything I thought I knew about storytelling and myth. It forced me to see commonalities within many cultures. To cite one of Frazer’s countless comparative examples, consider the way that humans have approached the bear hunt. After the Kamtchatkans had killed a bear and eaten its flesh, the host of the subsequent dinner party would bring the bear’s head before the assembled guests, wrap it in grass, and then conduct a panel of sorts where the host, serving as a moderator only slightly less ballistic than Bill O’Reilly, would ask the bear if he had been well-treated. Much like many wingnut “journalists” irresponsibly publishing claims in Slate today without robust evidence (and failing to tender corrections when pwned), the Kamctchatkan host would blame the Russians. The American Indians likewise implored the dead bear not to be angry for being hunted and would hang the bear’s head on a post, painting it red and blue rather than donning it with vegetation. They also addressed it, much in the manner that dog owners chat with their uncomprehending pets when nobody’s around. The Assiniboins also held feasts after a hunt and also mounted the bear’s head, shrouding it in strips of scarlet cloth, and respected the bear so much that they offered the head a pipe to smoke, not unlike the poor dog who sits outside Mets games with a pipe in his mouth. And looking beyond Frazer, one finds in Alanson Skinner’s Notes on the Eastern Cree and Northern Saulteaux a similar bear’s head ceremony that involved sharing a pipe before the participants took a bite from the bear’s flesh and, with the old Finnish custom of karhunpeijaiset, a bear’s head mounted upon a young tree, venerated and addressed as a relative or the son of a god. And according to the Russian ethnographer Vladimir Arsenyev (and I found this bit by sifting through James H. Grayson’s Myths and Legends from Korea), the Udegey people of the Russian Far East also had a bear ceremony and believed, “To throw the head away is a great sin….The cooked bear’s head is wrapped in its own skin with its fur outwards and tied up with a thin rope,” with a communal ceremony quite similar to the Finns.

I could go on (and indeed Frazer often rambles for pages), but there’s an undeniable awe in learning that something so specific about bears (head mounted, party organized, head covered, bear respected), much less anything else, arose independently in so many different parts of the world. It proves very conclusively, and perhaps this is especially essential for us to understand as we reconcile a vast and seemingly incurable national division, that humans share more in common with each other than we’re willing to confess and that the seemingly unique rituals that we believe define “us” are quite likely shared many times over in other parts of the nation, much less the world.

The reason it took me so long to read The Golden Bough was not because of its many thousand pages (aside from some sloggish parts in the Spirits of the Corn and of the Wild volumes, the books are surprisingly readable2), but because my imagination would become so captivated by some tale of trees esteemed above human life or a crazed orgiastic release (see Saturnalia) that I would lose many hours in the library seeing how much of this was still practiced. It has been more than a century since Frazer published the Third Edition, but his remarkable observations about shared rituals still invite us to dream and believe and to perceive that, Frazer’s regrettable references to “savages” and “primitives” notwithstanding, we are not so different from each other.

Frazer’s explanation for these common qualities — epitomized by the famous J.M.W. Turner painting (pictured above) sharing the same name title as Frazer’s capacious opus — rests in the sylvan lake of Nemi and an ancient tale in which a priest-king defended a singular tree. The priest-king, who was an incarnation of a god wedded to the world, could only be overpowered by a fight to the death and, if he was slain, he would be replaced by his victor, with the cycle perpetuating ad infinitum. Frazer believed that nearly every story in human history could be traced back to this unifying myth, with most of the tales triggered by our imagination arising out of what he called “sympathetic magic,” whereby humans often imitate what they cannot understand. So if this meant building effigies or participating in elaborate and often unusual rituals that explain why the sun scorched the crops to an unsustainable crisp in the last harvest or helped more animals to multiply for grand feasts next season, magical thinking provided both the bond and the panacea well before Robert B. Thomas came up with the Old Farmer’s Almanac.

There are two components to sympathetic magic: the first is Contagion, or physical contact, which involved a transfer of “essence” by physical contact (among other things, this would account for why humans have been especially careful about bear’s heads, as described above); the second was Similarity, whereby “the magicians infers that he can produce any effect he desires merely by imitating it: from the second he infers that whatever he does to a material object will affect equally the person with whom the object was once in contact, whether it formed part of his body or not.”

One of The Golden Bough‘s most fascinating volumes, The Scapegoat, reveals how a human belief in “essence” may be the root of our most irrational fears. Contagion often led to humans trying to transfer their disease and miseries to other people, if not reinforcing their own biases about people or groups that they disliked. I am indebted to the terrific podcast Imaginary Worlds for steering me to the work of Carol Nemeroff, whose psychological considerations of Frazer’s findings are are especially constructive in understanding disgust. Nemeroff and her colleagues conducted a series of studies in which they placed a sterilized dead roach in a glass of juice and asked subjects to eat fudge that resembled dog feces. The natural reactions (recoiling at the roach and the shit-shaped fudge) showed that sympathetic magic is still very much a mainstay of our culture.

Indeed, sympathetic magic drives most of our cherished rituals today. In one of his most controversial (but nevertheless true) observations, Frazer observes in Adonis Attis Osiris that, although the Gospels never cited a hard date for Jesus Christ’s birthday bash, Christians have adhered to their churchgoing rituals with the same practiced regularity one sees in fundamentalist homophobics holding up cardboard signs that misquote the Bible to reinforce their hate. The original celebration date of Christ’s alleged birth was January 6th. But because heathens celebrated the birthday of the Sun on December 25th, and this was often a draw for the Christians because the heathens were more fun, the Church agreed to set December 25th as the official day. If Christmas did not exist, it would be necessary for humankind to invent it. For such useful observations, The Golden Bough is incredibly handy to have in one’s library, if only to remind us that most of our beliefs, the recurring rituals we are expected to adhere to, are predicated upon some ancient explanation that we failed to shake from the Magic 8-Ball of our daily existence. So Colin Kaepernick really doesn’t need to stand for the national anthem. While this conformist rite is admittedly improved from the Nazi-like Bellamy salute, standing for The Star-Spangled Banner is little more than a quaint superstition that one is pressured to participate in to “belong.”

Frazer’s scholarship, while impressive, is sometimes inexact in the effort to find a Theory of Everything. Midway through putting together the Third Edition, Frazer was challenged by Dr. Edward Westermarck, who pointed out that fire festivals did not originate from fire reinforcing the sun’s light and heat, but rather a need to celebrate purification. Frazer did correct his views in Balder the Beautiful, but it does leave one contemplating whether sympathetic magic served as Frazer’s knee-jerk goto point in his noble quest to reconcile several folkloric strands.

Still, one cannot disavow the conclusion that much of our behavior is not only similarly ceremonial across cultures, which would indeed suggest a common source. Frazer managed one last volume, the Aftermath, in 1937, just four years before his death. While this volume is little more than a collection of B-sides, it does have leave one wondering what Frazer would have made of Nuremburg rallies or even our current default mode of walking like zombies in the streets, heads down, eyes bulging at the prospect of another chapter in a Snapchat story. The gods and the sympathetic magic may be a tad more secular these days, but we still remain seduced. Myths and stories and rituals are as old as the Chauvet Cave paintings. One cannot imagine being human without them.

Next Up: Annie Dillard’s Pilgrim at Tinker Creek!

Shadow and Act (Modern Library Nonfiction #91)

(This is the tenth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Power Broker.)

mlnf91When I first made my bold belly flop into the crisp waters of Ralph Ellison’s deep pool earlier this year, I felt instantly dismayed that it would be a good decade before I could perform thoughtful freestyle in response to his masterpiece Invisible Man (ML Fiction #19). As far as I’m concerned, that novel’s vivid imagery, beginning with its fierce and intensely revealing Battle Royale scene and culminating in its harrowing entrapment of the unnamed narrator, stands toe-to-toe with Saul Bellow’s The Adventures of Augie March as one of the most compelling panoramas of mid-20th century American life ever put to print, albeit one presented through a more hyperreal lens.

But many of today’s leading writers, ranging from Ta-Nehisi Coates to Jacqueline Woodson, have looked more to James Baldwin as their truth-telling cicerone, that fearless sage whose indisputably hypnotic energy was abundant enough to help any contemporary humanist grapple with the nightmarish realities that America continues to sweep under its bright plush neoliberal rug. At a cursory glance, largely because Ellison’s emphasis was more on culture than overt politics, it’s easy to see Ellison as a complacent “Maybe I’m Amazed” to Baldwin’s gritty “Cold Turkey,” especially when one considers the risk-averse conservatism which led to Ellison being attacked as an Uncle Tom during a 1968 panel at Grinnell College along with his selfish refusal to help emerging African-American authors after his success. But according to biographer Arnold Rampersad, Baldwin believed Ralph Ellison to be the angriest person he knew. And if one dives into Ellison’s actual words, Shadow and Act is an essential volume, which includes one of the most thrilling Molotov cocktails ever pitched into the face of a clueless literary critic, that is often just as potent and as lapel-grabbing as Baldwin’s The Fire Next Time.

For it would seem that while Negroes have been undergoing a process of “Americanization” from a time preceding this birth of this nation — including the fusing of their blood lines with other non-African strains, there has been a stubborn confusion as to their American identity. Somehow it was assumed that the Negroes, of all the diverse American peoples, would remain unaffected by the climate, the weather, the political circumstances — from which not even slaves were exempt — the social structures, the national manners, the modes of production and the tides of the market, the national ideals, the conflicts of values, the rising and falling of national morale, or the complex give and take of acculturalization which was undergone by all others who found their existence within the American democracy.

This passage, taken from an Ellison essay on Amiri Baraka’s Blues People, is not altogether different from Baldwin’s view of America as “a paranoid color wheel” in The Evidence of Things Not Seen, where Baldwin posited that a retreat into the bigoted mystique of Southern pride represented the ultimate denial of “Americanization” and thus African-American identity. Yet the common experiences that cut across racial lines, recently investigated with comic perspicacity on a “Black Jeopardy” Saturday Night Live sketch, may very well be a humanizing force to counter the despicable hate and madness that inspires uneducated white males to desecrate a Mississippi black church or a vicious demagogue to call one of his supporters “a thug” for having the temerity to ask him to be more respectful and inclusive.

Ellison, however, was too smart and too wide of a reader to confine these sins of dehumanization to their obvious targets. Like Baldwin and Coates and Richard Wright, Ellison looked to France for answers and, while never actually residing there, he certainly counted André Malraux and Paul Valéry among his hard influences. In writing about Richard Wright’s Black Boy, Ellison wisely singled out critics who failed to consider the full extent of African-American humanity even as they simultaneously demanded an on-the-nose and unambiguous “explanation” of who Wright was. (And it’s worth noting that Ellison himself, who was given his first professional writing gig by Wright, was also just as critical of Wright’s ideological propositions as Baldwin.) Ellison described how “the prevailing mood of American criticism has so thoroughly excluded the Negro that it fails to recognize some of the most basic tenets of Western democratic thought when encountering them in a black skin” and deservedly excoriated whites for seeing Paul Robeson and Marian Anderson merely as the ne plus ultra of African-American artistic innovation rather than the beginning of a great movement.

shriversombreroAt issue, in Ellison’s time and today, is the degree to which any individual voice is allowed to express himself. And Ellison rightly resented any force that would stifle this, whether it be the lingering dregs of Southern slavery telling the African-American how he must act or who he must be in telling his story as well as the myopic critics who would gainsay any voice by way of their boxlike assumptions about other Americans. One sees this unthinking lurch towards authoritarianism today with such white supremacists as Jonathan Franzen, Lionel Shriver, and the many Brooklyn novelists who, despite setting their works in gentrified neighborhoods still prominently populated by African-Americans, fail to include, much less humanize, black people who still live there.

“White supremacist” may seem like a harshly provocative label for any bumbling white writer who lacks the democratic bonhomie to leave the house and talk with other people and consider that those who do not share his skin color may indeed share more common experience than presumed. But if these writers are going to boast about how their narratives allegedly tell the truth about America while refusing to accept challenge for their gaping holes and denying the complexity of vital people who make up this great nation, then it seems apposite to bring a loaded gun to a knife fight. If we accept Ellison’s view of race as “an irrational sea in which Americans flounder like convoyed ships in a gale,” then it is clear that these egotistical, self-appointed seers are buckling on damaged vessels hewing to shambling sea routes mapped out by blustering navigators basking in white privilege, hitting murky ports festooned with ancient barnacles that they adamantly refuse to remove.

Franzen, despite growing up in a city in which half the population is African-American, recently told Slate‘s Isaac Chotiner that he could not countenance writing about other races because he has not loved them or gone out of his way to know them and thus excludes non-white characters from his massive and increasingly mediocre novels. Shriver wrote a novel, The Mandibles, in which the only black characters are (1) Leulla, bound to a chair and walked with a leash, and (2) Selma, who speaks in a racist Mammy patois (“I love the pitcher of all them rich folk having to cough they big piles of gold”). She then had the effrontery to deliver a keynote speech at the Brisbane Writers Festival arguing for the right to “try on other people’s hats,” failing to understand that creating dimensional characters involves a great deal more than playing dress-up at the country club. She quoted from a Margot Kaminski review of Chris Cleave’s Little Bee that offered the perfectly reasonable consideration, one that doesn’t deny an author’s right to cross boundaries, that an author may wish to take “special care…with a story that’s not implicitly yours to tell.” Such forethought clearly means constructing an identity that is more human rather than crassly archetypal, an eminently pragmatic consideration on how any work of contemporary art should probably reflect the many identities that make up our world. But for Shriver, a character should be manipulated at an author’s whim, even if her creative vagaries represent an impoverishment of imagination. For Shriver, inserting another nonwhite, non-heteronormative character into The Mandibles represented “issues that might distract from my central subject moment of apocalyptic economics.” Which brings us back to Ellison’s question of “Americanization” and how “the diverse American peoples” are indeed regularly affected by the decisions of those who uphold the status quo, whether overtly or covertly.

Writer Maxine Benba-Clarke bravely confronted Shriver with the full monty of this dismissive racism and Shriver responded, “When I come to your country. I expect. To be treated. With hospitality.” And with that vile and shrill answer, devoid of humanity and humility, Shriver exposed the upright incomprehension of her position, stepping from behind the arras as a kind of literary Jan Smuts for the 21st century.3

If this current state of affairs represents a bristling example of Giambattista Vico’s corsi e ricorsi, and I believe it does, then Ellison’s essay, “Twentieth-Century Fiction and the Black Mask of Humanity,” astutely demonstrates how this cultural amaurosis went down before, with 20th century authors willfully misreading Mark Twain, failing to see that Huck’s release of Jim represented a moment that not only involved recognizing Jim as a human being, but admitting “the evil implicit in his ’emancipation'” as well as Twain accepting “his personal responsibility in the condition of society.” With great creative power comes great creative responsibility. Ellison points to Ernest Hemingway scouring The Adventures of Huckleberry Finn merely for its technical accomplishments rather than this moral candor and how William Faulkner, despite being “the greatest artist the South has produced,” may not be have been quite the all-encompassing oracle, given that The Unvanquished‘s Ringo is, despite his loyalty, devoid of humanity. In another essay on Stephen Crane, Ellison reaffirms that great art involves “the cost of moral perception, of achieving an informed sense of life, in a universe which is essentially hostile to man and in which skill and courage and loyalty are virtues which help in the struggle but by no means exempt us from the necessary plunge into the storm-sea-war of experience.” And in the essays on music that form the book’s second section (“Sound and the Mainstream”), Ellison cements this ethos with his personal experience growing up in the South. If literature might help us to confront the complexities of moral perception, then the lyrical, floating tones of a majestic singer or a distinctive cat shredding eloquently on an axe might aid us in expressing it. And that quest for authentic expression is forever in conflict with audience assumptions, as seen with such powerful figures as Charlie Parker, whom Ellison describes as “a sacrificial figure whose struggles against personal chaos…served as entertainment for a ravenous, sensation-starved, culturally disoriented public which had but the slightest notion of its real significance.”

What makes Ellison’s demands for inclusive identity quite sophisticated is the vital component of admitting one’s own complicity, an act well beyond the superficial expression of easily forgotten shame or white guilt that none of the 20th or the 21st century writers identified here have had the guts to push past. And Ellison wasn’t just a writer who pointed fingers. He held himself just as accountable, as seen in a terrific 1985 essay called “An Extravagance of Laughter” (not included in Shadow and Act, but found in Going with the Territory), in which Ellison writes about how he went to the theatre to see Jack Kirkland’s adaptation of Erskine Caldwell’s Tobacco Road. (I wrote about Tobacco Road in 2011 as part of this series and praised the way that this still volatile novel pushes its audience to confront its own prejudices against the impoverished through remarkably flamboyant characters.) Upon seeing wanton animal passion among poor whites on the stage, Ellison burst into an uncontrollable paroxysm of laughter, which emerged as he was still negotiating the rituals of New York life shortly after arriving from the South. Ellison compared his reaction, which provoked outraged leers from the largely white audience, to an informal social ceremony he observed while he was a student at Tuskegee that involved a set of enormous whitewashed barrels labeled FOR COLORED placed in public space. If an African-American felt an overwhelming desire to laugh, he would thrust his head into the pit of the barrel and do so. Ellison observes that African-Americans “who in light of their social status and past condition of servitude were regarded as having absolutely nothing in their daily experience which could possibly inspire rational laughter.” And the expression of this inherently human quality, despite being a cathartic part of reckoning with identity and one’s position in the world, was nevertheless positioned out of sight and thus out of mind.

When I took an improv class at UCB earlier this year, I had an instructor who offered rather austere prohibitions to any strain of humor considered “too dark” or “punching down,” which would effectively disqualify both Tobacco Road and the Tuskegee barrel ritual that Ellison describes.2 These restrictions greatly frustrated me and a few of my classmates, who didn’t necessarily see the exploration of edgy comic terrain as a default choice, but merely one part of asserting an identity inclusive of many perspectives. I challenged the notion of confining behavior to obvious choices and ended up getting a phone call from the registrar, who was a smart and genial man and with whom I ended up having a friendly and thoughtful volley about comedy. I had apparently been ratted out by one student, who claimed that I was “disrupting” the class when I was merely inquiring about my own complicity in establishing base reality. In my efforts to further clarify my position, I sent a lengthy email to the instructor, one referencing “An Extravagance of Laughter,” and pointed out that delving into the uncomfortable was a vital part of reckoning with truth and ensuring that you grew your voice and evolved as an artist. I never received a reply. I can’t say that I blame him.

Ellison’s inquiry into the roots of how we find common ground with others suggests that we may be able to do so if we (a) acknowledge the completeness of other identities and (b) allow enough room for necessary catharsis and the acknowledgment of our feelings and our failings as we take baby steps towards better understanding each other.

The most blistering firebomb in the book is, of course, the infamous essay “The World and the Jug,” which demonstrates just what happens when you assume rather than take the time to know another person. It is a refreshingly uncoiled response that one could not imagine being published in this age of “No haters” reviewing policies and genial retreat from substantive subjects in today’s book review sections. Reacting to Irving Howe’s “Black Boys and Native Sons,” Ellison condemns Howe for not seeing “a human being but an abstract embodiment of living hell” and truly hammers home the need for all art to be considered on the basis of its human experience rather than the spectator’s constricting inferences. Howe’s great mistake was to view all African-American novels through the prism of a “protest novel” and this effectively revealed his own biases against what black writers had to say and very much for certain prerigged ideas that Howe expected them to say. “Must I be condemned because my sense of Negro life was quite different?” writes Ellison in response to Howe roping him in with Richard Wright and James Baldwin. And Ellison pours on the vinegar by not only observing how Howe self-plagiarized passages from previous reviews, but how his intractable ideology led him to defend the “old-fashioned” violence contained in Wright’s The Long Dream, which, whatever its merits, clearly did not keep current with the changing dialogue at the time.

Shadow and Act, with its inclusion of interviews and speeches and riffs on music (along with a sketch of a struggling mother), may be confused with a personal scrapbook. But it is, first and foremost, one man’s effort to assert his identity and his philosophy in the most cathartic and inclusive way possible. We still have much to learn from Ellison more than fifty years after these essays first appeared. And while I will always be galvanized by James Baldwin (who awaits our study in a few years), Ralph Ellison offers plentiful flagstones to face the present and the future.

SUPPLEMENT: One of the great mysteries that has bedeviled Ralph Ellison fans for decades is the identity of the critic who attacked Invisible Man as a “literary race riot.” In a Paris Review interview included in Shadow and Act, Ellison had this to say about the critic:

But there is one widely syndicated critical bankrupt who made liberal noises during the thirties and has been frightened ever since. He attacked my book as a “literary race riot.”

With the generous help of Ellison’s biographer Arnold Rampersad (who gave me an idea of where the quote might be found in an email volley) and the good people at the New York Public Library, I have tracked down the “widely syndicated critical bankrupt” in question.

sterlingnorthHis name is Sterling North, best known for the children’s novel Rascal in 1963. He wrote widely popular (and rightly forgotten) children’s books while writing book reviews for various newspapers. North was such a vanilla-minded man that he comics “a poisonous mushroom growth” and seemed to have it in for any work of art that dared to do something different — or that didn’t involve treacly narratives involving raising baby raccoons.

And then, in the April 16, 1952 issue of the New York World-Telegram, he belittled Ellison’s masterpiece, writing these words:

This is one of the most tragic and disturbing books I have ever read. For the most part brilliantly written and deeply sincere, it is, at the same time, bitter, violent and unbalanced. Except for a few closing pages in which the author tries to express something like a sane outlook on race relations, it is composed largely of such scenes of interracial strife that it achieves the effect of one continuous literary race riot. Ralph Ellison is a Negro with almost as much writing talent as Richard Wright. Like his embittered hero (known only as “I’ throughout the book, Mr. Ellison received scholarships to help him through college, one from the State of Oklahoma which made possible three years at the Tuskegee Institute, and one from the Rosenwald Foundation.

If Mr. Ellison is as scornful and bitter about this sort of assistance as he lets his “hero” be, those who made the money available must wonder if it was well spent.

North’s remarkably condescending words offer an alarming view of the cultural oppression that Ellison was fighting against and serve as further justification for Ellison’s views in Shadow and Act. Aside from his gross mischaracterization of Ellison’s novel, there is North’s troubling assumptions that Ellison should be grateful in the manner of an obsequious and servile stereotype, only deserves a scholarship if he writes a novel that fits North’s limited idea of what African-American identity should be, and that future white benefactors should think twice about granting opportunities for future uppity Ellisons.

It’s doubtful that The Sterling North Society will recognize this calumny, but this is despicable racism by any measure. A dive into North’s past also reveals So Dear to My Heart, a 1948 film adaptation of North’s Midnight and Jeremiah that reveled in Uncle Tom representations of African-Americans.

North’s full review of The Invisible Man can be read below:

sterling-north

Next Up: James George Frazer’s The Golden Bough!

The American Political Tradition (Modern Library Nonfiction #93)

(This is the eighth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Contours of American History.)

mlnf93Before he became famous for delineating “the paranoid style in American politics” and honing every principled bone against the feverish anti-intellectualism one now sees embodied in everything from long-standing philistine Dan Kois decrying “eating his cultural vegetables” to lunatic presidential candidate Ted Cruz declaring gluten-free meals as a politically correct “social experiment,” historian Richard Hofstadter spent four years on a fiercely independent book that would go on to sell close to a million copies. The American Political Tradition was a Plutarchian overview of illustrious American figures ranging from vivacious abolitionist Wendell Phillips to Woodrow Wilson as closeted conservative. It was aimed at winning over a high-minded American public. Like William Appleman Williams, Hofstadter was very much following in Charles Beard’s footsteps, although this historian hoped to march to his own interpretive drum. Reacting to the toxic McCarthyism at the time, Hofstadter’s cautious defense of old school American liberalism, with the reluctant bulwark hoisted as he poked holes into the foibles of celebrated icons, saddled him with the label of “consensus historian.” With each subsequent volume (most notably The Age of Reform), Hofstadter drifted further away from anything close to a scorching critique of our Founders as hardliners enforcing their economic interests to a more vociferous denouncement of agrarian Populists and numbnuts standing in the way of erudite democratic promise. Yet even as he turned more conservative in later years, Hofstadter insisted that his “assertion of consensus history in 1948 had its sources in the Marxism of the 1930s.”

Such adamantine labels really aren’t fair to Hofstadter’s achievements in The American Political Tradition. The book is by no means perfect, but its Leatherman Wave-like dissection of American history unfolds with some sharp and handy blades. While Hofstadter is strangely reluctant to out Andrew Jackson as a demagogue (“He became a favorite of the people, and might easily come to believe that the people chose well.”) and far too pardonable towards John C. Calhoun, a rigid bloviator with a harsh voice who was one of slavery’s biggest cheerleaders and whose absolutist stance against tariffs under the guise of moderatism would later inspire the South to consider secession as a legitimate nuclear option3, Hofstadter at his best slices with a necessary critical force into many hallowed patriarchs. For it is the sum of their variegated and contradictory parts that has caused some to view the American trajectory in Manichean terms.

One of the book’s standout chapters is Hofstadter’s shrewd analysis of Lincoln as an exceptionally formidable man who dialed down his egalitarian ardor to zero the meter for his shrewd and very rapid political rise. In just four years, Lincoln advanced from an obscure attorney in Illinois to a prominent party leader in that same state’s House of Representatives. But Hofstadter cogently argues that Lincoln was far from the outspoken abolitionist who would later lay down some very strong words against those who would deny other people freedom. Lincoln not only kept his enemies closer than his friends, but he was exceptionally careful with his rhetoric, even though one eye-popping 1836 declaration proposed extending suffrage to women.2 Much as Franklin D. Roosevelt was very savvy about letting his political opponents make the first move before he acted, Lincoln used the Declaration of Independence’s very text as ammunition and inspiration for his justification for abolition, which come much later — Lincoln’s first public condemnation of slavery arrived when Lincoln was forty-five — than Lincoln’s many admirers are often willing to admit.

Hofstadter points out that Lincoln’s seeming contradiction between revolutionary politics and pragmatic interpretation of the law was not especially peculiar, but part of a nuts-and-bolts perpetuation of an ongoing political tradition, one that can be seen with Lincoln’s hard maneuvering with the 1851 conditional loan he issued to his stepbrother John D. Johnson. Lincoln’s famous House Divided speech was masterful rhetoric urging national reconciliation of the slavery issue, but he didn’t exactly go out of his way to out himself as an abolitionist. Hofstadter points out that in 1858, seemingly Honest Abe spoke in two entirely different manners about racial equality in Chicago and in Charleston (see the second paragraph of his first speech). Yet these observations not only illustrate Lincoln’s political genius, but invite parallels to Lyndon Johnson’s brilliant and equally contradictory engineering in passing the 1957 Civil Rights Act (perhaps best chronicled in a gripping 100 page section of Robert A. Caro’s excellent Master of the Senate). The American political tradition, which Hofstadter identifies as a continuity with capitalist democratic principles, is seen today with Hillary Clinton struggling against a young population hungry for progressive change unlikely to happen overnight, despite Bernie Sanders’s valiant plans and the immediate need to rectify corporate America’s viselike hold on the very democratic principles that have sustained this nation for more than two hundred years.

Yet this is the same tradition that has given us long years without a stabilizing central bank, the Trail of Tears, the Civil War, the Credit Mobilier scandal, robber barons, and Hoover’s unshakable faith that “prosperity was just around the corner,” among many other disgraces. Hofstadter is thankfully not above condemning lasseiz-faire absolutism, such as Grover Cleveland’s unrealistic assumption that “things must work out smoothly without government action, or the whole system, coherent enough in theory, would fall from the weakness of its premises” or the free silver campaign that buttressed the bombastic William Jennings Bryan into an improbable presidential candidate. On Bryan, Hofstadter describes his intellectual acumen as “a boy who never left home” and one can see some of Bryan’s regrettable legacy in the red-faced fulminations of a certain overgrown boy who currently pledges to make America great again. A careless and clumsy figure like Bryan was the very antithesis of Lincoln. Bryan failed to see difficult political tasks through to their necessary end. He would adopt principles that he once decried. His well-meaning efforts amounted to practically nothing. Think of Bryan as Fargo‘s Jerry Lundegaard to Lincoln’s Joe Girard. Hofstadter suggests that “steadfast and self-confident intelligence,” perhaps more important than courage and sincerity, was the very quality that Bryan and this nation so desperately needed. Yet in writing about Teddy Roosevelt and pointing to the frequency of “manly” and “masterful” in his prose, Hofstadter imputes that these “more perfect” personal qualities for the political tradition “easily became transformed into the imperial impulse.”

This is, at times, a very grumpy book. One almost bemoans the missed opportunity to enlist the late Andy Rooney to read aloud the audio version. But it is not without its optimism. Hofstadter places most of his faith in abolitionist agitator Wendell Phillips. But even after defending Phillips from numerous historical condemnations and pointing to Phillips’s “higher level of intellectual self-awareness,” Hofstadter sees the agitator as merely “the counterweight to sloth and indifference.” But Hofstadter, at this young stage of his career, isn’t quite willing to write off agitators. He does point to why Phillips was a necessary and influential force providing equilibrium:

But when a social crisis or revolutionary period at last matures, the sharp distinctions that govern the logical and doctrinaire mind of the agitator become at one with the realities, and he appears overnight to the people as a plausible and forceful thinker. The man who has maintained that all history is the history of class struggles and has appeared so wide of the mark in times of class collaboration may become a powerful leader when society is seething with unresolved class conflict; the man who has been valiantly demanding the abolition of slavery for thirty years may become a vital figure when emancipation makes its appearance as a burning issue of practical politics. Such was the experience of Wendell Phillips: although he never held office, he became one of the most influential Americans during the few years after the fall of Fort Sumter.

thealternativefactorThe question of whether you believe Hofstadter to be a consensus historian or not may depend on how much you believe that he viewed the American political tradition much like two Lazaruses forever duking it out for existence in the old Star Trek episode “The Alternative Factor.” He certainly sees a nation of political pragmatists and obdurate agitators caught in an eternal dead lock, which is not too far from the progressive historians who styled their interpretations on class conflict. But his fine eye for ferreting out the Burkean undertow within Woodrow Wilson’s putative liberalism or exposing how Hoover’s faith in unregulated business had him quivering with disbelief after Black Thursday suggests a historian who is interested in countering ideological bromides. Perhaps if Hofstadter had stretched some of his chapters across a massive book, his reputation as a consensus historian wouldn’t have been the subject of so many heated arguments among political wonks.

Fortunately, the next Modern Library essay in this series will investigate how one man fluctuated his politics to serve his own ends and reshaped a major metropolis through the iron will of his personality. That very long and very great book may be the key that turns the consensus lock. It will certainly tell us a lot more about political power.

Next Up: Robert A. Caro’s The Power Broker!

The Contours of American History (Modern Library Nonfiction #94)

(This is the seventh entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Promise of American Life.)

mlnf94

History is never the thrilling Zapcat powerboat ride it can and should be when we remain committed to oaring through the same exhausted legends about American exceptionalism and bogus democratic promise. Much as we may find new insights into human existence by tilting our canoes to the ripples contained within a storyteller’s cadences, so too may we discover more complete ways of contending with our historical contradictions through the viewpoint of a responsible revisionist armed with the facts and rejecting the hard establishment line.

The revisionist historian, that charming and sometimes infuriating rabble-rouser never to be confused with some creepy Holocaust denier flailing in a sea of empty Cheetos bags and crackpot pamphlets, often gets needlessly maligned in America. Before Annette Gordon-Reed offered conclusive evidence of Thomas Jefferson’s relationship with Sally Hemings (upheld by a 1998 DNA test), Fawn Brodie was attacked by vanilla-minded legacy holders two decades before for pushing beyond James Callender’s tawdry trolling, daring to suggest that there was good reason to believe that our much heralded champion of the rights of man had skeletons in his closet that were vital to understanding his philosophy. Brodie’s book, despite its psychobiographical failings, led to a reckoning with our myths and assumptions about the Sage of Monticello, one that continues to this very day with college students demanding the removal of Jefferson statues on campuses.

Provided that their efforts do not involve going out of their way to Bowlderize troubling if incontrovertible parts of the story and the results are as expansive and as rigorous as their more timorous mainstream counterparts, revisionists are often vital reconcilers of the public record. It is the facile propagandist who ignores Rosa Parks’s radicalism to paint a roseate image of a meek and tired seamstress who refused to give up her seat on a bus (“small,” “delicate,” and “little,” as belittled by Bill Clinton in 2005) or who upholds the lie that Abner Doubleday created baseball.

In recent decades, many young students have ardently clutched their copies of Howard Zinn’s A People’s History of the United States with the taut adamantine grip of a Fallout 4 junkie reluctant to capitulate her controller. Zinn’s thoughtful volume has been vehemently denounced by some establishment historians who have questioned the perceived polemical emphasis of class conflict at the expense of other issues. But before Zinn, there was William Appleman Williams, a brash energetic troublemaker who was arguably a more rigorous scholar than Zinn and who was among the best and the boldest of the firebrand 20th century historians who emerged from a Charles Beard afterglow with ass to kick once the bubble gum supply ran out.

William Appleman Williams unpacked the economic motivations of American expansion and foreign policy in The Tragedy of American Diplomacy and broadened this scholarship further with The Contours of American History, a punchy volume examining how imperialism and liberalism became a sordid double stitch intertwined in the American quilt well before the Sons of Liberty spilled massive chests of desperately offloaded tea into Boston Habor. Yet Williams’s often nimble analysis, riddled as it sometimes is with conceptual overreach, robustly articulates the ever-changing and contradictory American Weltanschauung that has motivated nearly every governmental decision since. He documents a worldview that started off with the relatively benign goal of creating and sustaining an economic nation that provided for everyone, but devolved under the autocratic yoke of Jacksonian democracy and Gilded Age greed to the corporate capitalist nightmare we are all trying to awake from today. And because Williams’s challenge to the so-called “American experiment” was so unprecedented in the mid-20th century, this historian was tarnished, besmirched, and condemned by other putative progressives who might have enlarged their rigid notions of national identity if they had been more willing to dive into the subtle words and actions directing the unshakable financial impetus.

Williams was harassed by the House Committee on Un-American Activities, that despicably despotic body that ruined the lives of so many, with a demand to produce the unfinished Contours manuscript. The HUAC would order Williams to testify in Washington and then cancel the appearance by telegram once he’d hopped on a train to the Beltway. Even after he testified for ten minutes and the HUAC abandoned its witch hunt, the IRS harassed him in various forms for nearly twenty years. Williams was hounded by the neoliberalism critic Arthur Schlesigner, Jr., who dutifully condemned Williams as “pro-communist” to the American Historical Association’s president. Even as late as 2009, an academic called Williams an “idiot” before a Society of Historians of American Foreign Relations panel, decrying Williams’s approach to history as a crude retooling of Charles Beard’s infamous assault upon our Founding Fathers’s pecuniary predispositions.3

But Williams was far from a typical progressive. He was a registered Republican when he first came to Wisconsin. He voted for Nixon as the lesser evil in 1960. And even in Contours, he defended Herbert Hoover’s hands-off Depression era policies, seeing this as a necessary tactic to forestall property holders from creating a business-friendly fascism that could have had a more diabolical effect on our clime than the many Hoovervilles that had mushroomed across the nation. Williams argued that Hoover’s perceived failure to do anything represented a more active resistance against special interests than the Progressive Movement was willing to acknowledge or act upon at the time. And that’s the way this jazz-loving Midwestern historian rolled. As Williams was to write in a 1973 essay, the revisionist’s duty was to “see basic facts in a different way and as interconnected in new relationships. He is a sister and a brother to those who use old steel to make a zipper, as contrasted with those who add new elements to make a better steel.”

In my previous Modern Library essay, I castigated Herbert Croly for the historical developments that he could not see ahead of him, for erring too much in his perfervid belief in a central government and for diminishing the justifiable grievances of protesters. William Appleman Williams may very well represent the opposite problem: a historian who could see the implications of any action all too well, one who was willing to articulate any interpretation of the facts even if it meant being alienated by the jingoistic minds who needed to reconsider the other fateful historical trajectories upholding the status quo.

Williams’s highly specific examples very much allow him to sell us on his interpretation. In Tragedy, for example, Williams’s deductive prowess is in high gear when he examines how Woodrow Wilson’s March 1913 decision to refuse a government loan to China, one long coveted by American industrialists at the time (and later attempted privately), actually fell within the framework of the Open Door Policy. Many historians have interpreted Wilson’s pushback as a betrayal of American expansionism at the time, but Williams points to the lack of private capital available to fulfill the job as well as the possibility that any governmental loan, even one secured with the help of other financiers, may have been perceived as a very clear threat to neighboring Japan. The Open Door Policy, for all of its flaws and its needless sullying of China, was intended to provide a peacefully imperialist framework for a burgeoning American empire: a GATT or IMF before its time, though regrettably without much in the way of homegrown protest. (Rebellion would come later in Beijing with the May Fourth movement.) The ostensible goal was to strengthen China with fresh influxes of low-risk private capital so that it could withstand troublesome neighbors looking for a fight, even as the new obligations to American entrepreneurs forged hot rivulets of cash rolling back to the imperialist homeland. Wilson’s decision was, as discerned by Williams, a canny chesslike stratagem to avoid war and conflict, one that would keep China a servant to America’s riches. From the vantage point of the 21st century, this useful historical interpretation reveals Wilson to be a pioneer in the kind of venal and now all too commonplace globalization that morally bankrupt neoliberals like Thomas Friedman have no problem opening their old steel zippers for. Their free trade fantasies possess all the out-of-sight, out-of-mind justification of a revenge porn junkie ignoring another person’s real world humiliation for fleeting sociopathic pleasure.

It was with Contours that Williams blew the lid off the great American lie, exposing the American liberal’s failure to confront his own implication in much of the lasseiz nous faire madness. Williams traced the origins of our mercantilist approach to Anthony Ashley Cooper, the Earl of Shaftesbury. In the 17th century, Shaftesbury was a political figure who opposed harsh penalties and absolutist government. He stood up for the nonconformists and called for regular parliaments, and would go on to found and lead the early Whig party in the wake of the British Exclusion Crisis. While traveling to Oxford to remove an abscess from his liver, he hit it off with a young doctor by the name of John Locke. (There weren’t as many cafes back then as there are today. In the 1600s, you had to take whatever mingling opportunities you could get.) Locke, of course, would later have many ideas about the social contract, a scheme about inalienable natural rights that would eventually find its way into a number one ditty penned by Jefferson that would become known as the Declaration of Independence.

But there was a twist to this tale. As Williams points out, Locke’s ideas were a corruption of Shaftesbury’s more inclusive and democratic efforts. Where Shaftesbury was willing to rebel against the King to ensure that courts and alternative political parties were in place to prevent the government from becoming an absolute tyranny, even going to the trouble of building a coalition that extended across all classes to fight for these safeguards when not putting together the Habeas Corpus Act of 1679, it was Locke who limited Shaftesbury’s remarkably liberal contributions by undercutting individual rights. Locke believed that those who owned property were perfectly justified in protesting their government, for they were the ones who had entered into a social contract. But the rabble who didn’t own property could more or less buzz off.2 As Williams put it, “[I]ndividualism was a right and a liberty reserved to those who accepted a status quo defined by a certain set of natural truths agreed upon a majority. Within such a framework, and it is a far narrower set of limits than it appears at first glance, the natural laws of property and labor were deemed sufficient to guide men’s pursuit of happiness.”

Yet those who subscribed to these early mercantilist standards believed that this classically liberal idea of “corporate structure” involved a basic responsibility to provide for everyone. And the way of sustaining such a benevolent national juggernaut was through the establishment of an empire: a Pax Americana predicated upon the promise of a democracy promulgated by patriarchs who not so quietly believed that the people were incapable of it.3 Williams observes how the Quakers in Philadelphia, who opposed expansion and much of the onslaughts against Native Americans, were very much committed to noblesse oblige, setting up hospitals, education, and philanthropic endeavors to take care of everyone. But this generous spirit was no match for the free trade nabobs or the hard-hearted Calvinists who increasingly shifted such solicitude to the propertied class (one can easily imagine Alec Baldwin’s Glengarry Glenn Ross “Always be closing” speech spouted by a Calvinist), leading the great theologian Jonathan Edwards to offer righteous pushback against “fraud and trickishness in trade.”

Against this backdrop, post-Revolutionary expansion and the Monroe Doctrine allowed mercantilism to transmute into an idea that was more about the grab than the munificent results, with visions of empire dancing in many heads. By the time Frederick Jackson Turner tendered his Frontier Thesis in 1893, mercantilism was no longer about providing for the commonweal, but about any “self-made man” looking out after his interests. Williams points to Chief Justice John Marshall’s efforts to enforce safeguards, such as his Gibbons vs. Ogden decision regulating interstate commerce, against the monopolies that would come to dominate America near the turn of the century. Marshall’s immediate successor, Chief Justice Taney, expanded the flexibility of the Constitution’s Contract Clause with his 1837 Charles River Bridge v. Warren Bridge decision, permitting states to alter any contract as it saw fit. While Taney’s decision seemed to strike the death knell against monopolies, it was no match against the consolidated trusts that were to come with the railroads and the robber barons. Rather curiously, for all of his sharp observations about free trade and expansionist dangers during this time, Williams devotes little more than a paragraph to the 1836 closing of the Second Bank of the United States:

[Nicholas Biddle] did a better job than the directors of the Bank of England. Under his leadership the bank not only established a national system of credit balancing which assisted the west as much as the east, and probably more, but sought with considerable success to save smaller banks from their own inexperience and greed. It was ultimately his undoing, for what the militant advocates of lasseiz nous faire came to demand was help without responsibilities. In their minds, at any rate, that was the working definition of democratic freedom.

Talk about sweeping one of the greatest financial calamities in American history under the rug! I don’t want to get too much into Andrew Jackson, who I believe to be nothing less than an abhorrent, reckless, and self-destructive maniac who claimed “liberalism” using the iron fist of tyranny, in this installment. I shall preserve my apparently unquenchable ire for Old Hickory when I tackle Arthur Schlesinger, Jr.’s The Age of Jackson in a few years (Modern Library Nonfiction #36). But Jackson’s imperious and irresponsible battle with Biddle, complete with his Specie Circular, undoubtedly led to the Panic of 1837, in which interest rates spiked, the rich got richer, a fixable financial mess spiraled out of control and became needlessly dangerous, and buyers could not come up with the hard cash to invest in land. Considering Williams’s defense of Hoover in both Contours and Tragedy, it is extremely curious that he would shy away from analyzing why some form of central bank might be necessary to mitigate against volatility, even though he adopted some fascinating counterpoints to the “too big to fail” theory decades before Bernanke and Krugman.

This oversight points to the biggest issue I have with Williams. His solution to the great imperialist predicament was democratic socialism, which he called “the only real frontier available to Americans in the second half of the 20th century.” While this is a clever way of inverting Turner’s thesis, to uphold this, Williams cites a few examples such as the courage of Wendell Phillips, a few throwaway references to social property, and a late 19th century return with Edward Bellamy and Henry Demarest Lloyd to the Quaker-like notion of “a commonwealth in which men were brothers first and economic men second.” But while Williams is often a master of synthesis, he falls somewhat short in delineating how his many historical examples can aid us to correct our ongoing ills. If the American Weltanschauung is so steeped in our culture, how then can democratic socialism uproot it? This vital question remains at the root of any progressive-minded conversation. But now that we have a presidential race in which socialism is no longer a dirty word and the two leading Democratic candidates bicker over who is the greater progressive, perhaps the answer might arrive as naturally as Williams anticipated.

Next Up: Richard Hofstadter’s The American Political Tradition!

The Promise of American Life (Modern Library Nonfiction #95)

(This is the sixth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: In Cold Blood.)

mlnf95Before The New Republic devolved under Chris Hughes into a half-worthy husk of knee-jerk platitudes just a few histrionic clickbait headlines shy of wily Slate reductionism, it was a formidable liberal magazine for many decades, courageous enough to take real stands while sustaining vital dialogue about how and when government should intercede in important affairs. The source of this philosophical thrust, as duly documented by Franklin Foer, was the greatly diffident son of a prominent newspaperman, an unlikely progenitor who entered and exited Harvard many times without ever finishing, someone who suffered from severe depression and who, for a time, didn’t know what to do with his life other than play bridge and tennis and write about obscure architecture. But Croly found it in him to spill his views about democracy’s potential, what he called the “New Nationalism,” into a 1909 book called The Promise of American Life, which served as something of a manifesto for the early 20th century Progressives and became a cult hit among political wonks at the time. It partially inspired Theodore Roosevelt, who was proudly name-checked by Croly as “a Hamiltonian with a difference,” to initiate his ill-fated 1912 Bull Moose campaign as an outsider presidential candidate. (Historians have argued over the palpable influence of Croly’s book on Roosevelt, but it’s possible that, had not Croly confirmed what Roosevelt had already been thinking about, Roosevelt may not have entered the 1912 race as ardently as he did. With a more united Republican coalition against Wilson, America may very well have carried on with a second Taft term, with an altogether different involvement in World War I. Taft’s notable rulings as Chief Justice of the Supreme Court, which included extending executive power and broadening the scope of police evidence, may not been carried out in the 1920s. A book is often more of a Molotov shattering upon history’s turf than we are willing to accept.)

Croly’s book touched a nerve among a small passionate group. One couple ended up reading Croly’s book aloud to each other during their honeymoon (leaving this 21st century reader, comparing Croly’s thick “irremediable”-heavy prose style against now all too common sybaritic options, to imagine other important activities that this nubile pair may have missed out on). The newly married couple was Willard Straight and Dorothy Whitney. They had money. They invited Croly to lunch. The New Republic was formed.

So we are contending with a book that not only created an enduring magazine and possibly altered the course of American history, but one that had a profound impact on the right elite at the right time. So it was a tremendous surprise to discover a book that greatly infuriated me during the two times I read it, at one time causing me to hurl it with high indignant velocity against a wall, for reasons that have more to do with this gushing early 20th century idealist failing to foresee the rise of Nazism, the despicable marriage of racism and police brutality, growing income inequality, corporate oligarchy, draconian Common Core educational standards, and dangerous demagogues like George Wallace and Donald Trump.

But it is also important to remember that Croly wrote this book before radio, television, the Internet, women’s suffrage, two world wars, the Great Depression, smartphones, outrage culture, and 9/11. And it is never a good idea to read an older book, especially one of a political nature, without considering the time that it was written. I did my best to curb my instincts to loathe Croly for what he could not anticipate, for his larger questions of how power aligns itself with the democratic will of the people are still very much worth considering. Croly is quite right to identify the strange Frankenstein monster of Alexander Hamilton’s pragmatic central government and Thomas Jefferson’s rights of man — the uniquely American philosophical conflict that has been the basis of nearly every national conflict and problem that has followed — as a “double perversion” of our nation’s potential, even if Croly seems unwilling to consider that some “perversions” are necessary for an evolving democratic republic and he is often too trusting of executive authority and the general public’s obeisance to it. That these inquiries still remain irreconcilable (and are perverted blunter still by crass politicians who bellow about how to “make America great again” as they eject those who challenge them from the room) some 107 years after the book’s publication speaks to both the necessity and the difficulty of the question.

I’ve juxtaposed Croly’s meek-looking law clerk mien against George Bellows’s famous boxing painting (unveiled two years before Croly’s book) because there really is no better way to visualize the American individual’s relationship to its lumbering, venal, and often futile government. Croly’s solution is to call for all Americans to be actively engaged in a collaborative and faithful relationship with the nation: “to accept a conception of democracy which provides for the substantial integrity of his country, not only as a nation with an exclusively democratic mission, but as a democracy with an essentially national career.” On its face, this seems like a reasonable proposition. We all wish to belong in a democracy, to maintain fidelity to our country, and to believe that the Lockean social contract in which the state provides for the commonweal is a workable and reasonable quid pro quo. But it is also the kind of orgiastic meat and potatoes mantra that led both Kennedy and Reagan to evoke mythical American exceptionalism with the infamous “shining city upon a hill” metaphor. Dulcet words may make us feel better about ourselves and our nation, but we have seen again and again how government inaction on guns and a minimum wage that does not reflect contemporary living standards demands a Black Lives Matter movement and a “fight for $15.” And when one begins to unpack just what Croly wants us to give up for this roseate and wholly unrealistic Faustian bargain, we begin to see someone who may be more of a thoughtful and naive grandstander than a vital conceptual pragmatist.

Croly is right to demand that America operate with a larger administrative organ in place, some highly efficient Hamiltonian body that mitigates against “the evil effects of a loose union.” He smartly points out that such evils as slavery resulted from the American contradictions originating in the strange alliance between our poetic Jeffersonian call for Constitutional democracy and individualistic will and the many strains of populism and nationalism that followed. In his insistence on “the transformation of Hamiltonianism into a thoroughly democratic political principle,” Croly is suspicious of reformers, many of which he singles out in a manner strikingly similar to Norman Mailer’s “Quick and Expensive Comments on the Talent in the Room.” He calls William Jennings Bryan an “ill conceived” reformer, claims the now nearly forgotten William Travers Jerome to be “lulled into repose” by traditional Jeffersonian democracy (never mind Jerome’s successful crusades against Tammany Hall corruption, regrettably overshadowed by his prosecution of Harry K. Thaw during the Stanford White murder trial), interestingly pegs William Randolph Hearst as someone motivated by endless “proclaimation[s] of a rigorous interpretation of the principle of equal rights,” and holds up Teddy Roosevelt as “more novel and more radical” in his calls for a Square Deal than “he himself has probably proclaimed.”

But Croly’s position on reform is quite problematic, deeply unsettling, and often contradictory. He believes that citizens “should be permitted every opportunity to protest in the most vigorous and persistent manner,” yet he states that such protests “must conform to certain conditions” enforced by the state. While we are certainly far removed from the 1910 bombing of the Los Angeles Times building that galvanized the labor movement, as we saw with the appalling free speech cages during the 2004 Republican Convention, muzzling protesters not only attenuated their message but allowed the NYPD to set up traps for the activists, which ensured their arrest and detention — a prototype for the exorbitant enforcement used to diminish and belittle the Occupy Wall Street movement a few years later. Croly believes that the job of sustaining democratic promise should, oddly enough, be left to legislators and executives granted all the power required and sees state and municipal governments as largely unsuccessful:

The interest of individual liberty in relation to the organization of democracy demands simply that the individual officeholder should possess an amount of power and independence adequate to the efficient performance of his work. The work of a justice of the Supreme Court demands a power that is absolute for its own special work, and it demands technically complete independence. An executive should, as a rule, serve for a longer term, and hold a position of greater independence than a legislator, because his work of enforcing the laws and attending to the business details of government demands continuity, complete responsibility within its own sphere, and the necessity occasionally of braving adverse currents of public opinion. The term of service and the technical independence of a legislator might well be more restricted than that of an executive; but even a legislator should be granted as much power and independence as he may need for the official performance of his public duty. The American democracy has shown its enmity to individual political liberty, not because it has required its political favorites constantly to seek reëlection, but because it has since 1800 tended to refuse to its favorites during their official term as much power and independence as is needed for administrative, legislative, and judicial efficiency. It has been jealous of the power it delegated, and has tried to take away with one hand what it gave with the other.

There is no room for “Act locally, think globally” in Croly’s vision. This is especially ungenerous given the many successful progressive movements that flourished decades after Croly’s death, such as the civil rights movement beginning with local sit-ins and developing into a more cogent and less ragged strain of the destructive Jacksonian populism that Croly rightly calls out, especially in relation to the cavalier obliteration of the Second Bank of the United States and the Nullification Crisis of 1832, which required Henry Clay to clean up Jackson’s despotic absolutism with a compromise. On the Nullification point, Croly identifies Daniel Webster, a man who became treacherously committed to holding the Union together, as “the most eloquent and effective expositor of American nationalism,” who “taught American public opinion to consider the Union as the core and crown of the American political system,” even as he offers a beautifully stinging barb on Webster’s abolitionist betrayal with the 1850 speech endorsing the Fugitive Slave Act: “He was as much terrorized by the possible consequences of any candid and courageous dealing with the question as were the prosperous business men of the North; and his luminous intelligence shed no light upon a question, which evaded his Constitutional theories, terrified his will, and clouded the radiance of his patriotic visions.”

But Croly also promulgates a number of loopy schemes, including making representative legislatures at any level beholden to an executive who is armed with a near tyrannical ability to scuttle laws, even as he claims that voters removing representatives through referendum “will obtain and keep a much more complete and direct control over the making of their laws than that which they have exerted hitherto; and the possible desirability of the direct exercise of this function cannot be disputed by any loyal democrat.” Well, this loyal democrat, immediately summoning Lord Acton’s famous quote, calls bullshit on giving any two-bit boss that kind of absolute power. Because Croly’s baffling notion of “democracy” conjures up the terrifying image of a sea of hands raised in a Bellamy salute. On one hand, Croly believes that a democracy must secure and exercise individual rights, even as he rightly recognizes that, when people exercise these rights, they cultivate the “tendency to divide the community into divergent classes.” On the other hand, he believes that individuals should be kept on a restrictive leash:

[T]hey should not, so far as possible, be allowed to outlast their own utility. They must continue to be earned. It is power and opportunity enjoyed without being earned which help to damage the individual — both the individuals who benefit and the individuals who consent — and which tend to loosen the ultimate social bond. A democracy, no less than a monarchy or an aristocracy, must recognize political, economic, and social discriminations, but it must also manage to withdraw its consent whenever these discriminations show any tendency to excessive endurance. The essential wholeness of the community depends absolutely on the ceaseless creation of a political, economic, and social aristocracy and their equally incessant replacement.

There’s certainly something to be said about how many Americans fail to appreciate the rights that they have. Reminding all citizens of their duties to flex their individual rights may be a very sound idea. (Perhaps one solution to American indifference and political disillusion is the implementation of a compulsory voting policy with penalties, similar to what goes on in Australia.) But with such a middling door prize like this handed out at the democratic dance party, why on earth would any individual want to subscribe to the American promise? Aristocrats, by their very nature, wish to hold onto their power and privilege and not let go. Croly’s pact is thus equally unappealing for the struggling individual living paycheck to paycheck, the career politician, or the business tycoon.

Moreover, in addition to opposing the Sherman Antitrust Act, Croly nearly succumbs to total Taylorism in his dismissal of labor unions: “They seek by the passage of eight-hour and prevailing rate-of-wages laws to give an official sanction to the claims of the unions, and they do so without making any attempt to promote the parallel public interest in an increasing efficiency of labor. But these eight-hour and other similar laws are frequently being declared unconstitutional by the state courts, and for the supposed benefit of individual liberty.” Granted, Croly’s words came ten years before the passage of the Adamson Act, the first federal law enforcing a mandatory eight-hour day. But Croly’s failure to see the social benefits of well-rested workers better positioned to exercise their individual liberty for a democratic promise is one of his more outrageous and myopic pronouncements, even as he also avers how the conditions that create unrestricted economic opportunities also spawn individual bondage. But if Croly wants Americans to “[keep] his flag flying at any personal cost or sacrifice,” then he really needs to have more sympathy for the travails of the working stiff.

Despite all my complaints, I still believe some 21st century thinker should pick up from Croly’s many points and make an equally ambitious attempt to harmonize Hamilton and Jefferson with more recent developments. American politics has transformed into a cartoonish nightmare from which we cannot seem to escape, one that causes tax absolutist lunatics like Grover Norquist to appear remotely sane. That we are seeing a strange replay of the 1912 election with the 2016 presidential race, with Trump stepping in as an unlikely Roosevelt and Bernie Sanders possibly filling in for Eugene Debs, and that so many Americans covet an “outsider” candidate who will fix a government that they perceive as a broken system speaks to a great need for some ambitious mind to reassess our history and the manner in which we belong to our nation, while also observing the many ways in which Americans come together well outside of the political bear trap. For the American individual is no longer boxing George Bellows-style with her government. She is now engaged in a vicious MMA match unfurling inside a steel cage. Whether this ugly pugilism can be tempered with peace and tolerance is anyone’s guess, but, if we really believe in democracy, the least we can do is try to find some workaround in which people feel once again that they’re part of the process.

Next Up: William Appleman Williams’s The Contours of American History!

In Cold Blood (Modern Library Nonfiction #96)

(This is the fifth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Journalist and the Murderer.)

halmacapoteTruman Capote was a feverish liar and a frenzied opportunist from the first moment his high voice pierced the walls of a literary elite eager to filth up its antimacassars with gossip. He used his looks to present himself as a child prodigy, famously photographed in languorous repose by Harold Halma to incite intrigue and controversy. He claimed to win national awards for his high school writing that no scholar has ever been able to turn up. He escorted the nearly blind James Thurber to his dalliances with secretaries and deliberately put on Thurber’s socks inside out so that his wife would notice, later boasting that one secretary was “the ugliest thing you’ve ever seen.” Biographer Gerald Clarke chronicled how Capote befriended New Yorker office manager Daise Terry, who was feared and disliked by many at the magazine, because he knew she could help him. (Capote’s tactics paid off. Terry gave him the easiest job on staff: copyboy on the art department.) If Capote wanted to know you, he wanted to use you. But the beginnings of a man willing to do just about anything to get ahead can be found in his early childhood.

Capote’s cousin Jennings Faulk Carter once described young Truman coming up with the idea of charging admission for a circus. Capote had heard a story in the local paper about a two-headed chicken. Lacking the creative talent to build a chicken himself, he enlisted Carter and Harper Lee for this faux poultry con. The two accomplices never saw any of the money. Decades later, Capote would escalate this tactic on a grander scale, earning millions of dollars and great renown for hoisting a literary big top over a small Kansas town after reading a 300 word item about a family murder in The New York Times. Harper Lee would be dragged into this carnival as well.

mlnf96The tale of how two frightening men murdered four members of the Clutter family for a pittance and created a climate of fear in the surrounding rural area (and later the nation) is very familiar to nearly anyone who reads, buttressed by the gritty 1967 film (featuring a pre-The Walking Dead Scott Wilson as Dick Hickock and a pre-Bonnie Lee Bakley murder Robert Blake as Perry Smith) and a deservedly acclaimed 2005 film featuring the late great Philip Seymour Hoffman as Capote. But what is not so discussed is the rather flimsy foundation on which this “masterpiece” has been built.

Years before “based on a true story” became a risible cliche, Capote and his publicists framed In Cold Blood‘s authenticity around Capote’s purported accuracy. Yet the book itself contains many gaping holes in which we have only Smith and Hickock’s words, twisted further by Capote. What are we to make of Bill and Johnny — a boy and his grandfather who Smith and Hickock pick up for a roadside soda bottle-collecting adventure to make a few bucks? In our modern age, we would demand the competent journalist to track these two side characters down, to compare their accounts with those of Smith and Hickock. Capote claims that these two had once lived with the boy’s aunt on a farm near Shreveport, Louisiana, yet no independent party appears to have corroborated their identities. Did Capote (or Hickock and Smith) make them up? Does the episode really contribute to our understanding of the killers’ pathology? One doesn’t need to be aware of a recent DNA test that disproved Hickock and Smith’s involvement with the quadruple murder of the Walker family in Sarasota County, Florida, taking place one month after the Clutter murders, to see that Capote is more interested in holding up the funhouse mirror to impart specious complicity:

Hickock consented to take the [polygraph] test and so did Smith, who told Kansas authorities, “I remarked at the time, I said to Dick, I’ll bet whoever did this must be somebody that read about what happened out here in Kansas. A nut.” The results of the test, to the dismay of Osprey’s sheriff as well as Alvin Dewey, who does not believe in exceptional circumstances, were decisively negative.

Never mind that polygraph tests are inaccurate. It isn’t so much Hickock and Smith’s motivations that Capote was interested in. He was more concerned with stretching out a sense of amorphous terror on a wide canvas. As Hickock and Smith await to be hanged, they encounter Lowell Lee Andrews in the adjacent cell. He is a fiercely intelligent, corpulent eighteen-year-old boy who fulfilled his dormant dreams of murdering his family, but Capote’s portrait leaves little room for subtlety:

For the secret Lowell Lee, the one concealed inside the shy church going biology student, fancied himself an ice-hearted master criminal: he wanted to wear gangsterish silk shirts and drive scarlet sports cars; he wanted to be recognized as no mere bespectacled, bookish, overweight, virginal schoolboy; and while he did not dislike any member of his family, at least not consciously, murdering them seemed the swiftest, most sensible way of implementing the fantasies that possessed him.

We have modifiers (“shy,” “ice-hearted,” “gangsterish,” “silk,” “scarlet,” “bespectacled,” “bookish,” “virginal,” “swiftest,” and “sensible”) that conjure up a fantasy atop the fantasy, that suggest relativism to the two main heavies, but there is little room for subtlety or for any doubt in the reader’s mind. Capote does bring up the fact that Andrews suffered from schizophrenia, but diminishes this mental illness by calling it “simple” before dredging up the M’Naghten Rule, which was devised in 1843 (still on the books well before psychiatry existed and predicated upon a 19th century standard) to exclude any insanity defense whereby the accused recognizes right from wrong. But he has already tarnished Andrews with testimony from Dr. Joseph Satten: “He considered himself the only important, only significant person in the world. And in his own seclusive world it seemed to him just as right to kill his mother as to kill an animal or a fly.” I certainly don’t want to defend Andrews’s crime (much less the Clutter family murders), but this conveniently pat assessment does ignore more difficult and far more interesting questions that Capote lacks the coherence, the empathy, or the candor to confess his own contradictions to pursue. Many pages before, in relation to Hickock, Capote calls M’Naghten “a formula quite color-blind to any gradations between black and white.” In other words, Capote is the worst kind of journalist: a cherry-picking sensationalist who applies standards as he sees fit, heavily steering the reader’s opinion even as he feigns objectivity. The ethical reader reads In Cold Blood in the 21st century, wanting Katherine Boo to emerge from the future through a wormhole, if only to open up a can of whoopass on Capote for these egregious and often thoughtless indiscretions.

Capote’s decision to remove himself from the crisp, lurid story was commended by many during In Cold Blood‘s immediate reception as a feat of unparalleled objectivity, with the “nonfiction novel” label sticking to the book like a trendy hashtag that hipsters refuse to surrender, but I think Cynthia Ozick described the thorny predicament best in her infamous driveby on Capote (collected in Art & Ardor): “Essence without existence; to achieve the alp of truth without the risk of the footing.” If we accept any novel — whether “nonfiction” or fully imaginative — as some sinister or benign cousin to the essay, as a reasonably honest attempt to reckon with the human experience through invention, then In Cold Blood is a failure: the work of a man who sat idly in his tony Manhattan spread with cadged notebooks and totaled recall of aggressively acquired conversations even as his murderous subjects begged their “friend” to help them escape the hangman’s noose.

In 2013, Slate‘s Ben Yagoda described numerous factual indiscretions, revealing that editor William Shawn had penciled in “How know?” on the New Yorker galley proofs of Capote’s four part opus (In Cold Blood first appeared in magazine form). That same year, the Wall Street Journal uncovered new evidence from the Kansas Bureau of Investigation, which revealed that the KBI did not, upon receiving intelligence from informant Floyd Wells, swiftly dispatch agent Harold Nye to the farmhouse where Richard Hickock had lodged. (“It was as though some visitor were expected,” writes Capote. Expected by Hickock’s father or an author conveniently tampering his narrative like a subway commuter feverishly filling in a sudoku puzzle?) As Jack de Bellis has observed, Capote’s revisions from New Yorker articles to book form revealed Capote’s feeble command of time, directions, and even specific places. But de Bellis’s examination revealed more descriptive imprudence, such as Capote shifting a line on how Perry “couldn’t stand” another prisoner to “could have boiled him in oil” (“How know?” we can ask today), along with many efforts to coarsen the language and tweak punctuation for a sensationalist audience.

And then there is the propped up hero Alvin Dewey, presented by Capote as a tireless investigator who consumes almost nothing but coffee and who loses twenty pounds: a police procedural stereotype if ever there was one. Dewey disputes closing his eyes during the execution and the closing scene of Dewey meeting Nancy Clutter’s best friend, Susan Kidwell, in a cemetery is not only invented, but heavily mimics the belabored ending of Capote’s 1951 novel, The Grass Harp. But then “Foxy” Dewey and Capote were tighter than a pair of frisky lovers holed up for a week in a seedy motel.

Capote was not only granted unprecedented access to internal documents, but Capote’s papers reveal that Dewey provided Capote with stage directions in the police interview transcripts. (One such annotation reads “Perry turns white. Looked at the ceiling. Swallows.”) There is also the highly suspect payola of Columbia Pictures offering Dewey’s wife a job as a consultant on the 1965 film for a fairly substantial fee. Harold Nye, another investigator whose contributions have been smudged out of the history told Charles J. Shields in a December 30, 2002 interview (quoted in Mockingbird), “I really got upset when I know that Al [Dewey] gave them a full set of the reports. That was like committing the largest sin there was, because the bureau absolutely would not stand for that at all. If it would have been found out, he would have been discharged immediately from the bureau.”

haroldnyeIn fact, Harold Nye and other KBI agents did much of the footwork that Capote attributes to Dewey. Nye was so incensed by Capote’s prevarications that he read 115 pages of In Cold Blood before hurling the book across the living room. And in the last few years, the Nye family has been fighting to reveal the details inside two tattered notebooks that contain revelations about the Clutter killings that may drastically challenge Capote’s narrative.

Yet even before this, Capote’s magnum opus was up for debate. In June 1966, Esquire published an article by Phillip K. Tompkins challenging Capote’s alleged objectivity. He journeyed to Kansas and discovered that Nancy Clutter’s boyfriend was hardly the ace athlete (“And now, after helping clear the dining table of all its holiday dishes, that was what he decided to do that — put on a sweatshirt and go for a run.”) that Capote presented him as, that Nancy’s horse was sold for a higher sum to the father of the local postmaster rather than “a Mennonite farmer who said he might use her for plowing,” and that the undersheriff’s wife disputed Capote’s account:

During our telephone conversation, Mrs. Meier repeatedly told me that she never heard Perry cry; that on the day in question she was in her bedroom, not the kitchen; that she did not turn on the radio to drown out the sound of crying; that she did not hold Perry’s hand; that she did not hear Perry say, ‘I’m embraced by shame.’ And finally – that she had never told such things to Capote. Ms. Meier told me repeatedly and firmly, in her gentle way, that these things were not true.

(For more on Capote’s libertine liberties, see Chapter 4 of Ralph F. Voss’s Truman Capote and the Legacy of In Cold Blood.)

Confronted by these many disgraceful distortions, we are left to ignore the “journalist” and assess the execution. On a strictly showboating criteria, In Cold Blood succeeds and captures our imagination, even if one feels compelled to take a cold shower knowing that Capote’s factual indiscretions were committed with a blatant disregard for the truth, not unlike two psychopaths murdering a family because they believed the Clutters possessed a safe bountiful with riches. One admires the way that Capote describes newsmen “[slapping] frozen ears with ungloved, freezing hands,” even as one winces at the way Capote plays into patriarchal shorthand when Nye “visits” Barbara Johnson (Perry Smith’s only surviving sister: the other two committed suicide), describing her father as a “real man” who had once “survived a winter alone in the Alaskan wilderness.” The strained metaphor of two gray tomcats — “thin, dirty strays with strange and clever habits” – wandering around Garden City during the Smith-Hickcock trial allows Capote to pad out his narrative after he has exhausted his supply of “flat,” “dull,” “dusty,” “austere,” and “stark” to describe Kansas in the manner of some sheltered socialite referencing the “flyover states.” Yet for all these cliches, In Cold Blood contains an inexplicably hypnotic allure, a hold upon our attention even as the book remains aggressively committed to the facile conclusion that the world is populated by people capable of murdering a family over an amount somewhere “between forty and fifty dollars.” As Jimmy Breslin put it (quoted in M. Thomas Inge’s Conversations with Truman Capote), “This Capote steps in with flat, objective, terrible realism. And suddenly there is nothing else you want to read.”

That the book endures — and is even being adapted into a forthcoming “miniseries event’ by playwright Kevin Hood — speaks to an incurable gossipy strain in Western culture, one reinforced by the recent success of the podcast Serial and the television series The Jinx. It isn’t so much the facts that concern our preoccupation with true crime, but the sense that we are vicariously aligned with the fallible journalist pursuing the story, who we can entrust to dig up scandalous dirt as we crack open our peanuts waiting for the next act. If the investigator is honest about her inadequacies, as Serial‘s Sarah Koenig most certainly was, the results can provide breathtaking insight into the manner in which we incriminate other people with our emotional assumptions and our fallible memories and superficially examined evidence. But if the “journalist” removes himself from culpability, presenting himself as some demigod beyond question or reproach (Capote’s varying percentages of total recall memory certainly feel like some newly wrangled whiz kid bragging about his chops before the knowledge bowl), then the author is not so much a sensitive artist seeking new ways inside the darkest realm of humanity, but a crude huckster occupying an outsize stage, waiting to grab his lucrative check and attentive accolades while the real victims of devastation weep over concerns that are far more human and far more deserving of our attention. We can remove the elephants from the lineup, but the circus train still rolls on.

Next Up: The Promise of American Life by Herbert Croly!

The Journalist and the Murderer (Modern Library Nonfiction #97)

(This is the fourth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Taming of Chance.)

mlnf97One of the mistakes often made by those who immerse themselves in Janet Malcolm’s The Journalist and the Murderer is believing that MacDonald’s guilt or innocence is what matters most. But Malcolm is really exploring how journalistic opportunity and impetuous judgment can lead any figure to be roundly condemned in the court of public opinion. Malcolm’s book was written before the Internet blew apart much of the edifice separating advertising and editorial with native advertising and sponsored articles, but this ongoing ethical dilemma matters ever more in our age of social media and citizen journalism, especially when Spike Lee impulsively tweets the wrong address of George Zimmerman (and gets sued because of the resultant harassment) and The New York Post publishes a front page cover of two innocent men (also resulting in a lawsuit) because Reddit happened to believe they were responsible for the 2013 Boston Marathon bombing.

Yet it is important to approach anything concerning the Jeffrey MacDonald murder case with caution. It has caused at least one documentary filmmaker to go slightly mad. It is an evidential involution that can ensnare even the most disciplined mind, a permanently gravid geyser gushing out books and arguments and arguments about books, with more holes within the relentlessly regenerating mass than the finest mound of Jarlsberg. But here are the underlying facts:

On February 17, 1970, Jeffrey MacDonald reported a stabbing to the military police. Four officers found MacDonald’s wife Colette, and their two children, Kimberley and Kristen, all dead in their respective bedrooms. MacDonald went to trial and was found guilty of one count of first-degree murder and two counts of second-degree murder. He was sentenced to three life sentences. Only two months before this conviction, MacDonald hired the journalist Joe McGinniss — the author of The Selling of a President 1968, then looking for a comeback — to write a book about the case, under the theory that any money generated by MacDonald’s percentage could be used to sprout a defense fund. MacDonald placed total trust in McGinniss, opening the locks to all his papers and letting him stay in his condominium. McGinniss’s book, Fatal Vision, was published in the spring of 1983. It was a bestseller and spawned a popular television miniseries, largely because MacDonald was portrayed as a narcissist and a sociopath, fitting the entertainment needs of a bloodthirsty public. MacDonald didn’t know the full extent of this depiction. Indeed, as he was sitting in jail, McGinniss refused to send him a galley or an advance copy. (“At no time was there ever any understanding that you would be given an advance look at the book six months prior to publication,” wrote McGinniss to MacDonald on February 16, 1983. “As Joe Wambuagh told you in 1975, with him you would not even see a copy before it was published. Same with me. Same with any principled and responsible author.” Malcolm copiously chronicles the “principled and responsible” conduct of McGinniss quite well, which includes speaking with MacDonald in misleading and ingratiating tones, often pretending to be a friend — anything to get MacDonald to talk.)

wallacemacdonald

On 60 Minutes, roughly around the book’s publication, Mike Wallace revealed to MacDonald what McGinniss was up to:

Mike Wallace (narrating): Even government prosecutors couldn’t come up with a motive or an explanation of how a man like MacDonald could have committed so brutal a crime. But Joe McGinniss thinks he’s found the key. New evidence he discovered after the trial. Evidence he has never discussed with MacDonald. A hitherto unrevealed account by the doctor himself of his activities in the period just before the murders.

Joe McGinniss: In his own handwriting, in notes prepared for his own attorneys, he goes into great detail about his consumption of a drug called Eskatrol, which is no longer on the market. It was voluntarily withdrawn in 1980 because of dangerous side effects. Among the side effects of this drug are, when taken to excess by susceptible individuals, temporary psychosis, often manifested as a rage reaction. Here we have somebody under enormous pressure and he’s taking enough of this Eskatrol, enough amphetamines, so that by his own account, he’s lost 15 pounds in the three weeks leading up to the murders.

eskatrolnoteWallace: Now wait. According to the note which I’ve seen, three to five Eskatrol he has taken. We don’t know if he’s taken it over a period of several weeks or if he’s taken three to five Eskatrol a day or a week or a month.

McGinniss: We do know that if you take three to five Eskatrol over a month, you’re not going to lose 15 pounds in doing so.

Jeffrey MacDonald: I never stated that to anyone and I did not in fact lose fifteen pounds. I also wasn’t taking Eskatrol.

Wallace (reading MacDonald’s note): “We ate dinner together at 5:45 PM. It is possible I had one diet pill at this time. I do not remember and do not think I had one. But it is possible. I had lost 12 to 15 pounds in the prior three to four weeks in the process, using three to five capsules of Eskatrol Spansule. I was also…”

MacDonald: Three to five capsules for the three weeks.

Wallace: According to this.

MacDonald: Right.

Wallace: According to this.

MacDonald: And that’s a possibility.

Wallace: Then why would you put down here that…that there was even a possibility?

MacDonald: These are notes given to an attorney, who has told me to bare my soul as to any possibility so we could always be prepared. So I…

Wallace: Mhm. But you’ve already told me that you didn’t lose 15 pounds in the three weeks prior…

MacDonald: I don’t think that I did.

Wallace: It’s in your notes. “I had lost 12-15 lbs. in the prior 3-4 weeks, in the process using 3-5 capsules of Eskatrol Spansules.” That’s speed. And compazine. To counteract the excitability of speed. “I was losing weight because I was working out with a boxing team and the coach told me to lose weight.” — 60 Minutes

One of McGinniss’s exclusive contentions was that MacDonald had murdered his family because he was high on Eskatrol. Or, as he wrote in Fatal Vision:

It is also fact that if Jeffrey MacDonald were taking three to five Eskatrol Spansules daily, he would have been consuming 75 mg. of dextroamphetamine — more than enough to precipitate an amphetamine psychosis.

Note the phrasing. Even though McGinniss does not know for a fact whether or not MacDonald took three to five Eskatrol (and MacDonald himself is also uncertain: both MacDonald and McGinniss prevaricate enough to summon the justifiably hot and bothered mesh of Mike Wallace’s grilling), he establishes the possibility as factual — even though it is pure speculation. The prognostication becomes a varnished truth, one that wishes to prop up McGinniss’s melodramatic thesis.

* * *

Malcolm was sued for libel by Jeffrey Masson over her depiction of him in her book, In the Freud Archives. In The Journalist and the Murderer, she has called upon all journalists to feel “some compunction about the exploitative character of the journalist-subject relationship,” yet claims that her own separate lawsuit was not the driving force in the book’s afterword. Yet even Malcolm, a patient and painstaking practitioner, could not get every detail of MacDonald’s appearance on 60 Minutes right:

As Mike Wallace — who had received an advance copy of Fatal Vision without difficulty or a lecture — read out loud to MacDonald passages in which he was portrayed as a psychopathic killer, the camera recorded his look of shock and utter discomposure.

Wallace was reading MacDonald’s own notes to his attorney back to him, not McGinniss’s book. These were not McGinniss’s passages in which MacDonald was “portrayed as a psychopathic killer,” but passages from MacDonald’s own words that attempted to establish his Eskatrol use. Did Malcolm have a transcript of the 60 Minutes segment now readily available online in 1990? Or is it possible that MacDonald’s notes to his attorney had fused so perfectly with McGinnis’s book that the two became indistinguishable?

This raises important questions over whether any journalist can ever get the facts entirely right, no matter how fair-minded the intentions. It is one thing to be the hero of one’s own story, but it is quite another to know that, even if she believes herself to be morally or factually in the clear, the journalist is doomed to twist the truth to serve her purposes.

It obviously helps to be transparent about one’s bias. At one point in The Journalist and the Murderer, Malcolm is forthright enough to confess that she is struck by MacDonald’s physical grace as he breaks off pieces of tiny powdered sugar doughnuts. This is the kind of observational detail often inserted in lengthy celebrity profiles to “humanize” a Hollywood actor uttering the same calcified boilerplate rattled off to every roundtable junketeer. But if such a flourish is fluid enough to apply to MacDonald, we are left to wonder how Malcolm’s personal connection interferes with her purported journalistic objectivity. In the same paragraph, Malcolm neatly notes the casual abuse MacDonald received in his mailbox after McGinniss’s book was published — in particular a married couple who read Fatal Vision while on vacation who took the time to write a hateful letter while sunbathing at the Sheraton Waikiki Hotel. This casual cruelty illustrates how the reader can be just as complicit as the opportunistic journo in perpetuating an incomplete or slanted portrait.

The important conundrum that Malcolm imparts in her short and magnificently complicated volume is why we bother to read or write journalism at all if we know the game is rigged. The thorny morality can extend to biography (Malcolm’s The Silent Woman is another excellent book which sets forth the inherent and surprisingly cyclical bias in writing about Sylvia Plath). And even when the seasoned journalist is aware of ethical discrepancies, the judgmental pangs will still crop up. In “A Girl of the Zeitgeist” (contained in the marvelous collection, Forty-One False Starts), Malcolm confessed her own disappointment in how Ingrid Sischy failed to live up to her preconceptions as a bold and modern woman. Malcolm’s tendentiousness may very well be as incorrigible as McGinnis’s, but is it more forgivable because she’s open about it?

* * *

It can be difficult for Janet Malcolm’s most arduous advocates to detect the fine grains of empathy carefully lining the crisp and meticulous forms of her svelte and careful arguments, which are almost always sanded against venal opportunists. Malcolm’s responsive opponents, which have recently included Esquire‘s Tom Junod, Errol Morris, and other middling men who are inexplicably intimidated by women who are smarter, have attempted to paint Malcolm as a hypocrite, an opportunist, and a self-loathing harpy of the first order. Junod wrote that “it’s clear to anyone who reads her work that very few journalists are animated by malice than Janet Malcolm” and described her work as “a self-hater whose work has managed to speak for the self-hatred” of journalism. Yet Junod cannot cite any examples of this self-hate and malice, save for the purported Henry Youngman-like sting of her one liners (Malcolm is not James Wolcott; she is considerably more thoughtful and interesting) and for pointing out, in Iphigenia in Forest Hills, how trials “offer unique opportunities for journalistic heartlessness,” failing to observe how Malcolm pointed out how words or evidence lifted out of context could be used to condemn or besmirch the innocent until proven guilty (and owning up to her own biases and her desire to interfere).

Malcolm is not as relentless as her generational peer Renata Adler, but she is just as refreshingly formidable. She is as thorough with her positions and almost as misunderstood. She has made many prominent enemies for her controversial positions — even fighting a ten year trial against Jeffrey Masson over the authenticity of his quotations (dismissed initially by a federal judge in California on the grounds that there was an absence of malice). Adler was ousted from The New Yorker, but Malcolm was not. In the last few years, both have rightfully found renewed attention for their years among a new generation.

One origin for the anti-Malcolm assault is John Taylor’s 1989 New York Magazine article, “Holier than Thou,” which is perhaps singularly responsible for making it mandatory for any mention of The Journalist and the Murderer to include its infamous opening line: “Every journalist who is not too stupid or too full of himself to notice what is going on knows that what he does is morally indefensible.” Taylor excoriated Malcolm for betraying McGinniss as a subject, dredged up the Masson claims, and claimed that Malcolm used Masson much as McGinniss had used MacDonald. It does not occur to Taylor that Malcolm herself may be thoroughly familiar with what went down and that the two lengthy articles which became The Journalist and the Murderer might indeed be an attempt to reckon with the events that caused the fracas:

Madame Bovary, c’est moi,” Flaubert said of his famous character. The characters of nonfiction, no less than those of fiction, derive from the writer’s most idiosyncratic desires and deepest anxieties; they are what the writer wishes he was and worries that he is. Masson, c’est moi.

Similarly, Evan Hughes had difficulty grappling with this idea, caviling over the “bizarre stance” of Malcolm not wanting to be “oppressed by the mountain of documents that formed in my office.” He falsely infers that Malcolm has claimed that “it is pointless to learn the facts to try to get to the bottom of a crime,” not parsing Malcolm’s clear distinction between evidence and the journalist’s ineluctable need to realize characters on the page. No matter how faithfully the journalist sticks with the facts, a journalistic subject becomes a character because the narrative exigencies demand it. Errol Morris can find Malcolm’s stance “disturbing and problematic” as much as he likes, but he is the one who violated the journalistic taboo of paying subjects for his 2008 film, Standard Operating Procedure, without full disclosure. One of Morris’s documentary subjects, Joyce McKinney, claimed that she was tricked into giving an interview for what became Tabloid, alleging that one of Morris’s co-producers broke into her home with a release form. Years before Morris proved triumphant in an appellate court, he tweeted:

The notion of something “unvarnished” attached to a personal account may have originated with Shakespeare:

And therefore little shall I grace my cause
In speaking for myself. Yet, by your gracious patience,
I will a round unvarnished tale deliver
Of my whole course of love. What drugs, what charms,
What conjuration and what mighty magic—
For such proceeding I am charged withal—
I won his daughter.
Othello, Act 1, Scene 3

Othello hoped that in telling “a round unvarnished tale,” he would be able to come clean with Brabantio over why he had eloped with the senator’s daughter Desdemona. He wishes to be straightforward. It’s an extremely honorable and heartfelt gesture that has us very much believing in Othello’s eloquence. Othello was very lucky not to be speaking with a journalist, who surely would have used his words against him.

Next Up: Truman Capote’s In Cold Blood!

The Taming of Chance (Modern Library Nonfiction #98)

(This is the third entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Operating Instructions.)

mlnf98In the bustling beginnings of the twentieth century, the ferociously independent mind who forever altered the way in which we look at the universe was living in poverty.* His name was Charles Sanders Peirce and he’d anticipated Heisenberg’s uncertainty principle by a few decades. In 1892, Peirce examined what he called the doctrine of necessity, which held that every single fact of the universe was determined by law. Because before Peirce came along, there were several social scientists who were determined to find laws in everything — whether it be an explanation for why you parted your hair at a certain angle with a comb, felt disgust towards specific members of the boy band One Direction, or ran into an old friend at a restaurant one hundred miles away from where you both live. Peirce declared that absolute chance — that is, spontaneity or anything we cannot predict before an event, such as the many fish that pelted upon the heads of puzzled citizens in Shasta County, California on a January night in 1903 — is a fundamental part of the universe. He concluded that even the careful rules discovered by scientists only come about because, to paraphrase Autolycus from A Winter’s Tale, although humans are not always naturally honest, chance sometimes makes them so.

The story of how Peirce’s brave stance was summoned from the roiling industry of men with abaci and rulers is adeptly set forth in Ian Hacking’s The Taming of Chance, a pleasantly head-tingling volume that I was compelled to read twice to ken the fine particulars. It’s difficult to articulate how revolutionary this idea was at the time, especially since we now live in an epoch in which much of existence feels preordained by statistics. We have witnessed Nate Silver’s demographic models anticipate election results and, as chronicled in Moneyball, player performance analysis has shifted the way in which professional baseball teams select their roster and steer their lineup into the playoffs, adding a strange computational taint that feels as squirmy as performance enhancing drugs.

But there was a time in human history in which chance was considered a superstition of the vulgar, even as Leibniz, seeing that a number of very smart people were beginning to chatter quite a bit about probability, argued that the true measure of a Prussian state resided in how you tallied the population. Leibniz figured that if Prussia had a central statistic office, it would not only be possible to gauge the nation’s power but perhaps lead to certain laws and theories about the way these resources worked.

This was obviously an idea that appealed to chin-stroking men in power. One does not rule an empire without keeping the possibility of expansion whirling in the mind. It didn’t take long for statistics offices to open and enthusiasts to start counting heads in faraway places. (Indeed, much like the early days of computers, the opening innovations originated from amateurs and enthusiasts.) These early statisticians logged births, deaths, social status, the number of able-bodied men who might be able to take up weapons in a violent conflict, and many other categories suggested by Leibniz (and others that weren’t). And they didn’t just count in Prussia. In 1799, Sir John Sinclair published a 21 volume Statistical Account of Scotland that undoubtedly broke the backs of many of the poor working stiffs who were forced to carry these heavy tomes to the guys determined to count it all. Some of the counters became quite obsessive in their efforts. Hacking reports that Sinclair, in particular, became so sinister in his efforts to get each minister of the Church of Scotland to provide a detailed congregation schedule that he began making threats shrouded in a jocose tone. Perhaps the early counters needed wild-eyed dogged advocates like Sinclair to establish an extremely thorough baseline.

The practice of heavy-duty counting resulted, as Hacking puts it, in a bona-fide “avalanche of numbers.” Yet the intersection of politics and statistics created considerable fracas. Hacking describes the bickering and backbiting that went down in Prussia. What was a statistical office? Should we let the obsessive amateurs run it? Despite all the raging egos, bountiful volumes of data were published. And because there was a great deal of paper being shuffled around, cities were compelled by an altogether different doctrine of necessity to establish central statistical hubs. During the 1860s, statistical administrations were set up in Berlin, New York, Stockholm, Vienna, Rome, Leipzig, Frankfurt-am-Main, and many others. But from these central offices emerged a East/West statistics turf war, with France and England playing the role of Biggie on the West and Prussia as Tupac on the East. The West believed that a combination of individual competition and natural welfare best served society, while the East created the welfare state to solve these problems. And these attitudes, which Hacking is good enough to confess as caricaturish even as he illustrates a large and quite important point, affected the way in which statistics were perceived. If you believe in a welfare state, you’re probably not going to see laws forged from the printed numbers. Because numbers are all about individual action. And if you believe in the Hobbesian notion of free will, you’re going to look for statistical laws in the criminal numbers, because laws are formed by individuals. This created new notions of statistical fatalism. It’s worth observing that science at the time was also expected to account for morality.

Unusual experiments ensued. What, for example, could the chest circumference of a Scotsman tell us about the stability of the universe? (Yes, the measurement of Scottish chests was seriously considered by a Belgian guy named Adolphe Quetelet, who was trying to work out theories about the average man. When we get to Stephen Jay Gould’s The Mismeasure of Man several years from now, #21 in the Modern Library Nonfiction canon, I shall explore more pernicious measurement ideas promulgated as “science.” Stay tuned!) More nefariously, if you could chart the frequency of how often the working classes called in sick, perhaps you could establish laws to determine who was shirking duty, track the unruly elements, and punish the agitators interfering with the natural law. (As we saw with William Lamb Melbourne’s story, the British government was quite keen to crack down on trade unions during the 1830s. So just imagine what a rabid ideologue armed with a set of corrupted and unproven “laws” could do. In fact, we don’t even have to jump that far back in time. Aside from the obvious Hollerith punch card example, one need only observe the flawed radicalization model presently used by the FBI and the DHS to crack down on Muslim “extremists.” Arun Kundnani’s recent book, The Muslims Are Coming, examines this issue further. And a future Bat Segundo episode featuring Kundnani will discuss this dangerous approach at length.)

Throughout all these efforts to establish laws from numbers (Newton’s law of gravity had inspired a league of scientists to seek a value for this new G constant, a process that took more than a century), Charles Babbage, Johann Christian Poggendorf, and many others began publishing tables of constants. It is one thing to publish atomic weights. It is quite another to measure the height, weight, pulse, and breath of humans by gender and ethnicity (along with animals). The latter constant sets are clearly not as objective as Babbage would like to believe. And yet the universe does adhere to certain undeniable principles, especially when you have a large data set.

It took juries for mathematicians to understand how to reconcile large numbers with probability theory. In 1808, Pierre-Simon Laplace became extremely concerned with the French jury system. At the time, twelve-member juries convicted an accused citizen by a simple majority. He calculated that a seven-to-five majority had a chance of error of one in three. The French code had adopted the unusual method of creating a higher court of five judges to step in if there was a disagreement with a majority verdict in the lower court. In other words, if the majority of the judges in the higher court agreed with the minority of jurors in the lower court that an accused person should be acquitted, then the accused person would be acquitted. Well, this complicated system bothered Laplace. Accused men often faced execution in the French courts. So if there was a substantial chance of error, then the system needed to be reformed. Laplace began to consider juries composed of different sizes and verdicts ranging from total majority (12:0) to partial majority (9:3, 8:4), and he computed the following odds (which I have reproduced from a very helpful table in Hacking’s book):

hacking-juryerror

The problems here become self-evident. You can’t have 1,001 people on a jury arguing over the fate of one man. On the other hand, you can’t have a 2/7 chance of error with a jury of twelve. (One of Laplace’s ideas was a 144 member jury delivering a 90:54 verdict. This involved a 1/773 chance of error. But that’s nowhere nearly as extreme as a Russian mathematician named M.V. Ostrogradsky, who wasted much ink arguing that a 212:200 majority was more reliable than a 12:0 verdict. Remember all this the next time you receive a jury duty notice. Had some of Laplace’s understandable concerns been more seriously considered, there’s a small chance that societies could have adopted larger juries in the interest of a fair trial.)

French law eventually changed the minimum conviction from 7:5 to 8:4. But it turned out that there was a better method to allow for a majority jury verdict. It was a principle that extended beyond mere frequency and juror reliability, taking into account Bernoulli’s ideas on drawing black and white balls from an urn to determine a probability value. It was called the law of large numbers. And the great thing is that you can observe this principle in action through a very simple experiment.

Here’s a way of seeing the law of large numbers in action. Take a quarter and flip it. Write down whether the results are heads or tails. Do it again. Keep doing this and keep a running tally of how many times the outcome is heads and how many times the coin comes up tails. For readers who are too lazy to try this at home, I’ve prepared a video and a table of my coin toss results:

edcointoss

The probability of a coin toss is 1:1. On average, the coin will turn up heads 50% of the time and tails 50% of the time. As you can see, while my early tosses leaned heavily towards heads, by the time I had reached the eighteenth toss, the law of large numbers ensured that my results skewed closer to 1:1 (in this case, 5:4) as I continued to toss the coin. Had I continued to toss the coin, I would have come closer to 1:1 with every toss.

galtonbox

The law of large numbers offered the solution to Laplace’s predicament. It also accounts for the mysterious picture at the head of this essay. That image is a working replica of a Galton box (also known as a quincunx). (If you’re ever in Boston, go to the Museum of Science and you can see a very large working replica of a Galton box in action.) Sir Francis Galton needed a very visual method of showing off the central limit theorem. So he designed a box, not unlike a pachinko machine, in which beans are dropped from the top and work their way down through a series of wooden pins, which cause them to fall along a random path. Most of the beans land in the center. Drop more beans and you will see a natural bell curve form, illustrating the law of large numbers and the central limit theorem.

Despite all this, there was still the matter of statistical fatalism to iron out, along with an understandable distrust of statistics among artists and the general population, which went well beyond Disraeli’s infamous “There are three kinds of lies: lies, damned lies, and statistics” quote. Hacking is a rigorous enough scholar to reveal how Dickens, Dostoevsky, and Balzac were skeptical of utilitarian statistics. Balzac, in particular, delved into “conjugal statistics” in his Physiology of Marriage to deduce the number of virtuous women. They had every reason to be, given how heavily philosophers leaned on determinism. (See also William James’s “The Dilemma of Determinism.”) A German philosopher named Ernst Cassirer was a big determinism booster, pinpointing its beginnings in 1872. Hacking challenges Cassierer by pointing out that determinism incorporated the doctrine of necessity earlier in the 1850s, an important distinction in returning back to Peirce’s idea of absolute chance.

I’ve been forced to elide a number of vital contributors to probability and some French investigations into suicide in an attempt to convey Hacking’s intricate narrative. But the one word that made Perice’s contributions so necessary was “normality.” This was the true danger of statistical ideas being applied to the moral sciences. When “normality” became the ideal, it was greatly desirable to extirpate anything “abnormal” or “aberrant” from the grand human garden, even though certain crime rates were indeed quite normal. We see similar zero tolerance measures practiced today by certain regressive members of law enforcement or, more recently, New York Mayor Bill de Blasio’s impossible pledge to rid New York City of all traffic deaths by 2024. As the law of large numbers and Galton’s box observed, some statistics are inevitable. Yet it was also important for Peirce to deny the doctrine of necessity. Again, without chance, Peirce pointed out that we could not have had all these laws in the first place.

It was strangely comforting to learn that, despite all the nineteenth century innovations in mathematics and probability, chance remains very much a part of life. Yet when one begins to consider stock market algorithms (and the concomitant flash crashes), as well as our collective willingness to impart voluminous personal data to social media companies who are sharing these numbers with other data brokers, I cannot help but ponder whether we are willfully submitting to another “law of large numbers.” Chance may favor the prepared mind, as Pasteur once said. So why court predictability?

* Peirce’s attempts to secure academic employment and financial succor were thwarted by a Canadian scientist named Simon Newcomb. (A good overview of the correspondence between the two men can be found at the immensely helpful “Perice Gateway” website.)

Next Up: Janet Malcolm’s The Journalist and the Murderer!

Operating Instructions (Modern Library Nonfiction #99)

(This is the second entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Melbourne.)

mlnf99It is easy to forget, as brave women document their battles with cancer and callous columnists bully them for their candor, that our online confessional age didn’t exist twenty years ago. I suspect this collective amnesia is one of the reasons why Anne Lamott’s Operating Instructions — almost an urtext for mommy blogs and much of the chick lit that followed — has been needlessly neglected by snobbish highbrow types, even when hungry young writers rushed to claim transgressive land in the Oklahoma LiveJournal Run of 2006.

Lamott’s book, which is a series of honed journal entries penned from the birth of her son Sam to his first birthday, was ignored by the New York Times Book Review upon its release in 1993 (although Ruth Reichl interviewed her for the Home & Garden section after the book, labeled “an eccentric baby manual” by Reichl, became a bestseller). Since then, aside from its distinguished inclusion on the Modern Library list, it has not registered a blip among those who profess to reach upward. Yet if we can accept Karl Ove Knausgaard’s honesty about fatherhood in the second volume of his extraordinary autobiographical novel, My Struggle, why then do we not honor Anne Lamott? It is true that, like Woody Allen in late career, Lamott has put out a few too many titles. It is also true that she attracts a large reading audience, a sin as unpardonable to hoity-toity gasbags as a man of the hoi polloi leaving the toilet seat up. Much as the strengths of Jennifer Weiner’s fiction are often dwarfed by her quest for superfluous respect, Anne Lamott’s acumen for sculpting the familiar through smart and lively prose doesn’t always get the credit it deserves.

Operating Instructions — with its breezy pace, its populist humor, and its naked sincerity — feels at first to be a well-honed machine guaranteed to attract a crowd of likeminded readers. But once you start looking under the hood, you begin to understand how careful Lamott is with what she doesn’t reveal. It begins with the new baby’s name. We are informed that Samuel John Stephen Lamott’s name has been forged from Lamott’s brothers, John and Steve. But where does the name Samuel come from? And why is Lamott determined to see Sams everywhere? (A one-armed Sam, the son of a friend named Sam, et al.) There are murky details about Sam’s father, who flits in and out of the narrative like some sinister figure with a twirling moustache. He is six foot four and two hundred pounds. He is in his mid-fifties, an older man who Lamott had a fling with. We learn later in the book that he “filed court papers today saying that we never fucked and that he therefore cannot be the father.” Even so, what’s his side of the story?

This leaves Lamott, struggling for cash and succor, raising Sam on her own with a dependable “pit crew” of friends. Yet one is fascinated not only by Lamott’s unshakable belief that she will remain a single parent for the rest of her natural life (“there is nothing I can do or say that will change the fact that his father chooses not to be his father. I can’t give him a dad, I can’t give him a nuclear family”), but by how the absence of this unnamed father causes her to dwell on her own father’s final days.

Lamott’s father was a writer who “died right as I crossed the threshold into publication.” His brain cancer was so bad that he could barely function in his final days. Lamott describes leaving her father in the car with a candy bar as she hits the bank. Her father escapes the car, becoming a “crazy old man pass[ing] by, his face smeared with chocolate, his blue jeans hanging down in back so you could see at least two inches of his butt, like a little boy’s.” It is a horrifying image of a man Lamott looked up to regressing into childhood before the grave, leaving one to wonder if this has ravaged Lamott’s view of men — especially since she repeatedly chides the apparent male relish of peeing standing up — and what idea of manhood she will pass along to her growing boy.

Part of me loves and respects men so desperately, and part of me thinks they are so embarrassingly incompetent at life and in love. You have to teach them the very basics of emotional literacy. You have to teach them how to be there for you, and part of me feels tender toward them and gentle, and part of me is so afraid of them, afraid of any more violation. I want to clean out some of these wounds, though, with my therapist, so Sam doesn’t get poisoned by all my fear and anger.

This is an astonishing confession for a book that also has Lamott tending to Sam’s colic, describing the wonders of Sam’s first sounds and movement, and basking in the joys of a human soul emerging in rapid increments. Motherhood has long been compared to war, to the point where vital discussions about work-family balance have inspired their own “mommy wars.” Operating Instructions features allusions to heroes, Nagasaki, Vietnam, and other language typically muttered by priapic military historians. Yet Lamott’s take also reveals a feeling that has become somewhat dangerous to express in an era of mansplaining, Lulu hashtags, and vapid declarations of “the end of men.” Men are indeed embarrassing, but are they an ineluctable part of motherhood? It is interesting that Lamott broaches this question long after Sam’s father has become a forgettable presence in the book. And yet months later, Lamott is more grateful for the inherited attributes of the “better donor” in “the police lineup of my ex-boyfriends”:

He’s definitely got his daddy’s thick, straight hair, and, God, am I grateful for that. It means he won’t have to deal with hat hair as he goes through life.

Throughout all this, Lamott continues to take in Sam. He is at first “just a baby,” some human vehicle that has just left the garage of Lamott’s belly:

The doctor looked at the baby’s heartbeat on the monitor and said dully, “The baby’s flat,” and I immediately assumed it meant he was dead or at least retarded from lack of oxygen. I don’t think a woman would say anything like that to a mother. “Flat?” I said incredulously. “Flat?” Then he explained that this meant the baby was in a sleep cycle.

But as Sam occupies a larger space in Lamott’s life, there is an innate ecstasy in the way she describes his presence. Sam is “a breathtaking collection of arms and knees,” “unbelievably pretty, with long, thin, Christlike feet,” and “an angel today…all eyes and thick dark hair.” We’re all familiar with the way that new parents gush about their babies, yet Lamott is surprisingly judicious in tightening the pipe valve. Even as she declares the inevitable epithets of frustration (“Go back to sleep, you little shit”) and trivializes Sam (“I thought it would be more like getting a cat”), Sam’s beauty is formidable enough to spill elsewhere, such as this description of a mountain near Bolinas:

So we were driving over the montain, and on our side it was blue and sunny, but as soon as we crested, I could see the thickest blanket of fog I’ve ever seen, so thick it was quilted with the setting sun shining upward from underneath it, and it shimmered with reds and roses, and above were radiant golden peach colors. I am not exaggerating this. I haven’t seen a sky so stunning and bejeweled and shimmering with sunset colors and white lights since the last time I took LSD, ten years ago.

Being a mother may be akin to a heightened narcotic experience, but that doesn’t have to stop you from feeling.

* * *

I suggested earlier that Operating Instructions serves as a precursor to the mommyblog, but this doesn’t just extend to the time-stamp. There is something about setting down crisp observations while the baby is napping that inspires an especially talented writer to find imaginative similes, especially in commonplace activities which those who are not mothers willfully ignore or take for granted. Compare Lamott and Dooce‘s Heather Armstrong (perhaps the best-known of the mommy bloggers) as they describe contending with a breast pump:

“You feel plugged into a medieval milking machine that turns your poor little gumdrop nipples into purple slugs with the texture of rhinoceros hide.” — Anne Lamott, 1993

“I end up lying on my back completely awake as my breasts harden like freshly poured cement baking in the afternoon sun.” — Dooce, February 23, 2004

Armstrong has sedulously avoided invoking Lamott’s name in more than a decade of blogging, but, in both cases, we see just enough imagery squirted into the experience for the reader to feel the struggle. Both Lamott and Armstrong have rightly earned a large readership for describing ordinary situations in slightly surreal (and often profane) terms. (Both writers are also marked in ways by religion. Lamott came to Christianity after a wild life that involved alcoholism. Armstrong escaped Mormonism, eluding to a period as an “unemployed drunk” before meeting her husband, who she subsequently divorced.)

Next Up: Ian Hacking’s The Taming of Chance!

The Modern Library Nonfiction Challenge

mlnfindexJust under three years ago, I began the Modern Library Reading Challenge. It was an ambitious alternative to a spate of eccentric reading challenges then making the rounds. These included such gallant reading missions as the Chunkster, the Three Card Monte/Three Sisters Bronte, the Read All of Shakespeare While Blindfolded Challenge, and the Solzhenitsyn Russian Roulette Challenge. It took a fairly eccentric person to place the literary embouchure ever so nobly to one’s lips and fire off a fusillade of eupohonic Prince Pless bliss into the trenchant air. But I was game.

In my case, the idea was to write at least 1,000 words on each title after reading it. The hope was to fill in significant reading gaps while also cutting an idiosyncratic course across the great works of 20th century literature, with other intrepid readers walking along with me.

Over the next twenty-three months, I steadily worked my way through twenty-three works of fiction. Some of the books were easy to find. Some required elaborate trips to exotic bookstores in far-off states. When I checked out related critical texts and biographies from the New York Public Library, I was often informed by the good librarians at the Mid-Manhattan branch that I was the first soul to take these tomes home in sixteen years. This surprised me. New York was a city with eight million people. Surely there had to be more curiosity seekers investigating these authors. But I discovered that some of these prominent authors had been severely neglected. When I got to The Old Wives’ Tale, Arnold Bennett was so overlooked that I appeared to be the first person to upload a photo of reasonable resolution (which I had taken from a public domain image published in a biography) onto the Internet.

There were other surprises. I became an Iris Murdoch obsessive. I was finally able to overcome my youthful indiscretions and appreciate The Adventures of Augie March as the masterpiece that it was. My mixed feelings on Brideshead Revisited proved controversial in some circles and caused at least one academic to condemn me. On the other hand, I also sparked an online friendship with Stephen Wood, who was also working his way through the Mighty 100, and was put into contact with an extremely small yet determined group of enthusiasts making similar reading attempts in various corners of the world.

Yet when I told some people about my project, it was considered strange or sinister. When I mentioned the Modern Library Reading Challenge to a much older writer, she was stunned that anyone my age go to the trouble of Lawrence Durrell. (And she liked Durrell!) Her quizzical look led me to wonder if she was going to send me to some shady authority to administer a second opinion.

One of the project’s appeals was its methodical approach: I was determined to read all the books from #100 to #1. This not only provided a healthy discipline, but it ensured that I wouldn’t push the least desired books to the end. Much as life provides you with mostly happy and sometimes unpleasant tasks to fulfill as they arrive, I felt that my reading needed to maintain a similar commitment. This strategy also created a vicarious trajectory for others to follow.

Everything was going well. Very well indeed. Henry Green. Jean Rhys. The pleasant surprise of The Ginger Man. With these authors, how could it not go well? I was poised to read to the finish line. I was zooming my Triumph TR6 down a hilly two-lane highway with a full tank of gas. Cranking loud music. Not a care in the world.

And then I hit Finnegans Wake.

To call Finnegans Wake “difficult” is a woefully insufficient description. This is a book that requires developing an ineluctably Talmudic approach. But I am not easily fazed. Finnegans Wake is truly a book of grand lexical riches, even if I remain permanently stalled within the voluble tendrils of its first 50 pages. I have every intention of making my way through Finnegans Wake. I have reread Dubliners, A Portrait of the Artist as a Young Man, and Ulysses. I have consulted numerous reference texts. I have even listened to all 176 episodes of Frank Delaney’s excellent podcast, Re: Joyce. These have all been pleasant experiences, but I am still not sure if any of this significantly contributes to my understanding of Finnegans Wake. However, when you do something difficult, it is often best to remain somewhat clueless. If you become more aware of how “impossible” something may be, then you may not see it through to the end. Joyce remains the Everest of English literature. I am determined to scale the peak, even if I’m not entirely sure how reliable my gear is.

The regrettable Finnegans Wake business has also meant that the Modern Library Reading Challenge has been stuck on pause. It has been eleven months since I published a new Modern Library installment on these pages. And while I have certainly stayed busy during this time (I have made a documentary about Gary Shteyngart’s blurbs, attempted to fund a national walk that I intend to fulfill one day, canceled and brought back The Bat Segundo Show, and created a new thematic radio program, among other industries), I have long felt that persistent progress — that is, an efflorescent commitment to a regular fount of new material — is the best way to stay in shape and keep any project alive.

I have also had a growing desire to read more nonfiction, especially as the world revealed itself to be a truly maddening and perilous place as the reading challenge unfolded. Some have sought to keep their heads planted beneath the ground like quavering ostriches about all this. There are far too many adults I know, now well in their thirties, who remain distressingly committed to the “La la la I can’t hear you!” school of taking in bad news. But I feel that understanding how historical and social cycles (Vico, natch) cause human souls to saunter down dark and treacherous roads also allows us to comprehend certain truths about our present age. To carry on in the world without a sense of history, without even a cursory understanding of ideas and theories that have been attempted or considered before, is to remain a rotten vegetable during the more important moments of life.

It turns out that the Modern Library has another list of one hundred titles devoted to nonfiction. And the nonfiction list is, to my great surprise, more closely aligned to the fiction list than I anticipated.

In 1998, the Washington Post‘s David Streitfeld revealed that the Modern Library fiction list was plagued by modest scandal. The ten august Modern Library board members behind the fiction list had no knowledge over who had voted for what, why the books were ranked the way they were, or how the list had been composed, with many of the rankings more perfunctory than anyone knew. Brave New World, for example, had muscled its way up to #5, but only because many of the judges believed that it needed to be somewhere on the list.

So when the Modern Library gang devoted its attention to a nonfiction list, it was, as Salon‘s Craig Offman reported, determined not to repeat many of the same mistakes. University of Chicago statistics professor Albert Mandansky was signed on to ensure a more dutiful ranking progress. Younger authors and more women were included among the board. Mandansky went to the trouble of creating a computer algorithm so that there could be no ties. But the new iron fist approach offered some drawbacks. There was a new rule that an author could only have one title on the list, which meant that Edmund Wilson’s To the Finland Station didn’t make the cut. And when the top choice was announced — The Education of Henry Adams — the crowd stayed silent. It was rumored that one board member scandalously played with his dentures as the titles were called.

Perhaps the Modern Library’s second great experiment reveals the unavoidable pointlessness behind these lists. As novelist Richard Powers recently observed in a National Book Critics Circle post, “The reading experiences I value most are the ones that shake me out of my easy aesthetic preferences and make the favorites game feel like a talent show in the Iroquois Theater just before the fire. Give me the not-yet likable, the unhousebroken, something that is going to throw my tastes in a tizzy and make my self-protecting Tops of the Pops slink away in shame.”

On the other hand, if it takes anywhere from five to ten years to get through a hundred titles, then the reader is inured to this problem. Today’s favorites may be tomorrow’s dogs, and yesterday’s lackluster choices may be tomorrow’s crown jewels. As the Modern Library reader grows older, there’s nearly a built-in guarantee that these preordained tastes will become passe at some point. (To wit: Lord David Cecil’s Melbourne, the first book I will be reading for this new challenge, is now out of print.)

So I have decided to take up the second challenge, reading the nonfiction list from #100 to #1. Modern Library Nonfiction Challenge titles shall flow from these pages as I slowly make my way through Finnegans Wake during the first challenge. Hopefully, once the disparity between the two challenges has been worked out, I will eventually make steady progress on the fiction and nonfiction fronts. But the nonfiction challenge won’t be a walk in the park either. It has its own Finnegans Wake at #23. I am certain that Principia Mathematica will come close to destroying my brain. But as I said three years ago, I plan to read forever or die trying.

To prevent confusion for longtime readers, the fiction challenge will be separated from the nonfiction challenge by color. Fiction titles shall carry on in red. Nonfiction titles will be in gold.

I’ve started to read Melbourne and I’m hoping to have the first Modern Library Nonfiction Challenge essay up before the end of the month. This page, much like the fiction list, will serve as an index. I will add the links and the dates as I read the books. I hope that these efforts will inspire more readers to take up the challenge. (And if you do end up reading along, don’t be a stranger!)

Now let’s get this party started. Here are the titles:

100. Melbourne, Lord David Cecil (December 27, 2013)
99. Operating Instructions, Anne Lamott (January 14, 2014)
98. The Taming of Chance, Ian Hacking (March 23, 2014)
97. The Journalist and the Murderer, Janet Malcolm (July 17, 2014)
96. In Cold Blood, Truman Capote (November 11, 2015)
95. The Promise of American Life, Herbert Croly (January 21, 2016)
94. The Contours of American History, William Appleman Williams (February 7, 2016)
93. The American Political Tradition, Richard Hofstadter (February 18, 2016)
92. The Power Broker, Robert A. Caro (May 11, 2016)
91. Shadow and Act, Ralph Ellison (November 3, 2016)
90. The Golden Bough, James George Frazer (13 volumes, Third Edition) (November 14, 2016)
89. Pilgrim at Tinker Creek, Annie Dillard (November 23, 2016)
88. Six Easy Pieces, Richard P. Feynman (November 30, 2016)
87. A Mathematician’s Apology, G.H. Hardy (December 3, 2016)
86. This Boy’s Life, Tobias Wolff (June 15, 2017)
85. West with the Night, Beryl Markham (August 21, 2017)
84. A Bright Shining Lie, Neil Sheehan (December 6, 2017)
83. Vermeer, Lawrence Gowing (February 22, 2018)
82. The Strange Death of Liberal England, George Dangerfield (September 16, 2018)
81. The Face of Battle, John Keegan (December 26, 2018)
80. Studies in Iconology, Erwin Panofsky (January 23, 2019)
79. The Rise of Theodore Roosevelt, Edmund Morris (February 20, 2019)
78. Why We Can’t Wait, Martin Luther King Jr. (February 28, 2019)
77. Battle Cry of Freedom, James M. McPherson (September 11, 2020)
76. The City in History, Lewis Mumford (September 12, 2020)
75. The Great War and Modern Memory, Paul Fussell (March 3, 2022)
74. Florence Nightingale, Cecil Woodham-Smith (June 14, 2022)
73. James Joyce, Richard Ellmann (December 22, 2023)
72. The Gnostic Gospels, Elaine Pagels (March 16, 2024)
71. The Rise of the West, William H. McNeill
70. The Strange Career of Jim Crow, C. Vann Woodward
69. The Structure of Scientific Revolutions, Thomas S. Kuhn
68. The Gate of Heavenly Peace, Jonathan D. Spence
67. A Preface to Morals, Walter Lippmann
66. Religion and the Rise of Capitalism, R.H. Tawney
65. The Art of Memory, Frances A. Yates
64. The Open Society and Its Enemies, Karl Popper
63. The Sweet Science, A.J. Liebling
62. The House of Morgan, Ron Chernow
61. Cadillac Desert, Marc Reisner
60. In the American Grain, William Carlos Williams
59. Jefferson and His Time, Dumas Malone (6 volumes)
58. Out of Africa, Isak Dinesen
57. The Second World War, Winston Churchill (6 volumes)
56. The Liberal Imagination, Lionel Trilling
55. Darkness Visible, William Styron
54. Working, Studs Terkel
53. Eminent Victorians, Lytton Strachey
52. The Right Stuff, Tom Wolfe
51. The Autobiography of Malcolm X, Alex Haley and Malcolm X
50. Samuel Johnson, Walter Jackson Bate
49. Patriotic Gore, Edmund Wilson
48. The Great Bridge, David McCullough
47. Present at the Creation, Dean Acheson
46. The Affluent Society, John Kenneth Galbraith
45. A Study of History, Arnold J. Toynbee (12 volumes)**
44. Children of Crisis, Robert Coles (5 volumes)
43. The Autobiography of Mark Twain, Mark Twain (3 volumes)
42. Homage to Catalonia, George Orwell
41. Goodbye to All That, Robert Graves
40. Science and Civilization in China, Joseph Needham (5 volumes, abridged)*
39. Autobiographies, W.B. Yeats
38. Black Lamb and Gray Falcon, Rebecca West
37. The Making of the Atomic Bomb, Richard Rhodes
36. The Age of Jackson, Arthur Schlesinger, Jr.
35. Ideas and Opinions, Albert Einstein
34. On Growth and Form, D’Arcy Thompson
33. Philosophy and Civilization, John Dewey
32. Principia Ethica, G.E. Moore
31. The Souls of Black Folk, W.E.B. Du Bois
30. The Making of the English Working Class, E.P. Thompson
29. Art and Illusion, Ernest H. Gombrich
28. A Theory of Justice, John Rawls
27. The Ants, Bert Hoelldobler and Edward O. Wilson
26. The Art of the Soluble, Peter B. Medawar
25. The Mirror and the Lamp, Meyer Howard Abrams
24. The Mismeasure of Man, Stephen Jay Gould
23. Principia Mathematica, Alfred North Whitehead and Bertrand Russell (3 volumes)
22. An American Dilemma, Gunnar Myrdal
21. The Elements of Style, William Strunk and E.B. White
20. The Autobiography of Alice B. Toklas, Gertrude Stein
19. Notes of a Native Son, James Baldwin
18. The Nature and Destiny of Man, Reinhold Niebuhr
17. The Proper Story of Mankind, Isaiah Berlin
16. The Guns of August, Barbara Tuchman
15. The Civil War, Shelby Foote (Three volumes: Fort Sumter to Perryville, Fredericksburg to Meridian, Red River to Appomattox)
14. Aspects of the Novel, E.M. Forster
13. Black Boy, Richard Wright
12. The Frontier in American History, Frederick Jackson Turner
11. The Lives of a Cell, Lewis Thomas
10. The General Theory of Employment, Interest, and Money, John Maynard Keynes
9. The American Language, H.L. Mencken
8. Speak, Memory, Vladimir Nabokov
7. The Double Helix, James D. Watson
6. Selected Essays, 1917-1932, T.S. Eliot
5. Silent Spring, Rachel Carson
4. A Room of One’s Own, Virginia Woolf
3. Up from Slavery, Booker T. Washington
2. The Varieties of Religious Experience, William James
1. The Education of Henry Adams, Henry Adams

* December 15, 2018 Update: While I am striving to read the unabridged versions of all works for this project, upon further reflection, I’ve realized that the cost of obtaining the full 27 volume set of Needham’s opus is well beyond my price range. Each volume ranges from $40 to $200, in part due to the extortionate pricing of Cambridge University Press, a publisher that has proven deaf to my inquiries about obtaining a review copy. This effectively makes this purchase equal to the price of a used car. In addition, it is rather insane for any reader, even one who possesses my ridiculous ambitions, to devote some 8,000 pages to one author. I have reluctantly opted to substitute the five volume Shorter Science and Civilisation in China when I get to this particular essay. As of now, I do have the unabridged The Golden Bough under my belt. And I will spring for the unabridged Toynbee. I hope readers following along can forgive me for cutting corners on one entry. But I do want to complete this project before I depart this earth. And pragmatically speaking, this is the only way to do it.

** August 3, 2022 Update: Four years ago, it was possible to get a copy of the full twelve volume set of Toynbee for under $200. But in a testament to how rapidly these books are going out of print, getting a copy of the full set has become increasingly difficult to find. I may have to tackle the abridged version, with great reluctance.