mathboard

A Mathematician’s Apology (Modern Library Nonfiction #87)

(This is the fourteenth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Six Easy Pieces.)

mlnf87Clocking in at a mere ninety pages in very large type, G.H. Hardy’s A Mathematician’s Apology is that rare canapé plucked from a small salver between all the other three-course meals and marathon banquets in the Modern Library series. It is a book so modest that you could probably read it in its entirety while waiting for the latest Windows 10 update to install. And what a bleak and despondent volume it turned out to be! I read the book twice and, each time I finished the book, I wanted to seek out some chalk-scrawling magician and offer a hug.

G.H. Hardy was a robust mathematician just over the age of sixty who had made some serious contributions to number theory and population genetics. He was a cricket-loving man who had brought the Indian autodidact Srinivasa Ramanujan to academic prominence by personally vouching for and mentoring him. You would think that a highly accomplished dude who went about the world with such bountiful and generous energies would be able to ride out his eccentric enthusiasm into his autumn years. But in 1939, Hardy survived a heart attack and felt that he was as useless as an ashtray on a motorcycle, possessing nothing much in the way of nimble acumen or originality. So he decided to memorialize his depressing thoughts about “useful” contributions to knowledge in A Mathematician’s Apology (in one of the book’s most stupendous understatements, Hardy observed that “my apology is bound to be to some extent egotistical”), and asked whether mathematics, the field that he had entered into because he “wanted to beat other boys, and this seemed to be the way in which I could do so most decisively,” was worthwhile.

You can probably guess how it all turned out:

It is indeed rather astonishing how little practical value scientific knowledge has for ordinary man, how dull and commonplace such of it as has value is, and how its value seems almost to vary inversely to reputed utility….We live either by rule of thumb or other people’s professional knowledge.

If only Hardy could have lived about sixty more years to discover the 21st century thinker’s parasitic relationship to Google and Wikipedia! The question is whether Hardy is right to be this cynical. While snidely observing “It is quite true that most people can do nothing well,” he isn’t a total sourpuss. He writes, “A man’s first duty, a young man’s at any rate, is to be ambitious,” and points out that ambition has been “the driving force behind nearly all the best work of the world.” What he fails to see, however, is that youthful ambition, whether in a writer or a scientist, often morphs into a set of routines that become second-nature. At a certain point, a person becomes comfortable enough with himself to simply go on with his work, quietly evolving, where the ambition becomes more covert and subconscious and mysterious.

Hardy never quite confronts what it is about youth that frightens him, but he is driven by a need to justify his work and his existence, pointing to two reasons for why people do what they do: (1) they work at something because they know they can do it well and (2) they work at something because a particular vocation or specialty came their way. But this seems too pat and Gladwellian to be a persuasive dichotomy. It doesn’t really account for the journey we all must face over why one does something, which generally includes the vital people you meet at certain places in your life who point you down certain directions. Either they recognize some talent in you and give you a leg up or they are smart and generous enough to recognize that one essential part of human duty is to help others find their way, to seek out your people — ideally a group of eclectic and vastly differing perspectives — and to work with each other to do the best damn work and live the best damn lives you can. Because what’s the point of geeking out about Fermat’s “two squares” theorem, which really is, as Hardy observes, a nifty mathematical axiom of pure beauty, if you can’t share it with others?

But let’s return to Hardy’s fixation on youth. Hardy makes the claim that “mathematics, more than any other art or science, is a young man’s game,” yet this staggering statement is easily debunked by such late bloomers as prime number ninja Zhang Yitang and Andrew Wiles solving Fermat’s Last Theorem at the age of 41. Even in Hardy’s own time, Henri Poincaré was making innovations to topology and Lorentz transformations well into middle age. (And Hardy explicitly references Poincaré in § 26 of his Apology. So it’s not like he didn’t know!) Perhaps some of the more recent late life contributions have much to do with forty now being the new thirty (or even the new twenty among a certain Jaguar-buying midlife crisis type) and many men in Hardy’s time believing themselves to be superannuated in body and soul around the age of thirty-five, but it does point to the likelihood that Hardy’s sentiments were less the result of serious thinking and more the result of crippling depression.

Where Richard Feynman saw chess as a happy metaphor for the universe, “a great game played by the gods” in which we humans are mere observers who “do not know what the rules of the game are,” merely allowed to watch the playing (and yet find marvel in this all the same), Hardy believed that any chess problem was “simply an exercise in pure mathematics…and everyone who calls a problem ‘beautiful’ is applauding mathematical beauty, even if is a beauty of a comparatively lowly kind.” Hardy was so sour that he compared a chess problem to a newspaper puzzle, claiming that it merely offered an “intellectual kick” for the clueless educated rabble. As someone who enjoys solving the Sunday New York Times crossword in full and a good chess game (it’s the street players I have learned the most from; for they often have the boldest and most original moves), I can’t really argue against Hardy’s claim that such pastimes are “trivial” or “unimportant” in the grand scheme of things. But Hardy seems unable to remember the possibly apocryphal tale of Archimedes discovering gradual displacement while in the bathtub or the more reliable story of Otto Loewi’s dream leading the great Nobel-winning physiologist to discover that nervous impulses arose from electrical transmissions. Great minds often need to be restfully thinking or active on other fronts in order to come up with significant innovations. And while Hardy may claim that “no chess problem has ever affected the development of scientific thought,” I feel compelled to note Pythagoras played the lyre (and even inspired a form of tuning), Newton had his meandering apple moment, and Einstein enjoyed hiking and sailing. These were undoubtedly “trivial” practices by Hardy’s austere standards, but would these great men have given us their contributions if they hadn’t had such downtime?

It’s a bit gobsmacking that Hardy never mentions how Loewi was fired up by his dreams. He seems only to see value in Morpheus’s prophecies if they are dark and melancholic:

I can remember Bertrand Russell telling me of a horrible dream. He was in the top floor of the University Library, about A.D. 2100. A library assistant was going round the shelves carrying an enormous bucket, taking down book after book, glancing at them, restoring them to the shelves or dumping them into the bucket. At last he came to three large volumes which Russell could recognize as the last surviving copy of Principia mathematica. He took down one of the volumes, turned over a few pages, seemed puzzled for a moment by the curious symbolism, closed the volume, balanced it in his hand and hesitated….

One of an author’s worst nightmares is to have his work rendered instantly obsolescent not long after his death, even though there is a very strong likelihood that, in about 150 years, few people will care about the majority of books published today. (Hell, few people care about anything I have to write today, much less this insane Modern Library project. There is a high probability that I will be dead in five decades and that nobody will read the many millions of words or listen to the countless hours of radio I have put out into the universe. It may seem pessimistic to consider this salient truth, but, if anything, it motivates me to make as much as I can in the time I have, which I suppose is an egotistical and foolishly optimistic approach. But what else can one do? Deposit one’s head in the sand, smoke endless bowls of pot, wolf down giant bags of Cheetos, and binge-watch insipid television that will also not be remembered?) You can either accept this reality and reach the few people you can and find happiness and gratitude in doing so. Or you can deny the clear fact that your ego is getting in the way of your achievements, embracing supererogatory anxieties and forcing you to spend too much time feeling needlessly morose.

I suppose that in articulating this common neurosis, Hardy is performing a service. He seems to relish “mathematical fame,” which he calls “one of the soundest and steadiest of investments.” Yet fame is a piss-poor reason to go about making art or formulating theorems. Most of the contributions to human advancement are rendered invisible. These are often small yet subtly influential rivulets that unknowingly pass into the great river that future generations will wade in. We fight for virtues and rigor and intelligence and truth and justice and fairness and equality because this will be the legacy that our children and grandchildren will latch onto. And we often make unknowing waves. Would we, for example, be enjoying Hamilton today if Lin-Manuel Miranda’s school bus driver had not drilled him with Geto Boys lyrics? And if we capitulate those standards, if we gainsay the “trivial” inspirations that cause others to offer their greatness, then we say to the next generation, who are probably not going to be listening to us, that fat, drunk, and stupid is the absolute way to go through life, son.

A chair may be a collection of whirling electrons, or an idea in the mind of God: each of these accounts of it may have its merits, but neither conforms at all closely to the suggestions of common sense.

This is Hardy suggesting some church and state-like separation between pure and applied mathematics. He sees physics as fitting into some idealistic philosophy while identifying pure mathematics as “a rock on which all idealism flounders.” But might not one fully inhabit common sense if the chair exists in some continuum beyond this either-or proposition? Is not the chair’s perceptive totality worth pursuing?

It is at this point in the book where Hardy’s argument really heads south and he makes an astonishingly wrongheaded claim, one that he could not have entirely foreseen, noting that “Real mathematics has no effects on war.” This was only a few years before Los Alamos was to prove him wrong. And that’s not all:

It can be maintained that modern warfare is less horrible than the warfare of pre-scientific times; that bombs are probably more merciful than bayonets; that lachrymatory gas and mustard gas are perhaps the most humane weapons yet devised by military science; and that the orthodox view rests solely on loose-thinking sentimentalism.

Oh Hardy! Hiroshima, Nagasaki, Agent Orange, Nick Ut’s famous napalm girl photo from Vietnam, Saddam Hussein’s chemical gas massacre in Halabja, the use of Sarin-spreading rockets in Syria. Not merciful. Not humane. And nothing to be sentimental about!

Nevertheless, I was grateful to argue with this book on my second read, which occurred a little more than two weeks after the shocking 2016 presidential election. I had thought myself largely divested of hope and optimism, with the barrage of headlines and frightening appointments (and even Trump’s most recent Taiwan call) doing nothing to summon my natural spirits. But Hardy did force me to engage with his points. And his book, while possessing many flawed arguments, is nevertheless a fascinating insight into a man who gave up: a worthwhile and emotionally true Rorschach test you may wish to try if you need to remind yourself why you’re still doing what you’re doing.

Next Up: Tobias Wolff’s This Boy’s Life!

feynman

Six Easy Pieces (Modern Library Nonfiction #88)

(This is the thirteenth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Pilgrim at Tinker Creek.)

mlnf88Richard Feynman, exuberant Nobel laureate and formidable quantum mechanics man, may have been energetic in his lectures and innovatively performative in the classroom, but I’m not sure he was quite the great teacher that many have pegged him to be. James Gleick’s biography Genius informs us that students dropped out of his high-octane, info-rich undergraduate physics classes at a remarkable rate, replaced by Caltech faculty members and grad students who took to the Queens-born superstar much like baryons make up the visible matter of the universe. The extent to which Feynman was aware of this cosmic shift has been disputed by his chroniclers, but it is important to be aware of this shortcoming, especially if you’re bold enough to dive into the famed three volume Feynman Lectures on Physics, which are all thankfully available online. Six Easy Pieces represents an abridged version of Feynman’s full pedagogical oeuvre. And even though the many YouTube videos of Feynman reveal an undeniably magnetic and indefatigably passionate man of science who must have been an incredible dynamo to experience in person, one wonders whether barraging a hot room of young nervous twentysomethings with hastily delivered information is the right way to popularize science, much less inspire a formidable army of physicists.

Watch even a few minutes of Feynman firing on all his robust cylinders and it becomes glaringly apparent how difficult it is to contend with Feynman’s teaching legacy in book form. One wonders why the Modern Library nonfiction judges, who were keen to unknowingly bombard this devoted reader with such massive multivolume works as The Golden Bough, Dumas Malone’s Jefferson and His Time, and Principia Mathematica, didn’t give this spot to the full three volume Lectures. Did they view Feynman’s complete lesson plan as failed?

Judging from the sextet that I sampled in this deceptively slim volume, I would say that, while Feynman was undeniably brilliant, he was, like many geniuses, someone who often got lost within his own metaphors. While his analogy of two corks floating in a pool of water, with one cork jiggling in place to create motion in the pool that causes indirect motion for the other cork, is a tremendously useful method of conveying the “unseen” waves of the electromagnetic field (one that galvanized me to do the same in a saucepan after I had finished two bottles of wine over a week and a half), he is not nearly on-the-nose with his other analogies. The weakest lesson in the book, “Conservation on Energy,” trots out what seems to be a reliably populist metaphor with a child named “Dennis the Menace” playing with 28 blocks, somehow always ending up with 28 of these at the end of the day. Because Feynman wants to illustrate conservational constants, he shoehorns another element to the narrative whereby Dennis’s mother is, for no apparent reason, not allowed to open up the toy box revealing the number of blocks and thus must calculate how many blocks reside within. The mother has conveniently weighed the box at some unspecified time in advance back when it contained all 28 blocks.

This is bad teaching, in large part because it is bad storytelling that makes no sense. I became less interested in conservation of energy, with Feynman’s convoluted parallel clearly becoming more trouble than it was worth, and more interested in knowing why the mother was so fixated on remembering the number of blocks. Was she truly so starved for activity in her life that she spent all day at work avoiding all the juicy water cooler gossip about co-workers, much less kvetching about the boss, so that she might scheme a plan to at long last show her son that she would always know the weight of a single block? When Dennis showed resistance to opening the toy box, why didn’t the mother stand her ground and tell him to buzz off and stream an episode of Project Mc²?

Yet for all these defects in method, there is an indisputable poetic beauty in the way in which Feynman reminds us that we live in a vast world composed of limitless particles, a world in which we still aren’t aware of all the rules and in which even the particles contained within solids remain “fixed” in motion. Our universe is always moving, even when we can’t see it or completely comprehend it. Feynman is quick to observe throughout his lessons that “The test of all knowledge is experiment,” which again points to my theory that Feynman’s teachings, often accentuated by experiment, were probably better experienced than read. Nevertheless, even in book form, it is truly awe-inspiring to understand that we can still not accurately predict the precise mass, form, and force of all the cascading droplets from a mighty river once it hits the precipice of a waterfall. Such mysteries capture our imagination and, when Feynman is committed to encouraging our inventiveness through open and clear-eyed examples from our world, he is very much on point. Thanks in part to Feynman reminding me just how little we silly humans now know, I began to feel my heart open more for Tycho Brahe, that poor Dane who spent many years of his life refining Copernicus’s details and determining the elliptical patterns of planetary orbits. Brahe worked out his calculations entirely without a telescope, which allowed Johannes Kepler to sift through his invaluable measurements and forge laws that all contemporary astronomers now rely on to determine where a planet might be in the sky on any given night of the year. Heisenberg’s uncertainty principle hasn’t even been around a century and it’s nothing less than astounding to consider how our great grandparents had a completely different understanding of atoms and motion in their early lifetime than we do today.

Feynman did have me wanting to know more about the origins of many scientific discoveries, causing me to contemplate how each and every dawning realization altered human existence (an inevitable buildup for Thomas Kuhn and paradigms, which I will take up in ML Nonfiction #69). But unlike such contemporary scientists as Neil deGrasse Tyson, Alan Guth, or Brian Greene, Feynman did not especially inspire me to plunge broadly into my own experiments or make any further attempts to grapple with physics-based complexities. This may very well be more my failing than Feynman’s, but there shall be many more stabs at science as we carry on with this massive reading endeavor!

Next Up: G.H. Hardy’s A Mathematician’s Apology!

anniedillard

Pilgrim at Tinker Creek (Modern Library Nonfiction #89)

(This is the twelfth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Golden Bough.)

“Either this world, my mother, is a monster, or I myself am a freak.” — Annie Dillard, Pilgrim at Tinker Creek

I was a sliver-thin, stupefyingly shy, and very excitable boy who disguised his bruises under the long sleeves of his shirt not long before the age of five. I was also a freak.

bedroomI had two maps pinned to the wall of my drafty bedroom, which had been hastily constructed into the east edge of the garage in a house painted pink (now turquoise, according to Google Maps). The first map was of Tolkien’s Middle-earth, in which I followed the quests of Bilbo and Frodo by finger as I wrapped my precocious, word-happy head around sentences that I’d secretly study from the trilogy I had purloined from the living room, a well-thumbed set that I was careful to put back to the shelves before my volatile and often sour father returned home from the chemical plant. In some of his rare calm moments, my father read aloud from The Lord of the Rings if he wasn’t too drunk, irascible, or violent. His voice led me to imagine Shelob’s thick spidery thistles, Smeagol’s slithering corpus, and kink open my eyes the next morning for any other surprises I might divine in my daily journeys to school. The second map was of Santa Clara County, a very real region that everyone now knows as Silicon Valley but that used to be a sweeping swath of working and lower middle-class domiciles. This was one of several dozen free maps of Northern California that I had procured from AAA with my mother’s help. One of the nice perks of being an AAA member was the ample sample of rectangular geographical foldouts. I swiftly memorized all of the streets, held spellbound by the floral and butterfly patterns of freeway intersections seen from a majestic bird’s eye view in an errant illustrated sky. My mother became easily lost while driving and I knew the avenues and the freeways in more than a dozen counties so well that I could always provide an easy cure for her confusion. It is a wonder that I never ended up working as a cab driver, although my spatial acumen has remained so keen over the years that, to this day, I can still pinpoint the precise angle in which you need to slide a thick unruly couch into the tricky recesses of a small Euclidean-angled apartment even when I am completely exhausted.

mlnf89These two maps seemed to be the apotheosis of cartographic art at the time, filling me with joy and wonder and possibility. It helped me cope with the many problems I lived with at home. I understood that there were other regions beyond my bedroom where I could wander in peace, where I could meet kinder people or take in the beatific comforts of a soothing lake (Vasona Lake, just west of Highway 17 in Los Gatos, had a little railroad spiraling around its southern tip and was my real-life counterpart to Lake Evendim), where the draw of Rivendell’s elvish population or the thrill of smoky Smaug stewing inside the Lonely Mountain collided against visions of imagined mountain dwellers I might meet somewhere within the greens and browns of Santa Teresa Hills and the majestic observatories staring brazenly into the cosmos at the end of uphill winding roads. I would soon start exploring the world I had espied from my improvised bedroom study on my bike, pedaling unfathomable miles into vicinities I had only dreamed about, always seeking parallels to what the Oxford professor had whipped up. I once ventured as far south as Gilroy down the Monterey Highway, which Google Maps now informs me is a thirty-six mile round trip, because my neglectful parents never kept tabs on how long I was out of the house or where I was going. They didn’t seem to care. As shameful as this was, I’m glad they didn’t. I needed an uncanny dominion, a territory to flesh out, in order to stay happy, humble, and alive.

The maps opened up my always hungry eyes to books, which contained equally bountiful spaces devoted to the real and the imaginary, unspooling further marks and points for me to find in the palpable world and, most importantly, within my heart. I always held onto this strange reverence for place to beat back the sadness after serving as my father’s punching bag. To this day, I remain an outlier, a nomad, a lifelong exile, a wanderer even as I sit still, a renegade hated for what people think I am, a black sheep who will never belong no matter how kind I am. I won’t make the mistake of painting myself as some virtuous paragon, but I’ve become so accustomed to being condemned on illusory cause, to having all-too-common cruelties inflicted upon me (such as the starry-eyed bourgie Burning Man sybarite I recently opened my heart to, who proceeded to deride the city that I love, along with the perceived deficiencies of my hard-won apartment, this after I had told her tales, not easily summoned, about what it was like to be rootless and without family and how home and togetherness remain sensitive subjects for me) that the limitless marvels of the universe parked in my back pocket or swiftly summoned from my shelves or my constant peregrinations remain reliable, life-affirming balms that help heal the scars and render the wounds invisible. Heartbreak and its accompanying gang of thugs often feel like a mob bashing in your ventricles in a devastatingly distinct way, even though the great cosmic joke is that everyone experiences it and we have to love anyway.

So when Annie Dillard’s poetic masterpiece Pilgrim at Tinker Creek entered my reading life, its ebullient commitment to finding grace and gratitude in a monstrous world reminded me that seeing and perceiving and delving and gaping awestruck at Mother Earth’s endless glories seemed to me one one of the best survival skills you can cultivate and that I may have accidentally stumbled upon. As I said, I’m a freak. But Dillard is one too. And there’s a good chance you may walk away from this book, which I highly urge you to read, feeling a comparable kinship, as I did to Dillard. Even if you already have a formidable arsenal of boundless curiosity ready to be summoned at a moment’s notice, this shining 1974 volume remains vital and indispensable and will stir your soul for the better, whether you’re happy or sad. Near the end of a disastrous year, we need these inspirational moments now more than ever.

* * *

“Our life is a faint tracing on the surface of mystery.” – Pilgrim at Tinker Creek

Annie Dillard was only 28 when she wrote this stunning 20th century answer to Thoreau (the subject of her master’s thesis), which is both a perspicacious journal of journeying through the immediately accessible wild near her bucolic Southwestern Virginia perch and a daringly honest entreaty for consciousness and connection. Dillard’s worldview is so winningly inclusive that she can find wonder in such savage tableaux as a headless praying mantis clutching onto its mate or the larval creatures contained within a rock barnacle. The Washington Post claimed not long after Pilgrim‘s publication that the book was “selling so well on the West Coast and hipsters figure Annie Dillard’s some kind of female Castaneda, sitting up on Dead Man’s Mountain smoking mandrake roots and looking for Holes in the Horizon her guru said were there.” But Pilgrim, inspired in part from Colette’s Break of Day, is far from New Age nonsense. The book’s wise and erudite celebration of nature and spirituality was open and inspiring enough to charm even this urban-based secular humanist, who desperately needed a pick-me-up and a mandate to rejoin the world after a rapid-fire series of personal and political and romantic and artistic setbacks that occurred during the last two weeks.

For all of the book’s concerns with divinity, or what Dillard identifies as “a divine power that exists in a particular place, or that travels about over the face of the earth as a man might wander,” explicit gods don’t enter this meditation until a little under halfway through the book, where she points out jokingly how gods are often found on mountaintops and points out that God is an igniter as well as a destroyer, one that seeks invisibility for cover. And as someone who does not believe in a god and who would rather deposit his faith in imaginative storytelling and myth rather than the superstitions of religious ritual, I could nevertheless feel and accept the spiritual idea of being emotionally vulnerable while traversing into some majestic terrain. Or as Pascal wrote in Pensées 584 (quoted in part by Dillard), “God being thus hidden, every religion which does not affirm that God is hidden, is not true, and every religion which does not give the reason of it, is not instructive.”

Much of this awe comes through the humility of perceiving, of devoting yourself to sussing out every conceivable kernel that might present itself and uplift you on any given day and using this as the basis to push beyond the blinkered cage of your own self-consciousness. Dillard uses a metaphor of loose change throughout Pilgrim that neatly encapsulates this sentiment:

It is dire poverty indeed when a man is so malnourished and fatigued that he won’t stoop to pick up a penny. But if you cultivate a healthy poverty and simplicity, so that finding a penny will literally make your day, then, since the world is in fact planted in pennies, you have with your poverty bought a lifetime of days. It is that simple. What you see is what you get.

This is not too far removed from Thoreau’s faith in seeds: “Convince me that you have a seed there, and I am prepared to expect wonders.” The smug and insufferable Kathryn Schulzes of our world gleefully misread this great tradition of discovering possibilities in the small as arrogance, little realizing how their own blind and unimaginative hubris glows with crass Conde Nast entitlement as they fail to observe that Thoreau and Dillard were also acknowledging the ineluctable force of a bigger and fiercer world that will carry on with formidable complexity long after our dead bodies push against daisies. Faced with the choice of sustaining a sour Schulz-like apostasy or receiving every living day as a gift, I’d rather risk the arrogance of dreaming from the collected riches of what I have and what I can give rather than the gutless timidity of a prescriptive rigidity that fails to consider that we are all steeped in foolish and inconsistent behavior which, in the grand scheme of things, is ultimately insignificant.

Dillard is guided just as much by Heisenberg’s uncertainty principle as she is by religious and philosophical texts. The famous 1927 scientific law, which articulates how you can never know a particle’s speed and velocity at the same time, is very much comparable to chasing down some hidden deity or contending with some experiential palpitations when you understand that there simply is no answer, for one can feel but never fully comprehend the totality in a skirmish with Nature. It accounts for Dillard frequently noting that the towhee chirping on a treetop or the muskrat she observes chewing grass on a bank for forty minutes never see her. In seeing these amazing creatures carry on with their lives, who are completely oblivious to her own human vagaries, Dillard reminds us that this is very much the state of Nature, whether human or animal. If it is indeed arrogance to find awe and humility in this state of affairs, as Dillard and Thoreau clearly both did, then one’s every breath may as well be a Napoleonic puff of the chest.

Dillard is also smart and expansive enough to show us that, no matter where we reside, we are fated to brush up against the feral. She points to how arboreal enthusiasts in the Lower Bronx discovered a fifteen feet ailanthus tree growing from a lower Bronx garage and how New York must spend countless dollars each year to rid its underground water pipes of roots. Such realities are often contended with out of sight and out of mind, even as the New York apartment dweller battles cockroaches, but the reminder is another useful point for why we must always find the pennies and dare to dream and wander and take in, no matter what part of the nation we dwell in.

Another refreshing aspect of Pilgrim is the way in which Dillard confronts her own horrors with fecundity. Yes, even this graceful ruminator has the decency to confess her hangups about the unsettling rapidity with which moths lay their eggs in vast droves. She stops short at truly confronting “the pressure of birth and growth” that appalls her, shifting to plants as a way of evading animals and then retreating back to the blood-pumping phylum to take in blood flukes and aphid reproduction more as panorama rather than something to be felt. This volte-face isn’t entirely satisfying. On the other hand, Dillard is also bold enough to scoop up a cup of duck-pond water and peer at monostyla under a microscope. What this tells us is that there are clear limits to how far any of us are willing to delve, yet I cannot find it within me to chide Dillard too harshly for a journey she was not quite willing to take, for this is an honest and heartfelt chronicle.

While I’ve probably been “arrogant” in retreating at length to my past in an effort to articulate how Dillard’s book so moved me, I would say that Pilgrim at Tinker Creek represents a third map for my adult years. It is a true work of art that I am happy to pin to the walls of my mind, which seems more reliable than any childhood bedroom. This book has caused me to wonder why I have ignored so much and has demanded that that I open myself up to any penny I could potentially cherish and to ponder what undiscoverable terrain I might deign to take in as I continue to walk this earth. I do not believe in a god, but I do feel with all my heart that one compelling reason to live is to fearlessly approach all that remains hidden. There is no way that you’ll ever know or find everything, but Dillard’s magnificent volume certainly gives you many good reasons to try.

Next Up: Richard Feynman’s Six Easy Pieces!

The Golden Bough exhibited 1834 Joseph Mallord William Turner 1775-1851 Presented by Robert Vernon 1847 http://www.tate.org.uk/art/work/N00371

The Golden Bough (Modern Library Nonfiction #90)

(This is the eleventh entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Shadow and Act.)

mlnf90It is somewhat difficult to obtain a decent print edition of the Third Edition of The Golden Bough without getting fleeced by some squirrely operator working out of a shady storage unit in the middle of nowhere. For nobody seems to read the whole enchilada anymore. This is hardly surprising in an age, abundantly cemented last week, when most people are more inclined to celebrate regressive stupidity, melting their minds in any self-immolating pastime rather than opening a book. But I was able to find an affordable edition with the help of a British antiquarian. I had no idea what I was in for, but some initial research suggested very strongly that I should not settle for the abridged edition that is much easier to acquire. Certainly the sheer time-sucking insanity of the Modern Library Reading Challenge, one of the many dependable bastions I have left in a bleak epoch, demands that I go the distance on all entries, even if it means becoming ensnared by a particular title for several weeks, often answering texts from pals checking in on me with fun little snippets from Estonian folklore quebraditaing somewhere within a “Fine. How are you?” Such is the life of a book-loving eccentric with a ridiculous self-imposed mandate that involves refusing to let terrible setbacks get in the way of happy rumination. We find hope and courage and new ideas and fierce fortitude in remembering that not a single authoritarian entity or pernicious individual can ever crush the possibilities contained within our minds, our hearts, and our souls.

The thirteen volume set landed at my feet with a promising thud after a month-long voyage by boat across the Atlantic Ocean, where it occupied my reading time for many months and proceeded to change my life. I realize that such a claim may sound trite in light of the devastating and life-altering results of the 2016 U.S. presidential election, but, if there’s anything we can learn from Stefan Zweig’s suicide, we must never forget the importance of patience, of working long and hard to fight and endure while steering humanity’s promising galleon back on the right course even as we look to culture’s power to sustain our spirits in the darkest times.1

James George Frazer’s The Golden Bough proved so galvanizing that I began to marvel more at trees and desired to spend more time beneath their magnificent branches. I began picking up the junk that other New Yorkers had so thoughtlessly deposited under their glorious leafy crowns. I began naming some of the trees I liked, saying “Hello, Balder!” (styled after the Norse god) to a beloved maple near the southwestern edge of Central Park. I started paying closer attention to the modest superstitious rituals that most of us take for granted, wanting to know why we feared black cats crossing our path (it started in the 1560s and originated with the idea that black cats were witches who had transformed their corporeal state) or worried ourselves into years of bad luck from walking under a ladder (it goes back to the Egyptians, who believed that walking under any pyramid would attenuate its mystical power). And, of course, I began to wonder if other superstitious rituals, such as voting for a vicious sociopathic demagogue to make a nation “great” again, originated from similar irrational fears. Despite being a secular humanist, I was stunned to discover that I had modest pagan proclivities and started to ask friends to engage in rather goofball offshoots of established rites in a somewhat libertine manner, much of which is unreportable. And if you think such a reaction is idiosyncratic (and it is), consider the strange takeaway that D.H. Lawrence memorialized in a December 8, 1915 letter:

I have been reading Frazer’s Golden Bough and Totemism and Exogamy. Now I am convinced of what I believed when I was about twenty — that there is another seat of consciousness than the brain and the nerve system: there is a blood-consciousness which exists in us independently of the ordinary mental consciousness, which depends on the eye as its source or connector. There is the blood-consciousness, with the sexual connection, holding the same relation as the eye, in seeing, holds to the mental consciousness. One lives, knows, and has one’s being in blood, without any references to nerves and brain. This is one half of life, belonging to the darkness. And the tragedy of this our life, and of your life, is that the mental and nerve consciousness exerts a tyranny over the blood-consciousness, and that your will has gone completely over to the mental consciousness, and is engaged with the destruction of your blood-being or blood-consciousness, the final liberating of the one, which is only death in result.

When I finished Frazer’s final volume, I certainly wasn’t prepared to suggest that any part of my consciousness was tyrannizing the others because of some eternal human connection to myths and rites enacted to answer and make sense of the presently inexplicable. But Lawrence did have a point about the way humans are naturally drawn to unusual ceremonies and celebrations that go well beyond Carolina Panthers coach Ron Rivera wearing the same shoes on game days, with the impulse often defying any steely rationalism we may use to make sense of our inherently animalistic nature, which any glance at a newspaper reveals to be quite frighteningly present.

goldenboughset

More importantly, The Golden Bough changed everything I thought I knew about storytelling and myth. It forced me to see commonalities within many cultures. To cite one of Frazer’s countless comparative examples, consider the way that humans have approached the bear hunt. After the Kamtchatkans had killed a bear and eaten its flesh, the host of the subsequent dinner party would bring the bear’s head before the assembled guests, wrap it in grass, and then conduct a panel of sorts where the host, serving as a moderator only slightly less ballistic than Bill O’Reilly, would ask the bear if he had been well-treated. Much like many wingnut “journalists” irresponsibly publishing claims in Slate today without robust evidence (and failing to tender corrections when pwned), the Kamctchatkan host would blame the Russians. The American Indians likewise implored the dead bear not to be angry for being hunted and would hang the bear’s head on a post, painting it red and blue rather than donning it with vegetation. They also addressed it, much in the manner that dog owners chat with their uncomprehending pets when nobody’s around. The Assiniboins also held feasts after a hunt and also mounted the bear’s head, shrouding it in strips of scarlet cloth, and respected the bear so much that they offered the head a pipe to smoke, not unlike the poor dog who sits outside Mets games with a pipe in his mouth. And looking beyond Frazer, one finds in Alanson Skinner’s Notes on the Eastern Cree and Northern Saulteaux a similar bear’s head ceremony that involved sharing a pipe before the participants took a bite from the bear’s flesh and, with the old Finnish custom of karhunpeijaiset, a bear’s head mounted upon a young tree, venerated and addressed as a relative or the son of a god. And according to the Russian ethnographer Vladimir Arsenyev (and I found this bit by sifting through James H. Grayson’s Myths and Legends from Korea), the Udegey people of the Russian Far East also had a bear ceremony and believed, “To throw the head away is a great sin….The cooked bear’s head is wrapped in its own skin with its fur outwards and tied up with a thin rope,” with a communal ceremony quite similar to the Finns.

I could go on (and indeed Frazer often rambles for pages), but there’s an undeniable awe in learning that something so specific about bears (head mounted, party organized, head covered, bear respected), much less anything else, arose independently in so many different parts of the world. It proves very conclusively, and perhaps this is especially essential for us to understand as we reconcile a vast and seemingly incurable national division, that humans share more in common with each other than we’re willing to confess and that the seemingly unique rituals that we believe define “us” are quite likely shared many times over in other parts of the nation, much less the world.

The reason it took me so long to read The Golden Bough was not because of its many thousand pages (aside from some sloggish parts in the Spirits of the Corn and of the Wild volumes, the books are surprisingly readable2), but because my imagination would become so captivated by some tale of trees esteemed above human life or a crazed orgiastic release (see Saturnalia) that I would lose many hours in the library seeing how much of this was still practiced. It has been more than a century since Frazer published the Third Edition, but his remarkable observations about shared rituals still invite us to dream and believe and to perceive that, Frazer’s regrettable references to “savages” and “primitives” notwithstanding, we are not so different from each other.

Frazer’s explanation for these common qualities — epitomized by the famous J.M.W. Turner painting (pictured above) sharing the same name title as Frazer’s capacious opus — rests in the sylvan lake of Nemi and an ancient tale in which a priest-king defended a singular tree. The priest-king, who was an incarnation of a god wedded to the world, could only be overpowered by a fight to the death and, if he was slain, he would be replaced by his victor, with the cycle perpetuating ad infinitum. Frazer believed that nearly every story in human history could be traced back to this unifying myth, with most of the tales triggered by our imagination arising out of what he called “sympathetic magic,” whereby humans often imitate what they cannot understand. So if this meant building effigies or participating in elaborate and often unusual rituals that explain why the sun scorched the crops to an unsustainable crisp in the last harvest or helped more animals to multiply for grand feasts next season, magical thinking provided both the bond and the panacea well before Robert B. Thomas came up with the Old Farmer’s Almanac.

There are two components to sympathetic magic: the first is Contagion, or physical contact, which involved a transfer of “essence” by physical contact (among other things, this would account for why humans have been especially careful about bear’s heads, as described above); the second was Similarity, whereby “the magicians infers that he can produce any effect he desires merely by imitating it: from the second he infers that whatever he does to a material object will affect equally the person with whom the object was once in contact, whether it formed part of his body or not.”

One of The Golden Bough‘s most fascinating volumes, The Scapegoat, reveals how a human belief in “essence” may be the root of our most irrational fears. Contagion often led to humans trying to transfer their disease and miseries to other people, if not reinforcing their own biases about people or groups that they disliked. I am indebted to the terrific podcast Imaginary Worlds for steering me to the work of Carol Nemeroff, whose psychological considerations of Frazer’s findings are are especially constructive in understanding disgust. Nemeroff and her colleagues conducted a series of studies in which they placed a sterilized dead roach in a glass of juice and asked subjects to eat fudge that resembled dog feces. The natural reactions (recoiling at the roach and the shit-shaped fudge) showed that sympathetic magic is still very much a mainstay of our culture.

Indeed, sympathetic magic drives most of our cherished rituals today. In one of his most controversial (but nevertheless true) observations, Frazer observes in Adonis Attis Osiris that, although the Gospels never cited a hard date for Jesus Christ’s birthday bash, Christians have adhered to their churchgoing rituals with the same practiced regularity one sees in fundamentalist homophobics holding up cardboard signs that misquote the Bible to reinforce their hate. The original celebration date of Christ’s alleged birth was January 6th. But because heathens celebrated the birthday of the Sun on December 25th, and this was often a draw for the Christians because the heathens were more fun, the Church agreed to set December 25th as the official day. If Christmas did not exist, it would be necessary for humankind to invent it. For such useful observations, The Golden Bough is incredibly handy to have in one’s library, if only to remind us that most of our beliefs, the recurring rituals we are expected to adhere to, are predicated upon some ancient explanation that we failed to shake from the Magic 8-Ball of our daily existence. So Colin Kaepernick really doesn’t need to stand for the national anthem. While this conformist rite is admittedly improved from the Nazi-like Bellamy salute, standing for The Star-Spangled Banner is little more than a quaint superstition that one is pressured to participate in to “belong.”

Frazer’s scholarship, while impressive, is sometimes inexact in the effort to find a Theory of Everything. Midway through putting together the Third Edition, Frazer was challenged by Dr. Edward Westermarck, who pointed out that fire festivals did not originate from fire reinforcing the sun’s light and heat, but rather a need to celebrate purification. Frazer did correct his views in Balder the Beautiful, but it does leave one contemplating whether sympathetic magic served as Frazer’s knee-jerk goto point in his noble quest to reconcile several folkloric strands.

Still, one cannot disavow the conclusion that much of our behavior is not only similarly ceremonial across cultures, which would indeed suggest a common source. Frazer managed one last volume, the Aftermath, in 1937, just four years before his death. While this volume is little more than a collection of B-sides, it does have leave one wondering what Frazer would have made of Nuremburg rallies or even our current default mode of walking like zombies in the streets, heads down, eyes bulging at the prospect of another chapter in a Snapchat story. The gods and the sympathetic magic may be a tad more secular these days, but we still remain seduced. Myths and stories and rituals are as old as the Chauvet Cave paintings. One cannot imagine being human without them.

Next Up: Annie Dillard’s Pilgrim at Tinker Creek!

ralphellison

Shadow and Act (Modern Library Nonfiction #91)

(This is the tenth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Power Broker.)

mlnf91When I first made my bold belly flop into the crisp waters of Ralph Ellison’s deep pool earlier this year, I felt instantly dismayed that it would be a good decade before I could perform thoughtful freestyle in response to his masterpiece Invisible Man (ML Fiction #19). As far as I’m concerned, that novel’s vivid imagery, beginning with its fierce and intensely revealing Battle Royale scene and culminating in its harrowing entrapment of the unnamed narrator, stands toe-to-toe with Saul Bellow’s The Adventures of Augie March as one of the most compelling panoramas of mid-20th century American life ever put to print, albeit one presented through a more hyperreal lens.

But many of today’s leading writers, ranging from Ta-Nehisi Coates to Jacqueline Woodson, have looked more to James Baldwin as their truth-telling cicerone, that fearless sage whose indisputably hypnotic energy was abundant enough to help any contemporary humanist grapple with the nightmarish realities that America continues to sweep under its bright plush neoliberal rug. At a cursory glance, largely because Ellison’s emphasis was more on culture than overt politics, it’s easy to see Ellison as a complacent “Maybe I’m Amazed” to Baldwin’s gritty “Cold Turkey,” especially when one considers the risk-averse conservatism which led to Ellison being attacked as an Uncle Tom during a 1968 panel at Grinnell College along with his selfish refusal to help emerging African-American authors after his success. But according to biographer Arnold Rampersad, Baldwin believed Ralph Ellison to be the angriest person he knew. And if one dives into Ellison’s actual words, Shadow and Act is an essential volume, which includes one of the most thrilling Molotov cocktails ever pitched into the face of a clueless literary critic, that is often just as potent and as lapel-grabbing as Baldwin’s The Fire Next Time.

For it would seem that while Negroes have been undergoing a process of “Americanization” from a time preceding this birth of this nation — including the fusing of their blood lines with other non-African strains, there has been a stubborn confusion as to their American identity. Somehow it was assumed that the Negroes, of all the diverse American peoples, would remain unaffected by the climate, the weather, the political circumstances — from which not even slaves were exempt — the social structures, the national manners, the modes of production and the tides of the market, the national ideals, the conflicts of values, the rising and falling of national morale, or the complex give and take of acculturalization which was undergone by all others who found their existence within the American democracy.

This passage, taken from an Ellison essay on Amiri Baraka’s Blues People, is not altogether different from Baldwin’s view of America as “a paranoid color wheel” in The Evidence of Things Not Seen, where Baldwin posited that a retreat into the bigoted mystique of Southern pride represented the ultimate denial of “Americanization” and thus African-American identity. Yet the common experiences that cut across racial lines, recently investigated with comic perspicacity on a “Black Jeopardy” Saturday Night Live sketch, may very well be a humanizing force to counter the despicable hate and madness that inspires uneducated white males to desecrate a Mississippi black church or a vicious demagogue to call one of his supporters “a thug” for having the temerity to ask him to be more respectful and inclusive.

Ellison, however, was too smart and too wide of a reader to confine these sins of dehumanization to their obvious targets. Like Baldwin and Coates and Richard Wright, Ellison looked to France for answers and, while never actually residing there, he certainly counted André Malraux and Paul Valéry among his hard influences. In writing about Richard Wright’s Black Boy, Ellison wisely singled out critics who failed to consider the full extent of African-American humanity even as they simultaneously demanded an on-the-nose and unambiguous “explanation” of who Wright was. (And it’s worth noting that Ellison himself, who was given his first professional writing gig by Wright, was also just as critical of Wright’s ideological propositions as Baldwin.) Ellison described how “the prevailing mood of American criticism has so thoroughly excluded the Negro that it fails to recognize some of the most basic tenets of Western democratic thought when encountering them in a black skin” and deservedly excoriated whites for seeing Paul Robeson and Marian Anderson merely as the ne plus ultra of African-American artistic innovation rather than the beginning of a great movement.

shriversombreroAt issue, in Ellison’s time and today, is the degree to which any individual voice is allowed to express himself. And Ellison rightly resented any force that would stifle this, whether it be the lingering dregs of Southern slavery telling the African-American how he must act or who he must be in telling his story as well as the myopic critics who would gainsay any voice by way of their boxlike assumptions about other Americans. One sees this unthinking lurch towards authoritarianism today with such white supremacists as Jonathan Franzen, Lionel Shriver, and the many Brooklyn novelists who, despite setting their works in gentrified neighborhoods still prominently populated by African-Americans, fail to include, much less humanize, black people who still live there.

“White supremacist” may seem like a harshly provocative label for any bumbling white writer who lacks the democratic bonhomie to leave the house and talk with other people and consider that those who do not share his skin color may indeed share more common experience than presumed. But if these writers are going to boast about how their narratives allegedly tell the truth about America while refusing to accept challenge for their gaping holes and denying the complexity of vital people who make up this great nation, then it seems apposite to bring a loaded gun to a knife fight. If we accept Ellison’s view of race as “an irrational sea in which Americans flounder like convoyed ships in a gale,” then it is clear that these egotistical, self-appointed seers are buckling on damaged vessels hewing to shambling sea routes mapped out by blustering navigators basking in white privilege, hitting murky ports festooned with ancient barnacles that they adamantly refuse to remove.

Franzen, despite growing up in a city in which half the population is African-American, recently told Slate‘s Isaac Chotiner that he could not countenance writing about other races because he has not loved them or gone out of his way to know them and thus excludes non-white characters from his massive and increasingly mediocre novels. Shriver wrote a novel, The Mandibles, in which the only black characters are (1) Leulla, bound to a chair and walked with a leash, and (2) Selma, who speaks in a racist Mammy patois (“I love the pitcher of all them rich folk having to cough they big piles of gold”). She then had the effrontery to deliver a keynote speech at the Brisbane Writers Festival arguing for the right to “try on other people’s hats,” failing to understand that creating dimensional characters involves a great deal more than playing dress-up at the country club. She quoted from a Margot Kaminski review of Chris Cleave’s Little Bee that offered the perfectly reasonable consideration, one that doesn’t deny an author’s right to cross boundaries, that an author may wish to take “special care…with a story that’s not implicitly yours to tell.” Such forethought clearly means constructing an identity that is more human rather than crassly archetypal, an eminently pragmatic consideration on how any work of contemporary art should probably reflect the many identities that make up our world. But for Shriver, a character should be manipulated at an author’s whim, even if her creative vagaries represent an impoverishment of imagination. For Shriver, inserting another nonwhite, non-heteronormative character into The Mandibles represented “issues that might distract from my central subject moment of apocalyptic economics.” Which brings us back to Ellison’s question of “Americanization” and how “the diverse American peoples” are indeed regularly affected by the decisions of those who uphold the status quo, whether overtly or covertly.

Writer Maxine Benba-Clarke bravely confronted Shriver with the full monty of this dismissive racism and Shriver responded, “When I come to your country. I expect. To be treated. With hospitality.” And with that vile and shrill answer, devoid of humanity and humility, Shriver exposed the upright incomprehension of her position, stepping from behind the arras as a kind of literary Jan Smuts for the 21st century.1

If this current state of affairs represents a bristling example of Giambattista Vico’s corsi e ricorsi, and I believe it does, then Ellison’s essay, “Twentieth-Century Fiction and the Black Mask of Humanity,” astutely demonstrates how this cultural amaurosis went down before, with 20th century authors willfully misreading Mark Twain, failing to see that Huck’s release of Jim represented a moment that not only involved recognizing Jim as a human being, but admitting “the evil implicit in his ’emancipation'” as well as Twain accepting “his personal responsibility in the condition of society.” With great creative power comes great creative responsibility. Ellison points to Ernest Hemingway scouring The Adventures of Huckleberry Finn merely for its technical accomplishments rather than this moral candor and how William Faulkner, despite being “the greatest artist the South has produced,” may not be have been quite the all-encompassing oracle, given that The Unvanquished‘s Ringo is, despite his loyalty, devoid of humanity. In another essay on Stephen Crane, Ellison reaffirms that great art involves “the cost of moral perception, of achieving an informed sense of life, in a universe which is essentially hostile to man and in which skill and courage and loyalty are virtues which help in the struggle but by no means exempt us from the necessary plunge into the storm-sea-war of experience.” And in the essays on music that form the book’s second section (“Sound and the Mainstream”), Ellison cements this ethos with his personal experience growing up in the South. If literature might help us to confront the complexities of moral perception, then the lyrical, floating tones of a majestic singer or a distinctive cat shredding eloquently on an axe might aid us in expressing it. And that quest for authentic expression is forever in conflict with audience assumptions, as seen with such powerful figures as Charlie Parker, whom Ellison describes as “a sacrificial figure whose struggles against personal chaos…served as entertainment for a ravenous, sensation-starved, culturally disoriented public which had but the slightest notion of its real significance.”

What makes Ellison’s demands for inclusive identity quite sophisticated is the vital component of admitting one’s own complicity, an act well beyond the superficial expression of easily forgotten shame or white guilt that none of the 20th or the 21st century writers identified here have had the guts to push past. And Ellison wasn’t just a writer who pointed fingers. He held himself just as accountable, as seen in a terrific 1985 essay called “An Extravagance of Laughter” (not included in Shadow and Act, but found in Going with the Territory), in which Ellison writes about how he went to the theatre to see Jack Kirkland’s adaptation of Erskine Caldwell’s Tobacco Road. (I wrote about Tobacco Road in 2011 as part of this series and praised the way that this still volatile novel pushes its audience to confront its own prejudices against the impoverished through remarkably flamboyant characters.) Upon seeing wanton animal passion among poor whites on the stage, Ellison burst into an uncontrollable paroxysm of laughter, which emerged as he was still negotiating the rituals of New York life shortly after arriving from the South. Ellison compared his reaction, which provoked outraged leers from the largely white audience, to an informal social ceremony he observed while he was a student at Tuskegee that involved a set of enormous whitewashed barrels labeled FOR COLORED placed in public space. If an African-American felt an overwhelming desire to laugh, he would thrust his head into the pit of the barrel and do so. Ellison observes that African-Americans “who in light of their social status and past condition of servitude were regarded as having absolutely nothing in their daily experience which could possibly inspire rational laughter.” And the expression of this inherently human quality, despite being a cathartic part of reckoning with identity and one’s position in the world, was nevertheless positioned out of sight and thus out of mind.

When I took an improv class at UCB earlier this year, I had an instructor who offered rather austere prohibitions to any strain of humor considered “too dark” or “punching down,” which would effectively disqualify both Tobacco Road and the Tuskegee barrel ritual that Ellison describes.2 These restrictions greatly frustrated me and a few of my classmates, who didn’t necessarily see the exploration of edgy comic terrain as a default choice, but merely one part of asserting an identity inclusive of many perspectives. I challenged the notion of confining behavior to obvious choices and ended up getting a phone call from the registrar, who was a smart and genial man and with whom I ended up having a friendly and thoughtful volley about comedy. I had apparently been ratted out by one student, who claimed that I was “disrupting” the class when I was merely inquiring about my own complicity in establishing base reality. In my efforts to further clarify my position, I sent a lengthy email to the instructor, one referencing “An Extravagance of Laughter,” and pointed out that delving into the uncomfortable was a vital part of reckoning with truth and ensuring that you grew your voice and evolved as an artist. I never received a reply. I can’t say that I blame him.

Ellison’s inquiry into the roots of how we find common ground with others suggests that we may be able to do so if we (a) acknowledge the completeness of other identities and (b) allow enough room for necessary catharsis and the acknowledgment of our feelings and our failings as we take baby steps towards better understanding each other.

The most blistering firebomb in the book is, of course, the infamous essay “The World and the Jug,” which demonstrates just what happens when you assume rather than take the time to know another person. It is a refreshingly uncoiled response that one could not imagine being published in this age of “No haters” reviewing policies and genial retreat from substantive subjects in today’s book review sections. Reacting to Irving Howe’s “Black Boys and Native Sons,” Ellison condemns Howe for not seeing “a human being but an abstract embodiment of living hell” and truly hammers home the need for all art to be considered on the basis of its human experience rather than the spectator’s constricting inferences. Howe’s great mistake was to view all African-American novels through the prism of a “protest novel” and this effectively revealed his own biases against what black writers had to say and very much for certain prerigged ideas that Howe expected them to say. “Must I be condemned because my sense of Negro life was quite different?” writes Ellison in response to Howe roping him in with Richard Wright and James Baldwin. And Ellison pours on the vinegar by not only observing how Howe self-plagiarized passages from previous reviews, but how his intractable ideology led him to defend the “old-fashioned” violence contained in Wright’s The Long Dream, which, whatever its merits, clearly did not keep current with the changing dialogue at the time.

Shadow and Act, with its inclusion of interviews and speeches and riffs on music (along with a sketch of a struggling mother), may be confused with a personal scrapbook. But it is, first and foremost, one man’s effort to assert his identity and his philosophy in the most cathartic and inclusive way possible. We still have much to learn from Ellison more than fifty years after these essays first appeared. And while I will always be galvanized by James Baldwin (who awaits our study in a few years), Ralph Ellison offers plentiful flagstones to face the present and the future.

SUPPLEMENT: One of the great mysteries that has bedeviled Ralph Ellison fans for decades is the identity of the critic who attacked Invisible Man as a “literary race riot.” In a Paris Review interview included in Shadow and Act, Ellison had this to say about the critic:

But there is one widely syndicated critical bankrupt who made liberal noises during the thirties and has been frightened ever since. He attacked my book as a “literary race riot.”

With the generous help of Ellison’s biographer Arnold Rampersad (who gave me an idea of where the quote might be found in an email volley) and the good people at the New York Public Library, I have tracked down the “widely syndicated critical bankrupt” in question.

sterlingnorthHis name is Sterling North, best known for the children’s novel Rascal in 1963. He wrote widely popular (and rightly forgotten) children’s books while writing book reviews for various newspapers. North was such a vanilla-minded man that he comics “a poisonous mushroom growth” and seemed to have it in for any work of art that dared to do something different — or that didn’t involve treacly narratives involving raising baby raccoons.

And then, in the April 16, 1952 issue of the New York World-Telegram, he belittled Ellison’s masterpiece, writing these words:

This is one of the most tragic and disturbing books I have ever read. For the most part brilliantly written and deeply sincere, it is, at the same time, bitter, violent and unbalanced. Except for a few closing pages in which the author tries to express something like a sane outlook on race relations, it is composed largely of such scenes of interracial strife that it achieves the effect of one continuous literary race riot. Ralph Ellison is a Negro with almost as much writing talent as Richard Wright. Like his embittered hero (known only as “I’ throughout the book, Mr. Ellison received scholarships to help him through college, one from the State of Oklahoma which made possible three years at the Tuskegee Institute, and one from the Rosenwald Foundation.

If Mr. Ellison is as scornful and bitter about this sort of assistance as he lets his “hero” be, those who made the money available must wonder if it was well spent.

North’s remarkably condescending words offer an alarming view of the cultural oppression that Ellison was fighting against and serve as further justification for Ellison’s views in Shadow and Act. Aside from his gross mischaracterization of Ellison’s novel, there is North’s troubling assumptions that Ellison should be grateful in the manner of an obsequious and servile stereotype, only deserves a scholarship if he writes a novel that fits North’s limited idea of what African-American identity should be, and that future white benefactors should think twice about granting opportunities for future uppity Ellisons.

It’s doubtful that The Sterling North Society will recognize this calumny, but this is despicable racism by any measure. A dive into North’s past also reveals So Dear to My Heart, a 1948 film adaptation of North’s Midnight and Jeremiah that reveled in Uncle Tom representations of African-Americans.

North’s full review of The Invisible Man can be read below:

sterling-north

Next Up: James George Frazer’s The Golden Bough!

hofstadter

The American Political Tradition (Modern Library Nonfiction #93)

(This is the eighth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Contours of American History.)

mlnf93Before he became famous for delineating “the paranoid style in American politics” and honing every principled bone against the feverish anti-intellectualism one now sees embodied in everything from long-standing philistine Dan Kois decrying “eating his cultural vegetables” to lunatic presidential candidate Ted Cruz declaring gluten-free meals as a politically correct “social experiment,” historian Richard Hofstadter spent four years on a fiercely independent book that would go on to sell close to a million copies. The American Political Tradition was a Plutarchian overview of illustrious American figures ranging from vivacious abolitionist Wendell Phillips to Woodrow Wilson as closeted conservative. It was aimed at winning over a high-minded American public. Like William Appleman Williams, Hofstadter was very much following in Charles Beard’s footsteps, although this historian hoped to march to his own interpretive drum. Reacting to the toxic McCarthyism at the time, Hofstadter’s cautious defense of old school American liberalism, with the reluctant bulwark hoisted as he poked holes into the foibles of celebrated icons, saddled him with the label of “consensus historian.” With each subsequent volume (most notably The Age of Reform), Hofstadter drifted further away from anything close to a scorching critique of our Founders as hardliners enforcing their economic interests to a more vociferous denouncement of agrarian Populists and numbnuts standing in the way of erudite democratic promise. Yet even as he turned more conservative in later years, Hofstadter insisted that his “assertion of consensus history in 1948 had its sources in the Marxism of the 1930s.”

Such adamantine labels really aren’t fair to Hofstadter’s achievements in The American Political Tradition. The book is by no means perfect, but its Leatherman Wave-like dissection of American history unfolds with some sharp and handy blades. While Hofstadter is strangely reluctant to out Andrew Jackson as a demagogue (“He became a favorite of the people, and might easily come to believe that the people chose well.”) and far too pardonable towards John C. Calhoun, a rigid bloviator with a harsh voice who was one of slavery’s biggest cheerleaders and whose absolutist stance against tariffs under the guise of moderatism would later inspire the South to consider secession as a legitimate nuclear option1, Hofstadter at his best slices with a necessary critical force into many hallowed patriarchs. For it is the sum of their variegated and contradictory parts that has caused some to view the American trajectory in Manichean terms.

One of the book’s standout chapters is Hofstadter’s shrewd analysis of Lincoln as an exceptionally formidable man who dialed down his egalitarian ardor to zero the meter for his shrewd and very rapid political rise. In just four years, Lincoln advanced from an obscure attorney in Illinois to a prominent party leader in that same state’s House of Representatives. But Hofstadter cogently argues that Lincoln was far from the outspoken abolitionist who would later lay down some very strong words against those who would deny other people freedom. Lincoln not only kept his enemies closer than his friends, but he was exceptionally careful with his rhetoric, even though one eye-popping 1836 declaration proposed extending suffrage to women.2 Much as Franklin D. Roosevelt was very savvy about letting his political opponents make the first move before he acted, Lincoln used the Declaration of Independence’s very text as ammunition and inspiration for his justification for abolition, which come much later — Lincoln’s first public condemnation of slavery arrived when Lincoln was forty-five — than Lincoln’s many admirers are often willing to admit.

Hofstadter points out that Lincoln’s seeming contradiction between revolutionary politics and pragmatic interpretation of the law was not especially peculiar, but part of a nuts-and-bolts perpetuation of an ongoing political tradition, one that can be seen with Lincoln’s hard maneuvering with the 1851 conditional loan he issued to his stepbrother John D. Johnson. Lincoln’s famous House Divided speech was masterful rhetoric urging national reconciliation of the slavery issue, but he didn’t exactly go out of his way to out himself as an abolitionist. Hofstadter points out that in 1858, seemingly Honest Abe spoke in two entirely different manners about racial equality in Chicago and in Charleston (see the second paragraph of his first speech). Yet these observations not only illustrate Lincoln’s political genius, but invite parallels to Lyndon Johnson’s brilliant and equally contradictory engineering in passing the 1957 Civil Rights Act (perhaps best chronicled in a gripping 100 page section of Robert A. Caro’s excellent Master of the Senate). The American political tradition, which Hofstadter identifies as a continuity with capitalist democratic principles, is seen today with Hillary Clinton struggling against a young population hungry for progressive change unlikely to happen overnight, despite Bernie Sanders’s valiant plans and the immediate need to rectify corporate America’s viselike hold on the very democratic principles that have sustained this nation for more than two hundred years.

Yet this is the same tradition that has given us long years without a stabilizing central bank, the Trail of Tears, the Civil War, the Credit Mobilier scandal, robber barons, and Hoover’s unshakable faith that “prosperity was just around the corner,” among many other disgraces. Hofstadter is thankfully not above condemning lasseiz-faire absolutism, such as Grover Cleveland’s unrealistic assumption that “things must work out smoothly without government action, or the whole system, coherent enough in theory, would fall from the weakness of its premises” or the free silver campaign that buttressed the bombastic William Jennings Bryan into an improbable presidential candidate. On Bryan, Hofstadter describes his intellectual acumen as “a boy who never left home” and one can see some of Bryan’s regrettable legacy in the red-faced fulminations of a certain overgrown boy who currently pledges to make America great again. A careless and clumsy figure like Bryan was the very antithesis of Lincoln. Bryan failed to see difficult political tasks through to their necessary end. He would adopt principles that he once decried. His well-meaning efforts amounted to practically nothing. Think of Bryan as Fargo‘s Jerry Lundegaard to Lincoln’s Joe Girard. Hofstadter suggests that “steadfast and self-confident intelligence,” perhaps more important than courage and sincerity, was the very quality that Bryan and this nation so desperately needed. Yet in writing about Teddy Roosevelt and pointing to the frequency of “manly” and “masterful” in his prose, Hofstadter imputes that these “more perfect” personal qualities for the political tradition “easily became transformed into the imperial impulse.”

This is, at times, a very grumpy book. One almost bemoans the missed opportunity to enlist the late Andy Rooney to read aloud the audio version. But it is not without its optimism. Hofstadter places most of his faith in abolitionist agitator Wendell Phillips. But even after defending Phillips from numerous historical condemnations and pointing to Phillips’s “higher level of intellectual self-awareness,” Hofstadter sees the agitator as merely “the counterweight to sloth and indifference.” But Hofstadter, at this young stage of his career, isn’t quite willing to write off agitators. He does point to why Phillips was a necessary and influential force providing equilibrium:

But when a social crisis or revolutionary period at last matures, the sharp distinctions that govern the logical and doctrinaire mind of the agitator become at one with the realities, and he appears overnight to the people as a plausible and forceful thinker. The man who has maintained that all history is the history of class struggles and has appeared so wide of the mark in times of class collaboration may become a powerful leader when society is seething with unresolved class conflict; the man who has been valiantly demanding the abolition of slavery for thirty years may become a vital figure when emancipation makes its appearance as a burning issue of practical politics. Such was the experience of Wendell Phillips: although he never held office, he became one of the most influential Americans during the few years after the fall of Fort Sumter.

thealternativefactorThe question of whether you believe Hofstadter to be a consensus historian or not may depend on how much you believe that he viewed the American political tradition much like two Lazaruses forever duking it out for existence in the old Star Trek episode “The Alternative Factor.” He certainly sees a nation of political pragmatists and obdurate agitators caught in an eternal dead lock, which is not too far from the progressive historians who styled their interpretations on class conflict. But his fine eye for ferreting out the Burkean undertow within Woodrow Wilson’s putative liberalism or exposing how Hoover’s faith in unregulated business had him quivering with disbelief after Black Thursday suggests a historian who is interested in countering ideological bromides. Perhaps if Hofstadter had stretched some of his chapters across a massive book, his reputation as a consensus historian wouldn’t have been the subject of so many heated arguments among political wonks.

Fortunately, the next Modern Library essay in this series will investigate how one man fluctuated his politics to serve his own ends and reshaped a major metropolis through the iron will of his personality. That very long and very great book may be the key that turns the consensus lock. It will certainly tell us a lot more about political power.

Next Up: Robert A. Caro’s The Power Broker!

americanrepublic

The Contours of American History (Modern Library Nonfiction #94)

(This is the seventh entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Promise of American Life.)

mlnf94

History is never the thrilling Zapcat powerboat ride it can and should be when we remain committed to oaring through the same exhausted legends about American exceptionalism and bogus democratic promise. Much as we may find new insights into human existence by tilting our canoes to the ripples contained within a storyteller’s cadences, so too may we discover more complete ways of contending with our historical contradictions through the viewpoint of a responsible revisionist armed with the facts and rejecting the hard establishment line.

The revisionist historian, that charming and sometimes infuriating rabble-rouser never to be confused with some creepy Holocaust denier flailing in a sea of empty Cheetos bags and crackpot pamphlets, often gets needlessly maligned in America. Before Annette Gordon-Reed offered conclusive evidence of Thomas Jefferson’s relationship with Sally Hemings (upheld by a 1998 DNA test), Fawn Brodie was attacked by vanilla-minded legacy holders two decades before for pushing beyond James Callender’s tawdry trolling, daring to suggest that there was good reason to believe that our much heralded champion of the rights of man had skeletons in his closet that were vital to understanding his philosophy. Brodie’s book, despite its psychobiographical failings, led to a reckoning with our myths and assumptions about the Sage of Monticello, one that continues to this very day with college students demanding the removal of Jefferson statues on campuses.

Provided that their efforts do not involve going out of their way to Bowlderize troubling if incontrovertible parts of the story and the results are as expansive and as rigorous as their more timorous mainstream counterparts, revisionists are often vital reconcilers of the public record. It is the facile propagandist who ignores Rosa Parks’s radicalism to paint a roseate image of a meek and tired seamstress who refused to give up her seat on a bus (“small,” “delicate,” and “little,” as belittled by Bill Clinton in 2005) or who upholds the lie that Abner Doubleday created baseball.

In recent decades, many young students have ardently clutched their copies of Howard Zinn’s A People’s History of the United States with the taut adamantine grip of a Fallout 4 junkie reluctant to capitulate her controller. Zinn’s thoughtful volume has been vehemently denounced by some establishment historians who have questioned the perceived polemical emphasis of class conflict at the expense of other issues. But before Zinn, there was William Appleman Williams, a brash energetic troublemaker who was arguably a more rigorous scholar than Zinn and who was among the best and the boldest of the firebrand 20th century historians who emerged from a Charles Beard afterglow with ass to kick once the bubble gum supply ran out.

William Appleman Williams unpacked the economic motivations of American expansion and foreign policy in The Tragedy of American Diplomacy and broadened this scholarship further with The Contours of American History, a punchy volume examining how imperialism and liberalism became a sordid double stitch intertwined in the American quilt well before the Sons of Liberty spilled massive chests of desperately offloaded tea into Boston Habor. Yet Williams’s often nimble analysis, riddled as it sometimes is with conceptual overreach, robustly articulates the ever-changing and contradictory American Weltanschauung that has motivated nearly every governmental decision since. He documents a worldview that started off with the relatively benign goal of creating and sustaining an economic nation that provided for everyone, but devolved under the autocratic yoke of Jacksonian democracy and Gilded Age greed to the corporate capitalist nightmare we are all trying to awake from today. And because Williams’s challenge to the so-called “American experiment” was so unprecedented in the mid-20th century, this historian was tarnished, besmirched, and condemned by other putative progressives who might have enlarged their rigid notions of national identity if they had been more willing to dive into the subtle words and actions directing the unshakable financial impetus.

Williams was harassed by the House Committee on Un-American Activities, that despicably despotic body that ruined the lives of so many, with a demand to produce the unfinished Contours manuscript. The HUAC would order Williams to testify in Washington and then cancel the appearance by telegram once he’d hopped on a train to the Beltway. Even after he testified for ten minutes and the HUAC abandoned its witch hunt, the IRS harassed him in various forms for nearly twenty years. Williams was hounded by the neoliberalism critic Arthur Schlesigner, Jr., who dutifully condemned Williams as “pro-communist” to the American Historical Association’s president. Even as late as 2009, an academic called Williams an “idiot” before a Society of Historians of American Foreign Relations panel, decrying Williams’s approach to history as a crude retooling of Charles Beard’s infamous assault upon our Founding Fathers’s pecuniary predispositions.1

But Williams was far from a typical progressive. He was a registered Republican when he first came to Wisconsin. He voted for Nixon as the lesser evil in 1960. And even in Contours, he defended Herbert Hoover’s hands-off Depression era policies, seeing this as a necessary tactic to forestall property holders from creating a business-friendly fascism that could have had a more diabolical effect on our clime than the many Hoovervilles that had mushroomed across the nation. Williams argued that Hoover’s perceived failure to do anything represented a more active resistance against special interests than the Progressive Movement was willing to acknowledge or act upon at the time. And that’s the way this jazz-loving Midwestern historian rolled. As Williams was to write in a 1973 essay, the revisionist’s duty was to “see basic facts in a different way and as interconnected in new relationships. He is a sister and a brother to those who use old steel to make a zipper, as contrasted with those who add new elements to make a better steel.”

In my previous Modern Library essay, I castigated Herbert Croly for the historical developments that he could not see ahead of him, for erring too much in his perfervid belief in a central government and for diminishing the justifiable grievances of protesters. William Appleman Williams may very well represent the opposite problem: a historian who could see the implications of any action all too well, one who was willing to articulate any interpretation of the facts even if it meant being alienated by the jingoistic minds who needed to reconsider the other fateful historical trajectories upholding the status quo.

Williams’s highly specific examples very much allow him to sell us on his interpretation. In Tragedy, for example, Williams’s deductive prowess is in high gear when he examines how Woodrow Wilson’s March 1913 decision to refuse a government loan to China, one long coveted by American industrialists at the time (and later attempted privately), actually fell within the framework of the Open Door Policy. Many historians have interpreted Wilson’s pushback as a betrayal of American expansionism at the time, but Williams points to the lack of private capital available to fulfill the job as well as the possibility that any governmental loan, even one secured with the help of other financiers, may have been perceived as a very clear threat to neighboring Japan. The Open Door Policy, for all of its flaws and its needless sullying of China, was intended to provide a peacefully imperialist framework for a burgeoning American empire: a GATT or IMF before its time, though regrettably without much in the way of homegrown protest. (Rebellion would come later in Beijing with the May Fourth movement.) The ostensible goal was to strengthen China with fresh influxes of low-risk private capital so that it could withstand troublesome neighbors looking for a fight, even as the new obligations to American entrepreneurs forged hot rivulets of cash rolling back to the imperialist homeland. Wilson’s decision was, as discerned by Williams, a canny chesslike stratagem to avoid war and conflict, one that would keep China a servant to America’s riches. From the vantage point of the 21st century, this useful historical interpretation reveals Wilson to be a pioneer in the kind of venal and now all too commonplace globalization that morally bankrupt neoliberals like Thomas Friedman have no problem opening their old steel zippers for. Their free trade fantasies possess all the out-of-sight, out-of-mind justification of a revenge porn junkie ignoring another person’s real world humiliation for fleeting sociopathic pleasure.

It was with Contours that Williams blew the lid off the great American lie, exposing the American liberal’s failure to confront his own implication in much of the lasseiz nous faire madness. Williams traced the origins of our mercantilist approach to Anthony Ashley Cooper, the Earl of Shaftesbury. In the 17th century, Shaftesbury was a political figure who opposed harsh penalties and absolutist government. He stood up for the nonconformists and called for regular parliaments, and would go on to found and lead the early Whig party in the wake of the British Exclusion Crisis. While traveling to Oxford to remove an abscess from his liver, he hit it off with a young doctor by the name of John Locke. (There weren’t as many cafes back then as there are today. In the 1600s, you had to take whatever mingling opportunities you could get.) Locke, of course, would later have many ideas about the social contract, a scheme about inalienable natural rights that would eventually find its way into a number one ditty penned by Jefferson that would become known as the Declaration of Independence.

But there was a twist to this tale. As Williams points out, Locke’s ideas were a corruption of Shaftesbury’s more inclusive and democratic efforts. Where Shaftesbury was willing to rebel against the King to ensure that courts and alternative political parties were in place to prevent the government from becoming an absolute tyranny, even going to the trouble of building a coalition that extended across all classes to fight for these safeguards when not putting together the Habeas Corpus Act of 1679, it was Locke who limited Shaftesbury’s remarkably liberal contributions by undercutting individual rights. Locke believed that those who owned property were perfectly justified in protesting their government, for they were the ones who had entered into a social contract. But the rabble who didn’t own property could more or less buzz off.2 As Williams put it, “[I]ndividualism was a right and a liberty reserved to those who accepted a status quo defined by a certain set of natural truths agreed upon a majority. Within such a framework, and it is a far narrower set of limits than it appears at first glance, the natural laws of property and labor were deemed sufficient to guide men’s pursuit of happiness.”

Yet those who subscribed to these early mercantilist standards believed that this classically liberal idea of “corporate structure” involved a basic responsibility to provide for everyone. And the way of sustaining such a benevolent national juggernaut was through the establishment of an empire: a Pax Americana predicated upon the promise of a democracy promulgated by patriarchs who not so quietly believed that the people were incapable of it.3 Williams observes how the Quakers in Philadelphia, who opposed expansion and much of the onslaughts against Native Americans, were very much committed to noblesse oblige, setting up hospitals, education, and philanthropic endeavors to take care of everyone. But this generous spirit was no match for the free trade nabobs or the hard-hearted Calvinists who increasingly shifted such solicitude to the propertied class (one can easily imagine Alec Baldwin’s Glengarry Glenn Ross “Always be closing” speech spouted by a Calvinist), leading the great theologian Jonathan Edwards to offer righteous pushback against “fraud and trickishness in trade.”

Against this backdrop, post-Revolutionary expansion and the Monroe Doctrine allowed mercantilism to transmute into an idea that was more about the grab than the munificent results, with visions of empire dancing in many heads. By the time Frederick Jackson Turner tendered his Frontier Thesis in 1893, mercantilism was no longer about providing for the commonweal, but about any “self-made man” looking out after his interests. Williams points to Chief Justice John Marshall’s efforts to enforce safeguards, such as his Gibbons vs. Ogden decision regulating interstate commerce, against the monopolies that would come to dominate America near the turn of the century. Marshall’s immediate successor, Chief Justice Taney, expanded the flexibility of the Constitution’s Contract Clause with his 1837 Charles River Bridge v. Warren Bridge decision, permitting states to alter any contract as it saw fit. While Taney’s decision seemed to strike the death knell against monopolies, it was no match against the consolidated trusts that were to come with the railroads and the robber barons. Rather curiously, for all of his sharp observations about free trade and expansionist dangers during this time, Williams devotes little more than a paragraph to the 1836 closing of the Second Bank of the United States:

[Nicholas Biddle] did a better job than the directors of the Bank of England. Under his leadership the bank not only established a national system of credit balancing which assisted the west as much as the east, and probably more, but sought with considerable success to save smaller banks from their own inexperience and greed. It was ultimately his undoing, for what the militant advocates of lasseiz nous faire came to demand was help without responsibilities. In their minds, at any rate, that was the working definition of democratic freedom.

Talk about sweeping one of the greatest financial calamities in American history under the rug! I don’t want to get too much into Andrew Jackson, who I believe to be nothing less than an abhorrent, reckless, and self-destructive maniac who claimed “liberalism” using the iron fist of tyranny, in this installment. I shall preserve my apparently unquenchable ire for Old Hickory when I tackle Arthur Schlesinger, Jr.’s The Age of Jackson in a few years (Modern Library Nonfiction #36). But Jackson’s imperious and irresponsible battle with Biddle, complete with his Specie Circular, undoubtedly led to the Panic of 1837, in which interest rates spiked, the rich got richer, a fixable financial mess spiraled out of control and became needlessly dangerous, and buyers could not come up with the hard cash to invest in land. Considering Williams’s defense of Hoover in both Contours and Tragedy, it is extremely curious that he would shy away from analyzing why some form of central bank might be necessary to mitigate against volatility, even though he adopted some fascinating counterpoints to the “too big to fail” theory decades before Bernanke and Krugman.

This oversight points to the biggest issue I have with Williams. His solution to the great imperialist predicament was democratic socialism, which he called “the only real frontier available to Americans in the second half of the 20th century.” While this is a clever way of inverting Turner’s thesis, to uphold this, Williams cites a few examples such as the courage of Wendell Phillips, a few throwaway references to social property, and a late 19th century return with Edward Bellamy and Henry Demarest Lloyd to the Quaker-like notion of “a commonwealth in which men were brothers first and economic men second.” But while Williams is often a master of synthesis, he falls somewhat short in delineating how his many historical examples can aid us to correct our ongoing ills. If the American Weltanschauung is so steeped in our culture, how then can democratic socialism uproot it? This vital question remains at the root of any progressive-minded conversation. But now that we have a presidential race in which socialism is no longer a dirty word and the two leading Democratic candidates bicker over who is the greater progressive, perhaps the answer might arrive as naturally as Williams anticipated.

Next Up: Richard Hofstadter’s The American Political Tradition!

croly

The Promise of American Life (Modern Library Nonfiction #95)

(This is the sixth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: In Cold Blood.)

mlnf95Before The New Republic devolved under Chris Hughes into a half-worthy husk of knee-jerk platitudes just a few histrionic clickbait headlines shy of wily Slate reductionism, it was a formidable liberal magazine for many decades, courageous enough to take real stands while sustaining vital dialogue about how and when government should intercede in important affairs. The source of this philosophical thrust, as duly documented by Franklin Foer, was the greatly diffident son of a prominent newspaperman, an unlikely progenitor who entered and exited Harvard many times without ever finishing, someone who suffered from severe depression and who, for a time, didn’t know what to do with his life other than play bridge and tennis and write about obscure architecture. But Croly found it in him to spill his views about democracy’s potential, what he called the “New Nationalism,” into a 1909 book called The Promise of American Life, which served as something of a manifesto for the early 20th century Progressives and became a cult hit among political wonks at the time. It partially inspired Theodore Roosevelt, who was proudly name-checked by Croly as “a Hamiltonian with a difference,” to initiate his ill-fated 1912 Bull Moose campaign as an outsider presidential candidate. (Historians have argued over the palpable influence of Croly’s book on Roosevelt, but it’s possible that, had not Croly confirmed what Roosevelt had already been thinking about, Roosevelt may not have entered the 1912 race as ardently as he did. With a more united Republican coalition against Wilson, America may very well have carried on with a second Taft term, with an altogether different involvement in World War I. Taft’s notable rulings as Chief Justice of the Supreme Court, which included extending executive power and broadening the scope of police evidence, may not been carried out in the 1920s. A book is often more of a Molotov shattering upon history’s turf than we are willing to accept.)

Croly’s book touched a nerve among a small passionate group. One couple ended up reading Croly’s book aloud to each other during their honeymoon (leaving this 21st century reader, comparing Croly’s thick “irremediable”-heavy prose style against now all too common sybaritic options, to imagine other important activities that this nubile pair may have missed out on). The newly married couple was Willard Straight and Dorothy Whitney. They had money. They invited Croly to lunch. The New Republic was formed.

So we are contending with a book that not only created an enduring magazine and possibly altered the course of American history, but one that had a profound impact on the right elite at the right time. So it was a tremendous surprise to discover a book that greatly infuriated me during the two times I read it, at one time causing me to hurl it with high indignant velocity against a wall, for reasons that have more to do with this gushing early 20th century idealist failing to foresee the rise of Nazism, the despicable marriage of racism and police brutality, growing income inequality, corporate oligarchy, draconian Common Core educational standards, and dangerous demagogues like George Wallace and Donald Trump.

But it is also important to remember that Croly wrote this book before radio, television, the Internet, women’s suffrage, two world wars, the Great Depression, smartphones, outrage culture, and 9/11. And it is never a good idea to read an older book, especially one of a political nature, without considering the time that it was written. I did my best to curb my instincts to loathe Croly for what he could not anticipate, for his larger questions of how power aligns itself with the democratic will of the people are still very much worth considering. Croly is quite right to identify the strange Frankenstein monster of Alexander Hamilton’s pragmatic central government and Thomas Jefferson’s rights of man — the uniquely American philosophical conflict that has been the basis of nearly every national conflict and problem that has followed — as a “double perversion” of our nation’s potential, even if Croly seems unwilling to consider that some “perversions” are necessary for an evolving democratic republic and he is often too trusting of executive authority and the general public’s obeisance to it. That these inquiries still remain irreconcilable (and are perverted blunter still by crass politicians who bellow about how to “make America great again” as they eject those who challenge them from the room) some 107 years after the book’s publication speaks to both the necessity and the difficulty of the question.

I’ve juxtaposed Croly’s meek-looking law clerk mien against George Bellows’s famous boxing painting (unveiled two years before Croly’s book) because there really is no better way to visualize the American individual’s relationship to its lumbering, venal, and often futile government. Croly’s solution is to call for all Americans to be actively engaged in a collaborative and faithful relationship with the nation: “to accept a conception of democracy which provides for the substantial integrity of his country, not only as a nation with an exclusively democratic mission, but as a democracy with an essentially national career.” On its face, this seems like a reasonable proposition. We all wish to belong in a democracy, to maintain fidelity to our country, and to believe that the Lockean social contract in which the state provides for the commonweal is a workable and reasonable quid pro quo. But it is also the kind of orgiastic meat and potatoes mantra that led both Kennedy and Reagan to evoke mythical American exceptionalism with the infamous “shining city upon a hill” metaphor. Dulcet words may make us feel better about ourselves and our nation, but we have seen again and again how government inaction on guns and a minimum wage that does not reflect contemporary living standards demands a Black Lives Matter movement and a “fight for $15.” And when one begins to unpack just what Croly wants us to give up for this roseate and wholly unrealistic Faustian bargain, we begin to see someone who may be more of a thoughtful and naive grandstander than a vital conceptual pragmatist.

Croly is right to demand that America operate with a larger administrative organ in place, some highly efficient Hamiltonian body that mitigates against “the evil effects of a loose union.” He smartly points out that such evils as slavery resulted from the American contradictions originating in the strange alliance between our poetic Jeffersonian call for Constitutional democracy and individualistic will and the many strains of populism and nationalism that followed. In his insistence on “the transformation of Hamiltonianism into a thoroughly democratic political principle,” Croly is suspicious of reformers, many of which he singles out in a manner strikingly similar to Norman Mailer’s “Quick and Expensive Comments on the Talent in the Room.” He calls William Jennings Bryan an “ill conceived” reformer, claims the now nearly forgotten William Travers Jerome to be “lulled into repose” by traditional Jeffersonian democracy (never mind Jerome’s successful crusades against Tammany Hall corruption, regrettably overshadowed by his prosecution of Harry K. Thaw during the Stanford White murder trial), interestingly pegs William Randolph Hearst as someone motivated by endless “proclaimation[s] of a rigorous interpretation of the principle of equal rights,” and holds up Teddy Roosevelt as “more novel and more radical” in his calls for a Square Deal than “he himself has probably proclaimed.”

But Croly’s position on reform is quite problematic, deeply unsettling, and often contradictory. He believes that citizens “should be permitted every opportunity to protest in the most vigorous and persistent manner,” yet he states that such protests “must conform to certain conditions” enforced by the state. While we are certainly far removed from the 1910 bombing of the Los Angeles Times building that galvanized the labor movement, as we saw with the appalling free speech cages during the 2004 Republican Convention, muzzling protesters not only attenuated their message but allowed the NYPD to set up traps for the activists, which ensured their arrest and detention — a prototype for the exorbitant enforcement used to diminish and belittle the Occupy Wall Street movement a few years later. Croly believes that the job of sustaining democratic promise should, oddly enough, be left to legislators and executives granted all the power required and sees state and municipal governments as largely unsuccessful:

The interest of individual liberty in relation to the organization of democracy demands simply that the individual officeholder should possess an amount of power and independence adequate to the efficient performance of his work. The work of a justice of the Supreme Court demands a power that is absolute for its own special work, and it demands technically complete independence. An executive should, as a rule, serve for a longer term, and hold a position of greater independence than a legislator, because his work of enforcing the laws and attending to the business details of government demands continuity, complete responsibility within its own sphere, and the necessity occasionally of braving adverse currents of public opinion. The term of service and the technical independence of a legislator might well be more restricted than that of an executive; but even a legislator should be granted as much power and independence as he may need for the official performance of his public duty. The American democracy has shown its enmity to individual political liberty, not because it has required its political favorites constantly to seek reëlection, but because it has since 1800 tended to refuse to its favorites during their official term as much power and independence as is needed for administrative, legislative, and judicial efficiency. It has been jealous of the power it delegated, and has tried to take away with one hand what it gave with the other.

There is no room for “Act locally, think globally” in Croly’s vision. This is especially ungenerous given the many successful progressive movements that flourished decades after Croly’s death, such as the civil rights movement beginning with local sit-ins and developing into a more cogent and less ragged strain of the destructive Jacksonian populism that Croly rightly calls out, especially in relation to the cavalier obliteration of the Second Bank of the United States and the Nullification Crisis of 1832, which required Henry Clay to clean up Jackson’s despotic absolutism with a compromise. On the Nullification point, Croly identifies Daniel Webster, a man who became treacherously committed to holding the Union together, as “the most eloquent and effective expositor of American nationalism,” who “taught American public opinion to consider the Union as the core and crown of the American political system,” even as he offers a beautifully stinging barb on Webster’s abolitionist betrayal with the 1850 speech endorsing the Fugitive Slave Act: “He was as much terrorized by the possible consequences of any candid and courageous dealing with the question as were the prosperous business men of the North; and his luminous intelligence shed no light upon a question, which evaded his Constitutional theories, terrified his will, and clouded the radiance of his patriotic visions.”

But Croly also promulgates a number of loopy schemes, including making representative legislatures at any level beholden to an executive who is armed with a near tyrannical ability to scuttle laws, even as he claims that voters removing representatives through referendum “will obtain and keep a much more complete and direct control over the making of their laws than that which they have exerted hitherto; and the possible desirability of the direct exercise of this function cannot be disputed by any loyal democrat.” Well, this loyal democrat, immediately summoning Lord Acton’s famous quote, calls bullshit on giving any two-bit boss that kind of absolute power. Because Croly’s baffling notion of “democracy” conjures up the terrifying image of a sea of hands raised in a Bellamy salute. On one hand, Croly believes that a democracy must secure and exercise individual rights, even as he rightly recognizes that, when people exercise these rights, they cultivate the “tendency to divide the community into divergent classes.” On the other hand, he believes that individuals should be kept on a restrictive leash:

[T]hey should not, so far as possible, be allowed to outlast their own utility. They must continue to be earned. It is power and opportunity enjoyed without being earned which help to damage the individual — both the individuals who benefit and the individuals who consent — and which tend to loosen the ultimate social bond. A democracy, no less than a monarchy or an aristocracy, must recognize political, economic, and social discriminations, but it must also manage to withdraw its consent whenever these discriminations show any tendency to excessive endurance. The essential wholeness of the community depends absolutely on the ceaseless creation of a political, economic, and social aristocracy and their equally incessant replacement.

There’s certainly something to be said about how many Americans fail to appreciate the rights that they have. Reminding all citizens of their duties to flex their individual rights may be a very sound idea. (Perhaps one solution to American indifference and political disillusion is the implementation of a compulsory voting policy with penalties, similar to what goes on in Australia.) But with such a middling door prize like this handed out at the democratic dance party, why on earth would any individual want to subscribe to the American promise? Aristocrats, by their very nature, wish to hold onto their power and privilege and not let go. Croly’s pact is thus equally unappealing for the struggling individual living paycheck to paycheck, the career politician, or the business tycoon.

Moreover, in addition to opposing the Sherman Antitrust Act, Croly nearly succumbs to total Taylorism in his dismissal of labor unions: “They seek by the passage of eight-hour and prevailing rate-of-wages laws to give an official sanction to the claims of the unions, and they do so without making any attempt to promote the parallel public interest in an increasing efficiency of labor. But these eight-hour and other similar laws are frequently being declared unconstitutional by the state courts, and for the supposed benefit of individual liberty.” Granted, Croly’s words came ten years before the passage of the Adamson Act, the first federal law enforcing a mandatory eight-hour day. But Croly’s failure to see the social benefits of well-rested workers better positioned to exercise their individual liberty for a democratic promise is one of his more outrageous and myopic pronouncements, even as he also avers how the conditions that create unrestricted economic opportunities also spawn individual bondage. But if Croly wants Americans to “[keep] his flag flying at any personal cost or sacrifice,” then he really needs to have more sympathy for the travails of the working stiff.

Despite all my complaints, I still believe some 21st century thinker should pick up from Croly’s many points and make an equally ambitious attempt to harmonize Hamilton and Jefferson with more recent developments. American politics has transformed into a cartoonish nightmare from which we cannot seem to escape, one that causes tax absolutist lunatics like Grover Norquist to appear remotely sane. That we are seeing a strange replay of the 1912 election with the 2016 presidential race, with Trump stepping in as an unlikely Roosevelt and Bernie Sanders possibly filling in for Eugene Debs, and that so many Americans covet an “outsider” candidate who will fix a government that they perceive as a broken system speaks to a great need for some ambitious mind to reassess our history and the manner in which we belong to our nation, while also observing the many ways in which Americans come together well outside of the political bear trap. For the American individual is no longer boxing George Bellows-style with her government. She is now engaged in a vicious MMA match unfurling inside a steel cage. Whether this ugly pugilism can be tempered with peace and tolerance is anyone’s guess, but, if we really believe in democracy, the least we can do is try to find some workaround in which people feel once again that they’re part of the process.

Next Up: William Appleman Williams’s The Contours of American History!

capotesmith

In Cold Blood (Modern Library Nonfiction #96)

(This is the fifth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Journalist and the Murderer.)

halmacapoteTruman Capote was a feverish liar and a frenzied opportunist from the first moment his high voice pierced the walls of a literary elite eager to filth up its antimacassars with gossip. He used his looks to present himself as a child prodigy, famously photographed in languorous repose by Harold Halma to incite intrigue and controversy. He claimed to win national awards for his high school writing that no scholar has ever been able to turn up. He escorted the nearly blind James Thurber to his dalliances with secretaries and deliberately put on Thurber’s socks inside out so that his wife would notice, later boasting that one secretary was “the ugliest thing you’ve ever seen.” Biographer Gerald Clarke chronicled how Capote befriended New Yorker office manager Daise Terry, who was feared and disliked by many at the magazine, because he knew she could help him. (Capote’s tactics paid off. Terry gave him the easiest job on staff: copyboy on the art department.) If Capote wanted to know you, he wanted to use you. But the beginnings of a man willing to do just about anything to get ahead can be found in his early childhood.

Capote’s cousin Jennings Faulk Carter once described young Truman coming up with the idea of charging admission for a circus. Capote had heard a story in the local paper about a two-headed chicken. Lacking the creative talent to build a chicken himself, he enlisted Carter and Harper Lee for this faux poultry con. The two accomplices never saw any of the money. Decades later, Capote would escalate this tactic on a grander scale, earning millions of dollars and great renown for hoisting a literary big top over a small Kansas town after reading a 300 word item about a family murder in The New York Times. Harper Lee would be dragged into this carnival as well.

mlnf96The tale of how two frightening men murdered four members of the Clutter family for a pittance and created a climate of fear in the surrounding rural area (and later the nation) is very familiar to nearly anyone who reads, buttressed by the gritty 1967 film (featuring a pre-The Walking Dead Scott Wilson as Dick Hickock and a pre-Bonnie Lee Bakley murder Robert Blake as Perry Smith) and a deservedly acclaimed 2005 film featuring the late great Philip Seymour Hoffman as Capote. But what is not so discussed is the rather flimsy foundation on which this “masterpiece” has been built.

Years before “based on a true story” became a risible cliche, Capote and his publicists framed In Cold Blood‘s authenticity around Capote’s purported accuracy. Yet the book itself contains many gaping holes in which we have only Smith and Hickock’s words, twisted further by Capote. What are we to make of Bill and Johnny — a boy and his grandfather who Smith and Hickock pick up for a roadside soda bottle-collecting adventure to make a few bucks? In our modern age, we would demand the competent journalist to track these two side characters down, to compare their accounts with those of Smith and Hickock. Capote claims that these two had once lived with the boy’s aunt on a farm near Shreveport, Louisiana, yet no independent party appears to have corroborated their identities. Did Capote (or Hickock and Smith) make them up? Does the episode really contribute to our understanding of the killers’ pathology? One doesn’t need to be aware of a recent DNA test that disproved Hickock and Smith’s involvement with the quadruple murder of the Walker family in Sarasota County, Florida, taking place one month after the Clutter murders, to see that Capote is more interested in holding up the funhouse mirror to impart specious complicity:

Hickock consented to take the [polygraph] test and so did Smith, who told Kansas authorities, “I remarked at the time, I said to Dick, I’ll bet whoever did this must be somebody that read about what happened out here in Kansas. A nut.” The results of the test, to the dismay of Osprey’s sheriff as well as Alvin Dewey, who does not believe in exceptional circumstances, were decisively negative.

Never mind that polygraph tests are inaccurate. It isn’t so much Hickock and Smith’s motivations that Capote was interested in. He was more concerned with stretching out a sense of amorphous terror on a wide canvas. As Hickock and Smith await to be hanged, they encounter Lowell Lee Andrews in the adjacent cell. He is a fiercely intelligent, corpulent eighteen-year-old boy who fulfilled his dormant dreams of murdering his family, but Capote’s portrait leaves little room for subtlety:

For the secret Lowell Lee, the one concealed inside the shy church going biology student, fancied himself an ice-hearted master criminal: he wanted to wear gangsterish silk shirts and drive scarlet sports cars; he wanted to be recognized as no mere bespectacled, bookish, overweight, virginal schoolboy; and while he did not dislike any member of his family, at least not consciously, murdering them seemed the swiftest, most sensible way of implementing the fantasies that possessed him.

We have modifiers (“shy,” “ice-hearted,” “gangsterish,” “silk,” “scarlet,” “bespectacled,” “bookish,” “virginal,” “swiftest,” and “sensible”) that conjure up a fantasy atop the fantasy, that suggest relativism to the two main heavies, but there is little room for subtlety or for any doubt in the reader’s mind. Capote does bring up the fact that Andrews suffered from schizophrenia, but diminishes this mental illness by calling it “simple” before dredging up the M’Naghten Rule, which was devised in 1843 (still on the books well before psychiatry existed and predicated upon a 19th century standard) to exclude any insanity defense whereby the accused recognizes right from wrong. But he has already tarnished Andrews with testimony from Dr. Joseph Satten: “He considered himself the only important, only significant person in the world. And in his own seclusive world it seemed to him just as right to kill his mother as to kill an animal or a fly.” I certainly don’t want to defend Andrews’s crime (much less the Clutter family murders), but this conveniently pat assessment does ignore more difficult and far more interesting questions that Capote lacks the coherence, the empathy, or the candor to confess his own contradictions to pursue. Many pages before, in relation to Hickock, Capote calls M’Naghten “a formula quite color-blind to any gradations between black and white.” In other words, Capote is the worst kind of journalist: a cherry-picking sensationalist who applies standards as he sees fit, heavily steering the reader’s opinion even as he feigns objectivity. The ethical reader reads In Cold Blood in the 21st century, wanting Katherine Boo to emerge from the future through a wormhole, if only to open up a can of whoopass on Capote for these egregious and often thoughtless indiscretions.

Capote’s decision to remove himself from the crisp, lurid story was commended by many during In Cold Blood‘s immediate reception as a feat of unparalleled objectivity, with the “nonfiction novel” label sticking to the book like a trendy hashtag that hipsters refuse to surrender, but I think Cynthia Ozick described the thorny predicament best in her infamous driveby on Capote (collected in Art & Ardor): “Essence without existence; to achieve the alp of truth without the risk of the footing.” If we accept any novel — whether “nonfiction” or fully imaginative — as some sinister or benign cousin to the essay, as a reasonably honest attempt to reckon with the human experience through invention, then In Cold Blood is a failure: the work of a man who sat idly in his tony Manhattan spread with cadged notebooks and totaled recall of aggressively acquired conversations even as his murderous subjects begged their “friend” to help them escape the hangman’s noose.

In 2013, Slate‘s Ben Yagoda described numerous factual indiscretions, revealing that editor William Shawn had penciled in “How know?” on the New Yorker galley proofs of Capote’s four part opus (In Cold Blood first appeared in magazine form). That same year, the Wall Street Journal uncovered new evidence from the Kansas Bureau of Investigation, which revealed that the KBI did not, upon receiving intelligence from informant Floyd Wells, swiftly dispatch agent Harold Nye to the farmhouse where Richard Hickock had lodged. (“It was as though some visitor were expected,” writes Capote. Expected by Hickock’s father or an author conveniently tampering his narrative like a subway commuter feverishly filling in a sudoku puzzle?) As Jack de Bellis has observed, Capote’s revisions from New Yorker articles to book form revealed Capote’s feeble command of time, directions, and even specific places. But de Bellis’s examination revealed more descriptive imprudence, such as Capote shifting a line on how Perry “couldn’t stand” another prisoner to “could have boiled him in oil” (“How know?” we can ask today), along with many efforts to coarsen the language and tweak punctuation for a sensationalist audience.

And then there is the propped up hero Alvin Dewey, presented by Capote as a tireless investigator who consumes almost nothing but coffee and who loses twenty pounds: a police procedural stereotype if ever there was one. Dewey disputes closing his eyes during the execution and the closing scene of Dewey meeting Nancy Clutter’s best friend, Susan Kidwell, in a cemetery is not only invented, but heavily mimics the belabored ending of Capote’s 1951 novel, The Grass Harp. But then “Foxy” Dewey and Capote were tighter than a pair of frisky lovers holed up for a week in a seedy motel.

Capote was not only granted unprecedented access to internal documents, but Capote’s papers reveal that Dewey provided Capote with stage directions in the police interview transcripts. (One such annotation reads “Perry turns white. Looked at the ceiling. Swallows.”) There is also the highly suspect payola of Columbia Pictures offering Dewey’s wife a job as a consultant on the 1965 film for a fairly substantial fee. Harold Nye, another investigator whose contributions have been smudged out of the history told Charles J. Shields in a December 30, 2002 interview (quoted in Mockingbird), “I really got upset when I know that Al [Dewey] gave them a full set of the reports. That was like committing the largest sin there was, because the bureau absolutely would not stand for that at all. If it would have been found out, he would have been discharged immediately from the bureau.”

haroldnyeIn fact, Harold Nye and other KBI agents did much of the footwork that Capote attributes to Dewey. Nye was so incensed by Capote’s prevarications that he read 115 pages of In Cold Blood before hurling the book across the living room. And in the last few years, the Nye family has been fighting to reveal the details inside two tattered notebooks that contain revelations about the Clutter killings that may drastically challenge Capote’s narrative.

Yet even before this, Capote’s magnum opus was up for debate. In June 1966, Esquire published an article by Phillip K. Tompkins challenging Capote’s alleged objectivity. He journeyed to Kansas and discovered that Nancy Clutter’s boyfriend was hardly the ace athlete (“And now, after helping clear the dining table of all its holiday dishes, that was what he decided to do that — put on a sweatshirt and go for a run.”) that Capote presented him as, that Nancy’s horse was sold for a higher sum to the father of the local postmaster rather than “a Mennonite farmer who said he might use her for plowing,” and that the undersheriff’s wife disputed Capote’s account:

During our telephone conversation, Mrs. Meier repeatedly told me that she never heard Perry cry; that on the day in question she was in her bedroom, not the kitchen; that she did not turn on the radio to drown out the sound of crying; that she did not hold Perry’s hand; that she did not hear Perry say, ‘I’m embraced by shame.’ And finally – that she had never told such things to Capote. Ms. Meier told me repeatedly and firmly, in her gentle way, that these things were not true.

(For more on Capote’s libertine liberties, see Chapter 4 of Ralph F. Voss’s Truman Capote and the Legacy of In Cold Blood.)

Confronted by these many disgraceful distortions, we are left to ignore the “journalist” and assess the execution. On a strictly showboating criteria, In Cold Blood succeeds and captures our imagination, even if one feels compelled to take a cold shower knowing that Capote’s factual indiscretions were committed with a blatant disregard for the truth, not unlike two psychopaths murdering a family because they believed the Clutters possessed a safe bountiful with riches. One admires the way that Capote describes newsmen “[slapping] frozen ears with ungloved, freezing hands,” even as one winces at the way Capote plays into patriarchal shorthand when Nye “visits” Barbara Johnson (Perry Smith’s only surviving sister: the other two committed suicide), describing her father as a “real man” who had once “survived a winter alone in the Alaskan wilderness.” The strained metaphor of two gray tomcats — “thin, dirty strays with strange and clever habits” – wandering around Garden City during the Smith-Hickcock trial allows Capote to pad out his narrative after he has exhausted his supply of “flat,” “dull,” “dusty,” “austere,” and “stark” to describe Kansas in the manner of some sheltered socialite referencing the “flyover states.” Yet for all these cliches, In Cold Blood contains an inexplicably hypnotic allure, a hold upon our attention even as the book remains aggressively committed to the facile conclusion that the world is populated by people capable of murdering a family over an amount somewhere “between forty and fifty dollars.” As Jimmy Breslin put it (quoted in M. Thomas Inge’s Conversations with Truman Capote), “This Capote steps in with flat, objective, terrible realism. And suddenly there is nothing else you want to read.”

That the book endures — and is even being adapted into a forthcoming “miniseries event’ by playwright Kevin Hood — speaks to an incurable gossipy strain in Western culture, one reinforced by the recent success of the podcast Serial and the television series The Jinx. It isn’t so much the facts that concern our preoccupation with true crime, but the sense that we are vicariously aligned with the fallible journalist pursuing the story, who we can entrust to dig up scandalous dirt as we crack open our peanuts waiting for the next act. If the investigator is honest about her inadequacies, as Serial‘s Sarah Koenig most certainly was, the results can provide breathtaking insight into the manner in which we incriminate other people with our emotional assumptions and our fallible memories and superficially examined evidence. But if the “journalist” removes himself from culpability, presenting himself as some demigod beyond question or reproach (Capote’s varying percentages of total recall memory certainly feel like some newly wrangled whiz kid bragging about his chops before the knowledge bowl), then the author is not so much a sensitive artist seeking new ways inside the darkest realm of humanity, but a crude huckster occupying an outsize stage, waiting to grab his lucrative check and attentive accolades while the real victims of devastation weep over concerns that are far more human and far more deserving of our attention. We can remove the elephants from the lineup, but the circus train still rolls on.

Next Up: The Promise of American Life by Herbert Croly!

macdonaldpain

The Journalist and the Murderer (Modern Library Nonfiction #97)

(This is the fourth entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: The Taming of Chance.)

mlnf97One of the mistakes often made by those who immerse themselves in Janet Malcolm’s The Journalist and the Murderer is believing that MacDonald’s guilt or innocence is what matters most. But Malcolm is really exploring how journalistic opportunity and impetuous judgment can lead any figure to be roundly condemned in the court of public opinion. Malcolm’s book was written before the Internet blew apart much of the edifice separating advertising and editorial with native advertising and sponsored articles, but this ongoing ethical dilemma matters ever more in our age of social media and citizen journalism, especially when Spike Lee impulsively tweets the wrong address of George Zimmerman (and gets sued because of the resultant harassment) and The New York Post publishes a front page cover of two innocent men (also resulting in a lawsuit) because Reddit happened to believe they were responsible for the 2013 Boston Marathon bombing.

Yet it is important to approach anything concerning the Jeffrey MacDonald murder case with caution. It has caused at least one documentary filmmaker to go slightly mad. It is an evidential involution that can ensnare even the most disciplined mind, a permanently gravid geyser gushing out books and arguments and arguments about books, with more holes within the relentlessly regenerating mass than the finest mound of Jarlsberg. But here are the underlying facts:

On February 17, 1970, Jeffrey MacDonald reported a stabbing to the military police. Four officers found MacDonald’s wife Colette, and their two children, Kimberley and Kristen, all dead in their respective bedrooms. MacDonald went to trial and was found guilty of one count of first-degree murder and two counts of second-degree murder. He was sentenced to three life sentences. Only two months before this conviction, MacDonald hired the journalist Joe McGinniss — the author of The Selling of a President 1968, then looking for a comeback — to write a book about the case, under the theory that any money generated by MacDonald’s percentage could be used to sprout a defense fund. MacDonald placed total trust in McGinniss, opening the locks to all his papers and letting him stay in his condominium. McGinniss’s book, Fatal Vision, was published in the spring of 1983. It was a bestseller and spawned a popular television miniseries, largely because MacDonald was portrayed as a narcissist and a sociopath, fitting the entertainment needs of a bloodthirsty public. MacDonald didn’t know the full extent of this depiction. Indeed, as he was sitting in jail, McGinniss refused to send him a galley or an advance copy. (“At no time was there ever any understanding that you would be given an advance look at the book six months prior to publication,” wrote McGinniss to MacDonald on February 16, 1983. “As Joe Wambuagh told you in 1975, with him you would not even see a copy before it was published. Same with me. Same with any principled and responsible author.” Malcolm copiously chronicles the “principled and responsible” conduct of McGinniss quite well, which includes speaking with MacDonald in misleading and ingratiating tones, often pretending to be a friend — anything to get MacDonald to talk.)

wallacemacdonald

On 60 Minutes, roughly around the book’s publication, Mike Wallace revealed to MacDonald what McGinniss was up to:

Mike Wallace (narrating): Even government prosecutors couldn’t come up with a motive or an explanation of how a man like MacDonald could have committed so brutal a crime. But Joe McGinniss thinks he’s found the key. New evidence he discovered after the trial. Evidence he has never discussed with MacDonald. A hitherto unrevealed account by the doctor himself of his activities in the period just before the murders.

Joe McGinniss: In his own handwriting, in notes prepared for his own attorneys, he goes into great detail about his consumption of a drug called Eskatrol, which is no longer on the market. It was voluntarily withdrawn in 1980 because of dangerous side effects. Among the side effects of this drug are, when taken to excess by susceptible individuals, temporary psychosis, often manifested as a rage reaction. Here we have somebody under enormous pressure and he’s taking enough of this Eskatrol, enough amphetamines, so that by his own account, he’s lost 15 pounds in the three weeks leading up to the murders.

eskatrolnoteWallace: Now wait. According to the note which I’ve seen, three to five Eskatrol he has taken. We don’t know if he’s taken it over a period of several weeks or if he’s taken three to five Eskatrol a day or a week or a month.

McGinniss: We do know that if you take three to five Eskatrol over a month, you’re not going to lose 15 pounds in doing so.

Jeffrey MacDonald: I never stated that to anyone and I did not in fact lose fifteen pounds. I also wasn’t taking Eskatrol.

Wallace (reading MacDonald’s note): “We ate dinner together at 5:45 PM. It is possible I had one diet pill at this time. I do not remember and do not think I had one. But it is possible. I had lost 12 to 15 pounds in the prior three to four weeks in the process, using three to five capsules of Eskatrol Spansule. I was also…”

MacDonald: Three to five capsules for the three weeks.

Wallace: According to this.

MacDonald: Right.

Wallace: According to this.

MacDonald: And that’s a possibility.

Wallace: Then why would you put down here that…that there was even a possibility?

MacDonald: These are notes given to an attorney, who has told me to bare my soul as to any possibility so we could always be prepared. So I…

Wallace: Mhm. But you’ve already told me that you didn’t lose 15 pounds in the three weeks prior…

MacDonald: I don’t think that I did.

Wallace: It’s in your notes. “I had lost 12-15 lbs. in the prior 3-4 weeks, in the process using 3-5 capsules of Eskatrol Spansules.” That’s speed. And compazine. To counteract the excitability of speed. “I was losing weight because I was working out with a boxing team and the coach told me to lose weight.” — 60 Minutes

One of McGinniss’s exclusive contentions was that MacDonald had murdered his family because he was high on Eskatrol. Or, as he wrote in Fatal Vision:

It is also fact that if Jeffrey MacDonald were taking three to five Eskatrol Spansules daily, he would have been consuming 75 mg. of dextroamphetamine — more than enough to precipitate an amphetamine psychosis.

Note the phrasing. Even though McGinniss does not know for a fact whether or not MacDonald took three to five Eskatrol (and MacDonald himself is also uncertain: both MacDonald and McGinniss prevaricate enough to summon the justifiably hot and bothered mesh of Mike Wallace’s grilling), he establishes the possibility as factual — even though it is pure speculation. The prognostication becomes a varnished truth, one that wishes to prop up McGinniss’s melodramatic thesis.

* * *

Malcolm was sued for libel by Jeffrey Masson over her depiction of him in her book, In the Freud Archives. In The Journalist and the Murderer, she has called upon all journalists to feel “some compunction about the exploitative character of the journalist-subject relationship,” yet claims that her own separate lawsuit was not the driving force in the book’s afterword. Yet even Malcolm, a patient and painstaking practitioner, could not get every detail of MacDonald’s appearance on 60 Minutes right:

As Mike Wallace — who had received an advance copy of Fatal Vision without difficulty or a lecture — read out loud to MacDonald passages in which he was portrayed as a psychopathic killer, the camera recorded his look of shock and utter discomposure.

Wallace was reading MacDonald’s own notes to his attorney back to him, not McGinniss’s book. These were not McGinniss’s passages in which MacDonald was “portrayed as a psychopathic killer,” but passages from MacDonald’s own words that attempted to establish his Eskatrol use. Did Malcolm have a transcript of the 60 Minutes segment now readily available online in 1990? Or is it possible that MacDonald’s notes to his attorney had fused so perfectly with McGinnis’s book that the two became indistinguishable?

This raises important questions over whether any journalist can ever get the facts entirely right, no matter how fair-minded the intentions. It is one thing to be the hero of one’s own story, but it is quite another to know that, even if she believes herself to be morally or factually in the clear, the journalist is doomed to twist the truth to serve her purposes.

It obviously helps to be transparent about one’s bias. At one point in The Journalist and the Murderer, Malcolm is forthright enough to confess that she is struck by MacDonald’s physical grace as he breaks off pieces of tiny powdered sugar doughnuts. This is the kind of observational detail often inserted in lengthy celebrity profiles to “humanize” a Hollywood actor uttering the same calcified boilerplate rattled off to every roundtable junketeer. But if such a flourish is fluid enough to apply to MacDonald, we are left to wonder how Malcolm’s personal connection interferes with her purported journalistic objectivity. In the same paragraph, Malcolm neatly notes the casual abuse MacDonald received in his mailbox after McGinniss’s book was published — in particular a married couple who read Fatal Vision while on vacation who took the time to write a hateful letter while sunbathing at the Sheraton Waikiki Hotel. This casual cruelty illustrates how the reader can be just as complicit as the opportunistic journo in perpetuating an incomplete or slanted portrait.

The important conundrum that Malcolm imparts in her short and magnificently complicated volume is why we bother to read or write journalism at all if we know the game is rigged. The thorny morality can extend to biography (Malcolm’s The Silent Woman is another excellent book which sets forth the inherent and surprisingly cyclical bias in writing about Sylvia Plath). And even when the seasoned journalist is aware of ethical discrepancies, the judgmental pangs will still crop up. In “A Girl of the Zeitgeist” (contained in the marvelous collection, Forty-One False Starts), Malcolm confessed her own disappointment in how Ingrid Sischy failed to live up to her preconceptions as a bold and modern woman. Malcolm’s tendentiousness may very well be as incorrigible as McGinnis’s, but is it more forgivable because she’s open about it?

* * *

It can be difficult for Janet Malcolm’s most arduous advocates to detect the fine grains of empathy carefully lining the crisp and meticulous forms of her svelte and careful arguments, which are almost always sanded against venal opportunists. Malcolm’s responsive opponents, which have recently included Esquire‘s Tom Junod, Errol Morris, and other middling men who are inexplicably intimidated by women who are smarter, have attempted to paint Malcolm as a hypocrite, an opportunist, and a self-loathing harpy of the first order. Junod wrote that “it’s clear to anyone who reads her work that very few journalists are animated by malice than Janet Malcolm” and described her work as “a self-hater whose work has managed to speak for the self-hatred” of journalism. Yet Junod cannot cite any examples of this self-hate and malice, save for the purported Henry Youngman-like sting of her one liners (Malcolm is not James Wolcott; she is considerably more thoughtful and interesting) and for pointing out, in Iphigenia in Forest Hills, how trials “offer unique opportunities for journalistic heartlessness,” failing to observe how Malcolm pointed out how words or evidence lifted out of context could be used to condemn or besmirch the innocent until proven guilty (and owning up to her own biases and her desire to interfere).

Malcolm is not as relentless as her generational peer Renata Adler, but she is just as refreshingly formidable. She is as thorough with her positions and almost as misunderstood. She has made many prominent enemies for her controversial positions — even fighting a ten year trial against Jeffrey Masson over the authenticity of his quotations (dismissed initially by a federal judge in California on the grounds that there was an absence of malice). Adler was ousted from The New Yorker, but Malcolm was not. In the last few years, both have rightfully found renewed attention for their years among a new generation.

One origin for the anti-Malcolm assault is John Taylor’s 1989 New York Magazine article, “Holier than Thou,” which is perhaps singularly responsible for making it mandatory for any mention of The Journalist and the Murderer to include its infamous opening line: “Every journalist who is not too stupid or too full of himself to notice what is going on knows that what he does is morally indefensible.” Taylor excoriated Malcolm for betraying McGinniss as a subject, dredged up the Masson claims, and claimed that Malcolm used Masson much as McGinniss had used MacDonald. It does not occur to Taylor that Malcolm herself may be thoroughly familiar with what went down and that the two lengthy articles which became The Journalist and the Murderer might indeed be an attempt to reckon with the events that caused the fracas:

Madame Bovary, c’est moi,” Flaubert said of his famous character. The characters of nonfiction, no less than those of fiction, derive from the writer’s most idiosyncratic desires and deepest anxieties; they are what the writer wishes he was and worries that he is. Masson, c’est moi.

Similarly, Evan Hughes had difficulty grappling with this idea, caviling over the “bizarre stance” of Malcolm not wanting to be “oppressed by the mountain of documents that formed in my office.” He falsely infers that Malcolm has claimed that “it is pointless to learn the facts to try to get to the bottom of a crime,” not parsing Malcolm’s clear distinction between evidence and the journalist’s ineluctable need to realize characters on the page. No matter how faithfully the journalist sticks with the facts, a journalistic subject becomes a character because the narrative exigencies demand it. Errol Morris can find Malcolm’s stance “disturbing and problematic” as much as he likes, but he is the one who violated the journalistic taboo of paying subjects for his 2008 film, Standard Operating Procedure, without full disclosure. One of Morris’s documentary subjects, Joyce McKinney, claimed that she was tricked into giving an interview for what became Tabloid, alleging that one of Morris’s co-producers broke into her home with a release form. Years before Morris proved triumphant in an appellate court, he tweeted:

The notion of something “unvarnished” attached to a personal account may have originated with Shakespeare:

And therefore little shall I grace my cause
In speaking for myself. Yet, by your gracious patience,
I will a round unvarnished tale deliver
Of my whole course of love. What drugs, what charms,
What conjuration and what mighty magic—
For such proceeding I am charged withal—
I won his daughter.
Othello, Act 1, Scene 3

Othello hoped that in telling “a round unvarnished tale,” he would be able to come clean with Brabantio over why he had eloped with the senator’s daughter Desdemona. He wishes to be straightforward. It’s an extremely honorable and heartfelt gesture that has us very much believing in Othello’s eloquence. Othello was very lucky not to be speaking with a journalist, who surely would have used his words against him.

Next Up: Truman Capote’s In Cold Blood!

quincunxcrop

The Taming of Chance (Modern Library Nonfiction #98)

(This is the third entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Operating Instructions.)

mlnf98In the bustling beginnings of the twentieth century, the ferociously independent mind who forever altered the way in which we look at the universe was living in poverty.* His name was Charles Sanders Peirce and he’d anticipated Heisenberg’s uncertainty principle by a few decades. In 1892, Peirce examined what he called the doctrine of necessity, which held that every single fact of the universe was determined by law. Because before Peirce came along, there were several social scientists who were determined to find laws in everything — whether it be an explanation for why you parted your hair at a certain angle with a comb, felt disgust towards specific members of the boy band One Direction, or ran into an old friend at a restaurant one hundred miles away from where you both live. Peirce declared that absolute chance — that is, spontaneity or anything we cannot predict before an event, such as the many fish that pelted upon the heads of puzzled citizens in Shasta County, California on a January night in 1903 — is a fundamental part of the universe. He concluded that even the careful rules discovered by scientists only come about because, to paraphrase Autolycus from A Winter’s Tale, although humans are not always naturally honest, chance sometimes makes them so.

The story of how Peirce’s brave stance was summoned from the roiling industry of men with abaci and rulers is adeptly set forth in Ian Hacking’s The Taming of Chance, a pleasantly head-tingling volume that I was compelled to read twice to ken the fine particulars. It’s difficult to articulate how revolutionary this idea was at the time, especially since we now live in an epoch in which much of existence feels preordained by statistics. We have witnessed Nate Silver’s demographic models anticipate election results and, as chronicled in Moneyball, player performance analysis has shifted the way in which professional baseball teams select their roster and steer their lineup into the playoffs, adding a strange computational taint that feels as squirmy as performance enhancing drugs.

But there was a time in human history in which chance was considered a superstition of the vulgar, even as Leibniz, seeing that a number of very smart people were beginning to chatter quite a bit about probability, argued that the true measure of a Prussian state resided in how you tallied the population. Leibniz figured that if Prussia had a central statistic office, it would not only be possible to gauge the nation’s power but perhaps lead to certain laws and theories about the way these resources worked.

This was obviously an idea that appealed to chin-stroking men in power. One does not rule an empire without keeping the possibility of expansion whirling in the mind. It didn’t take long for statistics offices to open and enthusiasts to start counting heads in faraway places. (Indeed, much like the early days of computers, the opening innovations originated from amateurs and enthusiasts.) These early statisticians logged births, deaths, social status, the number of able-bodied men who might be able to take up weapons in a violent conflict, and many other categories suggested by Leibniz (and others that weren’t). And they didn’t just count in Prussia. In 1799, Sir John Sinclair published a 21 volume Statistical Account of Scotland that undoubtedly broke the backs of many of the poor working stiffs who were forced to carry these heavy tomes to the guys determined to count it all. Some of the counters became quite obsessive in their efforts. Hacking reports that Sinclair, in particular, became so sinister in his efforts to get each minister of the Church of Scotland to provide a detailed congregation schedule that he began making threats shrouded in a jocose tone. Perhaps the early counters needed wild-eyed dogged advocates like Sinclair to establish an extremely thorough baseline.

The practice of heavy-duty counting resulted, as Hacking puts it, in a bona-fide “avalanche of numbers.” Yet the intersection of politics and statistics created considerable fracas. Hacking describes the bickering and backbiting that went down in Prussia. What was a statistical office? Should we let the obsessive amateurs run it? Despite all the raging egos, bountiful volumes of data were published. And because there was a great deal of paper being shuffled around, cities were compelled by an altogether different doctrine of necessity to establish central statistical hubs. During the 1860s, statistical administrations were set up in Berlin, New York, Stockholm, Vienna, Rome, Leipzig, Frankfurt-am-Main, and many others. But from these central offices emerged a East/West statistics turf war, with France and England playing the role of Biggie on the West and Prussia as Tupac on the East. The West believed that a combination of individual competition and natural welfare best served society, while the East created the welfare state to solve these problems. And these attitudes, which Hacking is good enough to confess as caricaturish even as he illustrates a large and quite important point, affected the way in which statistics were perceived. If you believe in a welfare state, you’re probably not going to see laws forged from the printed numbers. Because numbers are all about individual action. And if you believe in the Hobbesian notion of free will, you’re going to look for statistical laws in the criminal numbers, because laws are formed by individuals. This created new notions of statistical fatalism. It’s worth observing that science at the time was also expected to account for morality.

Unusual experiments ensued. What, for example, could the chest circumference of a Scotsman tell us about the stability of the universe? (Yes, the measurement of Scottish chests was seriously considered by a Belgian guy named Adolphe Quetelet, who was trying to work out theories about the average man. When we get to Stephen Jay Gould’s The Mismeasure of Man several years from now, #21 in the Modern Library Nonfiction canon, I shall explore more pernicious measurement ideas promulgated as “science.” Stay tuned!) More nefariously, if you could chart the frequency of how often the working classes called in sick, perhaps you could establish laws to determine who was shirking duty, track the unruly elements, and punish the agitators interfering with the natural law. (As we saw with William Lamb Melbourne’s story, the British government was quite keen to crack down on trade unions during the 1830s. So just imagine what a rabid ideologue armed with a set of corrupted and unproven “laws” could do. In fact, we don’t even have to jump that far back in time. Aside from the obvious Hollerith punch card example, one need only observe the flawed radicalization model presently used by the FBI and the DHS to crack down on Muslim “extremists.” Arun Kundnani’s recent book, The Muslims Are Coming, examines this issue further. And a future Bat Segundo episode featuring Kundnani will discuss this dangerous approach at length.)

Throughout all these efforts to establish laws from numbers (Newton’s law of gravity had inspired a league of scientists to seek a value for this new G constant, a process that took more than a century), Charles Babbage, Johann Christian Poggendorf, and many others began publishing tables of constants. It is one thing to publish atomic weights. It is quite another to measure the height, weight, pulse, and breath of humans by gender and ethnicity (along with animals). The latter constant sets are clearly not as objective as Babbage would like to believe. And yet the universe does adhere to certain undeniable principles, especially when you have a large data set.

It took juries for mathematicians to understand how to reconcile large numbers with probability theory. In 1808, Pierre-Simon Laplace became extremely concerned with the French jury system. At the time, twelve-member juries convicted an accused citizen by a simple majority. He calculated that a seven-to-five majority had a chance of error of one in three. The French code had adopted the unusual method of creating a higher court of five judges to step in if there was a disagreement with a majority verdict in the lower court. In other words, if the majority of the judges in the higher court agreed with the minority of jurors in the lower court that an accused person should be acquitted, then the accused person would be acquitted. Well, this complicated system bothered Laplace. Accused men often faced execution in the French courts. So if there was a substantial chance of error, then the system needed to be reformed. Laplace began to consider juries composed of different sizes and verdicts ranging from total majority (12:0) to partial majority (9:3, 8:4), and he computed the following odds (which I have reproduced from a very helpful table in Hacking’s book):

hacking-juryerror

The problems here become self-evident. You can’t have 1,001 people on a jury arguing over the fate of one man. On the other hand, you can’t have a 2/7 chance of error with a jury of twelve. (One of Laplace’s ideas was a 144 member jury delivering a 90:54 verdict. This involved a 1/773 chance of error. But that’s nowhere nearly as extreme as a Russian mathematician named M.V. Ostrogradsky, who wasted much ink arguing that a 212:200 majority was more reliable than a 12:0 verdict. Remember all this the next time you receive a jury duty notice. Had some of Laplace’s understandable concerns been more seriously considered, there’s a small chance that societies could have adopted larger juries in the interest of a fair trial.)

French law eventually changed the minimum conviction from 7:5 to 8:4. But it turned out that there was a better method to allow for a majority jury verdict. It was a principle that extended beyond mere frequency and juror reliability, taking into account Bernoulli’s ideas on drawing black and white balls from an urn to determine a probability value. It was called the law of large numbers. And the great thing is that you can observe this principle in action through a very simple experiment.

Here’s a way of seeing the law of large numbers in action. Take a quarter and flip it. Write down whether the results are heads or tails. Do it again. Keep doing this and keep a running tally of how many times the outcome is heads and how many times the coin comes up tails. For readers who are too lazy to try this at home, I’ve prepared a video and a table of my coin toss results:

edcointoss

The probability of a coin toss is 1:1. On average, the coin will turn up heads 50% of the time and tails 50% of the time. As you can see, while my early tosses leaned heavily towards heads, by the time I had reached the eighteenth toss, the law of large numbers ensured that my results skewed closer to 1:1 (in this case, 5:4) as I continued to toss the coin. Had I continued to toss the coin, I would have come closer to 1:1 with every toss.

galtonbox

The law of large numbers offered the solution to Laplace’s predicament. It also accounts for the mysterious picture at the head of this essay. That image is a working replica of a Galton box (also known as a quincunx). (If you’re ever in Boston, go to the Museum of Science and you can see a very large working replica of a Galton box in action.) Sir Francis Galton needed a very visual method of showing off the central limit theorem. So he designed a box, not unlike a pachinko machine, in which beans are dropped from the top and work their way down through a series of wooden pins, which cause them to fall along a random path. Most of the beans land in the center. Drop more beans and you will see a natural bell curve form, illustrating the law of large numbers and the central limit theorem.

Despite all this, there was still the matter of statistical fatalism to iron out, along with an understandable distrust of statistics among artists and the general population, which went well beyond Disraeli’s infamous “There are three kinds of lies: lies, damned lies, and statistics” quote. Hacking is a rigorous enough scholar to reveal how Dickens, Dostoevsky, and Balzac were skeptical of utilitarian statistics. Balzac, in particular, delved into “conjugal statistics” in his Physiology of Marriage to deduce the number of virtuous women. They had every reason to be, given how heavily philosophers leaned on determinism. (See also William James’s “The Dilemma of Determinism.”) A German philosopher named Ernst Cassirer was a big determinism booster, pinpointing its beginnings in 1872. Hacking challenges Cassierer by pointing out that determinism incorporated the doctrine of necessity earlier in the 1850s, an important distinction in returning back to Peirce’s idea of absolute chance.

I’ve been forced to elide a number of vital contributors to probability and some French investigations into suicide in an attempt to convey Hacking’s intricate narrative. But the one word that made Perice’s contributions so necessary was “normality.” This was the true danger of statistical ideas being applied to the moral sciences. When “normality” became the ideal, it was greatly desirable to extirpate anything “abnormal” or “aberrant” from the grand human garden, even though certain crime rates were indeed quite normal. We see similar zero tolerance measures practiced today by certain regressive members of law enforcement or, more recently, New York Mayor Bill de Blasio’s impossible pledge to rid New York City of all traffic deaths by 2024. As the law of large numbers and Galton’s box observed, some statistics are inevitable. Yet it was also important for Peirce to deny the doctrine of necessity. Again, without chance, Peirce pointed out that we could not have had all these laws in the first place.

It was strangely comforting to learn that, despite all the nineteenth century innovations in mathematics and probability, chance remains very much a part of life. Yet when one begins to consider stock market algorithms (and the concomitant flash crashes), as well as our collective willingness to impart voluminous personal data to social media companies who are sharing these numbers with other data brokers, I cannot help but ponder whether we are willfully submitting to another “law of large numbers.” Chance may favor the prepared mind, as Pasteur once said. So why court predictability?

* Peirce’s attempts to secure academic employment and financial succor were thwarted by a Canadian scientist named Simon Newcomb. (A good overview of the correspondence between the two men can be found at the immensely helpful “Perice Gateway” website.)

Next Up: Janet Malcolm’s The Journalist and the Murderer!

annelamott

Operating Instructions (Modern Library Nonfiction #99)

(This is the second entry in The Modern Library Nonfiction Challenge, an ambitious project to read and write about the Modern Library Nonfiction books from #100 to #1. There is also The Modern Library Reading Challenge, a fiction-based counterpart to this list. Previous entry: Melbourne.)

mlnf99It is easy to forget, as brave women document their battles with cancer and callous columnists bully them for their candor, that our online confessional age didn’t exist twenty years ago. I suspect this collective amnesia is one of the reasons why Anne Lamott’s Operating Instructions — almost an urtext for mommy blogs and much of the chick lit that followed — has been needlessly neglected by snobbish highbrow types, even when hungry young writers rushed to claim transgressive land in the Oklahoma LiveJournal Run of 2006.

Lamott’s book, which is a series of honed journal entries penned from the birth of her son Sam to his first birthday, was ignored by the New York Times Book Review upon its release in 1993 (although Ruth Reichl interviewed her for the Home & Garden section after the book, labeled “an eccentric baby manual” by Reichl, became a bestseller). Since then, aside from its distinguished inclusion on the Modern Library list, it has not registered a blip among those who profess to reach upward. Yet if we can accept Karl Ove Knausgaard’s honesty about fatherhood in the second volume of his extraordinary autobiographical novel, My Struggle, why then do we not honor Anne Lamott? It is true that, like Woody Allen in late career, Lamott has put out a few too many titles. It is also true that she attracts a large reading audience, a sin as unpardonable to hoity-toity gasbags as a man of the hoi polloi leaving the toilet seat up. Much as the strengths of Jennifer Weiner’s fiction are often dwarfed by her quest for superfluous respect, Anne Lamott’s acumen for sculpting the familiar through smart and lively prose doesn’t always get the credit it deserves.

Operating Instructions — with its breezy pace, its populist humor, and its naked sincerity — feels at first to be a well-honed machine guaranteed to attract a crowd of likeminded readers. But once you start looking under the hood, you begin to understand how careful Lamott is with what she doesn’t reveal. It begins with the new baby’s name. We are informed that Samuel John Stephen Lamott’s name has been forged from Lamott’s brothers, John and Steve. But where does the name Samuel come from? And why is Lamott determined to see Sams everywhere? (A one-armed Sam, the son of a friend named Sam, et al.) There are murky details about Sam’s father, who flits in and out of the narrative like some sinister figure with a twirling moustache. He is six foot four and two hundred pounds. He is in his mid-fifties, an older man who Lamott had a fling with. We learn later in the book that he “filed court papers today saying that we never fucked and that he therefore cannot be the father.” Even so, what’s his side of the story?

This leaves Lamott, struggling for cash and succor, raising Sam on her own with a dependable “pit crew” of friends. Yet one is fascinated not only by Lamott’s unshakable belief that she will remain a single parent for the rest of her natural life (“there is nothing I can do or say that will change the fact that his father chooses not to be his father. I can’t give him a dad, I can’t give him a nuclear family”), but by how the absence of this unnamed father causes her to dwell on her own father’s final days.

Lamott’s father was a writer who “died right as I crossed the threshold into publication.” His brain cancer was so bad that he could barely function in his final days. Lamott describes leaving her father in the car with a candy bar as she hits the bank. Her father escapes the car, becoming a “crazy old man pass[ing] by, his face smeared with chocolate, his blue jeans hanging down in back so you could see at least two inches of his butt, like a little boy’s.” It is a horrifying image of a man Lamott looked up to regressing into childhood before the grave, leaving one to wonder if this has ravaged Lamott’s view of men — especially since she repeatedly chides the apparent male relish of peeing standing up — and what idea of manhood she will pass along to her growing boy.

Part of me loves and respects men so desperately, and part of me thinks they are so embarrassingly incompetent at life and in love. You have to teach them the very basics of emotional literacy. You have to teach them how to be there for you, and part of me feels tender toward them and gentle, and part of me is so afraid of them, afraid of any more violation. I want to clean out some of these wounds, though, with my therapist, so Sam doesn’t get poisoned by all my fear and anger.

This is an astonishing confession for a book that also has Lamott tending to Sam’s colic, describing the wonders of Sam’s first sounds and movement, and basking in the joys of a human soul emerging in rapid increments. Motherhood has long been compared to war, to the point where vital discussions about work-family balance have inspired their own “mommy wars.” Operating Instructions features allusions to heroes, Nagasaki, Vietnam, and other language typically muttered by priapic military historians. Yet Lamott’s take also reveals a feeling that has become somewhat dangerous to express in an era of mansplaining, Lulu hashtags, and vapid declarations of “the end of men.” Men are indeed embarrassing, but are they an ineluctable part of motherhood? It is interesting that Lamott broaches this question long after Sam’s father has become a forgettable presence in the book. And yet months later, Lamott is more grateful for the inherited attributes of the “better donor” in “the police lineup of my ex-boyfriends”:

He’s definitely got his daddy’s thick, straight hair, and, God, am I grateful for that. It means he won’t have to deal with hat hair as he goes through life.

Throughout all this, Lamott continues to take in Sam. He is at first “just a baby,” some human vehicle that has just left the garage of Lamott’s belly:

The doctor looked at the baby’s heartbeat on the monitor and said dully, “The baby’s flat,” and I immediately assumed it meant he was dead or at least retarded from lack of oxygen. I don’t think a woman would say anything like that to a mother. “Flat?” I said incredulously. “Flat?” Then he explained that this meant the baby was in a sleep cycle.

But as Sam occupies a larger space in Lamott’s life, there is an innate ecstasy in the way she describes his presence. Sam is “a breathtaking collection of arms and knees,” “unbelievably pretty, with long, thin, Christlike feet,” and “an angel today…all eyes and thick dark hair.” We’re all familiar with the way that new parents gush about their babies, yet Lamott is surprisingly judicious in tightening the pipe valve. Even as she declares the inevitable epithets of frustration (“Go back to sleep, you little shit”) and trivializes Sam (“I thought it would be more like getting a cat”), Sam’s beauty is formidable enough to spill elsewhere, such as this description of a mountain near Bolinas:

So we were driving over the montain, and on our side it was blue and sunny, but as soon as we crested, I could see the thickest blanket of fog I’ve ever seen, so thick it was quilted with the setting sun shining upward from underneath it, and it shimmered with reds and roses, and above were radiant golden peach colors. I am not exaggerating this. I haven’t seen a sky so stunning and bejeweled and shimmering with sunset colors and white lights since the last time I took LSD, ten years ago.

Being a mother may be akin to a heightened narcotic experience, but that doesn’t have to stop you from feeling.

* * *

I suggested earlier that Operating Instructions serves as a precursor to the mommyblog, but this doesn’t just extend to the time-stamp. There is something about setting down crisp observations while the baby is napping that inspires an especially talented writer to find imaginative similes, especially in commonplace activities which those who are not mothers willfully ignore or take for granted. Compare Lamott and Dooce‘s Heather Armstrong (perhaps the best-known of the mommy bloggers) as they describe contending with a breast pump:

“You feel plugged into a medieval milking machine that turns your poor little gumdrop nipples into purple slugs with the texture of rhinoceros hide.” — Anne Lamott, 1993

“I end up lying on my back completely awake as my breasts harden like freshly poured cement baking in the afternoon sun.” — Dooce, February 23, 2004

Armstrong has sedulously avoided invoking Lamott’s name in more than a decade of blogging, but, in both cases, we see just enough imagery squirted into the experience for the reader to feel the struggle. Both Lamott and Armstrong have rightly earned a large readership for describing ordinary situations in slightly surreal (and often profane) terms. (Both writers are also marked in ways by religion. Lamott came to Christianity after a wild life that involved alcoholism. Armstrong escaped Mormonism, eluding to a period as an “unemployed drunk” before meeting her husband, who she subsequently divorced.)

Next Up: Ian Hacking’s The Taming of Chance!

mlnonfictionlist

The Modern Library Nonfiction Challenge

mlnfindexJust under three years ago, I began the Modern Library Reading Challenge. It was an ambitious alternative to a spate of eccentric reading challenges then making the rounds. These included such gallant reading missions as the Chunkster, the Three Card Monte/Three Sisters Bronte, the Read All of Shakespeare While Blindfolded Challenge, and the Solzhenitsyn Russian Roulette Challenge. It took a fairly eccentric person to place the literary embouchure ever so nobly to one’s lips and fire off a fusillade of eupohonic Prince Pless bliss into the trenchant air. But I was game.

In my case, the idea was to write at least 1,000 words on each title after reading it. The hope was to fill in significant reading gaps while also cutting an idiosyncratic course across the great works of 20th century literature, with other intrepid readers walking along with me.

Over the next twenty-three months, I steadily worked my way through twenty-three works of fiction. Some of the books were easy to find. Some required elaborate trips to exotic bookstores in far-off states. When I checked out related critical texts and biographies from the New York Public Library, I was often informed by the good librarians at the Mid-Manhattan branch that I was the first soul to take these tomes home in sixteen years. This surprised me. New York was a city with eight million people. Surely there had to be more curiosity seekers investigating these authors. But I discovered that some of these prominent authors had been severely neglected. When I got to The Old Wives’ Tale, Arnold Bennett was so overlooked that I appeared to be the first person to upload a photo of reasonable resolution (which I had taken from a public domain image published in a biography) onto the Internet.

There were other surprises. I became an Iris Murdoch obsessive. I was finally able to overcome my youthful indiscretions and appreciate The Adventures of Augie March as the masterpiece that it was. My mixed feelings on Brideshead Revisited proved controversial in some circles and caused at least one academic to condemn me. On the other hand, I also sparked an online friendship with Stephen Wood, who was also working his way through the Mighty 100, and was put into contact with an extremely small yet determined group of enthusiasts making similar reading attempts in various corners of the world.

Yet when I told some people about my project, it was considered strange or sinister. When I mentioned the Modern Library Reading Challenge to a much older writer, she was stunned that anyone my age go to the trouble of Lawrence Durrell. (And she liked Durrell!) Her quizzical look led me to wonder if she was going to send me to some shady authority to administer a second opinion.

One of the project’s appeals was its methodical approach: I was determined to read all the books from #100 to #1. This not only provided a healthy discipline, but it ensured that I wouldn’t push the least desired books to the end. Much as life provides you with mostly happy and sometimes unpleasant tasks to fulfill as they arrive, I felt that my reading needed to maintain a similar commitment. This strategy also created a vicarious trajectory for others to follow.

Everything was going well. Very well indeed. Henry Green. Jean Rhys. The pleasant surprise of The Ginger Man. With these authors, how could it not go well? I was poised to read to the finish line. I was zooming my Triumph TR6 down a hilly two-lane highway with a full tank of gas. Cranking loud music. Not a care in the world.

And then I hit Finnegans Wake.

To call Finnegans Wake “difficult” is a woefully insufficient description. This is a book that requires developing an ineluctably Talmudic approach. But I am not easily fazed. Finnegans Wake is truly a book of grand lexical riches, even if I remain permanently stalled within the voluble tendrils of its first 50 pages. I have every intention of making my way through Finnegans Wake. I have reread Dubliners, A Portrait of the Artist as a Young Man, and Ulysses. I have consulted numerous reference texts. I have even listened to all 176 episodes of Frank Delaney’s excellent podcast, Re: Joyce. These have all been pleasant experiences, but I am still not sure if any of this significantly contributes to my understanding of Finnegans Wake. However, when you do something difficult, it is often best to remain somewhat clueless. If you become more aware of how “impossible” something may be, then you may not see it through to the end. Joyce remains the Everest of English literature. I am determined to scale the peak, even if I’m not entirely sure how reliable my gear is.

The regrettable Finnegans Wake business has also meant that the Modern Library Reading Challenge has been stuck on pause. It has been eleven months since I published a new Modern Library installment on these pages. And while I have certainly stayed busy during this time (I have made a documentary about Gary Shteyngart’s blurbs, attempted to fund a national walk that I intend to fulfill one day, canceled and brought back The Bat Segundo Show, and created a new thematic radio program, among other industries), I have long felt that persistent progress — that is, an efflorescent commitment to a regular fount of new material — is the best way to stay in shape and keep any project alive.

I have also had a growing desire to read more nonfiction, especially as the world revealed itself to be a truly maddening and perilous place as the reading challenge unfolded. Some have sought to keep their heads planted beneath the ground like quavering ostriches about all this. There are far too many adults I know, now well in their thirties, who remain distressingly committed to the “La la la I can’t hear you!” school of taking in bad news. But I feel that understanding how historical and social cycles (Vico, natch) cause human souls to saunter down dark and treacherous roads also allows us to comprehend certain truths about our present age. To carry on in the world without a sense of history, without even a cursory understanding of ideas and theories that have been attempted or considered before, is to remain a rotten vegetable during the more important moments of life.

It turns out that the Modern Library has another list of one hundred titles devoted to nonfiction. And the nonfiction list is, to my great surprise, more closely aligned to the fiction list than I anticipated.

In 1998, the Washington Post‘s David Streitfeld revealed that the Modern Library fiction list was plagued by modest scandal. The ten august Modern Library board members behind the fiction list had no knowledge over who had voted for what, why the books were ranked the way they were, or how the list had been composed, with many of the rankings more perfunctory than anyone knew. Brave New World, for example, had muscled its way up to #5, but only because many of the judges believed that it needed to be somewhere on the list.

So when the Modern Library gang devoted its attention to a nonfiction list, it was, as Salon‘s Craig Offman reported, determined not to repeat many of the same mistakes. University of Chicago statistics professor Albert Mandansky was signed on to ensure a more dutiful ranking progress. Younger authors and more women were included among the board. Mandansky went to the trouble of creating a computer algorithm so that there could be no ties. But the new iron fist approach offered some drawbacks. There was a new rule that an author could only have one title on the list, which meant that Edmund Wilson’s To the Finland Station didn’t make the cut. And when the top choice was announced — The Education of Henry Adams — the crowd stayed silent. It was rumored that one board member scandalously played with his dentures as the titles were called.

Perhaps the Modern Library’s second great experiment reveals the unavoidable pointlessness behind these lists. As novelist Richard Powers recently observed in a National Book Critics Circle post, “The reading experiences I value most are the ones that shake me out of my easy aesthetic preferences and make the favorites game feel like a talent show in the Iroquois Theater just before the fire. Give me the not-yet likable, the unhousebroken, something that is going to throw my tastes in a tizzy and make my self-protecting Tops of the Pops slink away in shame.”

On the other hand, if it takes anywhere from five to ten years to get through a hundred titles, then the reader is inured to this problem. Today’s favorites may be tomorrow’s dogs, and yesterday’s lackluster choices may be tomorrow’s crown jewels. As the Modern Library reader grows older, there’s nearly a built-in guarantee that these preordained tastes will become passe at some point. (To wit: Lord David Cecil’s Melbourne, the first book I will be reading for this new challenge, is now out of print.)

So I have decided to take up the second challenge, reading the nonfiction list from #100 to #1. Modern Library Nonfiction Challenge titles shall flow from these pages as I slowly make my way through Finnegans Wake during the first challenge. Hopefully, once the disparity between the two challenges has been worked out, I will eventually make steady progress on the fiction and nonfiction fronts. But the nonfiction challenge won’t be a walk in the park either. It has its own Finnegans Wake at #23. I am certain that Principia Mathematica will come close to destroying my brain. But as I said three years ago, I plan to read forever or die trying.

To prevent confusion for longtime readers, the fiction challenge will be separated from the nonfiction challenge by color. Fiction titles shall carry on in red. Nonfiction titles will be in gold.

I’ve started to read Melbourne and I’m hoping to have the first Modern Library Nonfiction Challenge essay up before the end of the month. This page, much like the fiction list, will serve as an index. I will add the links and the dates as I read the books. I hope that these efforts will inspire more readers to take up the challenge. (And if you do end up reading along, don’t be a stranger!)

Now let’s get this party started. Here are the titles:

100. Melbourne, Lord David Cecil (December 27, 2013)
99. Operating Instructions, Anne Lamott (January 14, 2014)
98. The Taming of Chance, Ian Hacking (March 23, 2014)
97. The Journalist and the Murderer, Janet Malcolm (July 17, 2014)
96. In Cold Blood, Truman Capote (November 11, 2015)
95. The Promise of American Life, Herbert Croly (January 21, 2016)
94. The Contours of American History, William Appleman Williams (February 7, 2016)
93. The American Political Tradition, Richard Hofstadter (February 18, 2016)
92. The Power Broker, Robert A. Caro (May 11, 2016)
91. Shadow and Act, Ralph Ellison (November 3, 2016)
90. The Golden Bough, James George Frazer (13 volumes, Third Edition) (November 14, 2016)
89. Pilgrim at Tinker Creek, Annie Dillard (November 23, 2016)
88. Six Easy Pieces, Richard P. Feynman (November 30, 2016)
87. A Mathematician’s Apology, G.H. Hardy (December 3, 2016)
86. This Boy’s Life, Tobias Wolff
85. West with the Night, Beryl Markham
84. A Bright Shining Lie, Neil Sheehan
83. Vermeer, Lawrence Gowing
82. The Strange Death of Liberal England, George Dangerfield
81. The Face of Battle, John Keegan
80. Studies in Iconology, Erwin Panofsky
79. The Rise of Theodore Roosevelt, Edmund Morris
78. Why We Can’t Wait, Martin Luther King Jr.
77. Battle Cry of Freedom, James M. McPherson
76. The City in History, Lewis Mumford
75. The Great War and Modern Memory, Paul Fussell
74. Florence Nightingale, Cecil Woodham-Smith
73. James Joyce, Richard Ellmann
72. The Gnostic Gospels, Elaine Pagels
71. The Rise of the West, William H. McNeill
70. The Strange Career of Jim Crow, C. Vann Woodward
69. The Structure of Scientific Revolutions, Thomas S. Kuhn
68. The Gate of Heavenly Peace, Jonathan D. Spence
67. A Preface to Morals, Walter Lippmann
66. Religion and the Rise of Capitalism, R.H. Tawney
65. The Art of Memory, Frances A. Yates
64. The Open Society and Its Enemies, Karl Popper
63. The Sweet Science, A.J. Liebling
62. The House of Morgan, Ron Chernow
61. Cadillac Desert, Marc Reisner
60. In the American Grain, William Carlos Williams
59. Jefferson and His Time, Dumas Malone (6 volumes)
58. Out of Africa, Isak Dinesen
57. The Second World War, Winston Churchill (6 volumes)
56. The Liberal Imagination, Lionel Trilling
55. Darkness Visible, William Styron
54. Working, Studs Terkel
53. Eminent Victorians, Lytton Strachey
52. The Right Stuff, Tom Wolfe
51. The Autobiography of Malcolm X, Alex Haley and Malcolm X
50. Samuel Johnson, Walter Jackson Bate
49. Patriotic Gore, Edmund Wilson
48. The Great Bridge, David McCullough
47. Present at the Creation, Dean Acheson
46. The Affluent Society, John Kenneth Galbraith
45. A Study of History, Arnold J. Toynbee (12 volumes)
44. Children of Crisis, Robert Coles (5 volumes)
43. The Autobiography of Mark Twain, Mark Twain (3 volumes)
42. Homage to Catalonia, George Orwell
41. Goodbye to All That, Robert Graves
40. Science and Civilization in China, Joseph Needham (7 volumes, 27 books)
39. Autobiographies, W.B. Yeats
38. Black Lamb and Gray Falcon, Rebecca West
37. The Making of the Atomic Bomb, Richard Rhodes
36. The Age of Jackson, Arthur Schlesinger, Jr.
35. Ideas and Opinions, Albert Einstein
34. On Growth and Form, D’Arcy Thompson
33. Philosophy and Civilization, John Dewey
32. Principia Ethica, G.E. Moore
31. The Souls of Black Folk, W.E.B. Du Bois
30. The Making of the English Working Class, E.P. Thompson
29. Art and Illusion, Ernest H. Gombrich
28. A Theory of Justice, John Rawls
27. The Ants, Bert Hoelldobler and Edward O. Wilson
26. The Art of the Soluble, Peter B. Medawar
25. The Mirror and the Lamp, Meyer Howard Abrams
24. The Mismeasure of Man, Stephen Jay Gould
23. Principia Mathematica, Alfred North Whitehead and Bertrand Russell (3 volumes)
22. An American Dilemma, Gunnar Myrdal
21. The Elements of Style, William Strunk and E.B. White
20. The Autobiography of Alice B. Toklas, Gertrude Stein
19. Notes of a Native Son, James Baldwin
18. The Nature and Destiny of Man, Reinhold Niebuhr
17. The Proper Story of Mankind, Isaiah Berlin
16. The Guns of August, Barbara Tuchman
15. The Civil War, Shelby Foote (Three volumes: Fort Sumter to Perryville, Fredericksburg to Meridian, Red River to Appomattox)
14. Aspects of the Novel, E.M. Forster
13. Black Boy, Richard Wright
12. The Frontier in American History, Frederick Jackson Turner
11. The Lives of a Cell, Lewis Thomas
10. The General Theory of Employment, Interest, and Money, John Maynard Keynes
9. The American Language, H.L. Mencken
8. Speak, Memory, Vladimir Nabokov
7. The Double Helix, James D. Watson
6. Selected Essays, 1917-1932, T.S. Eliot
5. Silent Spring, Rachel Carson
4. A Room of One’s Own, Virginia Woolf
3. Up from Slavery, Booker T. Washington
2. The Varieties of Religious Experience, William James
1. The Education of Henry Adams, Henry Adams