Why Trigger Warnings Threaten Free Speech, Original Voices, and Thoughtful Discourse

“Don’t be so gloomy. After all, it’s not that awful. You know what the fellow said. In Italy, for thirty years under the Borgias, they had warfare, terror, murder and bloodshed. But they produced Michelangelo, Leonardo da Vinci and the Renaissance. In Switzerland, they had brotherly love. They had five hundred years of democracy and peace. And what did that produce? The cuckoo clock.” — The Third Man

On February 26, 2014, a UC Santa Barbara student named Bailey Loverin pushed “A Resolution to Mandate Warnings for Triggering Content in Academic Settings” through the Associated Students Senate. The resolution called on professors to issue “trigger warnings” for students on “materials with mature content” taught in the classroom — whether it be a difficult film shown for context, material assigned for reading, or even a casual conversation on a tough issue. Loverin cited her own discomfort sitting through a film, one which she has refused to identify, depicting sexual assault. The class’s unnamed professor, according to Loverin, provided no warning before the film. Loverin claimed that it was harder for her to walk out of the movie, because doing so in the dark would apparently draw attention. (I have consulted other interviews with Loverin to establish the facts. Loverin did not return my emails for comment.)

Loverin has stated that she’s a survivor of sexual abuse, but she has not suffered from PTSD — the chief reason proffered for the “trigger warning” resolution. In an interview with Reason TV, Loverin said, “We were watching a film. And there were several scenes of sexual assault and, finally, a very drawn out rape scene. It did not trigger me. I recognized the potential for it to be very triggering.”

Loverin is not a psychologist, a sociologist, or a medical authority of any kind. She is a second-year literature major who has become an unlikely figure in a debate that threatens to diminish the future of free speech. Yet Loverin isn’t nearly as extreme in her views as the trigger warning acolytes at other universities.

Rutgers’s Philip Wythe has claimed that trigger warnings are needed for Virginia Woolf’s Mrs. Dalloway (for students suffering from self-harm) and for F. Scott Fitzgerald’s The Great Gatsby (apparently a literary gorefest on the level of an episode of The Walking Dead). Oberlin threatened to issue trigger warnings over such traumatic issues as “heterosexism, cissexism, ableism, and other issues of oppression,” before the policy was sensibly pulled for further reconsideration.

Loverin claimed in an opinion roundtable for the New York Times that the UCSB resolution “only applies to in-class content like screenings or planned lectures and doesn’t ban the content or excuse students from learning it.” The resolution offers a suggested list of “rape, sexual assault, abuse, self-injurious behavior, suicide, graphic violence, pornography, kidnapping, and graphic depictions of gore” as trigger warning options. Students who feel that they “have a negative emotional response” (note that it is any “negative emotional response,” not necessarily PTSD) “to such content, including distressing flashbacks or memories, should be allowed to leave the classroom or skip class altogether without being penalized.” And while there certainly isn’t any direct prohibition, there is still the unsettling possibility of professors forced to soften their materials, which leads one to wonder how they can adequately teach war, genocide, slavery, or imperial conquest in the classroom. Another question, one that has remained unconsidered by trigger warning boosters, is whether or not skipping class over material that easily offends will be used as a catch-all excuse for students to shirk their scholarly duties.

In the Reason TV interview, Loverin said, “Being uncomfortable, being upset, being even a little offended is different than having a panic attack, blacking out, hyperventillating, screaming in a classroom, feeling like you’re under such physical threat, whether its real or perceived, that you act out violently in front of other people.” It certainly is. Yet there is no evidence to support Loverin’s claim that there is a widespread epidemic of students acting out violently in class over a movie. The PTSD Foundation of America observes that 7.8% of Americans will experience PTSD at some point in their lives. But most people who experience PTSD are children under the age of 10 or war veterans. Furthermore, an NCBI publication reveals that intimate group support among fellow trauma victims (CISD) and rigorous pre-trauma training (CISM) are effective methods for helping the PTSD victim to move forward in her life.

There are some connections between media and PTSD, such as a study published last December which observed that some Americans who watched more than six hours of media coverage about the Boston Marathon bombings experienced more powerful stress reactions than those who refrained from watching the news or who were directly there. Another UC Irvine study found that 38% of Boston-area veterans who suffer from PTSD and other mental disorders experienced some emotional distress one week after the bombing. But these studies point to (a) people who already suffer from PTSD, (b) PTSD victims being exposed to media for a lengthy duration, and (c) PTSD victims in close proximity to a recent attack.

astronautfeldstein

It is certainly reasonable for a professor to ask her class if any student suffers from PTSD, but the trigger warning approach is uncomfortably similar to the Comics Magazine Association of America’s crackdown on comic books over gory content in 1954, which led Charles F. Murphy to wrongfully conclude that reading violent comics leads to juvenile delinquency. As Saladin Ahmed recently pointed out at BuzzFeed, Murphy’s Puritanical Comics Code — a kind of self-regulated and equally self-righteous “trigger warning” system of the time — forced a black astronaut to be made white in order for Al Feldstein and Joe Orlando’s “Judgment Day” to run. (Before the Comics Code, the story would have appeared without a problem.) The pro-trigger warning crowd also refuses to consider the benefits of confronting trauma. If Steve Kandell hadn’t possessed the courage to visit the 9/11 Memorial Museum 13 years after his sister was killed in the attacks, then we would not have learned more about that tourist attraction’s crass spectacle and its deeply visceral effect on victims. There was no trigger warning at the head of his essay.

Discouraging students from confronting challenging topics because of “a negative emotional response” may also result in missed opportunities for humanism. In her thoughtful volume A Paradise Built in Hell, Rebecca Solnit examines numerous instances of people reacting to disasters. Contrary to the reports of doom and gloom presumed to follow a disaster, people more often react with joy and a desire to reforge social community. Those distanced from disaster tend to become more paralyzed with fright. No less a literary personage than Henry James, who read sensationalistic accounts of the 1906 San Francisco earthquake, imagined that his brother William underwent some perdition: “I feel that I have collapsed, simply, with the tension of all these dismal days,” he wrote in a letter. “I should have told you that I have shared every pulse of your nightmare with you if I didn’t hold you quite capable of telling me that it hasn’t been a nightmare.” But William. who was impressed with the rapid manner in which San Francisco came together, replied, “We never reckoned on this extremity of anxiety on your part.”

Only a week after Loverin’s resolution was passed, a group of anti-abortion activists had one of its banners, which featured a bloody picture of an aborted fetus, plucked by professor Mireille Miller-Young. A YouTube video capturing this incident features the professor pushing Joan Short, one of the Christian activists and the person operating the camera, after she attempts to retrieve her sign from an elevator. At the 2:46 mark, Miiller-Young can be seen kicking Short’s shoe out of the elevator bank.

triggerreport

The police report, which I obtained from the Santa Barbara Independent‘s Tyler Hayden (the PDF can be viewed here), cites “triggering” as one motivations for Miller-Young’s actions. Miller-Young asked the Christian activists to remove the sign. They refused. Miller-Young grabbed the sign and destroyed it in her “safe space” with scissors. “I’m stronger,” said Miller-Young, “so I was able to take the poster.”

As a hard progressive and free speech advocate who is strongly pro-choice and as someone who finds gory pictures of aborted fetuses to be a repugnant response to civility, I am nevertheless appalled that a supposedly enlightened figure of authority like Miller-Young would use “trigger warnings” as an excuse to not only shut down another person’s perspective, but to completely destroy the sign used to present it. These activists were not harassing young women outside of an abortion clinic, yet Miller-Young claims in the police report that “she felt that the activists did not have the right to be there.”

Many ostensible liberals have attempted to paint “trigger warnings” as something harmless, yet they refuse to see how appending a precautionary warning can lead to a chilling curb of free speech. And like Miller-Young, in their rush to condemn, it becomes clear that they are less interested in comforting those who are sensitive and more concerned with painting anyone who disagrees with them as either “a jerk” or someone who delights in the suffering from others. On Twitter, two privileged white male writers with high follower counts revealed their commitment to petty despotism when opining on the trigger warning issue:

We have seen recent literary debates about unlikable characters, an essential part of truthfully depicting an experience. But if an artist or a professor has to consider the way her audience feels at all times, how can she be expected to pursue the truths of being alive? How can a student understand World War I without feeling the then unprecedented horror of trench warfare and poison gas and burying bodies (a daily existence that caused some of the bravest soldiers to crack and get shot for cowardice if they displayed anything close to the PTSD that they felt every minute)? How can one understand rape’s full hurt and humiliation if one does not wish to become familiar with its baleful emotions? How can any student comprehend climate change if the default response is to ignore the news and play a distracting cat video that will amuse her for two minutes?

I realize that the trigger warning police mean well, but human beings are made of more resilience and intelligence than these unlived undergraduates understand. Hashtag activism may work in a virtual world of impulsive 140 character dispatches, but it cannot ever convey the imbricated complexities of the human spirit, which are too important to be stifled and diminished by a censorious menagerie of self-righteous kids, including middle-aged genre writers who can’t push their worldview past adolescent posturing that’s as preposterous as Jerry Falwell claiming emotional distress over a parody.

The Onion Narrative

On the morning of Saturday, March 21, 2009, I left the house to purchase an onion. This action, in and of itself, might be considered meaningless. Most would consider this a perfunctory deed or an insignificant errand. There isn’t a foolproof way to capture all comparable actions occurring at the same moment (9:30 AM EDT), but why should any of us ignore the potential pleasures contained within such a routine act? Are we taking this modern convenience for granted? Is a trip to the store to be sneered at? If we view a produce run with contempt, do we therefore view a previous age of humanity with contempt? Why should it have to be about us? A 10th century Viking berserking his way across North America certainly didn’t have this option of a neighborhood market. The Viking’s diet consisted of what he was able to hunt and gather. I am certain that if the Viking learned of developments eleven centuries later, common to every civilized being, and further ascertained that we were complaining about what a pain in the ass it was to get an onion so early in the morning (for the Viking surely had to spend half of his day plunging through the river for some fish), he’d put a battleaxe through our skulls. And we might very well deserve it. (At the risk of self-aggrandizement, let the record show that I did not complain.)

What I hope to document here is one such act, which I style “The Onion Narrative.” Because my effort to obtain this onion exists in the past, fixed and immalleable and further complicated by the mind tinkering even as it accounts for what happened. I have provided diagrams (certainly not to scale) that have divided my Onion Narrative into three tidy stages. And to demonstrate how imperfect this process is, as an experiment, I attempted to recreate my Onion Narrative on video, following the exact same walking route and purchasing yet another onion. My recollection of the details isn’t nearly as precise as I’d like to think. The video confirms that I overstated the width of the food aisles in my diagrams. For the diagrams, I used only memory as my guide. And the video camera placed additional limitations. I was forced to adjust the narrative circumstances contained within my memory. Because I was holding a camera, admittedly a small one, I was nevertheless required to take a dollar out of my wallet in advance so that I could hand it to the register clerk. This way, I wouldn’t have to set down the camera and extract the dinero from my leather pouch. In addition, the price of the onion was different from the initial price. The onion itself was different. Moreover, the social conditions surrounding my journey had drastically changed. (For one, there were more dog walkers.) I’ll explore the implication of these details very soon. But for the moment, let’s concentrate on the original narrative contained within my memory.

The Original Narrative

We needed the onion to make breakfast. We had run out of onions the night before because we had used the last one in our kitchen to make some homemade soup. The market was only blocks away from where we lived. The task offered an opportunity for me to walk. And I also found it somewhat comical that I would be going to the market for only one item. Just some commonplace produce that cost under a dollar. Was this bad time management on my part? After all, if you’re going to go to the store, shouldn’t you go there for multiple items so that you might save yourself some trips?

I did not see the situation that way. Here were my priorities for this five-minute excursion:

Priority One: Obtain onion.
Priority Two: Go for a walk, commune with the world, get away from the damn computer.
Priority Three: Find random opportunities for recurring curiosity about others to take root.

I willfully revolted against my first priority by purchasing not one onion, but three. It might be argued that by purchasing one onion, I had fulfilled my priority and that the two additional onions represented a new priority that I had whipped up on the spot; a spontaneity that I had not anticipated until I arrived at the onion bin. A more austere type might wish to punish me for my failure to obey the set dicta, or for not following the subconscious directions to the letter or for exceeding my budget, as ridiculously minuscule as it was. (For what it’s worth, I only purchased one onion when I recreated the incident.) I suppose it all depends on how much value you place on the onion or whether you feel comfortable having an extra onion around the house. Nearly anybody can afford an onion. Or at least nearly anybody lucky enough to have a roof over his head. And perhaps purchasing two extra onions doesn’t really matter if you have, as I did, even four dollars in your wallet. But if you only have a dollar and you purchase two more, then you are forced into a position of potential embarrassment when the clerk is forced to put the additional onions back. The second onion, beyond your means in this hypothetical case, determines your social position, which is very low indeed. But maybe you have no shame and you wish to max out your meager budget. Or maybe you want to see how the clerk will react to such a dilemma.

stageoneAs I progressed to the store (see the accompanying graph labeled Stage One), the first priority became less significant. I found myself considering the number of cigarette butts scattered on the sidewalk, which had proliferated considerably from last evening. This then led me to wonder if there was a higher percentage of smokers in my neighborhood than I originally estimated, or whether there were some people who who only liked to smoke on Friday night. But could I really make such a judgment when I wasn’t devoting my complete attention to how frequently the streets were cleaned or the people cleaning them? I then began to observe people as I walked. Over the course of my journey, I counted 31 people who were out and about.

31 people! And this was just over the course of five minutes. That’s 372 people in one hour, assuming that the rate of wanderers remains constant and that you don’t run into the same person twice. Given these numbers, small wonder then that we still obsess over the phenomenon of “running into someone.” And yet none of us, I think, are quite aware of just how many people we see or how many social or conversational possibilities we are presented with at any given time. We are often so fixated on our solitary task (in this case, the purchase of an onion) that we fail to consider our true insignificance.

Web

Since I was in a jocular mood, I’d like to think that I plentifully partook of these social opportunities. But there were only two people with whom I had direct contact with. And this was in the store. (I do not count the dog with a sad-looking face just outside the store who angled his head through two metal railings for attention and who I proceeded to pet and speak in a soothing voice to.) At Point A: A boy saddling along with his mother and carrying a glum expression. Recognizing the boy’s need to feel happier, I stuck out my tongue at him, and he smiled. At Point B: Some banter with the register clerk. A “Good morning” and “How are you doing?” and a “Thanks.” A smile. But nothing beyond that. Indeed, at Point B in the video, I did even worse. My recreated journey on video sees me communicating with nobody save the clerk, and my socialization was limited to “Thanks.”

I felt like a terribly selfish person when these details were revealed. Had the video made me more self-conscious? Was I less jocular? Or were the circumstances inveterate? Can we be exonerated if we aren’t really aware of how few people we talk with? Or is it incumbent on us to be more socially responsible? If it is socially acceptable not to talk with even half of the 31 random people we regularly run into on any given day, then are we any less culpable in failing to live up to the possibilities before us? In 2007, a University of Melbourne researcher concluded that political candidates sitting on the left-hand position of a stage are more likely to draw attention. Because the brain, when tracking a tableau, has a tendency to drift to the left. I must note that in Points A and B that the individuals were to my left. I also see that my own journey to and from the store had me situated mostly on the left side of the street. Therefore, if my brain was going out of its way to excluding people, it was possibly because my visual cortex was occupied with the buildings and edifices. Was I subconsciously going out of my way to avoid people? (Additional factors to consider: I learned later in the afternoon that I was in need of social engagement. Several opportunities presented themselves and were taken.)

What is also troubling in my video reenactment is that the only time I comment on anything is when I see a bus parked at a stoplight. This bus is in almost the exact same position as another bus was during my original journey. And seeing the similarity, I am forced to violate the conditions of my recreation: commenting upon the action. This would further support the “running into someone” theory. Consider what cognitive scientist Colin Cherry identified as the cocktail party effect, whereby a person has the ability to focus in on one talker while a steady chatter of conversation is going on. This was supported last year by a study that revealed the auditory cortex does a good deal of work in filtering conversation. And if your brain has robust basal ganglia, well, then you likewise may have a robust “irrelevance filter.”

I do not know how tough my basal ganglia are. But I am troubled by how smoothly my brain deems certain details irrelevant. How little it notices. How needlessly egocentric it is on a subconscious level. After the fact, I am going out of my way to locate the new, the unseen, the underdogs, the moments I didn’t seize, the people I could have talked to, the emphases I now find phony or false.

stagethreeStage Three of my journey seems less significant than the first two stages. By my own judgment, it is also the least interesting part of the video recreation. The onion has been obtained. I recall that during the original narrative, I found myself observing more people. I was not in a rush to get back. But on the video, I am circling around people rather than approaching them. I do not know if this is because of the camera or because I felt uneasy reproducing the narrative. And why should Stage Three be the least of the three? The primary goal has been obtained. The mind is at ease and can be more spontaneous with the rigid order out of the way. Or so one would think.

This exercise originated, in part, from ruminating over Roger Ebert’s recent post about the determinism of the universe, although the subject has long been on my mind. I am a secular type who does not believe in a deity, and yet, on some primordial level, my mind seeks to find connections and patterns. Even in thinking about why my mind reacted the way that it did, I am still trying to pinpoint a framework. How is this any different from Intelligent Design?

Among George Santayana’s great arsenal of pithy maxims is this one, written in response to William James’s the Varieties of Religious Experience: “Experience seems to most of us to lead to conclusions, but empiricism has sworn never to draw them.” I hope that I have been more explicit about my free will than James was, and yet I share with James a strange pleasure in vivisecting my experience. Of coming to terms with my subconscious limitations as a human being through diagrams and video reenactments. One should probably not approach life this way, because on a certain level, one must live with blind and uncomfortable truths. But is the truth really unraveled when we consider the structure beneath it? Or is the mind so hopelessly fallible, so determined in its determinist filtering, that human beings are doomed to repeat the same mistakes even when the horse has been led to water? This seems cynical rhetoric, but it’s quite liberating to know that, no matter how much you slice and dice up a moment, the mind remains a dutiful deflector.