Thoughts on the Mime

mime1. The difference between the theatrical and the theoretical mime. — In the one the performance is palpable, but removed from pragmatic use, so that the mime is widely reviled out of habit, even when his actions beckon a half-hearted attention. Some wish to beat the mime to a pulp. More uncivilized spectators, containing their feral thoughts within the imagination, ruminate over whether or not the mime’s hypothetical gush of blood will be as invisible as the box that he is “trapping” himself in. One sees the mime’s principles within his performance, but the mime represents both theater and theory in practice. This causes hostility. This causes ulcers. This causes many to complain to their spouses and, in the most extreme cases, a temporary shift in slumbering receptacle from bed to couch.

2. But in the theoretical mime, the principles are fully separate from the theatrical. The mime neither exists nor is permitted to exist. It maintains its imaginative perch within an active noggin and proves so stubborn a resident that hostility is eked out at the theatrical mime, who shares nothing more than this subjective projection and is thereby innocent. The spectator only has to look at a real mime to be reminded of these theoretical speculations, and no real effort is necessary; for the theatrical mime’s performance is far from subtle and mimes themselves are numerous within our society.

3. All mimes would then be theoretical if they had access to the spectator’s theoretical viewpoint, or if they could indeed speak. But mimes are only permitted to convey their thoughts and feelings through silent action. And the mime rules dictate that props and gait must be invented. Since the mime is so occupied with these inherent duties, the communication between the spectator who contains the theoretical projection and the mime is one way. A mime is a terrible thing to waste, both in its theoretical and theatrical forms.

4. The reason therefore that the spectator remains so hostile to the theoretical mime is because he is not dressed up in striped shirt and his face is not attired in white paint. If the spectator is asphyxiated by a necktie as he watches the mime and his mind is occupied by negative thoughts pertaining to his work, then the spectator is likely to project additional theoretical mimes upon the theatrical mime.

5. But dull mimes are never either theatrical or theoretical.

6. The mime, if he is lively, is drawing from his own inner theoretical mime, shifting his arms and legs and chest by subconscious instinct. He therefore contains more of the theatrical mime than the theoretical mime as he carries out his performance. But it is just the reverse with the spectator. And where the spectator feels hostility towards the mime, the mime, by way of inhabiting more of the theatrical mime, feels ebullience, which he then applies to the performance.

7. Just as we harm the mime by projecting our theoretical mime upon him, so too does the theatrical mime harm the spectator in failing to project the theoretical upon us. That the theoretical forms the emotional bridge between mime and spectator, rather than the theatrical, is the chief cause for the many negative feelings directed towards the mime.

8. There are many people who witness a mime in the same way as they crave Ian Fleming’s vespers.

9. There remains the possibility of rectifying the theoretical/theatrical balance, but this will involve a good deal of mime outreach to beleaguered sectors of humanity. And since outreach is associated with many of the regrettable sensitivity and self-help movements of the 1970s, and since mimes themselves have already garnered a hostile position within civilization, the only practical solution to destroying this dichotomy is for the mimes to become spectators and the spectators to become mimes. The difficulties with establishing a World Mime Day come with the necessary autocratic enforcement. For in order for mimes to be understood as theatrical beings, it will be necessary for 90% of the spectators to become mimes. This is a difficult ratio, one that will certainly cause numerous spectators to resist and one that will cause further anti-mime propaganda to be disseminated through various circulars, several social networks, and numerous snarky websites.

10. But let us momentarily adopt an optimistic position and assume that such a possibility becomes plausible. Many of the new mimes (formerly spectators) will have difficulties adjusting to the role, and may come to resent the theatrical mime further, retreating again to the theoretical. Some may indeed decide that their roles as spectators have been balderdash all along and may become permanent mimes. But would such born again mimes be finding the right role in relation to society? It might be sufficiently argued that being a mime for a day is much better than toiling in a maquiladora. Then again, if being a mime is largely voluntary and without compensation, one might also argue the reverse.

11. Eloquence. — It requires the theatrical and the theoretical, but the theatrical must itself be drawn from the true. Eloquence is a bit like a high school blood drive, but the stakes are higher and the ambitions are tantamount to climbing Everest.

12. Eloquent responses to the mime problem therefore require one entire year, whereby the shift from spectator to mime is staggered over a 365 day period, and the many impromptu mimes scattered into everyday society is not so shocking. Governments must institute special tax incentives, encouraging spectators to become mimes and let the natural eloquence of the theatrical noodle its way into the theoretical. We must believe that mimes are more than two conditions. In this way, the spectators might overcome their internal skepticism by momentarily embracing the obverse.

The Follies of Emotional Expression

Lifestyles I'm Sorry Take Two

ITEM: September 1, 2009. A YouTube video surfaces. In the video, Van Jones calls Congressional Republicans “assholes.” The video is from an event in February 11, 2009. Jones was appointed by President Obama in March 2009. After considerable outcry from conservatives, Jones resigns from his White House position as Special Advisor for Green Jobs.

ITEM: September 8, 2009. President Obama delivers a speech before Congress. Rep. Joe Wilson (R – SC) shouts “You lie!” in the middle of the speech. Wilson apologizes, but the matter isn’t dropped. There are countless efforts to find ways to respond to Wilson’s words (is it racism as Jimmy Carter suggests a week later?). There is endless chatter by liberals and conservatives alike. More than a week later, Joe Wilson remains in the news.

ITEM: September 13, 2009. Serena Williams goes ballistic at the US Open. She is fined $10,000 for delivering a tirade at a judge. (She is also docked $500 for racket abuse. It was a tough racket.)

ITEM: September 13, 2009. Taylor Swift wins a Video Music Award. In the middle of her acceptance, Kanye West grabs the microphone out of Swift’s hands and shouts, “I’m sorry, but Beyoncé had one of the best videos of all time.”

ITEM: September 13, 2009. President Obama is asked about Kanye West and Obama calls West a “jackass.” Efforts to prevent the tweet, the audio clip, and the video clip from disseminating around the Internet fail. Most side with the President.

One could probably include many other visceral explosions in recent history. Sherman Alexie, Alain de Botton, Alice Hoffman, Michael Richards, Christian Bale, or Don Imus all come to mind. But the above items all went down this month. We still have about two weeks to go before September’s over. It appears very likely that more public figures will erupt (or interrupt) with a subtlety worthy of Vesuvius.

But what do these reactions mean? And what is the appeal? It would be superficial to blame it all on the media, although the media is going out of its way to perpetuate these stories. (Arguably, as a questionable media source, I am going out of my way to perpetuate these stories, although I am trying to ruminate on it all instead of getting away from it.) Could it be that the tendency to fixate on these incidents involves some desire to make sense of these reactions? Maybe. I doubt that any of us could have predicted that POTUS would have managed to mix himself up in a Kanye West tirade, particularly when more pressing concerns like unemployment and health care are burning up national peat. But politics is now just as vital to the celebrity-industrial complex as sports, movies, and music. (It could hardly have been an accident that the FOX Network timed its announcement of Ellen DeGeneres as new American Idol judge to coincide with the President’s speech.)

Instead of trying to understand these visceral impulses, it has become the duty of every cultural observer to perpetuate the shallow headlines rather than plunge deeper. Are two words or two sentences really enough to denounce someone? Is this not continuing the soundbite culture? (No accident that Twitter, itself a bedrock of textual soundbites, was one of the major conduits through which these stories spread.) Should we not judge these people on a more complete impression? What resides beneath the comments?

Van Jones’s “assholes” admonishment came when the assembled group was trying to understand how bipartisanship could be an option when the Republicans remained obdurate. That’s a fairly interesting question, but it’s too bad that sensitive ears and Penn Ave propriety weeded Jones out.

Joe Wilson, as inappropriate as his actions were, was trying to express his passion. And isn’t understanding that passion, as unsettling as the motivations may be, the more important concern here? If we calmly listened to people, as Al Franken patiently did, wouldn’t this cut down on conservatives showing up at town meetings packing heat? Why not ask questions? Or see where people are coming from? Why did Wilson think that Obama was lying? And why aren’t we discussing the more interesting facts?

The Nation‘s Dave Zern observed that Roger Federer had a tantrum two days after Serena Williams, but Federer wasn’t upbraided in the press as severely as Williams. Is there a double standard? Does Federer get a free pass because he isn’t African-American and he isn’t a woman? Maybe it has more to do with celebrity figures fulfilling our expectations. After all, Federer is known more for his calm demeanor on the court. Williams, on the other hand, is known for her temper. Shouldn’t Federer’s incongruous reaction (“I don’t give a shit what he said” uttered on national TV) be rejoined with greater severity? And shouldn’t we praise Serena Williams for handling a future game with calm professionalism? Are we not just as guilty with our predictable responses? Are we true to our nature?

Kanye West acted like a jackass (a subjective view), but he never called Taylor Swift a jackass (the objective quote). He told Swift that he was “very happy” for her before turning his back and denying her moment. And yet President Obama, who used an ad hominem remark to respond to the whole mess, has neither given an apology nor been asked for an apology. (Contrast this with the Cambridge Police Department demanding an apology from Obama in late July, after Obama declared that the police had “acted stupidly” in the Henry Louis Gates arrest. Obama didn’t apologize, but there was a beer summit.)

Since the President has become involved in these public disputes with greater frequency, and he reserves the right to tell people that they “acted stupidly” or call someone a “jackass,” then perhaps he should start setting a better example for rational bipartisan discourse. Or perhaps he should abandon his “civilized” remarks and call people “jackass” from time to time. (Nixon was hardly a President to be proud of, but it’s worth noting that he had no problem using the word “cocksucker.”)

Maybe there’s something else at work here pertaining to executive privilege. The New York Times reported that New York City’s unemployment rate hit 10.3% in August, a 16-year-old high. The national unemployment rate still holds at just under 10% — the highest unemployment rate since 1983. As of April, two million jobs were lost in 2009. In tough times, when those who are fortunate enough to remain employed have a strong desire to stay mum and keep their jobs, and when millions of unemployed people can’t take any chances, it makes intuitive sense to look vicariously towards those who have this executive privilege of emotional expression.

But if emotional expression is so atavistic, shouldn’t it be predicated on egalitarianism? Is it not a double standard for Van Jones to be dismissed while Obama keeps his job? Subjectively, I happen to think that Obama was correct in both instances. But why can’t somebody who isn’t the President make such statements and not have to go through the endless rigmarole of apologizing over the course of multiple interviews? Why can’t we just accept someone’s apology and move on? If we don’t, then the purpose of an apology is useless or the apology doesn’t fit the apparent punishment for the crime. And if we don’t accept other people, which includes listening to their heightened emotional expression, then this runs counterintuitive to eclectic discourse.

If emotional expression is reserved only for those at the top, then should we really be surprised by the people who show up at tea parties? Should we really be surprised that Glenn Beck’s popularity has risen dramatically during the Obama Administration?

Perhaps these people are expressing extraordinary emotions like this because society has established unspoken prohibitions in the manner by which they communicate. As I type this sentence, I happen to believe that Salman Rushdie is a cunt. I could tell you why if you asked me. And if Rushdie were to explain himself, I would be happy to listen. If he had a reasonable explanation for his cunt-like behavior, I might change my mind. But because I have stated that “Salman Rushdie is a cunt,” people will see this and possibly believe me to be an asshole. But should such a sentence discount all the thoughtful and positive sentences I have ever uttered? And is my opinion of Rushdie so inflexible? By our present emotional expressive standards, this would certainly be the case if I had, by some lark, achieved the fame of Serena Williams.

But let’s approach this issue from another sideways shuffle. It is very possible that you, dear reader, harbor a feeling, however permanent or temporary, that someone that you know is a cunt. If that sentiment is permanent, and if it is not subject to change, then you may not be a civilized person. (Or, in Joe Wilson’s words, you lie!) But if you accept the follies of your emotional expression and you remain flexible enough to change it or to embrace it, then it is very probable that you are a civilized person, assuming that you aren’t a sociopath.

And now that I’ve thought about it, I don’t think that I believe that Salman Rushdie is a cunt. I believed it just now, but after thinking about it, it seems ridiculous to place a writer who has written a novel as great as Midnight’s Children into the same milieu as Hitler, Nixon, and Genghis Khan (to name only a few rotten apples, but, to give Hitler that cliched benefit of the doubt, he treated his dogs well). I have not thought to strike the sentence from this essay. But if this were published somewhere, I’m certain that very few editors would print the phrase “Salman Rushdie is a cunt.”

Is it reasonable to prohibit ad hominem or emotional expression? Or to dwell on it, as it crops up from time to time, as if it something to be skimmed over and over like a four-second tape loop? Only if you believe that humans — or, with the second rhetorical question, a select civilized elite — are capable of nothing more than profound enlightenment. Humans certainly do great things, don’t they? But if you’re naive enough to believe that they contribute nothing but thoughtful contributions, then I urge you to acquaint yourself with the many psychopaths who have chewed up the scenery over the course of human history.)

But let’s say that we accept emotional expression and slow down with these knee-jerk responses. We therefore give those who practice this perfectly normal tendency an opportunity to explain or atone. The eccentric contributors come out of the closet. Innovators who have held their tongues are permitted to communicate wild ideas and become part of the process. And we expand the repertoire of human behavior. There will probably be ugliness, but ugliness can be rectified without forcing horses to drink the water. Asking people to constantly apologize — often before a camera — is the action of an autocratic enforcer who has no faith in humankind. But when two people listen to each other without instantaneous judgment, you can plant seeds instead of chopping down trees.

The Onion Narrative

On the morning of Saturday, March 21, 2009, I left the house to purchase an onion. This action, in and of itself, might be considered meaningless. Most would consider this a perfunctory deed or an insignificant errand. There isn’t a foolproof way to capture all comparable actions occurring at the same moment (9:30 AM EDT), but why should any of us ignore the potential pleasures contained within such a routine act? Are we taking this modern convenience for granted? Is a trip to the store to be sneered at? If we view a produce run with contempt, do we therefore view a previous age of humanity with contempt? Why should it have to be about us? A 10th century Viking berserking his way across North America certainly didn’t have this option of a neighborhood market. The Viking’s diet consisted of what he was able to hunt and gather. I am certain that if the Viking learned of developments eleven centuries later, common to every civilized being, and further ascertained that we were complaining about what a pain in the ass it was to get an onion so early in the morning (for the Viking surely had to spend half of his day plunging through the river for some fish), he’d put a battleaxe through our skulls. And we might very well deserve it. (At the risk of self-aggrandizement, let the record show that I did not complain.)

What I hope to document here is one such act, which I style “The Onion Narrative.” Because my effort to obtain this onion exists in the past, fixed and immalleable and further complicated by the mind tinkering even as it accounts for what happened. I have provided diagrams (certainly not to scale) that have divided my Onion Narrative into three tidy stages. And to demonstrate how imperfect this process is, as an experiment, I attempted to recreate my Onion Narrative on video, following the exact same walking route and purchasing yet another onion. My recollection of the details isn’t nearly as precise as I’d like to think. The video confirms that I overstated the width of the food aisles in my diagrams. For the diagrams, I used only memory as my guide. And the video camera placed additional limitations. I was forced to adjust the narrative circumstances contained within my memory. Because I was holding a camera, admittedly a small one, I was nevertheless required to take a dollar out of my wallet in advance so that I could hand it to the register clerk. This way, I wouldn’t have to set down the camera and extract the dinero from my leather pouch. In addition, the price of the onion was different from the initial price. The onion itself was different. Moreover, the social conditions surrounding my journey had drastically changed. (For one, there were more dog walkers.) I’ll explore the implication of these details very soon. But for the moment, let’s concentrate on the original narrative contained within my memory.

The Original Narrative

We needed the onion to make breakfast. We had run out of onions the night before because we had used the last one in our kitchen to make some homemade soup. The market was only blocks away from where we lived. The task offered an opportunity for me to walk. And I also found it somewhat comical that I would be going to the market for only one item. Just some commonplace produce that cost under a dollar. Was this bad time management on my part? After all, if you’re going to go to the store, shouldn’t you go there for multiple items so that you might save yourself some trips?

I did not see the situation that way. Here were my priorities for this five-minute excursion:

Priority One: Obtain onion.
Priority Two: Go for a walk, commune with the world, get away from the damn computer.
Priority Three: Find random opportunities for recurring curiosity about others to take root.

I willfully revolted against my first priority by purchasing not one onion, but three. It might be argued that by purchasing one onion, I had fulfilled my priority and that the two additional onions represented a new priority that I had whipped up on the spot; a spontaneity that I had not anticipated until I arrived at the onion bin. A more austere type might wish to punish me for my failure to obey the set dicta, or for not following the subconscious directions to the letter or for exceeding my budget, as ridiculously minuscule as it was. (For what it’s worth, I only purchased one onion when I recreated the incident.) I suppose it all depends on how much value you place on the onion or whether you feel comfortable having an extra onion around the house. Nearly anybody can afford an onion. Or at least nearly anybody lucky enough to have a roof over his head. And perhaps purchasing two extra onions doesn’t really matter if you have, as I did, even four dollars in your wallet. But if you only have a dollar and you purchase two more, then you are forced into a position of potential embarrassment when the clerk is forced to put the additional onions back. The second onion, beyond your means in this hypothetical case, determines your social position, which is very low indeed. But maybe you have no shame and you wish to max out your meager budget. Or maybe you want to see how the clerk will react to such a dilemma.

stageoneAs I progressed to the store (see the accompanying graph labeled Stage One), the first priority became less significant. I found myself considering the number of cigarette butts scattered on the sidewalk, which had proliferated considerably from last evening. This then led me to wonder if there was a higher percentage of smokers in my neighborhood than I originally estimated, or whether there were some people who who only liked to smoke on Friday night. But could I really make such a judgment when I wasn’t devoting my complete attention to how frequently the streets were cleaned or the people cleaning them? I then began to observe people as I walked. Over the course of my journey, I counted 31 people who were out and about.

31 people! And this was just over the course of five minutes. That’s 372 people in one hour, assuming that the rate of wanderers remains constant and that you don’t run into the same person twice. Given these numbers, small wonder then that we still obsess over the phenomenon of “running into someone.” And yet none of us, I think, are quite aware of just how many people we see or how many social or conversational possibilities we are presented with at any given time. We are often so fixated on our solitary task (in this case, the purchase of an onion) that we fail to consider our true insignificance.

Web

Since I was in a jocular mood, I’d like to think that I plentifully partook of these social opportunities. But there were only two people with whom I had direct contact with. And this was in the store. (I do not count the dog with a sad-looking face just outside the store who angled his head through two metal railings for attention and who I proceeded to pet and speak in a soothing voice to.) At Point A: A boy saddling along with his mother and carrying a glum expression. Recognizing the boy’s need to feel happier, I stuck out my tongue at him, and he smiled. At Point B: Some banter with the register clerk. A “Good morning” and “How are you doing?” and a “Thanks.” A smile. But nothing beyond that. Indeed, at Point B in the video, I did even worse. My recreated journey on video sees me communicating with nobody save the clerk, and my socialization was limited to “Thanks.”

I felt like a terribly selfish person when these details were revealed. Had the video made me more self-conscious? Was I less jocular? Or were the circumstances inveterate? Can we be exonerated if we aren’t really aware of how few people we talk with? Or is it incumbent on us to be more socially responsible? If it is socially acceptable not to talk with even half of the 31 random people we regularly run into on any given day, then are we any less culpable in failing to live up to the possibilities before us? In 2007, a University of Melbourne researcher concluded that political candidates sitting on the left-hand position of a stage are more likely to draw attention. Because the brain, when tracking a tableau, has a tendency to drift to the left. I must note that in Points A and B that the individuals were to my left. I also see that my own journey to and from the store had me situated mostly on the left side of the street. Therefore, if my brain was going out of its way to excluding people, it was possibly because my visual cortex was occupied with the buildings and edifices. Was I subconsciously going out of my way to avoid people? (Additional factors to consider: I learned later in the afternoon that I was in need of social engagement. Several opportunities presented themselves and were taken.)

What is also troubling in my video reenactment is that the only time I comment on anything is when I see a bus parked at a stoplight. This bus is in almost the exact same position as another bus was during my original journey. And seeing the similarity, I am forced to violate the conditions of my recreation: commenting upon the action. This would further support the “running into someone” theory. Consider what cognitive scientist Colin Cherry identified as the cocktail party effect, whereby a person has the ability to focus in on one talker while a steady chatter of conversation is going on. This was supported last year by a study that revealed the auditory cortex does a good deal of work in filtering conversation. And if your brain has robust basal ganglia, well, then you likewise may have a robust “irrelevance filter.”

I do not know how tough my basal ganglia are. But I am troubled by how smoothly my brain deems certain details irrelevant. How little it notices. How needlessly egocentric it is on a subconscious level. After the fact, I am going out of my way to locate the new, the unseen, the underdogs, the moments I didn’t seize, the people I could have talked to, the emphases I now find phony or false.

stagethreeStage Three of my journey seems less significant than the first two stages. By my own judgment, it is also the least interesting part of the video recreation. The onion has been obtained. I recall that during the original narrative, I found myself observing more people. I was not in a rush to get back. But on the video, I am circling around people rather than approaching them. I do not know if this is because of the camera or because I felt uneasy reproducing the narrative. And why should Stage Three be the least of the three? The primary goal has been obtained. The mind is at ease and can be more spontaneous with the rigid order out of the way. Or so one would think.

This exercise originated, in part, from ruminating over Roger Ebert’s recent post about the determinism of the universe, although the subject has long been on my mind. I am a secular type who does not believe in a deity, and yet, on some primordial level, my mind seeks to find connections and patterns. Even in thinking about why my mind reacted the way that it did, I am still trying to pinpoint a framework. How is this any different from Intelligent Design?

Among George Santayana’s great arsenal of pithy maxims is this one, written in response to William James’s the Varieties of Religious Experience: “Experience seems to most of us to lead to conclusions, but empiricism has sworn never to draw them.” I hope that I have been more explicit about my free will than James was, and yet I share with James a strange pleasure in vivisecting my experience. Of coming to terms with my subconscious limitations as a human being through diagrams and video reenactments. One should probably not approach life this way, because on a certain level, one must live with blind and uncomfortable truths. But is the truth really unraveled when we consider the structure beneath it? Or is the mind so hopelessly fallible, so determined in its determinist filtering, that human beings are doomed to repeat the same mistakes even when the horse has been led to water? This seems cynical rhetoric, but it’s quite liberating to know that, no matter how much you slice and dice up a moment, the mind remains a dutiful deflector.

A Call For Plenitude

It is a happy necessity which obliges wisdom to do good, whereas indifference with regard to good and evil would indicate a lack of goodness or of wisdom. And besides, the indifference which would keep the will in a perfect equipoise would itself be a chimera, as has been already shown: it would offend against the great principle of the determinant reason. — Leibniz, Theodicy

In recent weeks, I have observed undeserved burdens heaped on too many good souls. The Duane Reade clerk (one of two jobs she holds) too exhausted to lumber any faster beneath the register. The woman in her early forties juggling compact and BlackBerry, as if both were stray capsules from the same big bottle of panacea, while her heels clack with staccato desperation against the sidewalk, The sour middle-aged man sitting alone on the subway staring at a flimsy severance check and wondering what the hell he’s going to tell his family.

When I ask the kind Best Buy employee why she’s carefully examining the twenty dollar bill I hand over, she apologizes and tells me that there’s been a steady uptick in counterfeit bills. I’m genuinely surprised and I assure her that I’m no crook. “More shoplifting too?” I ask. She whispers yes. Her eyes dart nervously to her slim and nervous manager, whose eyes survey the floor like two surplus security cameras. I wave hello, but it’s no good. Every customer’s a suspect. It’s a good thing I’m just buying something expendable.

I’ve seen my trusty neighborhood bodega go under. The guy running the place couldn’t make his rent. But he understood that others were in similar straits and he cut his customer base some slack. “You pay me next time,” he said to a mother who couldn’t scrounge up the change for a dozen eggs. He paid all right. Never mind that she kept coming back.

There have been jittery emails from friends. Crushed voices over the phone carrying strains of reluctant endurance. Confessions of fatigue. They wonder if now is the time to take chances. We’re all getting by and we can’t imagine taking vacations. Instead of hanging out the whole day, how about a few hours after that job fair? Not that there’s a chance in hell of getting anything, but the savings won’t last forever. Yes, she lost her job too and I’m trying to cheer her up. You’re not expected to work overtime, but if you don’t, they’ll bring it up in your next performance review. Yeah, they’re having more performance reviews. You should see the applications fluttering in at Starbuck’s. Got any leads? Do you know anyone who has work?

Jobs being cut. Pay being slashed. Benefits lost. And everyone must work harder. Without rest. Until the unseen hunters stop shooting at the ailing beast that all of us have to bear on our backs.

But we’re not allowed to talk about any of this. To bring up the fortitude it takes to carry on is an indicator of weakness or defeat. A blot on our record. An arrogant man by the name of Rick Santelli blames it all on bad behavior. Even the latest chapters in the history books must be written by the winners. And those with the bulging wallets, those callous solipsists who kvetch of the difficulties of living in New York on less than $500K, hope that their spastic hand-waving ensures that the ink stays permanent.

So our faces become grimmer. Our hair grays faster. And we begin turning on each other like savage animals corralled in a cage. We search for any insouciance to lift our souring dispositions.

None of this is acceptable.

If fingers can cling resolutely to a cliff, the soul can easily extend beyond a mere Babbitt. We’ve reached a point in which we must take chances and throw ourselves into the wild briny patches of innovation. But why accept a world in which free thinking is replaced by a sad search for cues from someone who people think has a clue? Why believe that any one person is right all the time? Why celebrate a culture of entitlement and honor those who feel obliged to their spoils?

Tangible happiness expressed and received. A smile to a stranger. Five minutes to listen. Efforts to establish common ground. The burying of hatchets. A fey risk.

Are the most qualified people necessarily the best? And are the apparent dunces truly the worst? Must we cling to our groups and our clubs and our coteries? Or is there an epiphany to be found in the new and unpredictable?

The present conditions demand a blend of perfection and incuriousity that is incompatible with the human condition. To be human is to screw up and to seek out, to dust one’s self off the ground and try again and encourage others to do the same. Are we to get out of our present mess by playing it safe? Why limit the possibilities?

NYFF: In Girum Imus Nocte Et Consumimur Igni (1978)

[This is the fifth part in an open series of reports from the New York Film Festival.]

Several people who are much smarter than I am have written plenty of words about Guy Debord’s 1978 film, In Girum Imus Nocte Et Consumimur Igni, a title which I must confess is rather difficult for me to type without looking at another browser window (currently open, right next to a minimized Explorer window urging me to search through “My Computer” and presumably my soul) and ensuring that I am not making a spelling mistake. (Yes, I could cut and paste the title, like your typical hack journalist would. And I suspect, given the thin crowd I observed, very few will actually write about it, claiming the film to be too difficult or too sophisticated. But I wish to respond to the more troubling typing gaffe at hand. Speaking only for myself, I retain some small hope that I might actually type in this Latin palindrome correctly with repeated effort. Incidentally, in case you might be wondering, this phrase stands for “We Turn in the Night, Consumed by Fire.”) And several people who are much smarter than I am will indeed be discussing this film on Friday, October 3. But I don’t believe Debord, the man best known for his Situationist activities (and if we mere consumer slaves take this film at face value, Debord was one of the finest egotists that mid-20th century French philosophy had to offer) would have approved of this staged commentary. The film, after all, ends with the subtitle: TO BE GONE THROUGH AGAIN FROM THE BEGINNING. Which I think is a pretty clear instruction. So would it not have made more sense for the good people at the Lincoln Center to simply play the film twice for the benefit of any party who may not have parsed Debord’s words correctly the first time around, rather than to have talking heads attempt to explain the film for the audience? The assembled parties, last I heard, had no intention of excusing themselves.

And I certainly don’t believe that Debord would have approved of some balding blogger, who is, at present, clutching a copy of the film’s script that was kindly handed out by the good people at the Lincoln Center and is now again getting lost in Debord’s considerable thoughts, their relationship to the images, and is wondering why very few people make films quite like this anymore and what possible “take” I might offer.

For that matter, I don’t think that any reader (and especially French philosophers) would approve of these last two paragraphs, which are mired in needless clauses and parenthetical asides, and don’t really get anywhere close to conveying my rather amazing cinematic experience on Saturday morning, in which I was barely awake, peering around desperately for a caffeine drip, greeted in the dark by a dry and bitter French voice unloading a steady stream of anti-consumerist language, followed by personal adventures in the Left Bank that were truly not that different from the usual n+1 ravings (i.e., douchebag entitlement), but becoming, nevertheless, quite fascinated by the images of pilfered trailers from mediocre films, endless tracking shots across water, and assorted stills of overhead shots of Paris, various grids, and crummy-looking buildings.

This was indeed Paris, ten years after the riots and the failed experiment. And Debord, in 1978, did not like it one bit:

It was in Paris, a city that was then so beautiful that many people preferred to be poor there rather than rich anywhere else.

Who, now that nothing of it remains, will be able to understand this, apart from those who remember its glory? Who else could know the pleasures and exhaustions we experienced in these neighborhoods where everything has now become so abysmal?

Now the degree to which you can accept the veracity of this statement will probably inform the degree to which you enjoy Debord’s film. This is a man, knowing very well that he has enslaved his audience for 100 minutes, who proceeds to kvetch even grander than Jean-Paul Sartre. He often removes the images entirely, giving us either all-black and all-white for several minutes, so that the audience will be reminded of who is in command. (The film was made in an epoch before the remote control offered us the mute button.) He prides himself in his voiceover for being an intolerable gadfly. He regrets nothing, saying to us, “I remain completely incapable of imagining how I could have done anything any differently.” He suggests that he and his fellow gadflies are somehow superior because they did not apply for grants and did not go on television. Those who have not begun to live in some individual (and presumably Debord-like) fashion “are waiting for nothing less than a permanent paradise,” which might be identified as a job promotion or a total revolution. But Debord’s purpose was actually quite simple: “For our aim had been none other than to provoke a practical and public division between those who still want the existing world and those who will decide to reject it.”

I’m making Debord come across like an insufferable asshole. And while this may be somewhat true, the salient point I took away from this film was that there is now nobody like Debord who is telling the truth like this — even if Debord himself only half-believed it. It seems that something terrible has been lost in the last twenty years. A dessication of identity. A capitulation. I am not aware of anybody using the great possibilities provided to us by YouTube making a film like this who doesn’t care about the audience and who doesn’t care about how their offerings are perceived. It’s all about giving into the slim possibilities of fifteen minutes of fame, rather than living a lifetime of unapologetic infamy.

So Debord’s film comes at us forty years later reminding us that there was an altogether different type of provocateur who held various mediums hostage and used this to extort an audience into challenging their assumptions. Tout à fait brillant! If anyone will now listen to the man, his words are perhaps more important than any of us anticipated.