liberation-of-literature

The Center for Literate Values ~ Defending the Western tradition of responsible individualism, disciplined freedom, tasteful creativity, common sense, and faith in a supreme moral being.

P R A E S I D I U M

A Common-Sense Journal of Literary and Cultural Analysis

15.2 (Spring 2015)

 

Academe in Decay

 prae-206

Angels Unchained: Toward the Liberation of Literature from the Professorial Plantation (Part One)
John R. Harris

  1. Beautiful Stories and Their Academic Assassins

Not that I fail to understand how very different the literary game has grown: I see, for instance, that contemporary short stories have nothing but demoralization to purvey and are hence both depressing and boring to read. They were already so when my youthful endeavors were vying unsuccessfully for a microscopic share of the “literary and little magazine” market. My stories were always stories, you see: they had a beginning, a middle, and an end. As such, they projected a meaningful succession of human events in time; and by “meaningful”, I intend something like “implying generalized truths about the right and wrong or good and bad of human existence; tending toward a moral message”. In the Seventies and Eighties, “serious” works accepted by “recognized” publishers told a story only if the protagonist were a whale, a seagull, or some non-hominid species. Homo sapiens was no longer involved (so the intelligentsia had decided) in any purposeful kind of existence. Exploding a few remnant presumptions here and there that a person’s conduct could somehow matter in this world was, indeed, the mission of the new literati (e.g., contributors to the “new new novel”).

I hasten to add that I navigated my Joseph Conrad period while yet in high school. The notion that things might be summed up in words like “inscrutable” or phrases like “the horror, the horror” was therefore certainly not alien to me. Yet in Conrad I always detected—and continue to detect—the desperate search for a meaning. His frustration at having failed to find one exists only because meaning, for Conrad, must be sought; and his protagonists, in a proto-existentialist kind of fashion, live their lives by strict codes of personal honor even while aware that society’s broader framework of cynicism and hypocrisy makes them dupes. In contrast, my generation’s self-ordained arbiters of good reading and creators of good writing seemed capable of nothing more than loudly advertising their liberation from any sort of hang-up. They had triumphantly jettisoned every vestige of personal honor along with the lies of the military-industrial complex and the corporate loyalty of fathers who kept signing checks for their tuition or bequeathing them small fortunes in bond funds.

I wondered—and I continue to wonder—how so many of such a brood can ever have supposed themselves interested in the art of the story? Of course, as I have already conceded, they were and are story-assassins. They had an axe to grind against meaningful time, and they ground it razor-sharp. But why didn’t they just grow weed or content themselves with sabotaging artistic taste through the LP album? Why did they flood academe and take over our literary culture? How did they do so, considering their complete ineptitude for literary study and creation?

My intent is to begin chiseling away at two books in one through a series of essays that I hope to run in these pages throughout the coming year. People of my age who have any “sense of an ending” (in Frank Kermode’s excellent phrase) cannot ignore that more sand sits in the bottom of their hour-glass than in the top. Since I have been and am still so very devoted to literature, the notion has steadily rooted and sprouted in my conscience that I should write a) an exposé for the general public describing just what has happened to literary pedagogy, and b) another opus explaining in common-sense terms what the study of literature ought to be (as in “should have been”). Both projects nettled me as they grew, however; for I knew that if I were actually to complete these two works, a) the former would make me unemployable in academe, assuming that the general public cared enough to buy it, read it, and render me notorious; and b) the latter would never find a publisher, since only academics read books about literary theory and never guys on the street. Yet time has a way of removing our greatest obstacles—usually, I think, by allowing us to wake up to the non-existence of real obstacles. In my present level of enlightenment, I see that a) I’m older than dirt and without much of a career left to lose, and b) wasting my time as an educator if I cannot convince some of the laity that literature matters. The proverbial men and women in the streets are precisely those whom I should address in both instances, for it is their children who are being shortchanged by the Ivory Tower and their culture and humanity that is draining away from whatever legacy they hope to leave along with the bond funds.

Let me conclude this inaugural section with an essential definition that, I have no doubt, must animate whatever book emerges from my subsequent testimony. You must understand what a story is—why it matters; if you are a civilized human being, you must understand this much. The professionals entrusted with passing the knowledge along to your children have not done so and are not doing so, because they are not really civilized and want very much to destroy civilization. They suffer from an egoistic pathology that we will diagnose later.

Stories compress, crystallize, condense, summarize, or otherwise give perceptible form to life’s vast shape; for life is indeed vast, but it is not infinite. It begins and will end, for each of us as individuals and for all of us as a species. Birth, a time of extreme vulnerability, growth into greater independence and aptitude, full maturity with its burdens of responsibility, physical decline often accompanied by heightened understanding, then full decline in every regard, and finally death: these are the chapters of a terrestrial life. The poets sometimes add or subtract one here or there, but no intelligent adult of sound mind can deny that a clock ticks as each year crowds into the next. We don’t enjoy a spot in the sun forever. What to do with the time we have? Maximize pleasure? But what is pleasure, that we should make it greater? Is it that which gives us satisfaction? And wherein lies satisfaction? Is it to be found in the indulgence of carnal appetites or in the uplifting thrill of service to others? Our appetites are soon glutted and, into the bargain, can draw us into unhealthy, even deadly habits. Yet serving others can lead to stunning self-delusion as one constantly scans horizons for “underprivileged” upon whom to lavish that same cake and candy which one personally refuses as coarse and paltry. One can simply grind away at making one’s daily bread, to be sure, and sedate one’s moments of leisure with dull pastimes so as to still any brain activity. The very need of sedation, however, proves that we were not made to live like dumb beasts. Our time requires meaning.

Stories propose meanings. They allow a certain set of values to be applied to a certain set of existential challenges, and a certain set of results becomes the basis of judging the applied values. How many characters lived or died as a result of choices made? How much better or worse was the fictional world left? Many steps in the creative process may appear false ones to those of us who view the completed work. This or that reader may cry foul at almost any moment: hence the element of subjectivity in literary judgments. Could a real person be so virtuous as the story represents its hero, or so bad as it portrays its villain? Could the situation that poses the story’s central challenge really occur, or would it really occur more than once in a blue moon? Would the end really happen as reported: would the couple live happily thereafter, or would the antihero not be driven insane by the memory of his misdeeds?

Reality is the yardstick of stories as it is of no other art form except, perhaps, painting and sculpture. Music isn’t condemned for being illusory. Architecture must satisfy the harsh exigencies of gravity and shifting pressure, but only by producing a sound skeleton: its visible surfaces can go anywhere if enough cleverness is put into them. Marginal designs in dress and other decoration can be highly abstract, probably the more so the better. Even painting and sculpture pledge allegiance only to a static object, a snapshot of a thing frozen in time; and one might say that they evoke a real subject in the viewer’s mind only so that they may proceed to exaggerate that subject “artfully” and thereby emphasize some characteristic not normally noticed.

The story, on the contrary, draws keen criticism for practicing any kind of subreption—for trying to sneak in an improbable motive or reaction or stroke of luck, that is, in its flow of events. The protagonist can’t be utterly absolved of the murder in his heart because he suddenly reads a Bible verse or his gun mysteriously misfires. Though such contingencies may in fact link events in “real life” occasionally, the story’s reality demands that we at least be prepared for a big surprise. A sudden conversion to faith in a god of peace must be preceded by certain rumblings in the character’s soul—doubts and regrets and fidgets; and if the bullets won’t detonate, they had better have been loaded by an accomplice who secretly sabotaged them in a moment of guilt. Then that accomplice himself becomes of interest. What caused such eventful complexity in his personality? What is his background story?

The created narrative, in one word, must be aesthetic. Its overall motion must be explained by intricate component motions that have contributed to it as dozens of different physical forces—wind, tide, the coastline, etc.—contribute to forming a great wave. Explanations must at least be implied for readers shrewd enough to examine the details closely. The colors must match. The underpinning must support.

But now we seem to be veering back into the more material and static arts; and the material, static arts, as we saw, are not held strictly accountable to reality. The story claims to have a uniquely strong measure of the real… but the claim is fraudulent. The fictional narrative’s need for coherence, for minute and progressive order, is precisely what our generation of postmodern critics has most mercilessly mocked, mimicked, and undermined (or “deconstructed”) as phony about the story. For if the component parts of stories must make sense because our aesthetic sense—our rational delight in pattern—forces us to demand that sense, then the reality of stories must be a fake reality. In the same way, most of us realize that we would not be selected to represent an ordinary Jane or Joe in a Hollywood film about an ordinary Jane or Joe… because we look too ordinary. Some statistically improbable person with smoldering good looks will get the role, since only such a face will persuade viewers to part with their money. Why, then, should we take the story seriously as anything more than propaganda? Isn’t it just a confidence game striving to convince us that the order we want to see in things is an order that truly characterizes things? Even if no evil despot is engineering this propaganda, aren’t we feeding it to ourselves thanks to some tragic need for deception in our nature?

I have heard such sophistical folderol, and been more or less forced to study it, throughout my professional life. Its fundamental fallacy is just this: the presumption that the essential and necessary structure of thought is merely presumption. If that sounds like a paradox, it is, and I should perhaps be more charitable to the critical community for being hoodwinked by it. (But members of that community, you know, so pride themselves upon being more clever than the rest of us!) Let me state the paradoxical truth of stories from a more assertive direction: the human perception of reality is invincibly aesthetic. The “dressing up” cannot possibly be eliminated. Driving out the aesthetic arrangement of time would result in nothing less than nullifying the ability to perceive. Since we cannot help but impose order upon lived moments, jeering at intrusions of order is about as edifying as suspending all practical applications of geometry because points have an actual girth of zero. An intrusion of this sort indeed always exists—but to remove it would be to remove the human mind from the scene. We presume similarly even when perceiving static objects. When does a forest become a stand of trees? When do the trees become pine and spruce and oak? When does the mistletoe become an organism separate from its host? There is an urban legend that Eskimos have ten different words for “snow”; maybe Inuit children think that American city-slickers have a hundred different words for “igloo”.

Knowing that our division of material reality into discreet units is largely arbitrary, could we get by without so dividing what we see? Warned against the cultural conditioning that makes us view a tire as distinct from a car, could we somehow walk through a day absorbing nothing but the great oneness of things? Could we negotiate one hour in such a state—or one minute, if we were truly in motion among moving objects? Would we not then qualify for the protective custody of those suffering from cognitive impairment?

Far from being a propagandistic opiate that dulls us to life’s complexity, stories bring us alive to possibilities. They give us an equation in the form of, “If thus and so, then that,” and invite us to determine personally if the posited values balance. They allow us to map out our own life by indulging us in various trial runs. Yes, we may choose to wrap our minds in tales of the same sort that repeat the same cherished ending, and a kind of voluntary brainwash can result. Yet the antidote to such self-mesmerism is not to strip bare the shakiest assumptions of a certain author or genre and then strut away in victorious arrogance; for nakedness must be clothed—psychic nakedness most of all, perhaps—and illusions left cruelly exposed only cover themselves again in the nearest rag. The more responsible and enduring solution is to induce the threadbare wrap’s removal by offering a coat with sleeves and pockets. Replace weak stories, that is, with strong ones.

Dante introduces us to Paolo and Francesca in Hell after their being caught in flagrante delicto and slain on the spot by the lady’s jealous husband. They had been reading an Arthurian romance about Lancelot and Guinevere when they decided to give adultery a whirl themselves. It was a bad decision based upon too little evidence. For that matter, the Round Table romance doesn’t end happily, either: too bad that the lovers couldn’t have turned the page. What, then, is the nature of their own story’s “failure”, if it is such? Was Dante wrong to offer it up as somehow instructive? Does the blunt fact of “storiness”—the naïve creative act of having chained one event to another—deprive the poet’s comments of any possibility at deeper truth? If all narratives may be and should be deconstructed (which is just what literary critics of our time demand), then the murdering husband Gianciotto would never have presumed that he had exclusive rights to Francesca’s body, in the first place; for marriage is also a narrative, and the consequences of adultery also a narrative. Every cultural syllogism—every “if… then” formula—is a story, a narrative. Of course, the idyll of the passionate love affair is a narrative, too, with all the liabilities thereof. Dante’s sympathetic representation of the adulterers circling Hell’s outer circle forever in tragic devotion is whimsical blather, unless we suppose it intended as a hellacious punishment: that is, to be eternally trapped in a lovers’ narrative, long after the thrill of the forbidden is gone. Our poet was born almost a millennium too soon, though, to imagine tortures of such an order. Inferno’s torments impose agony because of the narrative they offer in each specific case, not simply because they are narratives. For Dante, to be in Hell is not to spend time, but to misspend it—to be forever engaging in pointless exchanges.

If all the many stories of our lives were deconstructed like a flimsy house of cards, then we would return to naked brutes moved by impulse… but we can’t really get there from here. Our exposure to the reflective, personally responsible life of the human has proceeded too far. We would only be play-acting as apes (and that act, too, would have a script).

The solution to the lovers’ story above would involve a much longer story. Perhaps the two should have aspired to maintain a chaste friendship throughout their lives in consolation for the vanished opportunity of more terrestrial joys. Or if that sounds altogether too saintly, perhaps Francesca should have sought an annulment of her abusive marriage; or at the very least (given the historical infeasibility of that option), perhaps she and Paolo might have run off to an unexplored corner of the world, as the Chevalier des Grieux and Manon Lescaut would do in a more “romantic” era. Stories where the unhappy wife keeps a lover behind her husband’s back usually don’t turn out well: humans are jealous, possessive creatures when they deeply care for something or someone, and sustaining a densely woven fabric of lies also places heavy strains on our nature. My two foregoing generalizations, of course, express a personal judgment about how people tick. Another arbiter might reach a different verdict. Yet the verdict would preserve a consequential form: “if this… not that, but that over there.” It would posit an alternate story, but still a story.

To summarize, the art of story-telling is the uniquely blended art of building temporal patterns out of human actions. Since human beings are conceded some measure of free will by all members of their species (even strict determinists make their case vehemently, as if addressing auditors capable of being swayed), the actions corresponding to bars of music or to posed figures in a painting fall under moral law. They are choices. A given reader enters a given story already possessing certain moral beliefs or convictions, and may therefore find affirmation or challenge in the plot’s implicit claim about where particular choices lead. This creates a charged atmosphere—a setting where friction may occur. The story may be judged less for the artistry of its arrangement than for the previously conceived notions of right and wrong imported by its reader. An unfair judgment may follow. A Victorian-minded audience may reject out of hand a novel that simply dares to address the subject of infidelity, even though its conclusion may reinforce the general view.

The reader cannot be expected to check his or her moral values at the narrative door, however; for the trial run staged by the story—its advertised rehearsal of “real life”—is precisely an interlacing of personal decisions with material circumstances and material consequences. Story-reading is an insurmountably moral exercise. The “fair” literary judgment, as a result, is not one completely purged of moral bias (as if that were possible), but one which suspends such bias sufficiently to let the tale make its case. One may believe that adultery is wrong yet also be brought to see, through a sensitively detailed story, that unholy marriages can occur. A truly great story wins the person of strong conviction over into conceding, not that his view of the good must yield, but that sometimes life gives us a choice only of greater and lesser evils. Artistry does this. The skill with which the drama’s players are moved from Act One to Act Five does this.

  1. Literature’s Slaughterhouse: The Collegiate Course of Study
  2. a) freshman composition classes

What exactly, then, is the nature of our children’s introduction to this essential element of our culture and our humanity once they enter the ivy-covered towers of academe?

Freshman English is almost universally devoted to six hours (that is, two semester-long courses) of composition. Students who score well on the “verbal” portion of their entrance exams are often granted the hours and excused from attendance. In my case, since I had skipped the senior year of high school, I elected to take the classes even though I enjoyed “immunity by SAT”. (This was among the first of several very bad judgments I made in my academic career, by the way.) Over the decades, the fundamental approach to teaching Comp has not changed much. Commentators who stress that Sixties revolutionaries are just now starting to transform our educational establishment often over-dramatize the cause-and-effect of the past century’s signal Stupid Years, it seems to me. Somehow or other, the Ivory Tower was already thoroughly pink when I entered its corridors.

My first assignment ever in Composition class was simply to write a description of something—anything. Since I had traveled a thousand miles to be in the land of my forefathers and had scarcely ever seen two trees side by side before, I naturally chose to write a very moody paragraph—entirely worthy of Chateaubriand—about the primal forest bristling in the summer night (I had rushed off to summer school right after Grade Eleven) just beyond the steps of my dormitory. Mr. Grinchuk selected my piece—reproducing it in full, as I recall, on his state-of-the-art Xerox—to demonstrate how not to describe. My lack of clarity in every word was exemplary (I may even have tossed in a Conradian “inscrutable”). For the rest of the short term, I rather felt as though I were playing a hopeless game of catch-up. Vague instructions had induced me to write a vague piece which had instantly stereotyped me as a vague thinker. The pill was the more bitter to swallow in that Mr. Grinchuk (who never wore socks in his shoes or allowed his hair to float above his collar) obviously considered himself a wonderfully broad-minded kind of guy—so much so that he dedicated many of our meandering class discussions to chiding a couple of coeds for being “good girls” and observing the evening curfew. Only an elite few could rise to his expectations of broad-mindedness, in the end. The one grade of A awarded, according to the scuttlebutt, went to the “spaced-out chick” who had transited Canada on a motorcycle, and who was rumored often to be seen accompanying Mr. Grinchuk at social gatherings. Apparently her expository talents were fully to his taste.

I had just completed my maiden voyage into the hyper-hypocritical, “read my mind, kiss my ass, I am God” world of the educated progressive. The class ought to have been subtitled, “Introduction to Subversive Indoctrination”.

I cannot recall all of the readings upon which I was force-fed that summer, naturally. A Kurt Vonnegut novel was among them, and also a tome by Norman Mailer. The pedagogy, in other words, was at least somewhat based in something that someone might call literary. (Though I shame to say it, I have hardly ever touched a scrap written by Vonnegut since then. He deserves better—but the well has been poisoned: I couldn’t possibly render a fair verdict on one of his works after my prison-camp experience. Ironic, isn’t it, that progressives like Vonnegut make such liberal use of tropes like the prison camp in their portrayal of bullet-headed right-wing fanaticism?) By the time I myself was trying to scramble up the lowest rung of the professorial totem pole, a veritable mushroom cloud of essay anthologies—known in the biz as “readers”—spread its umbrella vastly over the Composition-textbook market. All of these collections seemed to include Orwell’s “Shooting an Elephant”, Dr. King’s open letter from the Birmingham jail, Chief Joseph’s surrender speech (“I will fight no more forever”), and other pieces that possessed healthy doses of literary or philosophical value; but, inevitably, their intellectual challenge was diluted by the bludgeoning political sarcasm of Judy Brady’s “I Want a Wife”, the theatrical lunacy of Gloria Anzaldúa’s genre-defying victim-everywhere-I-turn screeds, and other gestures of politically correct tokenism.

For my money, though, the true and abiding subversion that characterizes Comp I and II isn’t the content of the reading or even the “force it down your throat” politics of the instructor, though those are certainly as visible as ever. (A twenty-first century version of Mr. Grinchuk was filling his blackboards all semester with Occupy Wall Street inania just lately in the Comp class right before mine.) No; the biggest threat to the morale of our young people, I believe, is the bedrock pedagogy of the “discipline”. It’s called Rhetoric. The handle seems somewhat grandiose to me, but perhaps it is less so if one reflects that ancient rhetoricians—and most notably the sophists of Athens—were teaching their charges the same lesson: namely, that forensic delivery is all a matter of gauging your audience and tailoring your pitch to suit the intelligence and prejudices of likely receivers. Give them what they want: the creed of the shrewd merchandiser. Rhetoric handbooks, to be fair, may also have sections on how to avoid logical fallacies and how to marshal evidence coherently. The best of them certainly do. Why, though, is this not the exclusive concern of the Composition teacher? Why are calculations about voice and audience indeed at the forefront of most classroom instruction? Should not these be very minor matters? Persuading students not to thrust “okay” and “messed up” and “kinda” into their writing should be as straightforward as launching a five-minute exhortation to drop slang and colloquialism from formal settings. Instead, students now typically work in groups for most of each week answering a checklist like this one:

  • What audience does the author anticipate? What is the likely level of education and awareness in this audience?
  • What is the setting in which the piece is delivered? Is it more formal or informal?
  • What specific evidence can you find to support your answers to 1 and 2 above? What words, phrases, and rhetorical strategies does the author use to reach the anticipated audience in a particular setting?
  • What unstated assumptions [enthymemes, if the teacher wishes to borrow from Aristotle] does the author make, and why are they made? Is the author relying on the audience’s sharing a certain prejudice, or is the assumption an unconscious presumption?
  • What are the author’s credentials? How does he or she convince the audience that the piece speaks with authority on the chosen subject?
  • Are there any special strategies that the author uses to enhance the desired effect (e.g., a joke inserted in a speech, a personal anecdote, or an illustration in a published piece)? Why are these effective, or why might the author have supposed that they would be so?

Now, I wouldn’t consider such matters to be irrelevant in analyzing a composition: I merely consider them to be more on the level of, “Don’t say ‘okay’.” Devoting most of an entire course to them would tell me, as a student, that the way you say something is more important than what you’re saying. I should like to see questions such as these addressed more deliberately:

  • Is the work’s central claim true? Isolate it and restate it in your own words.
  • Is there a broad gap between your restatement and the language used by your author at any given point?
  • If you answered “yes” to 2 above, then why does the gap exist? Might you not fully understand the author’s intent? Why might a misunderstanding have happened? Contrarily, do you think the author has an insufficient understanding of the issues or does not want to declare a position openly? What evidence would lead you to either of these conclusions?
  • Is the “truth value” of the piece indeterminable within its own context? In other words, is more information or more research needed to achieve certainty? Is the author aware of this?
  • Are value judgments an essential part of any major claims made? Identify these claims. What precisely are the value judgments? Taken together, do they define a specific tendency? Are you able to reconcile their tendency with your own values, or do you find that tendency misguided?
  • If value judgments play any role in the author’s writing, do you find that role justified? Might the piece have been written more objectively, or was some degree of personal assumption necessary in handling the subject?

Both of these lists could of course be extended. I should imagine, though, that the diverging vectors of the two are clear enough after six items. The former focuses (almost obsessively, it seems to me) upon persuasion. You are an author, and you stand upon X. Your audience occupies Y. You wish to leverage the mass of readers over to Z. How best to do that? Make them feel comfortable. Appeal to their prejudices. Sell yourself as an authority. The selling of credentials, I confess, is particularly obnoxious to me. Truth should be truth, no matter how lowly the tongue that declares it. On the other hand, laurels and titles should bestow upon no one the right to be heard unquestioned. If the Irish proverb is correct that many a good man wears ragged pants, than the reverse is also correct: many a slithy tove wears a three-piece suit.

One might imagine that a progressive academic would deeply sympathize with the proverb and its flip-side: Occupy Wall Street, ninety-nine percenters, whistleblower-heroes, etc., etc. Yet when it comes to apotheosizing tweed-clad worthies with publications in PMLA and a book accepted by U Cal Press, all bets on the little guy are off. The esteemed professor, lofted Olympian-high on the shoulders of peer reviewers, is the Dali Lama, the Second Coming, the unearthly messenger speaking golden words from a stargate. Few prospects can be more dispiriting to the naïve among us than that of the brash academic truth-seeker who suddenly turns sycophant and idolater when one of his own elite offers him a blessing.

My alternative list, notice, does not discount or shortchange the notion that things aren’t always as they seem. In the search for truth, skepticism always throws the clearest light. Don’t trust—always verify, to whatever degree possible. Skepticism, however, is not nihilism. To believe that truth requires exhaustive search is very different from believing that no truth exists. I think the former list makes a rather cynical game of the search, as if the rules of seeking mattered more than the possible discovery. Getting caught in the revolving door of speaker/audience interactions (or sender/receiver, as one of postmodernism’s patron saints, Fernand de Saussure, called them) can make one lose one’s way. The truth-and-seeker polarization generates productive magnetism: the audience-and-speaker variety only keeps the door spinning. We must have a sense of destination, even though we may merely approach our endpoint, like a hyperbola’s curve, and never reach it in earthly dimensions. Readers who discern a parallel here between my insistence on truth—on ultimate meaning—and my arguing earlier for the necessity of our viewing time as a story are not, in my opinion, imagining things. The academic derision of meaningful time is a directly projected shadow of its contempt for ultimate truth.

Granted, some audiences would never grasp certain kinds of evidence and would have to be “finessed”. Most of us don’t understand the recondita of physics; and too many of us, perhaps, think that we can divine deep truths in a stack of bare statistics. Material should probably be somewhat “pre-digested” in such cases. Yet the line between making complexities comprehensible and massaging facts can be paper-thin, and should be approached with extreme care. Say that I wanted to sell the general public on the advisability of our developing home-based rain-processing technology—the sort of thing that could give each household its independent supply of potable water. I could point immediately to several transparent advantages. The dwindling supply of groundwater would no longer be a worry; doubts about an invisible central authority’s competently filtering carcinogens could be placated; the ever-growing possibility of terrorist sabotage poisoning vast urban populations would be removed. What I don’t know is how, exactly, the technology might be best developed. I am not an engineer. Most of my audience’s members, as well, would not have an Engineering degree. Should I therefore spare them the needless pain of worrying over details by downplaying the issue’s technical aspects? Would I serve my cause well by stressing as credentials my having published two pages on this subject in a political journal whose bombastic name sounds scientific to the untrained ear? Would I not do far better in identifying, and even foregrounding, particular questions to be answered by those whose technical expertise far exceeds mine? I could make the “credentials” gambit work, but the list-of-questions “strategy” (and I honestly hate the word in this context: it makes me sound like I’m running a con) would draw both my audience and me in the direction of enlightenment. I would not be flipping my listeners into the position where I wanted them like some kind of rhetorical karate king: I would be saying to them, “We all need to know more about this,” and leading them in the proper direction.

Do two semesters of Composition in the contemporary university inspire such respect for the truth? I hope so—but only in the way that we all hope a car wreck hasn’t killed anyone. It would be better not to have the wreck.

  1. b) literary surveys and broad period courses

The few relict curmudgeons of the Old Guard like me who are yet to be found outside the walls of nursing homes will probably excoriate the textbook industry for watering down anthologies shamelessly. Allan Bloom, Alvin Kernan, Gertrude Himmelfarb, Jeffery Hart, and several other notable academicians often condemned of “conservatism” penned works in or around the nineties that bewailed the declining awareness of our cultural past among “educated” young people. The survey course and other mid-level “period” courses were once supposed to represent a gateway to that past. The intent was not (and this one condition surely remains unchanged) to provide depth of understanding so much as to define the general lay of the land. The succession of centuries and trends, the major figures who would influence generations to come, the significant historical events that impacted tastes and schools of thought… such were the broad strokes with which the survey and period courses were meant to paint.

And then came the Philistines of the Seventies and Eighties. At first they were merely uninterested in the “greats” of the past, whom they charged with having been elevated to near-immortality thanks to their effectiveness in purveying the propaganda of Western ruling elites (known technically as “dead white guys”). Later one had to suspect that the Young Turks couldn’t have taught Milton if a ghoulish gang of dead white churchmen had revived to torment them—because they had never read him. Various offshoots of Marxism and feminism were viewed as particularly implicated in the “dumbing down” conspiracy. The scribbles of ordinary blokes—the more ordinary, the better (although for much of history, the mere ability to write marked one, quite inconveniently, as extraordinary)—were to be preferred to purple pages composed for the haughty few. A slave narrative was more than a match for a Hawthorne short story. I, Rigoberta Menchú originated in the experiences of a female Amerindian who was “down for the struggle” (though the book’s true point of origin turned out to be her imagination), and hence had three points in its favor beside, say, Grazia Deledda’s puny one. For that matter, Deledda probably failed to make the cut for any sort of anthology just because she fell among those unhappy many whom the New Guard had ever heard of. Grazia Deledda, no less—the first Italian woman ever to win a Nobel Prize! Yet such was precisely the essence of the traditionalist indictment: the classroom revolutionaries were annihilating the canon and transplanting their favorite victims into the vacuum without even knowing that canon sufficiently to stay the execution of a sympathetic member here and there. They made a desert and called it “reform”.

This pedagogical tsunami, far from having immersed Mount Parnassus only to drain away with the ebbing tide, has indeed so engulfed literary studies that today’s remaining terra firma is mere tumbled rocks and cryptic ruins. Young teachers of survey and period courses tend to select the persecuted minority of special interest to them—rarely an undifferentiated proletariat now, but more often women or homosexuals or Native Americans or People of Color or some à la carte combination of the foregoing—and to spend fifteen weeks tracking the covert, constantly harried progress of said group. To be sure, the latest anthologies make the “narrative of victimization” annoyingly easy to teach; yet to fault publishers for chasing after the whimsy of the professionals who place book orders is to accuse the silk worm of making the parachute. I have voiced my share of fault-finding in moments of pique, for it is immensely frustrating to plan a survey course such as one has taught for years only to find that Edition 8 is no longer in print and that Edition 12 doesn’t even mention Ariosto or Joachim du Bellay in order to clear sufficient space for a tentatively decoded Incan record deploring the conduct of the conquistadors. I would rather that Christopher Marlowe had not been nudged out by Aphra Behn just because the latter was female and subversive; for the former, by any aesthetic measure, was a far better writer.

In my better moments, however, I recognize that the real problem lies not with which worthies are selected to spin the narrative thread: it lies with the mid-level literature course’s long-standing acceptance of historical narrative as a principle of organization. Narrative, granted, is an experience that responsible teachers should promote. Let us grant further that isolated narratives, however “classic”, often require integration into a broader story, especially if they dwell far from us in time and space. If certain cultures in certain eras judge certain narratives to capture the essence of meaningful time on earth, the criteria for “meaningful” shift at a crawl, like glaciers—or with a sudden rumble, like tectonic plates. For instance, if a shaman is believed by tribal cultures to hold the key to a higher reality hidden on “the other side”, literate cultures with a far more advanced standard of objective proof and a far more developed method of rigorous analysis will not likely be impressed by a mask and a dance. Taking stock of such shifts belongs to the many rewards of literary study. It helps us to filter out circumstantial influences upon those programs of action that seem to keep rising to the surface whenever human beings recount something memorably “done” by one of their number. I can’t imagine that any sane, decent person wants to accept the Aztec and Mayan habit of massive human sacrifice on a most sanguinary order as “just their was of doing things”: the same reasoning would have to display the same insouciance before the Holocaust. Yet to understand the naïve and intricate manner in which the tribalist visualizes supernatural forces preempting his “merely human” revulsion is, in some measure, to pardon his atrocities. (The SS officer, by the same token, receives no dispensation because he is culturally “one of us” and should know better.)

So grant, by all means, that placing the past’s great narratives in historical and cultural context is a worthy endeavor—something that should properly be attempted in a college classroom. Nevertheless, the professor ought always to remember (as he or she has almost never done in my lifetime) that history and culture are indeed mere filters, and that any story’s soul (its mythos, if you like) is the fundamental act-by-act sequence of time being lived meaningfully. This is why we study—or are supposed to be studying—the story in an English class rather than an anthropology or a history class. The story has “literariness”: it is more than a cultural commentary or an artifact. If it is Beowulf or the Iliad, something in it after all these years may still speak to us (to males particularly, I think) in our thoroughly post-heroic servitude to clock and keyboard. If it is the Odyssey or perhaps a creation of Ben Okri or Amos Tutuola, we may recall its images vaguely from a recurrent dream—a passage to our own subconscious version of limbo where we strangely lose our name, status, and insignias in a desperate search for home. History cannot explain these connections. Anthropology might commandeer psychology to assail a few of them, but the resulting diagnoses about genetic programming have betrayed the story’s magic and are themselves mere alternative stories.

The alternative story: therein lies the bad faith, the irony, the hypocrisy. The obtuseness. One cannot argue that mankind’s story-making must be deconstructed at every pass, as “always already” (in Deridda’s phrase) misguided simply for patching things together… and then patch together a string of such deconstructions to illustrate the liberation or ascent of the Select Sufferers. And indeed, the sword cuts both ways: one also cannot construct an ascending staircase of French or Russian or English literature that inexorably leads to Where We Are Now in some crypto-Darwinist climb to superior sophistication, as the old anthologies of my adolescence did. This puts the story-writing fully under the spell of textbook and teacher, who now stand before the class like Nostradamus with his darkly ciphered code that marshals the parade of history. Yet the sword’s direction has not really been reversed when we move from the neo-Marxist or feminist back to the old-school historicist. It hasn’t changed its trajectory by one degree. Yesteryear’s anthology stirred in philosophers and legislators and essayists to make its Geistesgeschichte luxuriantly ample and minutely plausible. The ideological crusader of our own time is just as wedded to an über-program that ignores the individual work of art. He or she invites non-sequiturs like the Rigoberta Menchú flap, where a supposed and proclaimed literary text was branded fraudulent because none of it had historically occurred—to which defenders countered that everything did indeed occur, only at various times and places rather than in a strict, locally riveted sequence. The notion that a literary work may perhaps not require an apology for not being an historical record apparently crossed the minds of neither side; and, of course, the “artist” herself had sold her “testimony” to the academic elite from the start on the basis of its being a rigidly factual account.

Oddly enough, I have found that intermediate upper-division courses—most of them “period” courses—tend to have a less overt ideological tilt than surveys. A layman might reasonably suppose that the closer students progress to graduation in an English program, the higher the ultrasound frequency to which their poor brains are exposed in the “conditioning” room. Yet this isn’t so—and the cause is equally reasonable, though less obviously so. When the curriculum is carved up into Middle Ages and Renaissance and Neoclassicism and Romanticism and all the rest at its higher levels, the essential nature of the design is historical. Courses with titles like “Female Orgasm in French Symbolist Poetry” and “Gay Pride on the Elizabethan Stage” are actually few and far between, though they always make good headlines when the editor of a conservative newsletter discovers them. Because, as was just noted, all varieties of progressive ideology, with their common theme of underclass liberation, are sub-species of historicism, the fire-breathing feminist fresh from grad school really doesn’t need to make a major psychic adjustment in order to teach The English Romantics. Progress is still progress. The narrative of the world’s becoming cleaner and more honest as the many infections of the past produce boils and at last burst can be told with different casts of characters. And the Romantics, by the way—as well as the Victorians and all subsequent generations of artists in the West—were themselves overwhelmingly progressive; so the ideologue-professor can remain quite true to the convictions and visions of the course’s subjects in emphasizing the forward struggle. More essays, more letters, and more journals find their way onto the syllabus—all serenely or deliriously preaching transformation, but also all appropriate to the Geistesgeschichte model. Hegel, Wordsworth, Shelley (Mr. and Mrs.), Rilke, Whitman, Wilde, Shaw, Wells… one could readily season such ingredients with any woman or minority—anyone from Virginia Woolf to N’gugi wa Th’iongo—without altering its taste (though one might have to go light on those old cynics, Leopardi, Baudelaire, and Eliot).   The artistry of these extraordinary creators, to be sure, may suffer an unfortunate de-emphasis as the student learns exhaustively about their biography and their creed; but that much was true, alas, fifty years ago.

Meanwhile, upper-level courses in the Middle Ages and Neoclassicism are hard to come by, true enough; and the ancient classics in translation, if they appear at all, rush by in a sophomore survey. It is in such a survey, once again, that the student is more likely to encounter a baldly ideological narrative.

I believe there is no more sinister reason for this than that the survey’s material has no patently valid and sensible historical axis along which to organize itself. Anthologies have always been arranged by date, nevertheless; yet the World Lit texts, especially, seem caught with their editorial pants down as they shift from Saint Augustine to the T’ang Dynasty or from the Faerie Queene to the Epic of Sonjara. Naturally, they do not make a strictly linear leap: the Chinese works have their own section for a few centuries, and the African works likewise. The awkward fact remains, however, that Sonjara is closer in both style and ethos to Gilgamesh even than to the Iliad. Stages of cultural development (and I write that word without any progressive undertone) do not overlap for most of human history across the planet’s various continents and in its many geographical pockets.

As a result—and also because no single course or pair of courses could ever come close to sampling the literature of the world—the instructor must fall back upon some principle of orderly presentation: some narrative. A survey of English or American Lit can (and usually does) avoid radical arbitrariness by clinging somewhat to mainstream historical convention; but any historical narrative’s coherence is already challenged in a European Lit survey (of which there are practically none any longer), for what hath Paris to do with Saint Petersburg in the Middle Ages? To be sure, if we worry over the space allotted to pre- or proto-literate narrative, substantial Irish and Welsh oral traditions are usually elbowed from British Literature, as well—though the Irish come roaring back in every nineteenth century anthology, when they produce swarms of antinomian and demoralized texts conveniently written in English rather than Gaelic. American literature, of course, has offered rich possibilities in slave narratives and Native American testimonies from the start, while “respectable” American literati were often none too polished, anyway, when standing on their own in your forefather’s 1950s anthology. The long and the short of it: no typical survey course is likely to offer a view of history unblended from a story of epic, epochal struggle.

As I close this brief section, I would stress the point implicit above: i.e., that English teachers are seldom simply the brainwashed automaton-servants of progressive ideology that radio talk-show hosts make them out to be. I am by no means calling into question their progressive bias—but I am suggesting that all of our culture generally, on the Left and on the Right, buys into the narrative of progress, the historical staircase. Ronald Reagan’s “shining city on a hill” does not sink its roots in the World Beyond, as it would have done in the Middle Ages or for Homer; it exists here and now and may be reached through growing jobs and expanding consumer spending. These are progressive formulations, though they do not jibe with a specifically progressivist agenda (since they leave too many decisions in the hands of individuals). The “better tomorrow” is an index whose high numbers self-styled conservatives covet on opinion polls, and it has indeed long been the soul of our national narrative. A more profound literary experience would expose our children to several toxic “staircase fallacies”, I believe: it would graphically demonstrate, through juxtaposition of rival narratives, that happiness comes from introspection and mastery of the passions rather than from material conquests and self-indulgence. Our culture, however, does not favor such enlightening competition. The stories of rising power and wealth acquired through the marketplace chafe against the stories of persecuted groups denied power and wealth: the blue-collar worker’s myth of advancement clashes with the ivory-tower elitist’s myth of oppression, both clamoring for their kind of progress—which view, in both cases, demands a terrestrial staircase, either one that exists or one that will be built anew.

The young scholar who ends up with a Ph.D. in English has merely observed our collective worship of science and history, wherein everything moves upward. Courses are organized to show or imply advance, the job market is rigged to reward those who thrive in a Darwinian Serengeti of publishing, and the very word “education” is synonymous with “having a better chance” in our culture. When, I ask, has this young scholar ever been introduced to the pleasures of reading literature as art? When has he or she ever been freed to evaluate, through imaginative reflection, various programs for maximizing the meaning of time in a human lifespan? When has our culture-wide narrative ever permitted any of its members an abundance of such freedom—a purposeful liberation from the staircase’s tyranny?

(N.B.: A simple answer to those rhetorically intended questions might appear to be, “TV and the movies.” But the answer would be altogether too simple. Film cannot readily represent inner turmoil; as Walter Ong wrote of oral narrative, its tales are “agonistic and extroverted” because of their servitude to the fully visible. Hence the television serial or movie favors “action”—which is naturally a major component of commercially ambitious productions, as well, due to the “thriller’s” power to amuse and divert while demanding little intellectual engagement. So, no, “Bond” movies and CSI are really not an adequate cultural substitute for Heart of Darkness or even War and Peace.)

  1. c) the senior seminar and other terminal indoctrinations

I may surprise many readers by declining to dissect the “literary” or “critical” theory offered in senior seminars. It’s there, for sure, and it’s immensely demoralizing. The ideological message is invariably one of propaganda and victimization from an historical angle, of sophistry and manipulation from a rhetorical one, and of transference and postponement from a psychological one. All literature is a shell game in the theoretical shell game. Meaning is arbitrarily declared by those with the power to blow trumpets and send around town criers—or else it is the pitiful retreat of the faint-hearted self-deceiver. (Or it may be both concurrently, in this design: dominating patriarchs excel at identifying the comfortable illusions to which their “sheeple” timidly cling.) At the bottom of everything is nothing… or maybe something like the Principle of Uncertainty. You can impose meaning upon an object or event only by wrenching it from the vital cosmic flux and paralyzing it; or if you release it to breathe and move again, then you can no longer say just what you were holding.

There can be a poignant anguish to certain passages in Roland Barthes or Michel Foucault; the fading sunset, the dying echo, and the retreating unicorn lie near the very heart of a certain artistic experience. Contemporary theory, however, strikes me as the false echo of an echo. On the whole, it leaves no room for supposing that ever-evasive meaning may be captured in the formal evasiveness of art—that the art work, in other words, can express and embody truths unadaptable to logic. The classical Kantian “purposiveness without a purpose” has now become, rather, an insurmountable failure in the human mind’s attempt to touch reality. These theorists, ironically, are narrow-minded logicians, in the sense that they need logic to be the sole medium of truth in order to pronounce the truth unknowable. If they are to be the shamans of irrationality who float above us all and smile, then they need the rest of us to understand “rationality” in a reduced and sophomoric manner.

Yet such swirling abysms, I must repeat, are not really the dark spaces at issue here. Few students have any idea of what lurks behind the handful of buzzwords drilled into their heads during the senior seminar. (I recently chanced upon a set of flashcards that one student had left behind in the seminar room. Term on one side, dozen-word definition on the other: the same approach that my son used to learn “baffle” and “quagmire” in sixth grade.) A doctoral candidate in Comparative Literature, if such wretches still exist, might suffer a profound existential vertigo after being force-fed on theory for years. The undergraduate wrapping up his or her degree in English, in contrast, draws but one essential lesson from a very limited exposure. I would call this lesson hermeticism: the sense of insularity, of specialness—of being sealed off from the uncomprehending rabble, of being chosen for elevation to an elite circle. The jargon of critical and theoretical “discourse” is ostentatiously arcane, to begin with; and not only is it never used in casual exchanges—its meaning, if explained with flashcard precision, only leaves the layman (and probably the apprentice-practitioner, if all parties would answer honestly) in a deeper befuddlement.

It is intimidating stuff, this theory-speak—and it is intended to intimidate. Scientific terms at least exist for a reason: they at least name an object, period, condition, procedure, or phenomenon succinctly which could not otherwise be so indicated. “Analgesic” and “myoplasty” are constructed of Greek roots and have clear semantic destinations. The jargon of literary theory, on the other hand, is a “members only” security code—a set of passwords and handshakes that the Gnostic cult’s brethren employ constantly to distinguish their initiates from the profane. Narratology, grammatology, essentialism, heteronormativism, materialist feminism, cultural imperialism, imaginative geography, abjection, center/margin dichotomy, the constitutive Other… such bombastic nomenclature is hardly telling a surgeon where to aim the laser. In a significant sense, it is en masse the granddaddy of the present noxious tendency—especially on college campuses—to fling warmongering polysyllabic concoctions at any opinion contrary to one’s own (e.g., “homophobe” and “speciesist”). That the meaning of the words often hangs in limbo is irrelevant. The words don’t really have to mean anything; and indeed, limbo is not accidentally the signifié of most of them, for the land of shadows is the best place to hide a recurrent lack of substance.

In short, literary theory is a kind of psychic indoctrination into the “discipline’s” most defining characteristic: its aggressive self-alienation from the speech, practice, and understanding of ordinary people. Other things being equal, one might well imagine this a very odd manner of conditioning; for literature, even now, enjoys a reputation among the general public of being theirs—of possessing timeless, beloved classics and of reaching out to humanity above temporal and cultural barriers. We still tend to think that Dante and Shakespeare should drift across a young person’s bow at some point in high school or college. Furthermore, the professoriate collectively—but especially in the humanities—advertises itself as being a great friend and ally of the little guy: hence the continued role of neo-Marxism in so much critical “scholarship” and in so many historicized literature classes. How could a group of people so dedicated to defending Average Joe and Josephine harbor such contempt for the masses and hatch such heinous plots to hijack our common heritage from us?

The humanities have been jealous of the attention and the money lavished upon the sciences at least since World War II. It may be that, at some early stage, the retreat into hermeticism was a bid for a pseudo-scientific sort of respectability, with “cutting-edge scholarship” being venerated more than the art works it purported to dissect and new systems of arcane terminology playing at the discovery of new phenomena. Again, though, I believe those who start and finish with this explanation to be too fine by half. The lure of hermeticism doesn’t require any historical conspiracy to make sense: we can find adequate cause for it in human nature. English majors are intense readers and writers, usually, and have been so since childhood. As such, they are introverts. They may have encountered much difficulty “fitting in” at a time when adolescents want desperately to belong. Hermeticism offers such personalities a kind of pis aller—a next-best option: the “sour grapes” resource of saying, “I didn’t really want to belong to a society of ignorant louts, anyway!” The formal opportunity, therefore, of immersing themselves in layers of obscurantism that make them sound as though they had each come up with E=MC2 is instantly embraced by most. The implied message, “We fly far above vulgar pedestrians!” is mere preaching to the choir.

As for the sanctimoniously professed sympathy with the common man, that is almost always yet another jab at the unhappy adolescence that spawned our bookworm. Since this character type is usually born into a bourgeois household where the value of reading is promoted and where scarce resources are eagerly invested in private school, both the “cool crowd” which authored so many teenaged rejections and the parents who unwittingly constructed the martyrdom’s scaffolding fuse into a hated white-collar, social-climbing mass of snobs. A predictable knee-jerk response to their collective influence is, of course, to spout revolutionary pabulum and adopt a bumper-sticker level of political devotion to Robespierre radicals. Any real sympathy for ordinary people—indeed, any ever-so-faint awareness of how ordinary people think and live—is scarcely to be seen in this passive-aggressive project of revenge.

I propose to examine in detail later the motives driving the professional assassins of literature. For now, I would note further that the entire notion of research hinted at several times above imprints a hermetic culture of elite expertise upon young minds in almost every upper-division class, and typically long before the senior seminar. A research paper is required even in some survey courses. What is that paper, exactly—or what does it look like in its most highly evolved form, as generated by the junior or senior whose breast swells upon reading the professorial comment scrawled in red, “This is almost publishable”? It isn’t in any sense a creative or innovative paper. It doesn’t show any significant degree of the critical thinking so extolled by composition professors as the millennium turned over. Fine analysis of a truly critical sort would require a close, sensitive reading of textual passages and an enlightening exposition of how the scrutinized pieces collaborate in the artistic whole. English scholars seldom waste their valuable time on such intratextual investigation today. That went out more than half a century ago, with what was then called the New Criticism. (“Critical thinking”, by the way, though just lately fallen out of style, was always a freshman-level objective: its reach never extended to “scholarship”.)

No, the standards of excellence for English essays are distinctly different now. The word to bear in mind is “research”, and it yields two absolutely indispensable criteria for the would-be Black Belt in literary studies. First, the paper must be heavily documented. It must parade so many citations past the reader that the final proportion of actual student writing to cited matter and footnotes/references is on the order of one to one, or even one to two. Secondly, references must be current, for the most part. They must not draw significantly on books and articles published in the gray old twentieth century—or in the decades preceding the Eighties and Nineties, at any rate. Naturally, seminar students are still hearing about Paul de Man and Jacques Lacan (not to mention Karl Marx). The patriarchs of shape-shifting must always be honored; or, as a cynic might observe, you have to deconstruct the literary text in order to ignore its literariness and direct all your scholarship to extrinsic factors caught up in politicized trend.

The paradoxes don’t stop here—and I would prefer to call them, in the interest of honesty, acts of hypocrisy. The plainest non-sequitur of the whole business is the shameless emulation of the scientific model, probably in a surge of sibling rivalry (for the humanities, as noted above, are the contemporary academy’s detested step-child). We understand Jupiter’s moons better as we bring satellites closer to them, devise more sensitive instruments of measurement, discover terrestrial analogues to their surface structure, and so forth. There is no reason to pretend, however, that our understanding of Edgar Allen Poe should proceed in response to the same kinds of advance. We don’t grind ever keener lenses for the appreciation of art works. The one scenario in which the opposite might be true (i.e., in which we could expect to understand Poe’s writing better with each new year) would be if historical research held the key to understanding Poe or any other artist; and such confidence would assume, of course, that history itself can achieve scientific method’s level of objectivity. The contemporary “research community” in all literary studies does indeed, by no accident, bet heavily on history. The myth of the “cutting edge” must be kept sharp somehow.

Once again, recall from the previous section that even critical approaches built around gender theory and ethnic victimization pick their cotton on history’s plantation. New “scholarship” targeting the “silent spaces” in texts, for instance, as places where the strangled female voice leaves a sign of its suppression can initiate a complete overhaul of our literary heritage. Those cleverly quiet lacunae turn out to be everywhere: who would have thought it! Likewise, a male character, a Caucasian character, a European character, a heterosexual character… any sort of character represented from a new perspective advanced by “cutting-edge scholarship” can now be reexamined with a view to exposing how he promotes the oppression of the mainstream power structure—and quite without the inconvenience of assessing the subtleties of his role in the story. The historical context of “the movement” (any movement will do) supplies a fresh hall of critical mirrors within which one more scholar may always erect one more mirror. The important thing is to mirror the rest of the hall, down to the very latest addition—not to show any awareness of the world beyond the carnival.

Such riveted self-absorption seems to me to induce a two-tiered hypocrisy of a most devastating kind. Budding scholars are nourished in the belief that they make “substantial contribution” to their field when they publish—yet the mechanism of the publication-engine requires that scholarship grow obsolete almost overnight, and certainly by the end of a forty-year career. The past is forever slipping into irrelevance, or even embarrassing naiveté. Ten people in the world may read what you publish next year in The Journal of George Eliot Studies. Not one of them will remember the piece three years from now.

So much for the promise issued to young coeds, especially, that an academic career will make their lives “mean something” (how’s that for a phony narrative?) if they stay the course rather than succumb to the marriage-and-family rut. Under the circumstances—and here comes the second hypocritical tier—why should anybody be wasting time upon the yet more distant texts, some of them several centuries distant, which scholars are always viewing with greater powers of magnification through their supposedly finer lenses? What have Donne and Herbert to say to us now? In other words, the authors whose study was to bestow immemorial glory upon one’s mortal days are themselves but passing fads. At most, such dimming asteroids merely provide an occasion for the literary “analyst” to show off his or her state-of-the-art gear. If this tear-down-and-rebuild exercise in futility has some sort of constructive purpose, it can only lie in the opportunities manufactured to erect academic careers, with the latest research constantly being trumped by the latest latest research. Certainly the whole process sheds no aura of reverence upon the original texts as products of genius and transcending expressions of humanity. The one substantial reality to emerge is a longer curriculum vitae with a better shot at winning promotion and tenure. One can see how the deconstructionist creed that all destinations are mirages might appeal to a troupe of such pilgrims without a holy land.

The currency of the research project’s documentation is a tip-off to the hermetic, “insider” character of the game. I have always emphasized what I consider to be outstanding works of criticism in performing research. Walter Ong’s Orality and Literacy, Peter Da Sa Wiggins’s Figures in Ariosto’s Tapestry, Alain Renoir’s A Key to Old Poems… such books, in my estimation, will never be replaced or outdated. It does not particularly bother me, therefore, that I cannot easily access the very latest research. As a full-time teacher whose job description thrusts him into the classroom, I have very little time for attending conferences and conducting library searches; and as a teacher at a relatively small university, I do not enjoy the resources of my Ivy League counterparts. No matter. I feel that presenting works over and over to undergraduates who often enter the semester with no inclination to study literary classics gives me an advantage. It forces me to express as objectively as I can the aesthetic quality—the beauty and power—of what I teach. This, in turn, directs my attention explicitly to qualities within a given work that I might never have noticed in a rarefied scholarly atmosphere. It provides me a kind of compass in my personal and lifelong study of the literary art. When I find critical commentaries along the way that are traveling my road, I link arms with them. When I discover during a quick review of recent research, on the other hand, a facile plaint against an ancient text’s sexism which displays no cultural awareness and no inkling of any mythic/traditional background, I quickly pass on. Thanks to many encounters of the latter sort, I have concluded that, in our time, a scholar’s depth of assessment is inversely proportional to his or her volume of publication.

But then, I have no professional “voice” in the matter, for I do not get published in “scholarly” circles any more. I have essentially given up trying. The system rewards those who have the time and resources to enter the hall of mirrors, as I have called it, and take their own small place in the refractive orgy. The rich get richer. The elite few grind out conference papers and five-page articles echoing the “research” of “scholars” with whom they attended grad school or under whom they studied; and this, in turn, renders them more employable and promotable—which, in turn, eventually positions them to reward the “right” sort of contributions themselves (as they are chosen to review submissions to prestigious journals) and to nix the attempted intrusions of non-members. On Wall Street, the academic elite would condemn such behavior as insider trading; on the exclusive greens of The Masters Tournament, they would roundly revile it as snobby, sexist, racist, and classist. Within their own ivy-covered ivory halls, it’s business as usual.

Every time an undergraduate English major is instructed to produce a paper thickly documented with recent research, he or she is initiated a little further into this cultic mentality. Analytic recognition is no longer awarded to the authors that Milton, for instance, would have read and mentioned allusively; other authors admired by a particular author under study are just a charnel house of “dead guys”, rank and unwholesome. A senior might compose a seminar paper—the “culminating experience” of her degree work (yes, she’s probably female)—on Paradise Lost’s sexist representation of Eve, larding her discussion with last year’s jargon-encrypted research; yet the same student will never have read a line of Spenser, let alone Homer or Virgil, and will not so much as have suspected that Tasso and Ariosto exist in literary history (an ignorance likely shared by her instructor). The game’s the thing. Play the game right, and you will be admitted to mysteries withheld from the dull eyes and ears of bourgeois vulgarian hordes.

Once again, and for the last time: young people with a Bachelor’s in English do not tend heavily leftist/progressive because their professors are haranguing them about Das Kapital and forcing them to attend performances of The Vagina Dialogues. Their major, rather, conditions them to think in terms of progress and elitist superiority. They do not endorse redistribution of income (others’ income) and fully centralized regulation of personal habits (others’ habits) because they have absorbed the ideology of the Hive, or can even enunciate it. Images of a disrupted status quo spiraling into golden ascent have simply seeped into their daydreams, from one direction—and inner-circle, top-down patterns of authority have calloused their hearts, from another. Would that their mystification could be dispelled as easily as by a fresh breeze of common sense and everyday life!

John Harris is founder and President of The Center for Literate Values.  He teaches writing, literature, and the Classics at the University of Texas at Tyler.

Leave a Reply