The Center for Literate Values ~ Defending the Western tradition of responsible individualism, disciplined freedom, tasteful creativity, common sense, and faith in a supreme moral being.
P R A E S I D I U M
A Common-Sense Journal of Literary and Cultural Analysis
7.3 (Summer 2007)
courtesy of artrenewal.org
Orality and Literacy Revisited: Beleaguered Allies Against the Technical Onslaught of the Visual
John R. Harris
It is not unusual for the readers of Praesidium to swap stories about the horrid grammatical gaffes and boners committed by young writers under their professional care. Occasionally such grim tales of linguistic mayhem find their way into these very pages. Thomas Bertonneau reviewed several patterns of abuse five years ago and came to the conclusion that “we live in the aftermath of a cultural calamity—the disappearance of mass literacy, as lamented by [Susan] Ozick.”1 To be sure, instructors have long traded yarns about discipulary incompetence in the lounge over a cup of strong java. One of my all-time favorites was the Spanish teacher’s whose ingenuous pupil, confusing the amicable hola! of the plaza with the exultant olé! of the bull ring, wanted to know why matadors sidle up to their horned adversary and say “hi”. That exchange, if memory serves, took place when Jimmy Carter was president.
Professor Bertonneau concurs with many of us who have served hard time in the nation’s classrooms, however, that something new is afoot. A misinterpreted sound is one thing. I could tell a few stories about myself as a student on that score (such as my supposing, throughout most of an undergraduate course on Indian history, that the abbreviation for “United Provinces”, “U. P.”, was some ancient Hindi place name anglicized as “Upee”). Of quite a different order, surely, is the error which involves no faulty processing of aural data, but rather a perplexity over how to put familiar sounds into conventional writing. During the spring semester of 2007, while teaching three classes of Freshman Composition (closed at 22 students each), I was able to compile the following list of remarkable misspellings more or less common among approximately 10-25% of my sample (depending on the specific blunder):
• “American’s” as the plural of “American”
• “amongst” for “among”
• “can not” for “cannot”
• “deep-seeded” for “deep-seated”
• “everyday” for “every day”
• “four-fathers” for “forefathers”
• “oftentimes” for “often”
• “pet stool” or “petty stool” for “pedestal”
• singular and plural indistinguishable in words like “realist” and “idealist”
• “wonder” for “wander”
Now, to be fair, some of these students in certain cases are just saying “hi” to the bull. The middle and final “d” and “t” are not well distinguished throughout much of the United States . Especially in the West, one always hears “seated” pronounced homophonously with “seeded”, and the metaphor of the deeply planted seed makes at least as much sense as that of the deeply cushioned (or short-legged?) chair. I have also seen (though not this semester) the expression “taken for granted” rendered “taken for granite”. In Black English of the South, a final “d” usually emerges as a “t” (even Condoleeza Rice does it: the same thing happens, by the way, in Gaelic), while the palatalization of a “t” after an “n” is, once again, especially common in the West. As before, also, the invented metaphor makes sense—is, indeed, far more palpable than the original version.
Professor Bertonneau would probably point out that the shift in such cases from more abstract metaphors (a seating to a seeding, a concession to a stone) is in itself symptomatic of an unsteady literacy: the world of invisible generalizations has clearly lost ground to the world of immediate perceptions—Walter Ong’s sensorium.2 I believe this distinction to be entirely valid and, in the case of my students, very probably justified. One could say the same thing of “four-fathers” and “pet stool”. That is, the student struggles to spell a word which he or she has occasionally heard pronounced, and ends up creating some extravagant but colorful image from immediate, familiar circumstances. Yet here, I would argue, more has happened than a turning away from the mind to the senses. For the student-writer remains, after all, in the mind—the image which cues the misspelling is a flight of fancy, based on no already circulated picture. The creating mind has failed to review its product and to recognize that the proposed image is simply too cramped, too subjectively spun, to have achieved the broad coinage necessary to account for the word. A new line of defense has been glibly trampled, a kind of communal awareness (if not basic sanity) which Ong’s oral-traditional tribesmen would have possessed in abundance. In moving from seeds and stones to the fathers four and pets who poise on stools, we have advanced from the plausible to the patently absurd. “Pet stool” irritates me in a way that “deep-seeded” does not. Why would any reasonable adult, faced either with having to ask a teacher how to spell “pedestal” (since his or her existing knowledge would not suffice to make browsing the dictionary profitable) or with simply selecting another phrase, not opt for the mere verb “worship”? Can students really be so… as they would say, clueless?
Of course, the answer is “yes”. The contemporary student does not flinch to type out concoctions like “pet stool”, because he or she lacks that degree of familiarity with formal, printed English which might send up a warning signal. Our students can read, but choose not to. Virtually all of the sources from which they draw their knowledge of the language are electronic: television, movies, radio (as represented by “downloads” on their “iPods”), the Internet, cell phones, “text messaging”, even books on disc which often allow keyword searches and are saturated in visual aids. We will return to the emphasis of the visual shortly. For now, I stress that we are witnessing the effective demise of book-reading. The word “pedestal” would be infinitely more likely to appear on a printed page than in a cell-phone conversation. In most electronic media, neither the proper spelling of words nor the fully conventional, non-dramatic context of their usage would come clear. In fact, a person who had seen “pedestal” in print even once could surely be expected to erase all notion of cuddly animals in collars, though most likely not retaining an exact copy of the orthographic fingerprint. Similarly, any thoughtful person whose eyes had passed a single time over the word “forefather” might be supposed to remember that it does not imply a numerical reference. I conclude, then, that my students must only have heard these words. In the case of “pedestal”, I would think it entirely possible that several of them could later see the word in print and not even connect it to the one their imaginations had outlandishly fabricated. Perhaps they would mentally insert a blank and pass on to the next word.
As for the confusion of “wonder” with “wander” or the deletion of the pluralizing “s” from words like “realist”, all that has just been observed applies a fortiori. I will stipulate on my students’ behalf that the word-pairs at issue are homophones in the Deep South—but this should not be of great relevance, for they are also quite common. Most students will avoid spelling the word “knot” as n-o-t or the word “sea” as s-e-e. Distinguishing “wonder” from “wander” or adding an “s” to “realist” should not be much more difficult; yet, apparently, it poses a mighty stumbling block for my classes. I can think of no possible explanation for such ineptitude with words employed universally, I should guess, at least once a week other than that they are used only in speech—never read on a page.
At this point, I must adduce in evidence a couple of incidents the like of which I have never experienced as a teacher, and which establish, to my mind, incontrovertibly that many students today have about the same level of familiarity with a book as I have with an iPod. I wanted my students to be exposed to several works which did not appear in their Norton reader, so I dutifully prepared a packet of photocopies for distribution on the first day of class. Many of the works were excerpted. Naturally, I included the title page of every book from which I borrowed: in the case of excerpts, I followed this immediately with the assigned material. Straightforward enough, one would think—yet some of my students insisted otherwise. About a dozen of my 60 or so claimed that I had condemned them to a fruitless hour of searching from end to end of the packet for the author du jour named in the syllabus—that they could find this name only on a rather bare page with the book’s title. My assumption that they would recognize the next page, beginning with “Chapter Two”, as belonging to the book the way a file belongs to a website was sadly mistaken. I encountered a similarly sobering response when we came to the pages copied from the Loeb Classical Library’s edition of Boethius. The Loeb texts (published by Harvard UP) always match the Greek or Latin original with English translation, page for page. By convention, the English invariably appears on the right—i.e., on odd-numbered pages. Now, I know better than to copy anything “foreign-looking” with an assignment. Two decades ago, students might have been excited to find a strange script like ancient Greek gracing the margins of their text: today some of them are apt to grow highly indignant. (A comment on an evaluation once fumed, “Why is he stupid enough to think that we can read Greek?”) So the problem in Spring 2007 was not that certain members of my classes were hurt or confused by my perplexing them with alphas and omegas—no; it was that all pages in the assignment were odd-numbered. Approximately the same percentage as before—indeed, very nearly the same group of individuals—claimed that they had been unable to complete their reading because they had not been given even-numbered pages. It did not occur to them to transport the sense of a phrase interrupted at the bottom of one page to another page and see if the result proved coherent.
These students know that English is read from left to right (they do not know, I daresay, that a language can possibly be read in any other direction). They know that one opens a book’s cover from left to right if one wishes to find the first page. A significant minority of them, however, is unfamiliar with the convention of placing a title page immediately inside the cover or of dividing the contents into chapters. If the arrangement of pages is at all idiosyncratic (I also squeezed four pages of Boethius to one of my photocopy by moving left-to-right, then bottom-left-to-bottom right), they are taxed to ferret out the flow of meaning. The instant panic they register when faced with the mildest oddity of this nature reminds me of my own when I have to navigate a large website in search of just the right “easy access” form. They are becoming—some of them—as distant from our once-common Western literate inheritance as I am from the latest electronic toys and games.
In an upside-down manner, this alienation from the written explains the prominence in student compositions of archaisms like “amongst” and “oftentimes”. I can say honestly that I have never in my life heard a student use either of these long-outdated words in private conversation or in class discussion. How do they blunder their way into essays? Precisely by dint of their otherness: they are “literary gingerbread”, like the “Ye Olde Candy Shoppe” folderol that one sees occasionally at the shopping mall—unconscious caricatures of a defunct body of manners and customs. To be sure, the gap between literary and spoken language has seldom withered to such negligible proportions as one finds in modern American usage, a fact which is abundantly advertised in student writing, as well. Colloquialisms like “okay” and “no way”—and even scurrilous slang on the order of “crap” and “suck”—proliferate in ostensibly formal papers. Yet the student is also aware of making a sacrifice to an unknown god, and he dutifully throws upon the altar from time to time a morsel which he thinks will surely find favor. “Amongst” certainly sounds formal, doesn’t it? Nobody ever uses it in common parlance: ergo, it must be prim and proper. Though a smattering of exposure to good writing for the last century would have left the student in no doubt that such words are in fact too antiquated to pass muster, such exposure is just what his preparation lacks. For the same reason, by the way, the solecism “between he and I” proves to be almost incurable. The correct “between him and me” sounds altogether too much like what an ordinary person might say in a real conversation, so the other form must be the right one. Inversion of the familiar makes the god smile.
Naturally, the root cause of dividing “cannot” into “can not” and of forging “every day” (as an adjective/noun pair) into “everyday” is also deficient exposure to printed texts. How often does the word “cannot” crop up in the typical book—perhaps every other page, at a modest estimate? How few books must a student have read by the ripe age of eighteen not to be aware of this convention? Fewer than ten? Fewer than two? The case of “everyday” is at least vexed by the logic of grammar—a logic apparently too intricate even for the proofreaders of major publishing houses.3 It requires that one be able to distinguish between a noun and an adjective. In these times, that’s expecting rather a lot.
And what shall we say of “American’s” as a plural of “American”? Readers may be skeptical when I assert that this kind of error was among the most common I noted: I would guess that fully a quarter of all my freshmen designated a normal English plural form at least once in the semester by placing an apostrophe before the “s”. Is this more “gingerbread”—more salting of the text with mysterious extravagance to appease the inscrutable spirit of grammar? But apostrophes do exist, and are still used—quite frequently before the “s”, as it happens. To muddy the waters further, that canny reader who would guess that students often drop the apostrophe before the possessive “s” would be entirely correct. Indeed, such lapses tend to occur in the very papers which have misused the apostrophe in forming the plural. My own assessment is that we see here the same phenomenon of alienation from books. How could anyone possibly have read half a dozen books in his or her life and not know how to form standard English plurals? At the same time, no student could possibly graduate from high school—even today’s high school—without having had the existence of apostrophes pounded into his or her callow skull. The student is aware, then, that some situation or other where you add an “s” calls for an apostrophe. The formation of plurals seems to fill the bill.
Here I wish to turn an important corner. The last couple of paragraphs—involving misused apostrophes and the wrongful splitting or merging of words—do not address cases which proper speech might have ameliorated. You cannot hear how “cannot” is spelled. For the most part, however, everything I have offered by way of testifying to the contemporary student’s alienation from books also bears witness to an impoverished oral/aural environment. Absurd misspellings like “pet stool” would be opposed from two directions in a community of competent speakers. First, the word would be more correctly elocuted—at least the student could approximate it to the point of making a visit to the dictionary fruitful. Secondly, an environment of living, breathing, verbally fluent people would greet a monstrosity like “pet stool” with laughing disbelief. “What did you say? A stool for a pet—is that what you think the word means?” The youthful transgressor would be mortified, yes—but such mortification is a healthy part of growing up. In the long run, it teaches common sense. It makes one reluctant to float publicly the first silly association that leaps to mind; one must review the hypothesis, rather, and ask, “Is it reasonable that other people might really have intended such an image?”
The student’s community, alas, has little connection to the vibrant, functional tribes which Ong and others contrast with literate culture. We are not either oral or literate: our children prove that we may turn out to be neither. Thanks to electronic speech—the conduit through which most of our young hear the English language modeled—not only has interest in books been siphoned away, but the truly participatory give-and-take of verbal exchange is nullified. This is the real tragedy, and not simply that professional jabberers on TV and radio use bad English. Some of them do not. My students, to tell the truth, probably first encountered the word “pedestal” in the lines of some talk-radio Cicero or perorating TV attorney—they certainly didn’t make its acquaintance on their cell phones or in chit-chat with roommates or classmates! The cell phone, indeed, is the closest of all these gadgets and gismos to an active exchange a quattro occhi: one may even bring up the image of one’s interlocutor on a small screen, as I understand. Yet at the same time as one blabbers away into one’s palm, one shows complete indifference—rude indifference—to all proximate living bodies forced to overhear a private conversation. These unwilling participants are forbidden by one of our few remaining social taboos to break in and observe, “The phrase is deep-seated”—let alone object, “You shouldn’t have another date with someone who acts like that.” Virtually every level of enrichment which comes of spoken interaction is compromised. My own suspicion is that even the reluctant bystanders in cell-phone society carry away a reprehensible lesson: that they should tune other people out and bury their chin in their collar.
The great orator or the brilliant conversationalist well knows, on the contrary, that he or she must cultivate the active awareness of people within hearing distance—that the bounds of common sense are demarcated by jabbing elbows and shaking heads. This virtuoso would notice if a brow furrowed or an auditor started to titter. If his southern drawl were too deep, he would be exposed to the ridicule of being questioned, “Do you like your final year of schooling, or do you lack it?” For an aspiring political candidate to proclaim in general company, “He was just wondering about as if he was lost,” or, “It’s just a bunch of communist,” would be to invite instant and not very flattering judgments. Even the rank and file prefer to be represented by Counselor Dan O’Conell rather than by Patch Noreen of Black Rock.
José Ortega y Gasset reflects in the “preliminary note” of The Revolt of the Masses upon vulgar Latin’s degeneration. The parallel which this epochal decline poses to our own culture’s case is not perfect, but more apt, perhaps, than most such gestures at Rome’s collapse. For in ancient Rome we find the curious phenomenon of a society apparently losing its oral eloquence after centuries of growing slowly more literate, and very likely because of that growth: the condition of the West, and especially of the United States, in rough facsimile. Plautus and Terence left us a fairly reliable record of how people must have chattered in the streets, and it shows us that Romans of the republic observed such grammatical minutiae as proper noun inflections and strict conditions for employing the subjunctive mood, even though they contracted words liberally and entertained each other with lively slang. That demotic speech in the Christian era had become a comparative broth without seasoning is revealed most simply by the emergence of modern Europe’s progenitors in places like Gaul, Spain, and Italy itself—dialects from which perhaps 90% of the classical inflections had disappeared. Of course, those who most abundantly used the degenerative forms would have possessed little or no literacy themselves. Reading and writing do not “dumb down” their practitioners: the chain of events has more than two links. Probably the proliferation of literacy, rather, led to a general neglect of memorizing inherited rules and forms, since these were now “extroverted” in the form of literary texts. Ong reminds us that Plato’s Socrates had protested in the Phaedrus that “writing destroys memory. Those who use writing will become forgetful, relying on an external resource for what they lack in internal resources.”4 Even those who did not use writing—the very class of person, obviously, most dependent for all forms of cultural instruction upon the more privileged—would suffer as traditions were set down on paper rather than passed along orally: perhaps they most of all. So the literate Roman proceeded to become, in a sense, too educated—too insulated from his broader community by the more private habits of the literate life. The bread, without its leavening, baked flat.
This is my brief summary of an extremely complicated historical transformation. Ortega y Gasset offers not even so much of a causality as I have: he is more interested in its aftermath.
Among the un-Hellenized portion of the Roman people, the language that thrived was that styled as “vulgar Latin”, the matrix of our romance tongues. This variety of Latin remains largely in the shadows and, for the most part, can be approached only through reconstruction. Yet what we do know more than suffices to create a shock about two of the language’s qualities. One is the incredible simplification of its grammatical mechanism in comparison with classical Latin. The rich complexity preserved in the language of the upper classes became supplanted by a plebeian speech, its mechanism quite plain and yet, at the same time and for the same reason, heavily mechanical, like a solid material. Grammatical forms were clumsy and overdone, full of the abortive and the roundabout like a child’s parlance. In fact, it was a childish or “baby” language that did not permit fine reasoning or lyrical flights to flourish. It was a language without light or temperature, without expressiveness or fervor of spirit—a sad language that advanced by blind probing….
The second humbling quality of vulgar Latin is precisely its homogeneity. Linguists, who are perhaps (after aviators) the brand of person least disposed to take fright at anything, appear not to be staggered before the fact that the same tongue was being spoken in countries as disparate as Carthage and Gaul, Tingitania and Dalmacia, Spain and Rumania. I, on the other hand, who am so timid that I tremble when I see a breeze rustle in the reeds, cannot repress a profound mental terror before this condition. It strikes me as simply atrocious. The truth is that I try to represent to myself how things must have been within this phenomenon which appears to us from without as homogeneity: I have managed to uncover the living reality of which this fact is the silent footprint. Let us grant, of course, that there were Africanisms, Hispanisms, Gallicisms. Yet to bring this much to light is to be left with the observation that the language’s trunk was common, identical—in spite of the distances involved, the rarity of commerce, the difficulty of communication, and the absence of any literature to forge connections. How could the Celtiberian and the Belgian, the resident of Hipona and that of Lutetia, the Mauretanian and the Dacian, come to such an accord except by virtue of a general leveling of outlook, reducing their existence to its foundation and nullifying their lives?5
Of course, one is tempted in the wake of scholars like Ong, Eric Havelock, and Albert Lord to recast Ortega y Gasset’s contrast of “baby” Latin with the Latin of the educated as a highly literate man’s embedded prejudice against the oral. I think that would be a mistake. The Gauls and the Dacians already possessed their own ancestral tongues when the Romans arrived: the contrast Ortega y Gasset gropes after must not, therefore, be that between a tribal atavism and an individual-based progressivism. No doubt, the literate onlooker does indeed tend to find the tribesman’s formulas and rituals oppressive when contemplating them from the outside. The suffocation of ideas occurring here, however, is happening at least in part because literate, sophisticated Rome had severed native cultures from their traditions and grafted in their place a sameness at once too narrow to be literate and too shallow to be traditional. These lands had been “colonized”, to use one of today’s hot-button political terms. The imported and transplanted Latin language both limited their horizons and bent all roads to a single city beyond the horizon.
One might say that the victims of such a cultural regimen tend to turn stupid: this is, indeed, precisely what Ortega y Gasset writes just before the section cited above (“los hombres se han vuelto estúpidos”). Colonization is stupefying. The oral-traditional mind knows how to think quite well, though it admittedly thinks circuitously by association and with frequent reference to the tangible rather than, like the literate mind, with linear logic serving abstract principle. In a significant sense, these people—the homogenized, the colonized—do not think well at any level or by any measure.
It is my thesis here that we are the colony of our technology in much the same way as if a foreign power had forced all our communications into a grammar and diction largely new to us and not handled well. We still have minds—some of us have exceptional minds; but how are even the brightest of us to think with such a poor vocabulary at our disposal and a mere wreckage of rules governing intricate logical relationships? Marshall McLuhan celebrated the effects of television upon our psyche (in his youthful days, at any rate) as a liberation from literate habits of thought to something more vibrantly primitive: of him, more later. Professor Bertonneau, frankly, is closer to my own view in regarding this shift as impoverishment—“leveling” (achatamiento), in Ortega y Gasset’s very apt word. He sees the decline of our students’ expressive skills as a mere relapse into orality, wherein I would part company with him; yet elements of the case he builds are quite compelling:
Student “orality” betrays itself in a number of guises: a childish prose, full of technical defects, depending heavily on the transcription of oral formulas based on the first and second persons, as though what one writes were merely a graphic version of what one says; a reliance on paratactic utterances pointing to a concomitant unfamiliarity with hypotactic procedures; a nearly perfect lack of analysis, either grammatical or logical; only the dimmest notion of causality; a lack of even the narrowest repertory of allusions and references, such as to an historical chronology or to a scientific or belletristic knowledge; a crude rhetoric of ego-assertion and resentment; a subjectivity that exploits a ready-made vocabulary of simplistic relativism, often expressing itself in a sweepingly judgmental condemnation of judgments; and a penchant for emotional posturing and what might be called affective argument (except that it is not really argument). Contemporary student writing is, as Ong might put it, close to the human lifeworld, agonistic, egotistical, aggregative, and formulaic.6
To be sure, some of the tendencies I noted in my students seem perfect illustrations of the oral mindset. For instance, many have the greatest difficulty phrasing any kind of subordinate proposition—they cannot handle logic, as Bertonneau asserts. A phrase like “in which” will be tossed in the direction of a hypotactic link, producing some cousin of the following freak: “A central government, in which we all must elect and respect its decisions, is necessary for a country’s survival.” A more plain-spun, and probably more frequent, example is the “that if” style of butchery: “He’s somebody that, if you make him mad, heads are sure to roll.” The pleonasm in “that… him” smacks of the guileless oral afterthought—the scrambling around in mid-sentence to locate one’s intent precisely. Ancient Greek scholars sometimes speak of the “lilies of the valley” construction, where “how they grow” comes trundling up from behind like a caboose without brakes to pose an alternative object for “consider”. The Greek Gospels team with such examples: “I know thee, who thou art,” and so on. In the Hellenic tradition, pre-Socratic literature abounds in proleptic creations like, “I saw him, how he ran.”
The contemporary student’s general ineptitude with prepositions is probably also best understood as a retreat from established convention—of which he or she knows virtually nothing—to that overt, tangible world of the oral mentality. For instance, criticism of a person or idea may now emerge more graphically as toward or around that person or idea. Prepositions in Latin were either followed by the accusative or the ablative case, depending on whether their objects had originally been viewed as receiving action or, for the ablative, transmitting it or simply being its indifferent locus; so we know that at some early stage in the language’s development, prepositions were usually conceived of as designating external, observable relationships in space. In Homer’s Greek, this is even more obvious—for prepositions remain free-floating adverbs in the two great epics of Greece, and they may even be repeated in the prefix of a verb. One can imagine the bard’s hands gesturing, just as a student may gesture when reading aloud a line such as, “Their sympathy toward her was more because of animosity from the community.”
What earnest composition instructor has not plucked out a few hairs over commas used to mimic oral emphasis? “I refused to go, and, I told him the reason why.” As much as these young people love to fiddle with fonts, they can only be slighting the “italic” function in favor of the comma because their voice pauses after “and” in speaking the sentence. My particular bête noir in this category is the use of “so” as an intensifier—i.e., synonymously with “very” or “quite” (“Capital punishment is so unfair!”). I have given up trying to impress upon my captive audiences that this tiny word must be followed closely by a result close. In the spoken language, however, such abuse of the word has a long history. Furthermore, in other cultures with an active oral component, similar words are similarly abused. I think of how the Scots sometimes intensify adjectives with, of all things, “that”: “I was that surprised, so I was!”
Most readers will not be surprised at all, however, to hear that “so” is now intensifying entire clauses in student parlance: “I was so ‘this can’t be happening’!” What rough beast is slouching into mainstream English behind this monstrosity? With apologies to Dr. Bertonneau, it doesn’t resemble any creature from orally transmitted myth or saga. In fact, when I first began to notice the usage, it was always very plainly humorous—almost ostentatiously located in certain television advertisements, as if to say, “See how attuned to teens we are? See how outrageously we have twisted the language?” No doubt, adolescents mercifully far from my hearing had been intensifying their clauses long before TV marketers grabbed the trend by the coattails; but even today—this very day—as I overhear students jabbering behind laptops and into cell phones, I fancy that I detect an ambition to be somewhat “cute” in concocting the most unwieldy, unlikely phrase possible to follow the racked and wretched “so”. Nothing remotely analogous from any oral tradition occurs to me. I believe, rather, that the clause following the “so” is felt and delivered as a dramatic line, with the intensifier’s functioning as opening and closing quotation marks—or, better yet, as a screen. I believe that young people more and more view themselves from the outside as players or actors being perceived in a certain part, a certain mode. For that matter, the “so” phenomenon just described is of exactly the same order, in my opinion, as the yet more widely observed “like” phenomenon: “I’m like, ‘This can’t be happening,’ and he’s like, ‘Don’t blow it all now,’ and I’m like, ‘What’ll I do if they ask, like, where my mother lives?’” How on earth is one to understand this push-button evocation of another register with the “like” key except as—once again—a series of quotation marks, of lines delivered out of an impromptu drama?
Let me be clear. Oral communities, I concede, are shame- rather than guilt-driven. That is, they do indeed have a very strong sense of reputation—they are intensely preoccupied by what the neighbors may think. Yet the “act”, if such it be, that they put on for others and themselves is very narrowly indexed to precedent, and to very ancient precedent, in most cases. The behavior they accept and strive to imitate has satisfied a test of time before the review of several generations. We know that abominations like human sacrifice can occasionally pass such muster, so the review is not morally flawless. I say only that it exists. Our young people, on the other hand, increasingly have the mentality of actors in search of a script. Their lines might express just about any view at all: like true thespians, they throw themselves with equal dedication into the part of hero or of villain. They possess the “being watched” mentality of pre-literate tribesmen without being bound by the tribe’s anchoring taboos. And perhaps most consequentially of all, they have, instead of the tribe’s rich legacy of proverbs and fables, a superabundance of gestures—not Homeric accompaniments to a verse sung emphatically, but gestures panicking in a verbal near-void.
For it strikes me that one of the critical characteristics of the “framing” or “screening” use of “so” and “like” is that it elicits a kind of miming. The hands flap, the eyes roll, the head tilts… the verbal message’s sender, as he or she grows ever less verbal, supplements lost meaning with visual display. In the last decade or so, the wearing of rings and tattoos has come to be described as “self-expression”. Students will tease and torture their hair into the most unlikely positions in order to “make a statement”, often adding two or three highly artificial colors by way of exclamation point. Clothes threaten to fall away from forbidden regions—and clearly not (one often realizes upon getting to know the person in question) as a sign of eagerness to have sex. The visual display, rather, is an announcement that one is a sender—that one has something to say, probably something quite urgent which the miserable adolescent couldn’t begin to put into words.
Tribes, to be sure, have their paints, feathers, and tattoos. Yet to think of a Mohawk warrior’s hair as the cultural equivalent of a student’s selectively sheared mop is indeed rather too blunt. The true tribesman’s adornments have profound meaning for him of a highly referential sort. They declare his place in an immense line stretching back among ancestors, time out of mind. The student’s baubles and plumage, despite identifying him or her sometimes with a certain set about campus, have a very delicate currency. Much of their message’s mood is subjunctive—“if you know this and if you know that”; a broad publication of the nose-ring’s or tattoo’s social connotations would neutralize their delightful aura of secrecy. Hence the message’s Gnostic terms must constantly be renewed. The Mohawk, on the other hand, fully expects everyone in his territory to understand the meaning of his feathers. He might well consider himself mortally insulted if he found out otherwise.
So the real meaning of the student’s visual displays, whatever specific allusions they may carry, is in the direction of the unique. Our young people hungrily crave individualism, even in their cliques and gangs. I believe that this craving has mushroomed precisely as—and precisely because—their linguistic skills have dwindled to the point of confining them within Ortega y Gasset’s suffocating sameness. Because they cannot speak their way out of the anonymity to which they have been reduced, they seek to act, gesture, and flash their way out. The irony—and it is a very bitter irony, for it seals these young lives within a closed labyrinth—is that visual images are far less capable of poignant, particularized expression than well-made sentences. One picture is not worth a thousand words; or if it is so, this can only be because the picture is displayed in a community where people talk constantly, fluidly, and intelligently. A large-eyed child brooding on a doorstep could mean any one of a thousand things—but only to people who talk, read, and write about innocence, sin, despair, alienation, growing up, and the meaning of life generally. To the neighborhood drug-dealer, the child is just a potential customer or a potential witness: nothing more.
It seems to me that the intellectual content of student essays—that facet of them which genuinely concerns Dr. Bertonneau, me, and any other responsible teacher far more than grammar and orthography—is best understood in the light of this excessively visual mindset, and not as an expression of “oral thinking”. Exactly what, I would ask, is oral thinking? Is it synonymous with fuzzy thinking? It certainly prefers the specific to the general, and the general to the abstract; but is abstract thinking always less blurred by detail, more categorically valid? Does a politician hundreds of miles behind the front line assess a war’s risks better because he sees no mangled bodies being carried away? Is the pre-literate tribesman more egotistical? In some sense, no doubt. But isn’t it the literate, with his enhanced grasp of the self/other distinction, who is alone capable of true self-absorption? Achilles’ “selfishness” consists of being painfully aware of his diminished status within the community. He grows more self-aware as he ponders his situation in the privacy of his tent—an unintended meditative retreat which very nearly leaves him ready to accept worldly obscurity in return for a long life of internal rewards before Patroclus’s death draws him impulsively back into battle. Was Achilles, then, behaving more “orally” when he sprang up and denounced Agamemon, more “literately” when he informed the king’s delegation that mere life outweighs wealth and fame, and then “orally” once again when he charged off to avenge his friend? But he certainly hadn’t been learning to read in his tent!
The useful and much-needed work which Milman Parry did in comparing Homer to Slavic performing bards founded a new scholarly field, and the ensuing labors of Lord, Ong, and others have greatly heightened our awareness of literacy’s full meaning. Yet this scholarly trend has also implied the existence of a dynamic strain between the spoken and the written which I now believe to have been often overstated. The truth is that the oral and the literate, in the normal course of events, support one another as much as they undermine one another. Achilles needed time, not literacy, to think more clearly; but one of the new habits which literacy has typically forced upon thought is a more reflective consumption of time. Ong’s ground-breaking book leaves the impression that literacy somehow refined the visual dimension of thought: its later chapters document gradual changes in the first printed books. A progression is observable, for instance, in how brief words were soon no longer hyphenated and how the most important words in titles were soon awarded the largest characters. Ong claims that such alterations appeared in tandem with the tendency to read silently rather than aloud.7 I find this association of ideas tenuous. Why strain after so involved an explanation when one may simply say that, as printers advanced in their craft, they reflected more on how to reinforce the text’s logical transitions with visual cues? The later printer had given the matter more consideration because he was later—he had benefited, like Achilles, from an extended period of reflection. Whether he and his loyal readership were now keeping their lips sealed as they gobbled up printed pages is surely quite irrelevant.
In fact, ancient authors wrote abundantly about the proper way to lay out an oration: literacy had evidently sharpened their sense of how to speak more coherently, more reasonably. Yet they were speakers, as well: Aristotle and Quintilian were teachers, Cicero and Tacitus lawyers and statesmen. Because they spoke, they were aware of argumentation as a means of influencing sensible people. Indeed, although Achilles does not learn to write in his tent, he delivers to Agamemnon’s embassy a much longer and more persuasive exposition of his view than he had managed in the heat of the epic’s opening book. Yes, the division of both Homeric tales into books was the work of Alexandrian librarians; but the divisions generally make sense, and we can hardly imagine that the same librarians added dozens or hundreds of lines to raise the seams dramatically. Literacy tends to draw out the logic latent in any concatenation of words; but the words, in their original form, are not pre- or anti-logical.
To find the contemporary student, then, unable to avoid tendentious, cliché-ridden palaver and surface-level self-contradiction in a 500-word essay is less a sign that he is devolving from an Aristotle into an Achilles than that he is an offended Achilles rather than a meditative one. Knee-jerk judgments determined entirely by emotion are characteristic, not of the oral tribesman, but of the roused oral tribesman (and, to a lesser extent, of the roused literate man of the world). The question, then, is not so much why our students don’t read more or write better, but why they seem to be so aroused all the time—so whimsical, so petulant, so sullen, so frivolous, so governed by superficial emotion. Consider some of the irreconcilable responses expressed in my classes by the same student as he or she hashed through questions of culture and value:
• The subject of one series of essays was whether a government should be expected at all times to abide by the same moral principles as the citizens under its control. (Viz., if it is wrong and legally punishable to break a contract, should it not also be wrong for national leaders to make covert deals in conflict with public policy?) A thunderous majority—perhaps 90%–of all my students insisted that government should enjoy discretionary powers never granted to private individuals (this is, after all, the first generation to have passed its entire adolescence in our post-9/11 phase). Many expressed such tolerance as a holy obligation: the language was especially striking in that I teach in the heart of the Bible Belt, and I should indeed conjecture that at least 60-70% of my students were reared in the framework of Protestant fundamentalism. “If you can’t trust your government, who[m] can you trust?” wrote one young woman; and another opined similarly, “People must have faith in their government.” Though the immediate object of this abject devotion would traditionally, for most of these young people, be God, the other “g” word had been penciled in without hesitation. The contradiction in making such obeisance to a worldly human power should perhaps be even more doctrinal than logical in their case—for the same two writers, and many others like them, would in a later essay rate devotion to God as their highest value.
• The Bible Belt is also home to much rugged individualism and “don’t tread on me” libertarianism (a fact which some would find paradoxical in itself). I should guess that, once again, approximately 90% of my students would agree that the government dishes out too many welfare checks, has too large a budget, and advances a social agenda too actively. Yet in the context of the essay about ceding government the power to ignore commonly accepted moral principles, such apologetics as the following were abundant: “The government was put into place to do for the people what the people can not [sic] do for themselves.” Note that this curiously un-Madisonian formulation vaguely implies that governments fall out of the sky like Rome’s Twelve Tables—another suggestion that, if not the incarnation of God himself, they are at least God’s right hand. My independent young people also manifested, in a great many cases, a fervent conviction that “ignorance is bliss” (to invoke what one called a “wise old saying”). “The public does not need to know the truth as long as their [the government’s] deception profits them [sic],” wrote one student (displaying the almost universal tendency to refer to abstract collective nouns as “they”—probably another of Professor Bertonneau’s indicators that this generation has to concretize the abstract in oral fashion). Another student explained the position with a Machiavellian twist: “Sometimes it seems that if the government held back some information about their [sic] more risky and complicated actions then it would eliminate some of the complaints and protest [sic] from citizens who do not understand the full extent situations.” Of course, the identity of the singular and plural forms of “protest” belongs to the “-ist” category of “ill-heard, ergo ill-spelled” blunder. But the point of greatest interest here is that the writer has coolly stated the rationale of public deception from the government’s perspective. We did, in fact, read excerpts from The Prince in preparing for the essay. Need I remark that outrage at Machiavelli’s thesis was on the paltry order of one student per class?
• In a closely related matter, we wrote upon the subject of torture—specifically, whether or not it might be justified in certain moments of extreme crisis. Again, students were almost nonplussed at my posing such a “no-brainer”. The consensus was that thumbscrews should be standard issue for every federal agent. I must add that about half of my sample grasped and accepted the concerns I tried to stress about the practical fallibility of torture (people often lie under duress) and the practical complexity of establishing guilt (terrorists frequently seek to take credit for whatever bomb or kidnapping is in the news). One writer was in such an emotional lather over the issue that she could not restrain herself within the bounds of, say, a suitcase bomb threatening Manhattan: she wanted the gate thrown wide-open. “The families of murderers should be tortured if they know where they are and won’t say, for then they are just as bad as them [sic].” I could not shake this person from her extravagant thesis by observing that the law, in fact, does not require close family members to testify against each other, and that natural law is generally understood as approving a parent’s defense of his or her offspring to the bitter end under any circumstances. Yet this same young woman vocally shared her joy in her own toddler on many occasions—and, I might add, once triumphantly informed a general group of bystanders that her fiancé had beaten the rap for possession of illegal drugs and weapons on a technicality. Perhaps this level of self-contradiction was sui generis: that it occurred at all in a college classroom was nonetheless shocking to me.
• I offer two final passages which are also unique, yet which, I fear, typify the kind of incoherence apt to undermine contemporary student writing. The subject was again torture. A particular student thought to plead its case by offering a very odd pedigree: “This concept [torture to extract vital information] is due to the way Christ was crucified on the cross for the good of the people, which explains why the people that believe in torture believe it is morally right in any circumstance.” The reader must accept my assurance that this writer considered himself a devout believer—the comment was most definitely not intended as an ironic indictment of Christian hypocrisy. I cannot explain exactly what was meant by the remark’s second half, but the gist appears to be that those who embrace Christ’s example endorse torture in principle since God incarnate glorified its practice! The second passage is only slightly less deranged, and continues to bother me to this moment because its author was such an exceptionally warm and jovial person. At issue here is the question, not of torture per se, but of the “trade-off” between a desperate deed and the anticipated profit to be achieved through it. “The terrorists weren’t thinking about all the innocents when they attacked the World Trade Center that killed nearly 3,000 people. So why should America consider any of their innocent people? I think that if America has access to nuclear missiles, then we should use some.” Of course, this comment fails even to address the “trade-off” issue, for it urges reprisal rather than preemption. The student, then, has neither answered the question under discussion nor, more broadly and disturbingly, recognized the distinction between a state-sponsored aggression and a terrorist cell’s machinations. Perhaps she felt that the Palestinians dancing in the streets after the World Trade Towers fell deserved punishment for their display—though I doubt that she could have identified just which nation she wanted bombed.
Fuzzy thinking? Incontestably. Literate thinking, informed by careful perusal of relevant texts and quiet, protracted reflection over quill and parchment? Not even close. Yet ought we therefore to call such muddled, impulsive archery at the moon the oral style by default? I don’t see why. One particularly salient aspect of my students’ clumsiness and uncertainty with the issues was their poor verbal performance during class discussions. Those few who could reason well in fluid speech wrote superior papers; those from whose papers I gleaned the examples offered above either had nothing to say in class or rambled on with maddening discontinuity (a single case: the student who wanted to torture the mothers of fugitive murderers). In short, poor argumentation is the natural accomplice, not of a tendency to chattiness, but of muteness. We can all readily cite examples, to be sure, of excellent writers whose voice we never heard throughout their semester-length presence in our lives. I was such a one myself as a youth: perhaps many of us professional educators were—perhaps this is why we find the writing/speaking opposition so credible. I do not contend, however, that silence is hostile to intelligent composition. On the contrary, I contend that the genuine loss for words—not shyness before a crowd, but a verbal vacuity which can no more animate pen than tongue—has sabotaged productive thinking in our time. The quiet student incubating brilliant ideas gazes from the back of the room with bursting eyes: the clown who doesn’t want his car searched without a warrant but is ready to give carte blanche to official eavesdropping fidgets, sighs, flips open the laptop, and text-messages. If you have taught, you know how to tell one from the other.
Text-messaging… we’re back to the cell phone. No new technology better illustrates the degenerate state of our language in the terms used by Ortega y Gasset of vulgar Latin. The owner of this modern marvel both speaks and writes—but in what mutilated phrases, in what inane and impersonal clichés! Various studies have apparently concluded that significant numbers of cell-phone users are in fact talking to themselves when their prattle intrudes upon our peace. The cell phone, then—especially among young people, where the measure of phony phoning can far exceed one-half—is yet another stage prop. It announces to the world that one is popular, important, sought-after. It turns out in such cases really to have nothing to do either with speech or with writing. Like the nose-ring and the tattoo, its primary mode of expression is visual. It exists to be seen.
I might credit Marshall McLuhan (another member of that distinguished Toronto University cohort including Eric Havelock and Harold Innis) with recognizing that television’s emphasis of the visual would alter Western consciousness. Certainly the insight expressed in the comments below is both profound and inarguable:
If technology is introduced either from within or from without a culture, and if it gives new stress or ascendancy to one or another of our senses, the ratio among all of our senses is altered. We no longer feel the same, nor do our eyes and ears and other senses remain the same. The interplay among our senses is perpetual save in conditions of anesthesia. But any sense when stepped up to high intensity can act as an anesthetic for other senses.8
Nevertheless, I must question to what degree McLuhan understood that literacy occupied the invidious end of the contrast rather than radio; and, in any case, his description—shortly preceding the above citation—of the renaissance to be expected as mankind was sensually drawn back into the immediate world comes across nowadays as poignantly, even tragically naïve. We are assured that “our electric technology has consequences for our most ordinary perceptions and habits of action which are quickly recreating in us the mental processes of the most primitive men.”9 If this sounds like a warning, it isn’t. On the contrary, McLuhan romantically foresees a mind-opening return to orality as electronic media bring Western culture full circle from the radically unstable “literate man… a split man, a schizophrenic”.10
The medium most certainly proved to be the message, though—and the late twentieth century’s dominant media most certainly turned out to emphasize the visual. What we must now soberly accept (on the assumption, of course, that knowing the truth may yet help us) is that sight, far more than hearing, antagonizes literate thought. As McLuhan observed with great genius, the way we think is deeply influenced by the media of communication favored at our point in history. What Ong would call the “psychodynamics” of sight are radically different from those involved in hearing. The following distinctions are among the most obvious:
• Vision enjoys a seamless continuity: there is always something to see as long as the eyes are open. The same is not true of hearing. Though we forget it sometimes in our highly artificial world’s cacophonous racket, such a thing as silence does exist. From time to time, one can hear nothing at all. The blinking of the eyes, while reflexive and steady, is not perceived as a lacuna in the same way as a 3 a.m. silence. The correct analogy would be between blinking and stopping up the ears; but even in this case, the blink would have to be unnaturally long (i.e., deliberately sustained)—and why, in the normal course of events, would anyone go about stopping up his ears at intervals? Neither interruption belongs to ordinary experience.
• Because of their “blank” or neutral background of silence, sounds are discrete. They have number and duration. Visual images, in contrast, may be conceived of as a single image, or else divided arbitrarily to the perceiver’s taste. Is a horizon just a horizon, or should rooftops be distinguished from the sky? Should chimneys and eaves be separated from roofs? Is the sky more blue in one quadrant than another? The answers to such questions do not impose themselves with the same objectivity as they would if one asked, “Did the glass shatter before the dog barked? Was there a woman’s scream as well as a bark?” They say Eskimos have several different words for “snow”, each of which introduces a minor distinction. Has any culture ever lumped into one word for “storm sounds” the rattle of raindrops and the roar of thunder?
• Sound involves the dimension of time more than sight in consequence of its being, in reality, a series of discrete sounds. Sight is far less linked to time: its sequences are often as patently arbitrary as the choice to survey a scene from left to right instead of right to left. Even when objects are seen to move, they often acquire a kind of peaceful fixity if perceived without sound (e.g., a distant train’s progress across a plain) or if the attendant sounds reinforce a visual repetition (e.g., waves rolling onto the shore or a field of wheat rippling in the breeze). One can actually lose track of time in such settings. A single discrete sound, however, may bring one sharply out of one’s reverie: a shout, an explosion, etc. A sight may do this, too: a running figure, a falling object, a sudden change of lighting. Yet when sights change, one examines the external world in order to read the meaning of the change; when sounds change, one tends to turn within for explanations. (E.g., the running figure is finally seen to be pursuing a dog, whereas an abrupt screech of tires is imagined—in the absence of visual data—to be a car avoiding a collision.) Noises, then, create an internal awareness of time in a way that sights do not. Paradoxically, the repetitive noise of seashore waves or of rippling wheat is sound’s closest approach to the mesmeric “timeless” effect precisely because the succession proves to be “false”: i.e., it introduces no distinctly new visual event or inner suspicion of new causes at work.
• Because of their firm implication in time, the recollection of sounds is far more complicated than that of sights. Sounds have an order, and the true recollection of them often depends upon restoring that order: the laughter at a party must almost necessarily be merged with the hum of voices and the clink of glasses when recalled. Such memories have a certain cadence, a certain pattern of preceding sounds and overlapping sounds. A particular image, however, can be summoned to mind both in isolation from its surroundings and with considerable vagueness in its component parts. One can somewhat remember a face or a house without visualizing very many details. Further thought may flesh out the memory: for a sound, the experience either returns largely intact at once, or not at all with any degree of effort. Of course, the art of sounds—music—requires a specific amount of time in which to represent itself and gives little pleasure in fragmentary or out-of-sequence form. A painting requires no certain amount of time to be perceived, and one may find joy in its memory without being able to recall significant parts of it.
Now let us sit in Professor McLuhan’s chair and speculate about a culture which privileges visual over aural communication. How would the seer’s worldview tend to differ from the hearer’s? Homo spectans would be more inclined to accept perceived divisions in things as subjectively imposed rather than objectively grounded. He would be more apt to attribute disagreements to differing “points of view”. He would be less sensitive to the passage of time, less convinced that choices produce consequences, less nervous about the “ripple” effects of a given event; or, to express the equation in his favor, he would be more intent upon the present, more confident that the social whole would absorb superficial oddities in its members’ conduct, more willing to take a bold step himself without brooding over its direction. He would want his days, hours, and minutes filled with amusement and pageantry: silence and stillness would strike him as intensely uninteresting. Lest we imagine that he would sit rapt before the wash of rollers on the seashore, recall that this experience owes its hypnotism as much to sound as to sight (the same scene might well hold a diminished degree of fascination for a deaf person). Precisely because visual time has a special elasticity, a forgivingness, those who dwell in it don’t care to see time stand still. Homo audiens—the introvert, the reader—is he who would be most soothed and enthralled by hypnotic prospects of repetition, for his heightened awareness of time, cause, and personal responsibility would make a temporary lifting of that burden delightful. Homo spectans needs to be swimming in the waves, rolling in the wheat. He would be on the go, mingling hungrily with his environment. His sense of self amid all this florid activity, almost certainly, would lag far behind that of the listener; for a man can hear his own voice at any time, but he cannot truly see himself even in a mirror—and he can form complete words and sentences in his mind, yet only grasp at metamorphic visual recollections as Menelaus did at the ever-changing Proteus.
That many of our young people belong to this new human species should scarcely need further demonstration for anyone who has moved daily among them. They struggle in their judgments to reify parts of the world and to apply abstract distinctions. Things blur together. A sense of outrage at the police for arresting one’s friend readily morphs into a willingness to have federal agents reading all our e-mail. Capital punishment is barbarous, but torturing a terrorist is a nice little bit of payback, with or without divulged secrets. No one but a racist could possibly approve the enforcing of our national borders, but we ought to “nuke” those people who wear “head rags”. The same young people cannot seem to have assignments read before class or to submit papers on time: they cannot even manage to drop a class which they have never attended before the publicly announced and clarioned deadline. Social commentators contradictorily assert of them that they are highly cliquish and fiercely individualistic—and the contradiction is true. They are forever fusing with their surroundings, doing as Romans do when in Rome, primping themselves to look “cool” or “fly”… yet, in their own minds, these efforts are as much a declaration of independence as a recitation of the fraternity’s sacred oath. For they do not really know where the group ends and they begin: their mimicry of behaviors and repetition of clichés constantly leaves them frustrated, since they had thought at some critical point that the group was their elusive self.
That the habits of “seeing man” are notably less compatible with literacy than those of “listening man” should also need little demonstration. The writer refines the speaker, providing a preview of his arguments, slowing down their construction, shoring up deficient logical connections, highlighting the major issues, balancing the time spent on each stage of the case, and minimizing ad hominem attacks that undermine the guiding principle’s authority. Writers ponder rather than respond to spontaneous cues: they grope after a transcending good rather than work a particular crowd. To this extent, they are indeed unlike speakers. Yet the relationship has historically been more symbiotic than adversarial. The speaker reminds the writer that prose should be vivid, that cadenced phrasing should accompany balanced argument, that a sharp example drawn from familiar circumstances need not strain against universal truth—that, in fact, human beings are universally awash in the particular. While the writer may better succeed at grasping and defining the confraternity which mystically binds us through and above our circumstances, the speaker shakes more hands. Speaking is that public projection of philosophizing which neutralizes the profound moralist’s tendency to misanthropy. Is it any wonder that Confucius, Socrates, and Epictetus (not to mention Jesus) let someone else do their scribbling?
If the oral and the literate are complementary opposites, the visual is the Mr. Hyde incredibly, perversely sired by Dr. Jekyll’s brilliance. The irony is starkly material: the electronic technology which has saturated us in images could not possibly exist without that jewel in literacy’s crown, science. Literacy (greatly helped by the printing press) allowed investigators to record observations in vast number and with precision, then to circulate them among the curious. The collaboration of an international community of scholars pledged to dissecting the physical environment with utter objectivity gained momentum. External reality could soon be manipulated as no ancient philosopher would ever have dreamed possible. The creation of highly artificial habitats where the flow of vital necessities was perfectly monitored and measured, the conquest of terrifying diseases and of seemingly irresistible degenerative processes, the removal of human bodies from the front lines of wars (and the unintended transformation of civilian centers into targets)… everything changed in a few short centuries. If these radical changes could be viewed as moving along a certain spiritual vector rather than simply turning existence inside-out, the steady breeze inclining all of them where it listeth was remove. Distance: the physical sprawl of settlements, the remote operation of war engines, insulation from biological contagion, isolation from thronging masses, liberation from subjective prejudice… the clinician’s white lab coat, the new subdivision’s automatic security gates, the PC-user’s “firewalls” and “virus shields”, the busman-on-holiday’s remote-control stick.
I suppose it is natural enough to misread this vector as paralleling that of increasing literacy. The reader has always retreated to a secret inner place since the day when he learned how to keep his lips still: the writer has done so since the day when he ceased being a glorified copyist. Literate introversion is also a kind of remove. Yet it is an oral remove—a withdrawal from gushing words to a place where they trickle more leisurely, more inventively. The scientist has long suffered cruel stereotyping as a verbally challenged automaton—a robot with an ego. The caricature is not entirely undeserved. Scientists, alas, are wedded to images. They are not free to write fantasies, utopias, or fugues: every word they pen must describe perceptible phenomena—visible phenomena; for even seismic tremors and black pulsars in far galaxies are represented visually by needles bouncing along tapes. Words are an inconvenient middle stage to scientific truth. The ultimate triumph rests always in perfectly describing perceived reality (pure science), or in altering that reality in perfectly predictable fashion. (applied science). Of course, the middle stage is indispensable: the scientist must be able to explain or describe, which requires words. His “skill set” is consequently very rare, and probably—like Dr. Jekyll’s—somewhat schizophrenic (as McLuhan blandly alleged of all literates); for, while the orator’s logic moves from the seen to the unseen, the scientist’s must always take a third step back to the seen. The inner life is never his destination.
It is very likely needless—maybe even needlessly provocative—for me to take so much space in extending the pedigree of modern counter-literacy (for the post-literate is aggressively counter-literate) all the way back to the scientific revolution. All I am compelled to show here is how our youth came to have their mental energies shanghaied by the visual. The connection, as I have said, should be almost self-evident: first movies, then television, then exponential multiplication of televised fare and TV sets per household, then the VCR , then video arcades, then the Internet, then video games on DVD , now cell phones with cameras. Except for the motion picture, all of these eye-catching amusements have blossomed during my lifetime, most of them within the past twenty years. I do not understand how any reasonable person could still doubt that the consciousness of our young people is being drawn outward in an alarming manner. Silence and solitude are fearful prospects to most of them. The word “boring” has been ever on the tip of student tongues now for at least three decades, and is perhaps less so now than ten years ago because so many students bring electronic diversions to class with them. Something must be “happening” all the time in their seamless, extroverted, visual approach to life. I have already assessed the cell phone as a visual signifier—a stage prop; but on those occasions when a real call actually goes through, it is also—naturally—providing instant diversion. Indeed, the aural experience of our young people has generally assumed the same fluid, available-on-demand character as is implicit in the visual world. Many of them are rigged with headphones when not chattering on a cell phone, and more than a few students tell me that they never compose papers on the computer without piping in music from its speakers. Such “music” is unlikely to be Gershwin, let alone Ravel or Brahms. The most popular genres truly have very little musical quality at all. Rap and hip-hop are ostentatiously monotonous: the latter may feature a few faintly melodious bars played over and over in the background. Why the relentless repetition? I suspect the “waves along the shoreline” effect. Sound is least indexed to linear temporal change—i.e., most like sight—in these circumstances. The score isn’t really going anywhere: time isn’t really advancing. Voices and instruments do not pause: instants of silence do not punctuate. Everything is here and now. Don’t forage through your memory, and don’t feel your way into an imagined future. Just open your eyes.
The scientific mentality strikes me as worth mentioning in this context only because it provides such a clear explanation, not just for how our electronic amusements were born, but also—and even more helpfully—for why they were born. Progressive remove from the vital circumstances of one’s environment is hard to bear. It is nothing less than solitary confinement. The literate author retreated within himself to find humanity’s soul: the literate scientist retreated from specific empirical cases to find universal laws for those cases. The empty inner space of his reflections (empty, that is, of interfering empirical stimuli) supplied the neutral zone, the sterilized incubator, in which he could best make the transition from the particular to the general level of what he had seen. Such formidable minds are not excused from a horror of the void; the technicians who developed radio, then television, computers, and the rest, were serious men and women addressing the chasm in the human psyche, not giddy adolescents. Yet they were not philosophers, either. If few of them were suffering through a teenager’s identity crisis, they nevertheless—as a group—did not know who they were. Even Einstein (who passed much of his youth as an exile from scientific respectability) grossly overestimated human nature in predicting the future of nuclear power.
So the human condition, as an acute pathology, was treated with the “plug-in drug” (to borrow Mary Wynne’s felicitous phrase). The scientific worldview, one may say without much fear of contradiction, embraces the artificial. Progress is always possible, and its mechanism lies not in better understanding the necessary limits of our mortal existence, but in exploiting nature’s laws to create unnatural solutions. That human alienation and emptiness should be addressed by creating alternate realities—visual realities—which anesthetize the soul to its pain would be fully in keeping with the scientific program. It would never occur to the committed scientist that the very techniques which allowed him to cure physical ailments would conspire to produce an anguishing psychological malaise, much less that the cure for this malaise du siècle would sabotage the future of science.
Yet so it has come to pass. The children and grandchildren of scientific innovators are so effectively turned outward into the visible environment that the mediation of words now sits beyond their reach. The present generation is arguably less scientific than Homer’s: deprive them of their inheritance of gadgetry, and they would be less capable of inventing the saw or the pulley than Odysseus. Ortega y Gasset insisted on the point. “The common man,” he writes, “upon encountering this technical and socially fine-crafted world, believes that Nature has produced it, and never spares a thought for the benign efforts of excellent individuals implicated in its creation. Even less would he acknowledge the idea that all of these conveniences flow in serial fashion from certain hard-won human virtues, the least failure of which would rapidly bring down the whole magnificent construction.”11 Our students can read, but not very well: information is as close at hand as Wikipedia, and vibrant entertainment as easy as inserting a plastic disc. They occasionally pass math courses, yet think more like marketers than mathematicians: a spreadsheet of generalized responses interests them infinitely more than the study’s definition of a discrete unit. When forced to enunciate a value judgment, they patch together some such circular folderol as, “You have to be happy with your self [sic] and be confident to get anywhere in this world”—as if happiness and confidence instrumentally preceded “getting somewhere”, which in turn would pay off the previous bluff by generating real wealth and pride. They neither write enough nor speak enough; by comparison, they watch far too much. One series of studies lately discovered that the student’s imaginative awareness of his visible appearance affects how well he thinks in the most quantifiable manner we know of: the IQ test. Students who took half the test, exited for a break, and then returned wearing a baseball cap performed distinctly less well on the second half. If the cap were rotated to the side or the back, performance was yet worse, by a considerable margin.12
Toward the end of her enchanting Se Il Sole Muore (If the Sun Dies), the late Oriana Fallaci describes an unforgettable scene with the pioneer astronauts who “adopted” her as a sister (probably because her ingenuous interviewing succeeded in unlocking the humanity beneath their military/technician exterior). The group is sitting around a motel pool near Cape Kennedy. Pete Conrad solicits suggestions from his mates about how to begin a speech which he must soon present in Philadelphia. The response is a general turning of backs, until Frank Borman inexplicably rises with the utmost gravity and begins to deliver Marc Antony’s funeral oration from Julius Caesar. After two dozen lines, Borman volleys the recitation to Theodore Freeman (destined to die in twenty-four hours), who instantly adds the next few lines before the baton is passed to Dick Gordon, then Tom Stafford… and thus through the whole leisurely assembly until Shakespeare’s last line is honored. One must wonder how many doctoral candidates in Elizabethan literature today could recite six lines from the same speech with an hour’s warning to rack their brains.
Yet Fallaci’s book is nothing so much as a nervous meditation on the American faith in progress. In an important sense, the Moon loomed large to us in mid-century—to astronomers, engineers, and pilots no less than to film-producers and poets—because we all had some Shakespeare by heart. We applied logical analysis, but also intellectual curiosity and spiritual hunger, to what we saw about us. As far as I know, no public figure has spoken of any new missions to the Moon for years. Our electorate would rather spend tax dollars on sports arenas, and our private sector would rather dig deeper into the gold mine of electronic entertainment. It isn’t that we have gone our separate ways as a culture: we have all traveled the same road, instead—away from Shakespeare, away from the past, away from silence and solitude, and into “a general leveling of outlook”. On our present course—which is precisely no course at all, a renunciation of all possible culture by seeking to dwell in an eternal present—our grandchildren will be reduced to the state of a Cro-Magnon on a savanna. They will see the sunlight, see a friend, make a hand signal, see the friend’s bandaged arm, pat the friend’s shoulder… what other expressions of sentiment, what deeper levels of analysis, will be possible in a being that only sees? If the Moon should rise before this pair of friends and one of them should point at it, will either of them be able to vociferate the vaguest of longings? Will the sound resemble a man’s thinking more than a dog’s baying?
1 See p. 22 of Thomas F. Bertonneau, “Thinking Is Hard: How a Damaged Literacy Hinders Students From Coming to Grips With Ideas,” Praesidium 2.3 (Summer 2003): 5-22.
2 Cf. Ong’s classification of the oral mentality as “situational rather than abstract” in Orality and Literacy: The Technologizing of the Word (New York: Routledge, 1982), 49-57.
3 I frequently console my classes with the howling gaffe in the very title of a book published in 1993 by Eerdmans: Whatever [sic] Happened to Evangelical Theology? We can hardly expect our disciples to clear hurdles which we ourselves leave overturned.
4 Op. cit., 79.
5 My translation from La Rebelión de las Masas (Madrid: Allianza Editorial, 1990), 30-31.
6 Op. cit., 8. Bertonneau adds in a note appended to this passage that he is not unsympathetic with students as they pace their inexpressive prison: he recognizes them, rather, as the victims of an incompetent or insouciant education system which has allowed this unwholesome condition to worsen for decades.
7 Op. cit., 119-121.
8 H. Marshall McLuhan, The Gutenberg Galaxy (Toronto: U of Toronto P, 1967), 25
9 Ibid., 30.
10 Ibid., 22.
11 Op. cit., 85. My translation.
12 See Michael Ackley, April 2, 2007 , World Net Daily (www.wnd.com), “IQ Study Caps Off Theory”. The editorialist warns that his work has satiric ambitions, but the raw facts of this case appear to be reported without embellishment.