great-lie

The Center for Literate Values ~ Defending the Western tradition of responsible individualism, disciplined freedom, tasteful creativity, common sense, and faith in a supreme moral being.

P R A E S I D I U M

A Common-Sense Journal of Literary and Cultural Analysis

15.1 (Winter 2015)

 

Academe in Decay

chutes

 

The Great Lie: High Tech and Superior Education
John R. Harris

Though they style themselves daring questioners of the status quo, our teachers produce as much dreary, propagandistic pabulum as any other engineer of the contemporary West’s decadence. One sanctimonious piety, in particular—milled by a grinding Education Establishment that tirelessly reduces talent and creativity to dust—has now taken its place among our age’s most mind-numbing background noise: “Young students require early immersion in technology so that they may be prepared for the world of tomorrow.”  The modal “may”, granted, is probably not discernible in the wheel’s usual whines and rumbles: the subjunctive has suffered the postmodern academy’s death sentence for the unpardonable sin of irrelevance. This stray detail turns out to have the utmost significance to my discussion’s issues, and I shall return to the brutal verdict against vague modal registers in closing.  For now, I would observe that the cult of high tech in the classroom, which licenses so very much of what is being systematically done to our children by “professionals”, truly partakes of fanatical nonsense.  It rests upon several false premises, this quasi-religion; and each premise, furthermore, is terminal to the high claim of salvation. Each suffices, that is, to prove that our children desperately do not need immersion in high tech.  Yet even the most idiotic of remarks, echoed in bureaucratic corridors for years, seems able to acquire the force of an axiom. Cynical autocrats knew long before Joseph Goebbels that a lie repeated often enough becomes the truth.

I count at least five kinds of ruinous mendacity.

1) No variety of technology taught to a first grader will have preserved its essential elements by the time the child graduates from high school.  If a single major new technology has navigated the first decade and a half of the twenty-first century without radical transformation, I should like to know what it is.  From the “user” standpoint, then, the first-grader who is introduced to the mysteries of the current iPad is learning absolutely nothing with any meaningful application to the century’s looming second quarter.

2) Of course, from the perspective of research and development, a head-start of some kind could scarcely help but prove beneficial to a child’s future—but of what would this head-start consist?  Writing computer programs presumably demands sophisticated math skills, and punching buttons playfully hasn’t been shown to nourish these better than scrawling on paper with a pencil.  In the matter of innovation—of “thinking outside the box”—years of trotting down prearranged corridors like a cow navigating the slaughter-house’s chutes is bound to be downright suffocating to creativity.  A mind whose entire life is spent in an aircraft might imagine a new kind of seat or storage bin, but never a new kind of aircraft.

3) To return to the user’s perspective, the destiny of any mass-produced technology (such as those varieties marketed so lucratively to schools) is to grow more user-friendly.  Why would any child need to learn at age six an operation which, by his or her eighteenth year, is bound to have become as easy as batting an eyelash?  Thirty years ago, a child would need to know a series of commands to type in when booting up a computer.  What use is that knowledge today? I’m sure those entry codes must have been rinsed from my own memory for at least two decades. I can’t begin to recall a single one—and I have yet to lament their complete loss.

4) And speaking of lucre, arming the next generation with a panoply of gadgets is prohibitively expensive, precisely because technology is advancing so rapidly that perpetual upgrades are required and the latest innovation is instantly obsolete.  This is growing more rather than less true, and at an exponential rate.  Though the concern here is utterly practical while those above are more conceptual, the hard fact of prohibitive expense is equally disastrous to the starry-eyed idealism of the high-tech progressive.

5) Finally, education-by-gadget doesn’t “prepare” our children for tomorrow: it programs them for a very specific kind of tomorrow.  It condemns them to envision only that tomorrow and no other; it so degrades their intellectual and spiritual eyesight as to be capable of viewing only certain horizons.  A man who justified teaching his toddler to use every known variety of deadly weapon on the ground that he was preparing the child for tomorrow would likely be raising a dangerous paranoid.  Why would we suppose that toddlers forced to learn only from buttons and screens would be preparing for a humane, creative future?

Contemporary education is proving a massive failure, and it’s high time that we a) recognize the evidence of this fact and b) admit the key role of technology in generating the failure. Not only do more screens and more software programs not compose the thread that will lead us out of the fatal labyrinth: they are the labyrinth’s very brick and mortar. There is no doubt something quite natural about supposing that sophisticated innovations must improve our chances of survival. Omne ignotum pro magnifico est, observed Tacitus (through the persona of a Briton explaining why the Romans couldn’t keep their feet off other people’s turf): “everything unknown seems to promise something grand.” The mystique of the gadget is very nearly irresistible. In our heart of hearts, most of us realize that we don’t understand calculus or chemistry or anatomy all that well. The software program allows us to claim with a straight face before skeptics that we have supplemented our information deficit—and our egotism persuades us that merely having discovered the program bestows some of its mastery upon us. Though we still don’t know squat about the subject, we are superior to other ignoramuses (whom we may even style ignorami) because we hold a key in our pocket and they do not. Like Ortega y Gasset’s hombre masa, we imagine ourselves to stand at the apex of human evolution simply because we possess technology. Understanding how it actually works isn’t required to establish our superior pedigree.

As the new wears off the twenty-first century, however, older educators like me entertain no doubt about the declining abilities of our students. Allow me to share in evidence a couple of paper assignments made to college undergraduates—and the results they elicited—just this past fall. One topic was created for the first half of an English literature survey (Beowulf through Goldsmith). We had “read in the margins” of the English tradition throughout the semester, perusing excerpts from Celtic sources, Boccaccio, Petrarch, Du Bellay, Voltaire, and so forth in an attempt better to understand how peripheral societies influenced the island and how English authors eventually chose to distinguish themselves from the continent. I asked that students take a stab at defining the peculiar qualities of English texts at the end of the term. They were given several weeks to produce an essay of about five pages. I sought to prime their minds by offering eight different—but, I think, more or less coherent—avenues of approach. Here’s the precise wording of the assignment:

1) English literature was a fusion of more numerous and disparate local elements than what one would find on the Continent. Germanic, Scandinavian, Celtic (Welsh, Scots, Irish), and Franco-Norman components all figured not only in literary works as influences, but also in the actual language which came to be what we call English.

2) Throughout our period of study, the awareness of differences in genre—often based upon differences in social class—appears less in English literature than in its continental counterparts. Little effort was made even after our survey period to view folklore as a distinct literary creation: it readily bled into the “higher” genres or else was largely ignored.

3) The imitation of continental “masters”, particularly the French and Italians, would characterize English literature from the days of the first romances through the heyday of the eighteenth century’s neoclassical and philosophical fiction.

4) In coming rather later to the recognized genres (romance both bourgeois and courtly, parody and satire, the sonnet) than the French and Italians, the English often copied these continental models with a certain self-consciousness and willingness to try something new.

5) Perhaps because of this identity crisis, English writers sometimes produced works of stunning novelty, especially in the Renaissance and thereafter.

6) Such novelty may also have been influenced by the English taste for the pragmatic—for that which makes plain common sense, which applies to everyday living. This taste was no doubt fed by England’s successes in developing and applying the new insights of science.

7) Much English literature, going back to the early Middle Ages, also broadcasts a strict adherence to religious faith of the day. Such didacticism (or dedication to teaching lessons) might also be seen as a projection of the pragmatic spirit mentioned above, as if the ultimate purpose of faith were to teach one the “do’s” and “don’t’s” of daily life.

8) Defining a sense of humor involves subjectivity—yet it certainly seems fair to say that English literature has generally not partaken of the risqué or wildly burlesque species of humor observable in Italian and French classics.

Above, I have purposely abstained from mentioning specific works and authors.  I would like you to fill in those blanks! But I should also be very happy to see you add further observations to mine. You are not to regard yourselves as in any way bound by them. My list is intended to give you some ideas—not to dictate the structure of your final paper.

This project proved a disaster. I bear much of the responsibility, in retrospect; for the teacher should know the abilities of his students, and I had obviously overestimated those of mine to a lamentable degree. Yet my point, in the present context, is no other than that I could once pose very broad parameters such as the above and reasonably expect to draw some thoughtful responses. No more. Almost half of the essays harvested a few stale historical observations from the Internet and added not a trace of speculative reflection. Probably a quarter picked out a couple of texts (say, Everyman and Doctor Faustus) and summarized their plot. Perhaps a fifth answered essay prompts given earlier in the semester for other assignments, entirely ignoring the stated question. Whatever percentage remained came somewhere close to producing the kind of discussion I wanted and had often modeled throughout the term: maybe one student in ten.

A possible commentary on these sad numbers would be to recommend that I rejoice in the ten percent—that teachers should never expect more than a happy few to manifest a gift for intelligent, open-ended, plausible speculation about the motives of human behavior. Fair enough. But the manner in which other shots flew wide of the mark must remain a grave concern. A huge majority of our students today, to put it bluntly, is either not listening to instructions or lacks whatever it takes to obey them. In times gone by (“back in the day”, as the current flock of parrots squawks—an expression perhaps five years old whose creator should burn in one of Inferno’s lower circles), students would moan, “I don’t know what you want!” Then I would explain that I just wanted them to think for themselves, and their iced-over imagination would begin to thaw. At least they had the presence of mind to register some cognitive dissonance—to realize that they should be insecure and seek further guidance. These students ask no questions, even when urged to do so. They do not submit rough drafts for a “no-cost estimate”, even though I make the offer. I don’t understand them. I can no longer reach them.

Am I really taking too long a stride to hypothesize that their playing on the iPads which we educators now require them, in many cases, to carry into class may have something to do with their inattention? That seems a thoroughly tame assumption; and, of course, the smart-phone which can also access PDF’s of assigned reading is even better suited to sustain “texting” exchanges and to conceal screens of Facebook posts. One student emailed me urgently on the formal exam day for our time slot that our classroom was empty, even though I had announced on August, Day One—and repeated often thereafter—that the final paper was a take-home exercise. Her weekly attendance had been perfect, by the way, when measured in bodily occupation of space; and at her age, I should have to suppose that her hearing was healthily keen. The disconnect was between ear and brain rather than between speech and ear. So for the others. During the term, I had sometimes deliberately “given away” answers to questions before the daily quiz, only to find that perhaps two or three in a class of thirty-four had taken my hint.

These faces, I must add, were not always engrossed in tiny screens, by any means. At a random moment, they would appear visibly involved in my antics at the head of the room; so the “browsing/texting” theory cannot fully account for their out-of-body experiences in class. I have to conclude that something in the very nature of receiving information via “device” has compromised the quality of their reflection upon received items. I suspect that they may view the external world as a succession of distant images intended to amuse lightly, just as (or more or less as) dialing through a television’s channels transports one from game highlights to the forensic lab to a Geico commercial to Alaskan survivalists. It’s all a great blur. Hierarchies of importance disappear as “surfing” moves laterally—associationally—from one whisper or flicker to another. “The point of it all” is forever conditional and temporary: nothing stands out except in its very fragile, ephemeral context. No set of references exists outside the search-box.

And a profound, significant paradox presents itself here, I believe: that this brainwashed, programmed generation of constantly surfing relativists is also intensely, even pathologically “goal-driven”. If one can somehow keep them focused on a single screen for a brief time, they hurl themselves zealously into the task represented by that screen. If playing video games, they shoot down a maximum of bad guys or bore toward the finish line in their virtual Ferrari. If lifting barbells, they fill their ears with hip-hop “swag” that stimulates adrenaline. (Bagpipes used to do the same thing: the Duke of Wellington had them silenced at Waterloo lest the Highlanders charge the enemy too soon.) Naturally, if asked to write a paper, these same young people will cut right to the chase. King Lear, a prophet—a shaman—in his grand madness? An essay question discussing that possibility? Look up “shaman” on Wikipedia; mention Beowulf—for the professor claimed that he, too, was of shamanic provenance; note that the two display arrogance; and finally do a victory-lap declaring the mission accomplished. Not a word about the “other world” that Lear sees in his raving, not a word about the content of his raves that so explodes the hypocrisy upon which human communities delicately balance, not a word of fine analysis or value judgment. Two arrogant rulers that go too far and end badly. Surface to counter-surface: A to B. Job done. Go to next task.

This, an it please thee, is how we are preparing our children for the world of tomorrow. No wonder Ray Kurzweil is confident that robots will utterly absorb or replace us in that world!

One example further: here we have a final paper in a grammar class whose enrollees were mostly junior and senior English majors, several of them about to graduate. My desire, once again, was that a take-home essay should respond conjecturally to the evidence of an internal war over things grammatical in the academy—evidence accumulated over a semester’s worth of mining such texts as The Oxford Modern English Grammar (by Bas Aarts). Once again, as well, I published some of our more provocative findings on the Internet many weeks before the paper’s due date and invited students to map out a traditionalist or progressive position based upon their reaction to such specific cases. Below is the actual posting:

Guidelines for Position Paper on Formal vs. Popular Usage

The contrast at work here might be described as the formal or classical or academic understanding of grammar vs. the popular or progressive or demotic (from demos, “the masses”) understanding. The former approach would emphasize the history of the language and the original logic underlying its structures—factors obviously not very visible to the contemporary rank-and-file and hence requiring some degree of training to be fully appreciated. The latter approach would find a latent snobbery and a patent irrelevance in such an Ivory Tower position; it would argue, instead, that language is a living, evolving phenomenon rather than a skeleton in a museum, and that the grammar of language must therefore reflect whatever logic the masses who actively speak it have imposed—rightly or wrongly—upon it.

I would like to see you develop a somewhat abstract fluency in your grasp of these issues. By that I mean that you should be able to form intelligent generalities and combine them in ways that lead to a compelling conclusion. The combination itself, furthermore, is bound to reflect a range of value judgments. For instance, is it elitist to deny vast numbers of people a final authority over the rules of their own tongue—or, on the other hand, is it foolish and destructive to chase after ignorant trend while throwing overboard fine shades of meaning embedded in a language’s history?

To arbitrate such questions, of course, you need to begin with a review of specific evidence. In many ways, this paper is indeed very like the “analysis of dialect” paper: it starts with evidence-gathering and triage, and only later advances to forming general propositions and a synthetic conclusion.

In the spirit of setting the wheel in motion, then, I propose a few cases drawn from Aarts’ Oxford Modern English Grammar. You are encouraged and expected to find additional examples, whether from the Oxford, other texts, or your personal experience. In fact, you may choose to use none of the examples below. The immediate objective is just to get you thinking about circumstances where a certain grammatical phenomenon may be analyzed differently by the classicist and the progressive. As with a dialectical analysis, the more such examples you can find, the more material you will have from which to generate meaningful conclusions.

The Oxford’s rationale for abandoning the conventional modal term “subjunctive” is essentially that English retains almost no modal inflections (i.e., verbs are almost never spelled differently to indicate mood). The argument for preferring “if it were to rain” over “if it was to rain” seems to evaporate, from this perspective. Is that a good thing or a bad thing?

Why is “many” a “degree determinative” rather than simply an adjective in the positive degree? (3.3.2)

Why are the “determinatives” in general distinguished from adjectives?  (3.3.2-5)

Should “get” be considered passive in sentences like, “I got sent home”?  (3.6.3.4, ex. 137)

Is the most sensible way to analyze “out” in the sentence, “We might go out for a meal,” as an “intransitive preposition”?  (3.7.2, ex. 146)

Is the best way to analyze “aside” in the phrase, “college work aside”, as a “postposition”?  (3.7.4, ex. 151)

In a related matter, pairing of prepositions is often viewed by the classicist as poor usage. In cases like, “My comments are based off of personal experience,” and, “We loaded up with [or ‘on’] groceries before the storm hit,” the traditionalist editor’s pen would likely spring into action. Is there any good reason not to allow such pairs to multiply?

”There is nothing I want for you to say anyway”: is “for” a coordinating conjunction here, or is “for you” an indirect object specifying the sense of “there is nothing to say”?  (3.9, ex. 171)

Is it helpful to speak of verbs as “ditransitive” that typically take both a direct and an indirect object (e.g., “Give the letter to me”; see gray box on p. 97 )? The classical approach is to reserve the word “transitive” only for a verb that takes a direct object; the indirect object would therefore not constitute a second transitive event.

The Oxford treats words such as “after, before, since, where, when, and while” as prepositions (5.5.1.3, exx. 225 ff.) rather than coordinating conjunctions. What do you think of this classification?

Should “whom” continue to be favored as the objective form of the relative pronoun, or avoided even when logically correct because it sounds “stuffy”?

Consider, “I’m going to go straight back to London” (5.5.2.1, ex. 253). A classicist would view straight as a “cognate accusative”, or the object of a normally intransitive verb (i.e., “go a straight way”)—which is really a fancy way of calling it an adverb. The Oxford shifts this word’s allegiance to the following preposition, claiming that we now have two prepositions of which the former is intransitive. Which explanation seems more justified to you?

The Oxford does acknowledge the distinctive formal character of the English imperative, though not as a mood (6.3). “Let’s” is offered as a special case that allows the speaker to include him- or herself in the command (e.g., “Let’s have a look at the list,” ex. 30). The conventional approach would be to treat “let” as a regular imperative that can implement first- and third-person commands (e.g., “Let them never forget this day!”).   How would you explain the rather convoluted English formulas for commands?

In the same general section, Aarts perceives the presence of a third-person imperative in constructions like, “Somebody find a doctor!” The more traditional approach would be to view “somebody” as a vocative (“Hey, somebody!”) and the verb as a second-person command. What do you think is happening in such expressions?

If the “that” clause in, “That it is late need not concern us,” is taken as subordinate, then what is the main clause’s subject? In classical grammar, the initial structure is called a noun clause and treated as a single noun. What are the advantages of treating it as subordination? (7.3.1.1)

Consider this sentence: “We forced them to ensure long-term stability.” The Oxford claims that verbs such as “force” (see Table 8.7) cannot be viewed as taking a direct object in the form of an infinitive. The classical approach is to treat the entire phrase following “forced” as that verb’s object UNLESS the infinitive shows purpose; the Oxford explanation seems to provide no logical ground for distinguishing between the purpose infinitive and the direct-object infinitive. Which approach is more useful?

Mood poses a challenge to almost everyone. Even among the classical languages, Latin had already collapsed the optative mood of Greek (used for wishes, prayers, etc.) into its subjective. How useful are the deontic, dynamic, and epistemic divisions of the Oxford (ch. 10)? For instance, in the sentences, “I may see her tomorrow,” and, “I will see her tomorrow,” is the contrast better expressed as between a present subjunctive and a future indicative, or as between the epistemic mood and the dynamic mood?

How should we analyze expressions like, “I used to kind of say you know please, please God get me out of this”? (10.3.11.4, ex. 220)

Professor Aarts has nothing explicit to say in defense of the hyphen and sometimes, in his explanations, avoids using one in circumstances where the classicist would insist on his doing so. Has the time come to heave the hyphen onto the bone pile?

Likewise, Aarts plays fast and loose in his commentaries with the standard rules for using commas around non-restrictive matter. Has the time come to cease insisting that restrictive and non-restrictive clauses be punctuated differently?

Is the age-old prohibition against beginning sentences with conjunctions at last ready to be put out to pasture?

I apologize for the length of this assignment’s description—and I readily forgive any reader who chose to bypass it after the first few hundred words. The extreme detail of the instructions, however, must be emphasized, and also the extravagant generosity of the examples provided. I was essentially doing most of the students’ digging for them. The result? Well over half of the submissions included no more than two or three instances of controversial grammatical rulings. A good half—generally the same papers—also lapsed into a strangely autobiographical mode, often telling me minute details about the author’s childhood and the cultural circumstances of his or her language acquisition. I did not particularly object to this tone in and of itself, since anecdotal evidence could certainly be relevant to the discussion; but the personal references tended far more often to justify a decision based on nostalgia or a blunt “you are what you are” kind of fatalism rather than to expose the strengths or weaknesses of legitimizing regional dialects. The best papers surpassed my generalized portrait above only by extending their review of forms to seven or eight anomalies. Not a single submission really analyzed the issue as I had wished. There was no categorizing of examples, say, into harmless idiomatic tendencies (“It is me,” instead of, “It is I”), anemic surrenders of logic to abuse (the abandonment of “whom”), and ignorant impatience with subtlety (the eradication of the subjunctive—of which I have promised to speak in my closing). The method was drawn from American Idol and Dancing with the Stars: “Now that number there… I didn’t like it so much.”

I was disappointed by the students in my English Lit Survey; by this group, I was nearly infuriated. Though older and also, presumably, wiser in the ways of the English language, the upper-classmen manifested the same inattention to directions. Though they had helped me to identify various instances of controversy throughout much of the semester, they reacted to my posting of the results as if it were a pop-up ad. Indeed, I think the speculation may be licensed that today’s young people automatically view an info-fragment on the Net as a kind of utility or convenience: something to be appropriated for a specific task if it supports an argument or otherwise “does the job”, but—that failing—a factoid as irrelevant and uninteresting as a plumber’s wrench to a roofer. Information is called up by a search for specific answers to specific questions. A citation becomes something you copy and paste once the high-tech machinery loads its bullet into your authorial gun: it is instantly and objectively chosen to drill the defined target’s center. You do not skeptically review, broadly categorize, sensibly contextualize, or in any other way analyze the data that appear on your screen. You use them. You fit them into slots. Where no ready fit exists, you let the useless datum lie in the bottom of the toolbox.

If my assessment is correct, today’s student views this inestimable treasure-trove of e-processed information extolled to us now for at least three decades with something approaching contempt—with a very casual indifference, perhaps. Electronic it surely is, but electric it is not: it fails to animate or energize. Its quiddity belongs to the nail and the paperclip. Software programs abound now which actually, and actively, select supporting matter for student essays from an array of databases. The human role grows increasingly passive. Precisely because the data-nuggets have been carefully sifted, cleansed, and refined by technology, the living mind at the end of the photonic conveyor belt grows ever less accustomed to the notion that it should have to do anything at all. Machines analyze: humans use the products of analysis.

This is the world of tomorrow for which we are preparing our children. This is the intellectual and spiritual mausoleum for whose dusty chambers we are embalming them.

The self-indulgent subjectivity I noted above suggests a final avenue of exploration. I have never opposed the use of the first person in scholarly writing. Sir Maurice Bowra knew how to employ it; so did Eric Havelock. Sir Kenneth Clark would occasionally allow an “I” into his incomparable documentary series about Western art and ideas, Civilisation. In such hands, the first person made the speaker more objective and credible. It did so by identifying points where he was straying from the realm of clear demonstration and preponderant testimony into the gray area of inkling and hunch. It showed him to be well aware of the boundary’s existence, and to be sensibly receptive to the humbling notion that not everything in this life is known, or even knowable.

My grammar students, as I have said, aimed their “I” in a very different direction. It was a permission-slip to gush with subjective reaction and to debauch upon personal reminiscence. Their other teachers, apparently, in raising a categorical prohibition against the first person’s use, had denied their charges a chance to understand its proper use.

Or was it as simple as that? Could there lurk here, as well, a subtle but toxic influence of the information-laden screen? Since the first days of cable television, at least, advanced technology has nourished a contradictory relationship between itself and its users. It has promised personalization, and later interactivity. The user would be able to choose just what he or she wanted, like ordering à la carte from a menu. The experience would confirm, and perhaps further define, the receiver’s individuality. The message was, “You are at the center of the universe. You command, and others obey. We, your abject automated servants, will even allow you to create alternate universes.”

At the same time, delivery was made upon this promise in a manner that inverted the promised results. Only by playing automation’s game could the user master the surrounding automatons. The hundreds of channels available through a satellite dish actually offered a shocking paucity of choices—or certainly did so in terms of format, taste, and projected values. The reality show, the game show, the sporting event, the newscast, the sitcom, the melodrama, the action series… one could virtually exhaust the available fare with a dozen rubrics. Likewise, the Internet, with its exponentially more generous promises, turns out to be straitjacketed into a very few formats (website, YouTube, Facebook), all of them trawled for keyword phrases of just a few characters. Entries must submit themselves to instant and rigorous pigeon-holing in order to have any chance of discovery. The viewer or user, in “commanding” all of these airy vapors to materialize or dissolve at the press of a button, scarcely enjoys the power of Archimago over an army of demons. On the contrary, he or she is constantly joining the Breaking Bad queue or the Gray’s Anatomy queue or the Amazon or Blogspot or YouTube queue or the friending and tweeting and posting queue. The throng is not visible—but it is palpable. The “choice” or “vote” that establishes one’s individuality is forever one of six boxes checked or one of two dozen options clicked. Television shows of rare originality died faster than ever after the dissolution of the Big Three network monopoly; and today, in the same spirit, Netflix unceremoniously tosses overboard any menu-item that fails to draw a sufficient mass of “viewer hits”. Websites such as literatefreedom.org—your current locus—are summarily passed over (so a seasoned technical professional has assured me) because they look tired and bland. Though no normal human can reflectively read tiny white lettering on a black background framed by pulsing graphics, everyone wants to stop at such a destination. It’s the latest thing.

My point? That our technology may have created a horrendously self-destructive beast—a creature entirely preoccupied with his personal whimsy, yet whose whims reflect the dull motions of a vast tribe rather than anything personal. We often accuse our young people of thinking only of themselves, yet the most alarming thing about them is their severely reduced sense of self. They deface their epidermis with tattoos and rings in a bid to forge superficial, quickly apprehended ties to some sullen or rowdy or alienated sub-mass. They consume intoxicants and hallucinogens in full knowledge that doing so reduces their ability to perceive, to reason, to create the very expressions that most plainly declare their personhood—and perhaps in high hopes that they will indeed annihilate that personhood. Their jealously prized freedom to be individual seems most often invoked in the service of neutralizing their individuality, as if its burden were a supreme torture.

Electronic technology cannot be solely responsible for this malheur du siècle, of course; but neither, I think, is its influence negligible. I shared with my grammar class a story about being thoroughly mortified before my dissertation advisor when, thanks to my Texas upbringing, I could not spell the word “helmet”. The moral of the story was supposed to be that, though my handicap was no fault of my own and in that sense unfair, it was nevertheless a flaw that I could and did correct—quickly and permanently. Customs and manners are always unfair. Some are always closer to the mainstream than others; yet a river’s waters must all move one way to reach the sea. End of anecdote. What I received in a majority of essays was (to borrow a tongue-in-cheek phrase from Flann O’Brien) a sad tale of the hard life. There was no substantial testimony in the typical ramble; most were immaterial identifications of the speaker with a certain victimized group. I hasten to add that virtually none of the essayists wanted to use a rooting in “sub-standard” cultural soil as an excuse for messy grammar. On the contrary, almost all were quite harsh on mild abuses (even though their own grammar was often unconsciously bruised and battered)… but that’s a curious psychological phenomenon for another day’s analysis.

They did not know what to do with the first person: they did not know how to use it so as to bring themselves before the judge’s bench and establish thereby a tone of impartiality, of self-critical skepticism. Once they allowed themselves into the discussion, they immediately became lost in some ethnos or other which partook of their “I”. “I” at once became “we”, “po’ redneck folk” who had somehow ended up in college and now wanted desperately not to be what they were. The battle could turn quite poignant… but it produced nothing approaching a logical dissection.

If I have still failed to convince that high-tech education is implicated in such drifting identity, consider this further. The same students who are warned against first-person pronouns in other writing classes grind out—under instruction, and perhaps duress—no end of plangent personal narratives in Creative Writing class. As they sit through literature courses, furthermore, they are gorged on diaries and confessions and prison journals of oppressed wives and bartered daughters and colonized people of color, practically all composed in the first person and every single one representing (or so the professor insists) a major category—a tribe—of similar sufferers behind the “I” persona. In other words, the situation of the falsely empowered technology-user, adrift and bound hand-and-foot in a sea Hobson’s Choices, hoping only to encounter and merge with another floating island of sufferers, is their steady diet. Everything connected with literature in the academy now is either about the sender’s correctly gauging his audience of receivers and transmitting seductively to them… or else (from the “literary” rather than “technical” direction) about individuals discovering that the transmission process stereotypes them and then, in passive aggression, casting about for kindred spirits tuned into their favorite channel. All gender criticism reduces to this model. All neo-Marxist criticism reduces to it; all literary criticism along ethnic and cultural lines, likewise. Sender with special interests, manipulated receiver; receiver nursing the wound of a stunted identity posting rogue transmissions to gather a following… how very, very, very electronic!

I wrote several pieces on this subject when the Internet was in its infancy, my primary reference back then being the television. My work, not surprisingly, did not find a warm reception in the critical community, and I allowed my monitory efforts to lapse. I rather believe that my heavily documented research now languishes on some floppy disk that no accessible computer will any longer open.

I will conclude this far more tendentious commentary, then, with an overarching observation about most of what I have just written. Even in one of life’s vaguest possible moments—the individual’s frustrating search to answer the question, “Who am I?”—our young people and the immediately preceding generation emerge with tastelessly clear, hard solutions. They are Women. They are Possessors of Negritude. They are Transgendered. They are Native People. Gender and race: that pretty much covers the range of options. Nothing about your musical talent, nothing about your devotion to serving others, nothing about your love of arched bridges… it’s all hard and clear, either in your DNA or inculcated by cultural conditioning (where a high imperative, however, often allows massive excavation to reach the DNA stratum). Music is a cultural category: you were raised to have that talent in its present form. Altruism is also mere conditioning unless you redirect it toward your group of fellow sufferers. Arched bridges… why the hell would you love arched bridges?

The generations raised since the Seventies are brutally purposive—narrowly “goal-driven”, as I called them earlier. I renew the charge now with greater insistence. They appear content with answers to life’s great questions whose parameters are those of the completed search or the changed channel. Type, hit return; enter, execute. Identity becomes a matter of clothing, coiffure, and “bling”. The classics of art become a matter of “opiate in story form massively propagated” or “seditious resistance-narrative covertly diffused”. Go and get, gone and gotten; seek and destroy, sought and destroyed.

I have a feeling that robots think this way. As we “prepare” (or “prep”) our children for the future, they are beginning to think more this way all the time.

What need would so robotic a culture have of a subjunctive mood? Why would you ever say “would” or “might” to a robot, unless you wanted it to overheat? “Go to the garage. Start my car. If the engine fails to crank, consult your automotive manual and execute corrections. If problem persists, contact my office.” Alternate realities in the future are merely rival indicatives with “if” affixed to them. Greater or lesser degrees of mist have no meaning, no value. Kant’s definition of art as Zweckmassigheit ohne Zweck (“purposiveness without a purpose”) will likely suffice to have its naïve user packed off to Re-education Camp… or will the word “programming” unapologetically replace all instances of “education” by that point, as euphemisms at last yield to honesty in our dumbed-down Brave New World?

John Harris is the founder and president of The Center for Literate Values.  He teaches at the University of Texas at Tyler.

One thought on “great-lie

  1. Pingback: The Thinking Housewife › Education-By-Gadget

Leave a Reply