14-2 literacy

The Center for Literate Values ~ Defending the Western tradition of responsible individualism, disciplined freedom, tasteful creativity, common sense, and faith in a supreme moral being.


A Common-Sense Journal of Literary and Cultural Analysis

14.2 (Spring 2014)


literary analysis


courtesy of artrenewal.org


Post-Literate Reading and Writing: A Reconsideration After Another Decade of “Progress”

John R. Harris

Incredibly, no contributor to this journal has examined the relationship between our culture’s rising dependency on advanced communications technology and its declining levels of literacy for about ten years.  The topic was once a staple in our unusual literary diet.  Yet I find direct discussions of pedagogy and literacy appearing in our annals no later than 2005, when Thomas F. Bertonneau published, “Three Essays for Students: On Topics Various and Sundry and Illustrative of Problems Faced by Beginning Writers” (Spring 5.2); whereupon I myself followed with, “The Post-Literate Student and the Anti-Literate Academy: A Bad Match at a Crucial Moment” (Fall 5.4).  As all of this went on, the general public was still purchasing video cassettes and patronizing Blockbuster every weekend.  The standard unit for acquiring music commercially was the compact disc.  Texting was unheard-of.

If, over the intervening decade, the exponential growth of speed and “convenience” in our communications and entertainments has somehow escaped the accompanying increase of shallow thought within their content dreaded by our contributors, the suspension of the trend would be truly shocking.  It would also be a great relief; but I must confess at once that the present reconsideration does not revise any old conclusions.  I believe, rather, that our cultural meltdown is more or less following the predicted track.  I regret that I cannot find grounds for optimism.

Several specific events have lately forced me into this reprise of an old subject.  Faithful readers of our journal know that I have acknowledged for years the need to “upgrade” our format (I shall explain later the irony implied by the quotation marks) from Microsoft FrontPage to a twenty-first century Web style.  Last fall I could no longer delay what for me was a very daunting labor.  When my Web host of years past insisted on locking out our international readers due to what he called security concerns, the need for a new format was compounded by an absolutely pressing need to find another host.  Happily, I was able to transport The Center’s galley overland to another river and launch her again, thanks largely to assistance from much younger and more tech-savvy volunteers.  As of this writing, the work of transition is still far from finished; in fact, the essays cited above have not yet been re-loaded and indexed in our archive.  I am confident, though, that we can proceed steadily from here with little obstruction.

I should be elated, I suppose.  The reasons why I am not have nothing to do, naturally, with the quality and willingness of the help I received or with the technical success of the result.  They arise purely from what one might call the abstract cost of the operation—a cost that shows up neither on screen nor on balance sheet.  In another sense, they are by no means abstracted from the labor of site-building.  If I may list them, I will indeed start with the physical toll of the labor itself and the physical risks of soaking up information through electronic transmissions: an item that none of us even thought to mention a decade ago.

I.               Numb thumbs and killer rays: physical health risks and their psychic implications

Another dramatic revelation: your author has his left wrist wrapped in an ace-bandage as he painfully types these words.  I cannot attribute my bouts of Carpel Tunnel Syndrome entirely to all the dragging-and-pasting I was forced to do in the website’s reconstruction.  My first encounter with CPS was in my right wrist, and hindsight clearly reveals that the iPad was the culprit.  Once my academic institution had paternally equipped all of its minions with armor from the desk and mind of Saint Steve of Jobs, it was only a matter of time before I naively took to composing practically everything on the portable keyboard and sending it to myself for further formatting (if needed).  I gave inadequate consideration to my unusual body type: shoulders broad beyond average and wide hands with stubby fingers.  Whomever such tiny keyboards are designed for, it isn’t me.  The right wrist eventually had to be excused fully from keyboard duty: for months I became exclusively a lefty typist.  Only after buggering up the works in this wrist, too (during which time the right one has largely recovered, I’m happy to report), did I finally grasp that squeezing my limbs around a minute spread of letters was the central—and ineradicable—problem.

Was the iPad’s keyboard in fact designed with any model of the standard human physique in mind?  I doubt it.  The bath tub wasn’t, in its first exposure to mass-production; neither were most classroom desks—and the typical auto seat to this day seems to me as well crafted to induce lower back pain as if its creators had intended to do just that.  My point is indeed a generalization about all such mass-produced technology: technical innovations advertised as making life easier for us have a long history of making us, instead, more pliant to the machine’s whimsy, often at great sacrifice of long-term physical comfort.  Nor does this invidious process of adapting man to machine stop at warping our bodies: I introduce my own nagging aches into the discussion mostly as a vibrant metaphor for an abuse that spreads much deeper.  As the body goes in these “convenient” adjustments, so goes the soul: both wear themselves to a frazzle on technology intended only for instant access, not for healthy use.

Look at it this way.  With an iPad I could whittle away on an article or essay at almost any time and in almost any place.  Why?  Because the gismo is small and light.  (Newer versions, of course, can even be transported in a pocket.)  This handy size—and I believe the Germans actually call their smart-phones ”handies”—means that the fingers must work over a very constricted space in typing, which has the potential of damaging nerves in the lower arm.  Was such a possibility ever considered in the production of the miracle-keyboard?  I feel certain that it was not.  Why would it have been?  The objective was handiness, not hand-health.

In the same way, the new generation of communication-devices serves to ensure that thoughts may be instantly transmitted—but not to ensure that these transmissions will be thoughtful.  On the contrary, the very instantaneity of “messaging” almost guarantees that the message will be brief and flow off the proverbial top of the head or tip of the tongue.  In the course of a week, a typical citizen of the twenty-first century (which does not include me, by any means) may read a conventional book’s worth of words in emailed or texted or—most lately—tweeted format, all of them contributing but a light touch to a shallow expression.  We have immersed ourselves in mindless chatter.  If our souls do not ache with the spiritual equivalent of CPS, it isn’t because the constriction lacks morbidity: it is only because Mother Nature gave us nerves to warn of bodily abuse but none to warn us of psychic strangulation.

I will adduce evidence of “Cognitive Tunnel Syndrome” under another rubric below.  First allow me to develop my analogy between physical and psychic harm in one more direction.  I believe that I must have suffered from over-exposure to radiation in the early years of the new century, while a massive old PC regularly perched a couple of feet from me as I pecked away, quite often, for several hours of the day.  My eyebrows virtually fell out at one point, moles formed on my upper body, insomnia tormented me, I suffered from digestive problems, and I grew far more short-tempered than I have ever been before or since.  These symptoms abated whenever I took about a week off from my routine, and I eventually managed to reverse the effects by backing far away from the monitor and (most successfully of all) working from within an improvised Faraday Cage.  To this day I must observe a few such precautions to stay symptom-free, even though the new technology no longer contains Cathode Ray Tubes (CTRs).  I couldn’t begin to say why the unpleasant effects slowly revive if I work unguarded before a MAC laptop: science appears to offer no reason other than “psychosomatosis” (i.e., I’m just imagining it all).

My condition elicits little sympathy from most people, because most people live in much more intimate proximity to electronic technology than I yet suffer no ill effects (as far as they know).  This highlights two further phenomena of the computer age: a) the “one size fits all” mentality of producers that makes no provision for an eccentric response, and b) the “trouble-maker” brand that users reflexively slap upon anyone whose experience is not as happy as theirs.  The first item of this pair I shall also award a special section below, for I find it to have major psychic consequences and to be ubiquitous in modern technology.  The second is much less easy to document; in fact, it’s little more than a “feeling”.  Yet I am confident that anyone who has ever not rushed to embrace the latest “must have” gadget knows this feeling.  I encounter it in my resistance to the cell phone (I own one but refuse to carry it everywhere and to harass my acquaintances constantly with its aid).  I also have no Facebook page, and students and teachers alike seem darkly puzzled that I do not post grades on or allow essay submissions through Blackboard.  I am treated sometimes rather like the beggar who is caught rubbing grime into his open sores that they may continue to fester and draw attention.  I am not instantly accessible: why am I not instantly accessible?  Everybody can be instantly accessible today: those who are not have chosen to hold aloof.  Why are they holding aloof?  Do they wish to draw pity, or attention… or are they trying to shirk work, or do they consider themselves better than everyone else?

On my trips to rural Ireland several decades ago, I often noticed that, whenever I rode a bus from one village to another, hands would go up all along the aisle in a salute from time to time.  I soon figured out that these simple people were crossing themselves as we passed a church.  None of them ever gave me any grief for not participating in the ritual; but the e-world has at least as many e-rituals, and those who abstain from them are likely to be regarded as enemies of progress.  A new tribalism has emerged.  Pleading health concerns for one’s abstinence elicits more than skepticism: it draws accusations of malicious purpose.  How could any of these artificial wonders possibly be unhealthy?  Was the light bulb unhealthy?  There must be another reason—some kind of antisocial subversion must be at work.  What’s your real game?  Come clean!

As I say, there is no earthly way to prove that our subservience to these tribal fetishes commonly styled “necessities” has a substantial connection to our newfound insistence on tribal identity in other respects.  Yet the “us/them” thinking bred of assessing quickly whether or not a stranger speaks e-idiom might certainly lend itself to other kinds of hipshot classification.  Why is it so important to know a person’s race in the twenty-first century?  Why is a person of a certain race who doesn’t endorse a certain political candidate considered a “race traitor”?  Why can there not be any rational, morally respectable counter-argument to the proposition that global warming is occurring and is manmade, or that homosexuality is entirely genetic?  Those who resist the new orthodoxy, rather, are enemies of progress—“fascists” and “haters”.  The detestation of them is as passionate as would be a South Sea islander’s for some poor castaway who had unwittingly desecrated his totem.  An hysterical “stasiphobia”—a terror of staying put, of not going along with the Movement—has become a prominent fixture of contemporary living, and its origins are as mysterious to those of us who remember earlier days as are the causes driving the proliferation of nose-rings and tattoos among our youth.

II.              “Cognitive Tunnel Syndrome”: narrow-mindedness goes tribal

Among literacy’s most transparent benefits is its antidotal relationship to tribalism.  The literate person absorbs text rather than spoken words.  Hence he is able to create an intellectual life independent of flighty jabber and collective prejudice.  He can let in or keep out ideas as one would turn a spigot on or off, and he has spigots tapping kegs of widely differing contents from which to serve his taste.  Throats and tongues cannot be so controlled.  Except for Algernon Moncrieff’s butler, people cannot deny access to spoken words because they “didn’t think it polite to listen.”  Spoken words are instantly public domain, even when whispered: as Irish peasants used to say, “No story’s a secret if three people hear it.”  The literate mind, even when absorbing published texts, does so in a combination, at a rate, and with reactions registering its own ends and its own resources.  Its sense of public and private spaces—of self and other—becomes keenly honed.

As a result of this hard-won independence, the literate human being is able to develop in directions that interest him or her rather than in those that the tribe has preordained to be “proper”.  Such a person grows into an individual, increasingly relying on the inner light of conscience to arbitrate moral matters, the inner genius of inspiration to animate creative moments, and the inner rule of universal reason to overcome dull habit.  Literate people tend to make up their own minds because literacy has made them aware that they have minds to make up.

When I initially inquired of a local Web-design wizard what kind of charge I was looking at for his overhauling The Center’s old FrontPage site, he told me frankly that the labor needed was massive, far above my organization’s declared means.  He was complimentary of our pages as to their content—but their layout, in his expert opinion, was so dismal that it would effectively prevent the site from ever being taken seriously by a broad Internet audience.

While I was grateful for my highly trained professional’s candor, he actually told me nothing that I hadn’t heard before.  The verdict bothered me, now as always, not because I felt that I had published an inferior product, but because I felt that we were all slaving away for an inferior society.  One does not refuse to read a rare book because its pages are moldy and apt to crumble; I do not, anyway.  The contemporary audience, though, appears to demand that its reading matter have a certain look—not a certain depth or objectivity or sense of taste (all qualities that are defined by the conceptual content of words read and processed), but rather a certain wrapper or delivery system.  All that glisters is gold: all that’s dim is dross.  If a site is vibrant with multiple columns and links and images and audio extensions, then it is a “good” site.  Anything else is unworthy of attention.

I don’t find this to be the response of a stupid mind—or not, at any rate, when it is much more widespread than the number of stupid people around us.  Some who share it are quite bright: that, in a nutshell, is what worries me.  Bright people are being conditioned to adopt the reflective habits of stupid people by electronic delivery systems.  The message must look “cool” before its content may even be attended to.  I ceased being surprised a long time ago when my students would criticize as “bad writing” a selection whose vocabulary exceeded them.  They now expect to have ideas delivered to their cognitive doorstep like a pizza.  An author who fails to do this is incompetent and contemptible.  (I’ll never forget my first run-in with such thinking, about twenty years ago: a student wrote of Sven Birkerts, after struggling with one of his essays, “This guy is a idiot.  I couldn’t hardly understand a word he said.”  And no, I would not have described this coed as bright—not twenty years ago.)

As for vocabulary, so for the pure “visuals”: everything must come down to the “user’s” level (a term that now translates reading into a consumer activity, like grocery-shopping or bowling).  I anticipate a day when students will no longer even check their campus email, but instead be responsive to “updates” or “feedback” about assignments only through text-messaging.  When I was young, we indeed had nothing quite like texting.  We had to satisfy ourselves with conventional phone conversations, letter-writing, and live exchanges at meetings arranged or accidental.  Now, in the “progressive” world, communication has been poured into a funnel, as it were.  People can neither write letters nor converse: they text instead of both.  Naturally enough, therefore, when they are forced to practice reading for academic purposes, they expect digestible little bites of information highly seasoned with sarcasm or boorishness and complete with boxes at the end where they may commemorate their visit with something like textese graffiti.  For good measure, a green “thumbs up” or a red “thumbs down” will tweet their royal pleasure to thousands of “friends”.  How fitting, that these images should hearken back to the barbarized masses for whose amusement gladiators once hacked each other!

There are ophthalmic pathologies where the sufferer can see the world only through a narrow tunnel.  Technology is reducing the imagination of our youth to an analogous near-blindness.  Approach them through the proper channels, and you may just win a thumbs-up and an exuberant tweet.  Come at them from outside the tunnel’s narrow parameters, and you fail to achieve so much as an acknowledgment of your existence.  Again, such narrow-mindedness typifies the pre-literate tribal mind: the original Homeric word barbaros was indeed intended as a mockery of the meaningless jabber pouring from the mouths of inferior races too crude to know Greek.  It is the mark of a barbarian to consider everyone else a barbarian who does not wear his feathers and know his dances.

III.            “One size fits all” thinking: the technologically reinforced mainstream

Of what, exactly, is this tunnel constructed?  Of various “better” ways, all of them demanding a tighter and tighter collaboration of “subscribers”.  In my lifetime, new communications technology has always been unveiled, with great braggadocio, as the state-of-the-art means to liberate individuals.  The shift from the Big Three television stations to cable TV and, soon after, to satellite networks was supposed to release viewers into a paradise of selections from which they could choose in à la carte fashion.  What in fact happened was that offerings became more mainstreamed than ever.  Since competition for viewership suddenly grew ferociously intense, the “niche market” outcome of attracting a small elite to specialized programming was a quick ticket to bankruptcy.  Instead, networks stumbled over themselves in quest of a plurality of viewers.  Sameness of programming was multiplied by ten or twenty in most households.  At least in the old days, a prime-time serial might risk difference because it was assured, in so narrow a field, that a significant minority would give it a brief look… and maybe the producers would catch lightning in their odd bottle.  Now such experimentation was too risky for shoestring budgets.  Cooking channels, religious and political panel-discussion channels, marketing channels, and other such ventures survive because fixing one camera on one stage for an hour requires little outlay (and also because dish and cable companies insist on serving “combo dinners” from their menu rather than accepting à la carte orders).  The promised revolution in more creative and ambitious fare never happened.

Luminaries like George Gilder assured us in the early nineties that the Internet would make of every humble plebeian at his keyboard a Cicero whose voice could now be heard around the world.  The common man would at last find a rostrum from which to declare his views above the outstretched arms of would-be censors.  Of course, this promise could not be kept, either.  The very superabundance of transmitters, as with cable television, made significant audiences almost impossible to attract.  A tiny few Web-savvy voices manipulated the Internet’s arcane technology to become distinctly audible; the rest fell on unheeding ears, their words muted under hundreds of higher rankings on the search engines.

Advanced communications technology, to repeat, centralizes and mainstreams.  For the very reason that it can bring potentially everyone into the amphitheater, it requires that all but a very few merely whisper while facing one stage.  Each of us, perhaps, may utter a peep if he will wait his turn or be content only to peep (I should be writing “tweet”, of course) as millions of others do so.  In the dark days of yore, when mortals were stifled by the restricted options of ink and print, they were limited to expressing themselves in poems, songs, ballads, short stories, essays, articles, editorials, pamphlets, novels, satires, dramas written or performed, sermons bound or delivered, speeches published or declaimed… and their audience, to be sure, was only that faithful few hundred who had come to identify a certain publisher or author with material to their taste.  Now those shackles have been struck from our ankles.  Now all of us may address millions at once… in a blog that ten may find and two may read.

The one truly successful communication of our time—the signature genre—is the YouTube video, which may go “viral” overnight if it shows an ambitious youth vomiting or defecating from a dangerous position or a celebrity swearing a blue streak at a papperazzo.  We can all see it at once—on our iPhone.  We all have an iPhone (or everyone with a pulse), and we can all send the link to our friends.  Instant universal communication: what more could anyone ask?  What else were we ever promised by progress?

Not a certain quality of communication, apparently: not any breadth of vision or depth of conception.  When every cow in the field has a moo in the new delivery system, the only possible production is a bunch of mooing cattle—or maybe the bellowing of a monstrous bull: the cacophony of the demos or the incessant drone of the tyrant.  One size fits all, and the omni-inclusive mass merely needs to determine if it wishes to shout all at once or listen all at once.

IV.            “Hits” and ratings: the triumph of utter vulgarity

Now that mass communication has been reduced to traveling one fast-paced but narrow highway up to the peak of Mount Parnassus, how do aspiring courtiers of the Muse avoid the traffic jam?  Small-market organs cannot stay in business: even if they attract a few visitors, their resources are insufficient to keep them abreast of constantly changing, state-of-the-art software (e.g., my own debacle with FrontPage).  New voices cannot be heard unless their broadcasters have the technical sophistication to lift them to the top of search results.  And what kind of subject, then, do hordes of tech-users “search”?  Women without clothes, major celebrities (often celebrated for playing women without clothes), idiot-daredevils engaged in “you won’t believe this” acts, lurid and bizarre violence, shocking disruptions of the norm in eating- or toilet-habits, exposés of outlandish conspiracies said to have “altered the course of history”, encounters with aliens, sightings of Bigfoot or the chupacabra… anything, it seems, that is antithetical to taste or sense.  Taste is the set of expectations, of stable background colors, within whose framework real genius shines brightest.  A person of discriminating judgment can scarcely rate the insight or creativity of a commentator or artist whose expressions intersect humdrum reality almost nowhere.  One must produce a surrounding of intelligent agreement in order to enunciate intelligent disagreement.

All of which, once again, is the diametric opposite of our electronic reality.  Mass communication has successfully grown so massive that only the radically eccentric can break through the deafening hum of conformity.  This much has long been apparent to film-producers, but they were held in check until the sixties by explicit codes of decency.  Once the dogs slipped their leashes decisively in the seventies, the war for viewership became a race to greater and ever greater vulgarity.  Pornography was perhaps the tamest of strategies used.  Even graphic violence grew tiresome if it portrayed mere cops and robbers or cowboys and Indians.  Torture and cannibalism sold more tickets: rape was better than consensual sex, and evisceration better than a slow-motion bullet impact.

Classic movies once drove their point home by nestling a unique situation in a bed of generic clichés or sleepy bourgeois standards.  Their remakes strive to explode the predictable from one scene to the next.  Compare The Day the Earth Stood Still old and new, Cape Fear old and new, or The 3:10 to Yuma old and new.

I need hardly repeat my thumb-nail assessment of YouTube’s “biggest hits”: most readers will be more aware of the paradigm than I.  Even cultural commentators who style themselves conservative, or are so styled by others, appear bitten by the bug.  Glenn Beck’s emailed teases for The Blaze often feature some “man bites dog” video to awaken the yawning browser; and Sean Hannity brought up for panel discussion in mid-January (on his nightly television hour) a YouTube video wherein a mechanized baby in a perambulator, painted with a diabolical expression and geared to leap out at passers-by, elicited various responses of terror and horror on a New York Street corner.  The clip had received two and a half million “views” in its first day.  Hannity praised its “candid camera” vintage of humor.


If my purpose were to deplore such people for displaying such tastelessness, I would be arguing against my own cause.  On the contrary, my point is precisely that practical, run-of-the-mill people—the sort from whom we would normally expect “salt of the earth” simplicity rather than Ivory Tower iconoclasm—are no more resistant to the lure of the shocking than anyone else nowadays.  The electronic media have taught us, both as broadcasters and receivers, to value fireworks.  Nothing else gets through.  Our children send crude tweets and jeopardize their professional future by posting debauches or brawls on YouTube, all because (I am convinced) they want desperately to demarcate some sort of individuality, and the only means of distinction available through Internet and iPhone is strident vulgarity.  I believe, indeed, that this search for difference at all costs accounts for excessive and obscene tattooing, wearing of rings in noses and lips and foreheads, spiking and bizarre coloring of hair, shaving of heads, and other kinds of repellent self-defacement so common in young people now.  That is, e-culture has left them so alone and insignificant at their various “world at your doorstep” terminals that they settle for drawing any kind of notice.  To be remarked as a “freak” is at least to be remarked.

V.             Composition by “stickies”: the grammar of the picture

Naturally, the search for “gotcha” communication styles favors the graphic over the verbal.  Reading takes time.  Reading even so short a bit of rubbish as a tweet takes more time than gazing at a photo.  And while the tweet probably beats the YouTube video in terms of seconds not “wasted” in consumption, consuming the video is both easier and more viscerally stirring.  A line of doggerel still requires a scintilla of intelligence to be read and processed.  A picture… the old saw rates one picture at a thousand words, but it neglects to add that pictures fix the volume of translation from concept to image at a stable zero.  Any imbecile can get something out of a picture.

Electronic communication is making us ever more pictorial as thinkers, even in our language.  By this I certainly do not mean that we now paint verbal scenes more eloquently than our forefathers—would that it were so!  I mean, rather, that we approach a sentence as a series of individual snapshots that we juxtapose.  We are losing our sense of syntax.  Young writers, I find, are ever more inclined to generate “noun+noun+noun”, with the connections between or within the nouns suggested ever more vaguely by ill-chosen prepositions.  As for verbs, they have become mere adjectives—participles (often of muddled tense and voice)—hooked to nouns by the copula.  “The bird fluttered for a moment and then fell to the ground,” may emerge as something like, “The bird was flapping on its wings in a moment and then the next moment it was laying there along the ground.”  Expressions of time and space loosely drift about the noun/adjective nucleus like the cell padding of a JPEG image.  Flash… flash… flash… and the perceiving mind is to gauge perspective, distance, and other relationships as it does in a flat landscape painting.  (In that sense, of course, mental processing does indeed enter into visual perception—but it is non-deliberate, instinctive processing.)

I recently grew so vexed at the sloppy language advertising the latest “bestsellers” on my Kindle that I started writing down the contents of these unsolicited “billboards”.  Below is the perfectly random harvest of one week’s round.  (By “random” I mean that I recorded the four titles and their blurbs that recycled on my screen for several days: I have deleted nothing from that time frame.)


The Nero Decree: A thrilling story of two brothers in the backdrop of Nazi Germany and World War II.

Gray Justice: After ex-soldier Tom Gray loses his wife and child, he goes on a roller coaster ride of vengeance.

The Atopia Chronicles [both “a’s” formed like lambdas, without the cross-stroke]: In the future, the price of paradise may be the survival of the human race.

Alone: For Dr. Victoria Peres, the bones she finds unearth chilling crimes… and this time, her own secrets.


Something’s amiss, I would say, with the construction of every “tease” of the four.  In the first case, is not the focus of attention set against a backdrop rather than in it?  If the latter… well, at what point in this “thriller” does any action occupy front stage?  Do we have nothing but backdrop all the way through?

In the second e-adventure, the adjective “roller-coaster” appears to fancy itself a noun and to refuse a hyphen’s intrusion.  Poor hyphens—nobody wants them in our new century!  More critically, one must wonder why Tom seeks vengeance just because his wife and child were “lost”.  Were they murdered?  Then why not confide that they were murdered?  And if Tom pursues revenge, isn’t he engaged in a rather active undertaking—and isn’t the patron of roller coasters (noun) involved in a definitively passive enterprise?  How can Tom exact vengeance when he’s just holding on for dear life?

Case Three: does the author (and I have no interest, notice, in discrediting authors by naming them) realize that atopia literally means “un-place”, since the original outopia and the late dystopia were both coined in full awareness of their Greek meaning?  What in hell is an “un-place”?  The lambda-like “a’s” seem to beg, “Go ahead—turn this into Greek!”  Latopil?  Probably not; probably just window-dressing: i.e., looks cool (in somebody’s estimation).  More significantly, what exactly is the cost of Unplace, again?  That the human race must survive?  Could the author be trying to say that its cost is the human race’s extermination (i.e., its un-survival)?

Then, at last, we have the sexy doctor’s name (with a sexy pair of eyes in the cover graphic).  She finds bones that unearth chilling crimes for her.  Too bad she’s surrounded by duller people who have gone blasé about buried bones.  These particular bones, this time, also implicate her.  Usually she finds bones that simply suggest chilling crimes of an impersonal nature—and then only suggest them to her.  Maybe the good doctor needs to prescribe herself a sedative (or to extinguish whatever she’s smoking).

And to think that in every one of these cases, we see the chosen line—the single finely crafted much-pondered banner—that a contemporary professional novelist has decided to elevate before the whole world in an effort to draw gawkers into the shop!  One almost wants to buy and download just to see how much further the writing can deteriorate at less “polished” points.  Almost… but not for four bucks.

I cannot resist the image of the “sticky”—the handy (there’s that word again) little note with glue on the back that readily adheres to refrigerators, doors, and other eye-high non-electronic obstructions.  Victoria Peres (stick)… bones (stick)… crimes (stick)… secrets (stick).  Try “vengeance”; beside that, scribble-and-stick “ex-soldier” and “wife and child”.  Try “brothers”: now smack “Nazi Germany” and “World War II” (oh, yeah… those German Nazis) on the refrigerator.  Need some glue?  Linguistically, we use prepositions: any will do.  Just slip a “for” or a “with” in your mouth, chew on it till it’s moist, and then apply it between noun and desired surface.

Honestly, the plots described in these would-be sentences are so instantly cliché that no accuracy is really required of any non-nominal elements.  “Aliens”… “crash”… “cover-up”… “Chiefs of Staff”… “Slade Gilroy”… “risk”… I feel something like a comic book beginning to emerge: lots and lots of pictures and a few balloons with words.

To be sure, tales for the masses have wallowed in cliché since Daphnis and Chloe.  If one were to shift suddenly from the e-thrillers above to Zane Gray or Dashiell Hammett, however, one would likely imagine that Addison and Steele had been resuscitated.  The writing of the Internet age is not merely lame in the manner of most popular romances: it has a grammatical degeneracy distinct to our time.  Jack London and O. Henry could spin a yarn.  Here we find shreds of plot only, where motive becomes impenetrable because the connection of events in time and space becomes indecipherable.  The medium is indeed the message—and what medium is represented by wavelengths decoded as pixels, if not one of instant images, free-floating in time and space?


Any kind of writing, essay as well as fable, is supposed to reach a conclusion.  I really don’t know how this story ends.  Even a tragic final scene seems out of the question, since someone capable of appreciating the tragedy would have to survive—and that, of course, would postpone the ultimate collapse.  Hope lingers.  The rare young person persists in saying that he (or, usually, she) prefers a bound book to a pulsing screen.  We may always sigh, “O tempora, o mores!” and wait for something better—or project it into our grandchildren’s lifetime, after we ourselves lie cold.  Yet I shall never fully understand why it appears so invincibly difficult to accept that machines draw us toward the mechanistic as we draw them toward the human.  We need more than ever to comprehend our essential humanity, that we may command our machines never to diminish it… and yet, this folly of dismantling our higher gifts in “innovation” continues.


Dr. John Harris is the founder and acting president of The Center for Literate Values.  He is also a Senior Lecturer specializing in World Literature and the classical languages at The University of Texas at Tyler.


One thought on “14-2 literacy

  1. Pingback: The Thinking Housewife › An Age of Cognitive Tunnel Syndrome

Comments are closed.