14-3 philosophy

The Center for Literate Values ~ Defending the Western tradition of responsible individualism, disciplined freedom, tasteful creativity, common sense, and faith in a supreme moral being.

P R A E S I D I U M

A Common-Sense Journal of Literary and Cultural Analysis

14.3 (Summer 2014)

 

academe in decay

prae-206

courtesy of artrenewal.org

 

Erchomenology: The Study of Things to Come

John R. Harris

a) the uniqueness of Now, and of Time to Come

The following four facts about tendencies in contemporary society are beyond argument.

1)     Citizens of the “postmodern” world exist in a much higher density than any generation of human beings before them.  This is the only of my four propositions that might be initially questioned; for ancient Rome or Alexandria (not to mention the largest settlements of India, China, or the Aztec and Mayan empires) must have seen warm bodies jammed together in hive-like proportions that our automobile-based lifestyle has effectively thinned out.  At the same time, however, a sizable percentage of these earlier civilizations clung to life on the farm.  Mechanization was insufficient to have things otherwise.  Only since the Industrial Revolution have masses of farm folk become city folk; and this trend has accelerated exponentially rather than arithmetically, until today the green spaces between urban centers in our own nation’s southeast are almost as deserted as the desert spaces between Las Vegas and Reno and Phoenix.

Furthermore, one supposes that people on the streets of the ancient metropolis would not have chafed upon and against each other as we do.  Fast-moving machinery did not menace; turbines of raucous noise, sometimes deliberately amplified, did not surround them; rigidly separated functions within the city did not require that they rush ever and anon to reach job or market or temple or home in time to escape censure; clocks did not warn them ubiquitously, either, of the impending instant when they would be counted absent or branded tardy.

Both in terms of the actual percentage of the populace wrested from the hinterland and in terms of the somewhat subjective feeling of “crush”, ancient people were no match for us.

2)            Homo postmodernis (henceforth HP) possesses artificial means of communication far in excess, not just of what his ancestors knew, but of what they could have imagined.  Direct speech was already being rivaled by exchanges over the telephone half a century ago.  Now the descendant gadgets of that wired and immobile convenience scarcely even transmit a spoken word, but rather are largely devoted to brief, formulaic messages encoded in an evolving shorthand.  The very act of reading has gone high-tech.  The words I write at this instant will be processed for a screen that constantly dangles links to alternative pages before the reader and will also flash or ping notification of new email or awaiting updates.

3)            Distinct from the previous proposition is the immensely multiplied intrusive power of “data collection agencies and services”.  Perhaps HP should have seen this one coming; but he didn’t, as a man or woman on the proverbial street (truly proverbial—for what ordinary, innocent citizen walks our streets any longer?).  The “smartphone” (one word), for instance, continues to be peddled as a means of instant access to far-flung friends and relatives.  No marketing campaign would ever pitch it as a more efficient means, besides, of supplying the central government and the private-sector elite with detailed personal information.  Intimate privacy, rather, is emphasized.  The sharing of such intimacy with unknown eavesdroppers, insofar as it is even suggested, becomes the source of “negative” marketing strategy: i.e., this or that service provider of email vaunts ironclad security from the invasions of snoopers.

Nevertheless, the eyes over our shoulder are here to stay, and they can only get keener and more numerous.  (Indeed, service providers are themselves motivated to facilitate snooping, since enhanced paranoia must feed the public’s demand for greater security; in the same way, the manufacturers of shields from computer viruses have always had a hidden motive to keep those viruses evolving.)  HP will be watched closely from now on.  Whether his spectators will make sinister use of their disembodied powers or, instead, simply drown in a tsunami of information is another matter.  Eventually, more and more of the information-processing—and perhaps even the associated soliciting or censuring of individuals—will have to be taken over by machines.

4)            The classical Chinese poet Wang Wei writes of crossing the broad Yangtze: at one indefinable point, he grows so intent on the newly emerging shore that he fails to notice the slipping from sight of his home behind him.  Even so, the many transformations of HP’s brave new world may blind us to the most obvious effect of the change: deracination from the past.  We see our flashing, exotic future rising before us; we do not see the beloved haunts and habits of our past disappearing behind us.  People move to cities, and they proceed to move often within cities.  Their contacts are instantly reached by a handy device (perhaps, in the near future, something like a wristwatch or a collar pin); but the “contact list” itself is in constant flux as jobs and neighborhoods succeed each other dizzyingly and ambitions reach toward far horizons.  It probably does not even occur to HP to revisit the streets and playgrounds of his youth until he has consumed about half of his biological span.  Then, if he should go seeking them, he will almost surely be frustrated.  Should they not be physically plowed under or degraded to slums too dangerous for him to prowl alone, he will still find the human landscape unrecognizable.  People in his society no longer bequeath residences to their children.  The residences themselves turn as insipid as a thirty-year-old hair-do after one generation has been reared in them, and are thereby doomed to the auction block and the bids of less affluent wanderers.

If it is important for the human being to preserve some kind of connection with his or her past, then HP loses something vital in this transaction.  Perhaps he supplements the loss in other ways; yet he is unique in human history, once again, in having to make such a shift.  The question certainly deserves to be asked: how well has he succeeded at finding happiness?

b) the necessity of a moral component in social analysis

I have believed for several years now that an academic field should be dedicated to such queries.  We currently have nothing in the curriculum of which they are the nucleus.  History studies human events of the past.  Political Science studies the theory and practice, past and present, of systems and institutions that direct human activity through laws (i.e., through force).  Economics studies the behavior of markets—a project which may certainly predict human events on a grand scale with some accuracy, but which, in studying only markets, fails to measure (or even consider) the full measure of “humanity” or “happiness”, either one.  Psychology translates satisfaction into endorphins, and does so, of course, on an individual level.  Philosophy allows the inquisitor to probe the meaning of abstractions without being chained to empirical terms; yet besides the troubling fact that current philosophy seldom does anything of the sort (and hence is moribund as a study independent of the sciences), the classical philosopher’s tendency is to see things in universals, and so to overlook—quite deliberately—the critically peculiar elements of times and customs.  The same objection might be raised against Theology; and, of course, that field throws an even smaller shadow over the contemporary campus than Philosophy.

It has been suggested to me that what I seek is Sociology, or some branch thereof.  I think not.   The methodology of the sociologist seems to hinge upon establishing a statistical baseline for whatever anomalous behavior or condition is being analyzed—median income for poverty, average marriage and divorce rates for familial dysfunction, average years of education for the population of incarcerated youth, and so forth.  This approach necessarily divides the social house against itself.  On the one hand, it invites ideologues to draw conclusions that represent those in the margins as victims; and on the other, it implies that those within the standard deviation do not themselves illustrate any sort of anomaly.  The given society’s numerical averages are those stable references against which we recognize instability.

What if entire societies, however, are suffering from a kind of collective neurosis compared to their own behavior of a generation earlier, or to a contemporary society’s in a very different environment?  Perhaps no society is ever perfectly stable and happy, just as no single person is ever so.  Have we, then, no benchmarks from which to reckon relative health?  The contemporary sociologist, it appears to me, is so deeply imbued with the notion that norms are practically arbitrary—virtual products of chance, ungrounded in any abiding principle—that he or she would scarcely even register human sacrifice as pathological.  This analyst, while forever “passing judgment” (in the popular sense of being his society’s conscience), does so only behind the screen of statistics.

In contrast, the investigator I have in mind would openly “moralize” in some measure, and would do so with rigorous discipline.  Mere rationality, after all, should be able to specify that the healthy society does not consume its members as a fire consumes wood.  Without an enhancement of safety, individuals would have little motivation to form societies, in the first place.  Should the group begin at some stage, therefore, to believe collectively that ritual slaughter of certain members increases the survival rate of the many, we ought to be able to understand the collective mind as suffering from a variety of paranoia or hysteria.  Something is wrong here.  It isn’t wrong with this sub-class or that alienated minority, but with the entire social organism; and it isn’t wrong because the victims appear shortchanged, for they may even compete to go under the high priest’s knife.  If a very high percentage of the surviving group remains fully convinced that life is now better—if the sacrificial victims themselves are content to die for the multitude, or if they bask in the honor of doing so—the illness is no less.  On the contrary, such “stabilizing attitudes” make the problem more palpable than ever: for what attitude could be more “sick”?

Collective illness of this kind must be defined as objectively as possible, naturally, if we are ever to gauge the spirit of the times and predict with any accuracy what times await around the corner.  What drives a society to embrace, say, a murderous irrationalism?  What societies have done so in the past?  What future trajectory do such societies tend to follow?  How are changes in their environment (e.g., in kind and degree of technology or in pressure from rival societies) likely to affect that future?  We cannot ask meaningful questions such as these if we cannot first identify a society that leaves human hecatombs in its wake as “interesting”.

Other value judgments would similarly arise from rational assumptions.  Truth allows understanding and collaboration; falsehood creates confusion and disorder.  Even at the simplest levels, people are uncomfortable with patent deception.  To be told that X isn’t home and then glimpse his face in the window, to hear one’s own witticism or insight claimed by another, or to be promised support on Friday only to find empty seats on Saturday can destroy friendships and feed enmities.  When a society invests heavily in formal and persistent fraud, we should likewise detect a symptom of illness.

Again, I am by no means calling for a rain and hail of moral condemnation upon societies around the world that we, thanks to our own cultural conditioning, decide to target.  I am observing, instead, the vital importance of objective moral judgment to predicting humanity’s future on a vast scale.  Certain patterns of behavior wear a society’s members down over time, regardless of economic, technological, military, or other successes.  The decline of societies suffering crises of mass hysteria or mass deception or mass despair is inevitable.  Yet the social sciences that I see currently on the books will not allow us to make this calculation, since it requires assuming that all values are not relative to time and place.  Hence these same sciences, as they stand, must prove inadequately predictive whenever they turn to the future.

I hasten to observe, as well, that the mainstream social sciences have failed miserably to hold aloof from sweeping value judgments, though their practitioners are quick to denounce the moral prejudice of methodology not their own.  In several particular cases, these “professionals” have indeed shamefully discredited themselves.  I think, for instance, of the Stanford Prison Experiment of 1971.  Researchers at Stanford University, led by Professor Philip Zimbardo, designed their own mock-prison to document what sorts of atrocity would rise to the surface once ordinary human beings settled into “guard” and “inmate” roles.  The specific motivation behind this “inquiry”, though crucial to determining Zimbardo’s prejudices, has not been made generally available.  All participants were volunteers from the local campus community; and, while screened for physical fitness, they too were never questioned about any specifically political interest in the undertaking.  Those chosen as inmates were “arrested” without explanation before friends or family and carted off to the artificial prison by a highly cooperative local police force.  Though physical torture was not permitted, verbal abuse, sexual humiliation, sleep deprivation, disruption of toilet routine, and similar types of harassment were dished out so zealously that the intended two-week experiment was shut down after running less than half of the projection.

In the first place, giving so little attention (exactly none, in fact) to the ideological predispositions of both experimenters and subjects hopelessly compromised the “findings” before they were ever “found”, since the self-selected participants would quite likely have perceived their adventure as a chance to make a political statement.  It was 1971.  Formal authority was held in very low regard, especially around elite campuses.  Students “role-playing” as guards could be expected to dose their performance amply with sadism, and students posing as prisoners to invite and mime agony with the same enthusiasm.  Zimbardo himself claimed that the control he enjoyed over events corrupted his objectivity; might his purpose all along not have been to watch things get out of hand and then blame the situation?  Arrest before one’s peers by local police, furthermore, scarcely equates to being rounded up after an armed robbery (except, of course, in the mind of someone who has deemed “suspects” to be randomly selected bystanders: QED).  There was, besides, no attempt made to approximate in the laboratory setting the kind, degree, and frequency of abuses alleged to have occurred at any particular prison.  The “experiment”, in short, was farcical, its self-fulfilling prophesy a fit monument to the subjectivity of all science that scorns moral truth as subjective.

Most human beings cannot resist admiration in certain circumstances and condemnation in others.  To study the human animal closely yet forbid, as a measure of one’s rigor, any admission of moral value judgments into the project is to invite blindsiding by one’s own moral prejudices.  The analyst is also a human.  The very contempt for the notion of moral axioms would strongly indicate that this analyst is importing far too much personal baggage into the laboratory setting.

c) the failure of previous academic predictions

Francis Fukuyama speculated that we had reached “the end of history” in 1978 (with an essay of that title), confident that the global triumph of Western political and economic institutions had accomplished all the major objectives of social evolution.  Fukuyama’s thesis might be glossed “the end of ideology”, for he viewed great historical struggles precisely as the competition of alternative systems.  Yet by the early years of the twenty-first century, the world had dissolved into ideological warfare.  Religious fundamentalism vs. secular liberalism, theocratic socialism vs. democratic capitalism, the traditionalism of Arab Islamic states vs. the progressivism of Western technocracies… these and other formulas were proposed as defining the central conflict.  One might well challenge all of them.  In fact, the specific ideologies at issue may not be possible to define succinctly—not the least reason for which is that they were (and still are) constantly misidentified, whether for propagandistic ends or out of sheer ignorance.  For instance, Islam in abstractu is supposed to overcome nationalist tendencies in favor of a global caliphate.  The so-called “Arab Spring” that has set North Africa and the Levant afire, however, displays no impetus more powerfully than that of nationalism.  While the politics of punishing the 9/11 attacks, furthermore, had a distinctly pro-democratic aroma at the time, they have produced in the United States a series of statist initiatives so aggressive that a radical realignment of the old Democratic/Republican divide may be taking shape around basic constitutional issues.  The animus beneath this upheaval is entirely ideological, though its critics attempt to defame it as primitive tribalism of one kind or another.

Professor Fukuyama, in short, was so spectacularly wrong that his essay very nearly heralded the dawn of a new ultra-ideological era.  A political scientist cum economist, he had projected that the new and free circulation of wealth in a global marketplace would replace Cold War rivalries with universal prosperity.  Money speaks the same language to all cultures and political powers (we were to realize).  Everyone would want a piece of the savory pie.  Fukuyama had failed to foresee (among other things) what enormous strains on cultural values—on linguistic customs, religious practices, health habits, conduct between the sexes, and so forth—would be generated as companies roamed the planet in search of the cheapest labor force, and also mobilized labor forces from all over the planet to come and fill domestic niches cheaply.  Economic competition, of course, feeds cultural tension; yet an assault upon one’s culture, much more than upon one’s wallet, leads one to grab a pitchfork or make Molotov cocktails.

Globalism had played havoc with the stability and conformity that human societies need to survive.  Far from resolving the differences that have divided people for time out of mind, the movement identified by Fukuyama loaded the powder kegs and then struck the match.  A “prophet” who ignores the importance of their past to human beings—who assumes that money can make everything right—is destined to end up looking a fool: that prophecy will not disappoint.

Albert Einstein once remarked that nuclear power could be an incredible blessing or an unimaginable curse upon the human race.  As a prediction, this hedging of bets may yet come true by way of the latter alternative; but the former, though often made of technological innovation, is always doomed to contradiction.  Technology will never bring paradise to earth.  The reasons are embedded in basic human nature—which Mr. Einstein, like so many people of staggering intelligence, was poorly positioned to assess in all its disappointing folly.

Say that a novel form of energy is developed in such a fashion as to be readily renewable, cheaply produced, and environmentally harmless.  Humans would be spared an incalculable amount of arduous labor.  At first, they would rejoice to be thus liberated.  Then the luxury of physical ease would become a necessity.  Everybody would expect to have a bottle containing a genie—would demand to have one.  Unlimited free energy would soon be guaranteed by a central authority, like clean water: it would be a right, not a privilege.  As government monitors set about monopolizing the energy market, of course, their venture would be financed by tax money, not by private-sector purchasing (for nothing is truly free in this life).  From one direction, then, “quality control” would be difficult to sustain, since a massive bureaucracy would respond to setbacks by increasing its own mass rather than promoting, firing, reorganizing, etc.  This is how bureaucracies behave: it is a hard fact of human society.  From another direction, the disappearance of the profit motive from the equation—of the chance to grow rich by improving the product—would cause the once-new technology to stagnate in the face accident and abrupt challenge, even if it were basically well maintained.  Free energy would become a dependency—a kind of drug—and the inbred certainty of its perpetual and unrestricted presence would have two inevitable results: 1) it would be used more and ever more, creating discontent over such trivialities as delayed access (e.g., standing in line or waiting for “juice” to transfer); and 2) it would represent an ever-growing vulnerability (e.g., through terrorist attack or natural catastrophe) within the society that gorged upon its fuel.

Entrepreneurs would not likely vanish, even after the magical energy source was nationalized.  They would apply their genius, rather, to introducing more and more toys to consume the boundless supply of animation.  Things would be designed to draw upon “free motion” whose function, by no stretch of the imagination, assumed that of once arduous but necessary physical effort.  On the contrary, the very essence of these playthings would be otiose and frivolous.  Yet as society in general grew enamored of them, they would be viewed as “must haves”.  The paradise of affluence would be forever teetering, in individual cases, on the verge of an inferno of privation.  A citizen who hadn’t the latest robotic dog or cat paraded by his proud neighbors would not know an instant of peace until he could bridge the gap.  This, too, is immutable human nature.

”Real life” examples of the moral degeneracy just described will leap to the mind of any thoughtful person over the age of forty (old enough, that is, to have already lived through several such transits).  Ortega y Gasset attributed many of the qualities mentioned above to “mass man”, while Aldous Huxley illustrated in more than one novel how material progress can proportionally drive spiritual regression.  The author of the present essay poignantly remembers a first-grade assignment in My Weekly Reader that portrayed the late twentieth-century city as humming with monorails.  Cars and their nightmarish chaos of traffic were to be gone in a few short decades.  The change was lead-pipe cinch.  If we hadn’t all the necessary technology already, our rate of advance was yet such that we had every right to picture clean, fast, efficient, enjoyable travel throughout the metropolis of the future.  Only a fool would think otherwise.

The fools, however, turned out to be wise in this instance (as in so many others), while wise men were made fools.  Innovation in transport began to atrophy about the time that our central government invested in heavily car-bearing highways and attendant infrastructure.  The metropolis of 1990 came and went without those glistening monorails shooting hither and yon: the cities of that era had been radically rebuilt to suit the automobile, and now one could not move through them except by automobile.  Many citizens of the twenty-first century hate their commutes, and new technology obligingly promises cars that drive themselves—more of the same, that is, but tweaked to mitigate the misery of an existential trap without exit.  Suburban residents also hate the impersonality of their neighborhoods, where widening roads have thrust houses farther and farther from each other and where each garage door automatically seals like a portcullis as soon as milord returns from fighting his daily dragons.  Satellite dishes and the Internet are the proposed answer: a superabundance of artificial socializing, of voyeuristic neighborliness.  The result isn’t really happiness at all, let alone paradise—but its deception suffices to hold misery at army’s length on an average day.

The technologist does not understand the fragility of artifice.  He or she fails to divine the true social needs of human beings.  The Japanese, one hears, are even developing robotic sexual partners for lonely adults in the high-tech urban jungle.  Such science-fictional scenarios fascinate, but they seem only half-prophetic.  They open before us a great glowing abyss, but they do not intimate what kind of landing we should expect at the bottom.

As a third and final example of failed contemporary prophecy, the “progress in race relations” vein of visionarism reveals yet further blunders.  HP is surrounded by progressive rhetoric, and his metaphors of history are steeped in progressive imagery.  We could never fly before, but now we fly; we could never cure an ailing heart before, but now we do open-heart surgery.  Why, therefore, should we not expect the persistent dark presence of racism in human history at last to dissolve before our time’s spreading enlightenment?  If it fails to do so, the cause can only be that certain segments of society linger in their Neolithic stupor.  The Scopes trial publicized how fiercely such minds could resist the evidence of science, just as the Catholic Church’s resistance to the Copernican Revolution had done.  The battle over race was earnestly joined in the fifties and sixties of the previous century, and it has since been hard fought in various pockets of barbarism; but the march proceeds, and the legions will both swell in number and reach their destination, as surely as tomorrow’s sun will rise.

This kind of forecast, with which Americans of my generation have grown up and grown old, has not necessary proved false—yet if true, then its fulfillment looks nothing like what we expected.  Citizens of African descent have won election to our highest offices, they attend our children’s schools and come to weekend sleep-overs, they buy houses where they wish, they eat at restaurants of their choosing… the days of Jim Crow appear to be mercifully distant.  Why, then, are we now told so often that this is not the case?  Within the same week in April 2014, US Attorney General Eric Holder fumed that critics of his unorthodox tenure in that office have attacked him because of his African genes, while high-profile baseball hero Henry Aaron charged that those who oppose President Obama’s initiatives are merely “Klansmen in neckties and starched collars”.  To hear such figures speak, one would conclude that most progress in black/white relations has been illusory.

From an angle whence these denunciators would perhaps neither expect nor want confirmation, it comes aplenty.  This writer might mention the case of a certain doctor who now avoids hiring black nurses or technicians.  The reason?  Because he has found repeatedly that they can become impossible either to correct or release: if disciplined or fired, they instantly file a racial discrimination grievance.  A professional attempting to keep a small practice afloat cannot afford such risks for the sake of doing the right thing.  Hence real discrimination occurs under the table, precisely because the menace of being bankrupted for manufactured instances of discrimination has grown fearful.

What has happened?  Where did our progress go?

The rope that is throttling progress in this instance, of course, has many strands.  I will trace but one.  Racism, to invoke the platitude, has indeed always existed—and it has also, just as fluidly, morphed from one era to another.  A devout New England Protestant would once have threatened to disinherit his son for asking to marry an Irish Catholic; now the pater familias is probably happy if the boy has found a Christian woman of any description.  Anglo settlers in the South or Southwest would have killed a man a hundred years ago for suggesting that a drop of Cherokee or Choctaw blood ran in their veins; their descendants are now more likely to boast of those drops.  These prickly situations did not relax because the people involved were “sensitized to otherness”—just the opposite.  The prejudice evaporated because everyone concerned began no longer to notice the difference of anyone else.  Such is the natural course of things.  Races in conflict hate one another bitterly, generations pass, Romeo marries Juliet and Hatfield marries McCoy… and the new tribe which emerges from these unions turns its collective distaste to a new rival.

One might speculate that the African/Caucasian situation is more complicated because physical differences are more pronounced.  A Montague looks much the same as a Capulet—but the epidermis is a giveaway in America’s most troubled racial divide.  Yet history offers little support for this thesis.  The original Celts were distinctly dark compared to the Scandinavians who invaded their shores; but now red hair is considered a signature of Irishness, and the stereotypical Scot has blond ringlets and fierce blue eyes.

What we failed to see (or one important thing, out of several) as we collectively prophesied progress in America’s black/ white relations was the political utility of lingering division.  Racial difference is now emphasized, in an official and codified manner, as it has been at no other point since the Jim Crow laws.  To that extent, we are certainly backsliding.  The formal, even academic justification of this “yellow star” approach is that it forces potential bigots to reflect explicitly and consciously upon the otherness of the person they are about to slight.  That such “sensitization” is an open invitation to an obnoxious paternalism cannot be overlooked by any genuinely sensitive observer; but its advocates in the broader power structure would not likely be shaken if confronted with their hypocrisy, since (like all Machiavellians) they are profound cynics. The goal is to divide and rule.  Man as a political animal quickly finds this out, if he does not know it instinctively: i.e., that masses of people are more easily handled if they view themselves primarily as members of a group rather than as individuals.

The prophecy of racial harmony has proved overly optimistic, in short, because it viewed the problem merely as an educational challenge—a material volatility needing a material additive to become stable, like a bunch of chemicals in a beaker.  The malice and sordid advantage of political calculation were never weighed.  The assumption was simply that those who hate a certain race must listen to reason; the fatal omission was that the dispensers of reason might—for their own selfish interests—poison their therapeutic doses of enlightenment to be more provocative than informative.

d) the content of “erchomenology” as a discipline

The examples just offered have a lamentable disorder and are certainly not exhaustive of the possibilities.  Yet they suffice to highlight a few of the more obvious reasons for why contemporary scholarship cannot reliably predict the next turn of society.  To recapitulate:

  • Human beings are not motivated simply by greed and self-interest, or at least not when “self” is narrowly defined.  Many models in the social sciences appear to assume that cynicism of this kind is empirically sound.  Economists, for instances, often seem to treat cultural and religious factors that stand in the way of profit as “playing hard to get” tactics, to be overcome by raising the amount of the bribe.  Naturally, history offers many instances of individuals and entire peoples who “sold out” their cherished traditions for material profit.  Even when auri sacra fames wins in such cases as these, however, it loses; for volatile emotions like a loathing of oneself, a resentment of the gift-bearing intruder, and a longing to purge guilt in some bloody act of penance may lurk dangerously just beneath the surface.  The failure of our national policy-makers (let alone our rank-and-file electorate) to comprehend how emotions of the sort influence the Third World continues to lead the United States into situations of high risk.
  • Our own cultural faith in technology has also created an almost invincible blind spot.  Visionaries like Ray Kurzweil base all of their forecasts upon the exponential growth of our technological capacity.  Yesterday, a man on the Moon: tomorrow, a colony settling a yet-undiscovered planet.  Yesterday, a heart replacement: tomorrow, a life expectancy of 10,000 years.  Such sanguine outlooks isolate a single characteristic of technological development without giving any thought to how changes will affect human beings.  Will we want to visit a new planet, even if we can?  Will we want to live longer once a virtual immortality lies at our feet… or will we, perhaps, long to die?  Technology alters the values and attitudes of those beings whose lives it was made to improve.  We cannot say what thoughts and feelings will stir within our great-grandchildren: Kurzweil, for one, believes that this unborn generation will be more artificial than biological (i.e., more robot than human).  Any prophecy concerning how human life will be bettered by technology, therefore, must begin in an assessment of what is good.  Insofar as mounting evidence suggests that we rapidly shift to accommodating our machines after those machines have initially accommodated us, is our glorious, golden future really more liberation than servitude?  Might we be approaching the state of the Aztec maiden who is told that she will be infinitely happier once her heart is cut out?
  • When we attempt to study our human world directly (as in the issue of race relations), our disciplines often lurch from cynicism to utopianism.  Racism, poverty, alcoholism, delinquency… all such ills are statistically objectified with implicit outrage, leading to the obvious conclusion that society must care more about its own.  The objectivity here is quite slippery, after all.  What is racism?  Is it “hate speech”, or perhaps a lower employment rate?  Can a Caucasian ever be a victim of racists?  Are homosexuals such victims, as is increasingly claimed?  And what is alcoholism, other than a chemical dependency?  Is dependency on caffeine-laced drinks of the same order?  Or since the brain can produce is own chemicals in response to proper stimuli, is pornography a dependency?  Are Star Trek and James Bond movies?  The game here seems to consist of taking the fundamentally unquantifiable—the human—and whimsically defining it into something susceptible to measure.  What does this accomplish other than foregrounding issues and problems that the researcher, for purely subjective reasons, finds interesting?  Is the result ever a workable solution to a problem; is not the whole process, more accurately, a symptom of a society that feels its humanity slipping away?

One would of course like to say that the disciplined forecaster of human events should know something about everything, since the recurrent failure in all of my examples is over-specialization.  Yet any career academic will recognize in the word “interdisciplinary” a red flag warning of potential slovenliness.  He who knows a little about much knows not much about anything.  This commentator’s Ph.D. was earned in a field called Comparative Literature, which has now (mercifully) vanished from the academic map.  Though there ought to have been a wealth of material for the literary comparatist to study—though there awaited nothing less than a science of literary aesthetics, such as Northrop Frye hinted at in his wonderfully synthetic books—the very term “aesthetic” (along with its pompously ivory recasting, “universalist”) was anathematic in comparative circles.  Comp Lit programs ground out feminist and neo-Marxist “theory” at break-neck speed, while annihilating the very possibility of a basic human attraction to a good yarn.

Yet another stab at an interdisciplinary discipline would therefore be dangerously exposed to yet another ideological highjacking.  An ever-guiding principle in the field of “Erchomenology” (the study of things to come) would hence have to be a deliberate and systematic abstinence from ideology of any sort.  This is easier said than done, to be sure.  I have used the phrase “human nature” repeatedly in this essay, yet many a scholar would bluntly insist that human beings have the nature of a highly developed primate: no altruism, no disinterested admiration of beauty, no intimations of immortality.  It is precisely in acknowledging such fundamental differences, though, that the erchomenologist would demonstrate rigor.  A model of urban collapse after a natural catastrophe—say, of Southern California’s behavior after a 9.5 earthquake—might be developed using a Zola-like estimate of human nature, and another that assumes in people a degree of redemptive “common humanity”.  A fruitful project would then be to research how human beings have behaved under similar conditions in the past.  In the likely event that responses differed, the researcher would question why they differed.  Was culture the decisive factor?  Was technology?

Possessed of such information, a public policy-maker, a clergyman, and even an entrepreneur would be able to take productive steps (though “productive” might be understood very differently by all three).  The erchomenologist would leave to each the interpretation of data based upon ideology: his or her function would merely be to furnish the most likely projection.  While supposing that this projection itself might entirely avoid ideological bias would be naive, time would eventually vindicate the better models.  False prophets are eventually stoned and double-talking oracles eventually ignored.  Only the truth lives to see the sun set.

Now, exactly what proportion of an “erchomenological” program of study should be history, what sociology, what economics, what psychology, and so forth does not really concern me here.  It ought to, admittedly, if I were indeed proposing an academic discipline.  My present purpose is not to attempt a wade through that morass, however, but simply to observe that predicting future events with some degree of accuracy should be possible if one had proper preparation.  Today it is done extremely poorly by the people in academe who seem to be best prepared… so the magical formula, whatever it is, doesn’t lie concealed in the vaults of any particular department.  I may say now that the deep incursions of ideology into many departments is as much a source of failure as the over-specialization mentioned above.  Scholars know too little beyond their area of expertise, and they assume far too much, at the same time, about how their special area connects to the broader world (the dilemma of the specialist described so well by Ortega y Gasset almost a century ago).  For the academy, then, a more important requisite than merely dispensing interdisciplinary information might be securing a generous flow of information in any favored discipline, such that the warping effects of ideology cancel out.  Where knowledge is concerned, gluttony is more healthy than selective spoon-feeding.

d) toward some initial predictions

I conclude with a demonstration: I shall place myself on prophecy’s “hot seat”.

Recall the conditions enumerated at this essay’s outset that make of postmodern society a unique venue.  Several factors related to those conditions have created a ticking time bomb.

1) The most technologically advanced societies are being flooded with Third World peoples largely ignorant of technology, and even more of the scientific principles that undergird it.  These newly arrived residents learn to drive a car quickly enough and show up at the emergency room quite often, yet few of them understand anything about chemical reactions or bacteria.  One vector, then, shows our high-tech lifestyle being financed ever more by uninformed consumers and managed ever more by elite technicians.  The gap between the two was already widening as the internal combustion engine, for instance, became so computerized that a mechanically inclined teenager could no longer overhaul it in his father’s garage.  Now that the general populace throughout the West is absorbing hordes of immigrants whose parents once traveled by donkey-cart, the gap must grow chasmic.

2) The previous tendency is magnified and accelerated by birth rates.  For whatever reason (and there are indeed many), members of the technocratic elite seldom reproduce at replacement levels throughout the Western world, while the immigrant blue-collar class produces very large families—too large, often, for the manual labor of a head-of-household to support in so sophisticated an economy.  The one rapidly growing segment of society, then, not only brings little knowledge to bear upon its consumption of the high-tech lifestyle: it also has insufficient resources to partake of that lifestyle unassisted.

3) Education, the benign goddess who presides over the future of progressive societies and is expected to relieve the pressures identified above, is but a cold stone idol; for no amount of education can transform unskilled laborers of peasant, traditionalist stock into an army of prosperous technicians.  This is far less due to the limits of classroom instruction or the number and ignorance of the students than to the nature of technology itself.  Developing and maintaining machines is exorbitantly expensive.  The financial benefit of mechanization to businesses rests solely upon the ability of said machines to eliminate human labor.  That millions and millions of well-tutored young people might somehow find a gilded future in the world of robots is patently absurd.  In other words, the very successes of technology would drive the creation of a highly exclusive elite even if the working class were not reproducing at four or five times the rate of the technocrats, and even if the working class were not being supplemented by vast infusions of undereducated Third Worlders.

4) Medical technology, specifically, is allowing people of all classes to live longer.  The emerging portrait of the last few paragraphs is thus still further shaded about its most worrisome features by our twenty-first century ability to keep our huge populations from stabilizing through the natural intervention of illness and aging.

5) The high-tech Western society with its ever more non-Western force of manual labor is a democracy: this generality is true to some degree in every specific case.  Thus we have people in families too large to reach an average standard of living, and whose job skills are unlikely to provide them a ticket into the technocratic elite, representing an ever larger percentage of the electorate.

6) At the same time, as observed earlier, technology has a way of turning luxury into necessity.  Everyone in the United States has a “right” to indoor plumbing, to heating and cooling, to dish- and clothes-washers and driers, and now apparently to a cell phone and to health care.  Even as a lower percentage of society is able to afford the newest gadgetry, the gadgetry is coming out faster and thicker.  The transit from cordless phones to cell phones took years; now cells themselves are passé, a stigmatizing dinosaur in a generation of “smart” devices.  Resentment therefore mounts as the person on the streets can afford less and less of the accelerated “more” appearing thanks to the exponential increment of high-tech skill.

7) To other strains in this pressure-cooker might be added the cost of that education which the child of the paisano, eager to break into the fast lane, has acquired through enormous loans.  When the coveted degree leads back to sacking groceries, this hapless dupe is not only poorer than ever, but also the more resentful for now having a modicum of the savvy and competence demanded by his brave new world.

8) Up to a point, Western governments have been releasing little jets of pressure by giving our young grocery clerk a job in the public sector.  The impoverished need their heating in winter, the underprivileged need their cell phones for job-hunting—and all of this distribution and quality-control needs loyal footsoldiers.  Resentment is a potent element.  Employing people loitering at dead ends to service homes and families all around those dead ends may keep the city from catching fire for a while… but it is not, of course, a long-term strategy.  Even heaping astronomical taxes upon the technocracy (while, at the back end, giving its members huge contracts to equip schools with “educational” playthings) cannot keep the ship of state afloat indefinitely.

What kind of forecast could flow from such observations that would not involve gloom and doom?  None that I can imagine… yet neither do I see one predestined, ineluctable outcome.  I would stress at this point, then, that the erchomenologist’s utility lies precisely in his being able to see likely outcomes, so that others might steer for the lesser disaster (in this case).  After all, as Cicero gently rebukes his brother Quintus in the treatise, On Divination, what would be the point of knowing the future if you could do nothing about it—yet if you could do something about a dire prognostication, in what sense would it be prophecy?  We human beings cannot really foresee much of anything; yet we can reckon probabilities fairly accurately, and, with a little will power, we can cause the future to lean in the direction of the better option.

One of the worst options for us postmodern Westerners would surely be to keep on as we’re going.  Eventually—and sooner rather than later—our economy will collapse; and I think this is as much because of the technological marketplace as because of the public sector’s takeover of that marketplace.  Most of us grasp the threat of a seventeen-million-dollar debt, and more than a few of us the danger of paying down that debt’s interest merely by printing money.  Few, though, seem to rate the “hyper-technologized” economy as one of the largest storm clouds.  On the contrary, a further advance of technology is generally regarded by economic disciplinarians as the only way back to prosperity.  I might put it this way: the distaste for freeloading and for financing freeloaders has never adequately been connected to agricultural values in the minds of policy-makers.  People who grow their own food, or at least provide services to independent small farmers, understand the “work product” in an intimate way, learn neighborliness and charity in the routine struggle to survive, and can quickly assess a demagogue by the height of his soapbox.  People who slave for a wage producing an item that they never fully see must always feel vaguely cheated and naggingly paranoid; for the company’s wealth often seems inversely proportional to theirs, and the operators of the urban labyrinth that has left them absolutely wage-dependent fear only one thing from them—their sheer mass.

We must find a way to make and keep our work local—to index it narrowly to our community rather than to transform it into a faceless abstraction adrift like a “bitcoin” in planetary tides.  From sacking groceries to running one’s own grocery store hasn’t traditionally been an unsatisfying career path—and the Internet seems unlikely to deliver tomatoes in a download any time soon.  Today the obstacles to such an ambition (and they are formidable) stem primarily from oppressive codes and regulations that our great benign employer of the jobless—government bureaucracy—has generated by the ton (cheered on, of course, by mega-businesses with deep pockets).

A lot of wild cards find their way into the deck if social unrest reaches unruly levels.  Riots are a distinct possibility.  In an equation that would strike many academics as inside-out, I believe violence to be more likely, not less, in urban areas with stricter laws concerning private ownership of guns.  Here local police and the National Guard would be more apt to confront massive uprisings directly; and once shots were fired and fatalities incurred, the situation might well spiral out of control in a fashion that would tantalize any Hollywood director.  Other regions of a nation as vast as the US would remain relatively inert, both because of a better-armed citizenry and because of a more agriculturally based local economy amenable to weathering short-term storms.  Beyond these quiet backwaters, one can well imagine racial/ethnic antagonisms being used by the politically savvy to stoke the fire artificially and create a popular demand for the central authority’s intervention.  For this sad truth must be acknowledged: if political forces existed that wished for some excuse to suspend elections and seize power, violent riots in the streets would come as a very welcome opportunity.

Another very unsavory, but all too likely, scenario is a pandemic.  The influx of millions upon millions of people into industrialized societies throughout the world who have no science-based education primes a kind of Petri dish for a new influenza that resists all known treatments.  Such ill-informed masses often misuse antibiotics, taking their pills infrequently due to cost or desisting from the treatment as soon as they feel better.  The eventual result is a much stronger strain of the “bug”.  Mere travel is also a source of menace; for even the humblest workers like to visit the old folks back home once in a while, and air travel has become one of our new collective “rights”, one might almost say.  The indiscriminate mingling of people from all over the world within time spans of just a few hours is an excellent recipe for a pandemic.  Yet another factor could be the virtually promiscuous sexual habits to which young immigrants are introduced in the West, as well as the abundant use of recreational dugs that sometimes involve needles.

Again, one must wonder if the democratic government of the near future would necessarily be terribly distraught to see its underemployed, resentful masses decimated or halved by a new plague.  Might such a government, indeed, having gotten what mileage it needed out of “democracy” (i.e., election of a like-minded elite), actively elicit the scourge through required public inoculations, and then suspend elections indefinitely during the “crisis”?

I shall stray no farther down that dark corridor; for a word to the wise if sufficient, and I cannot stress enough the importance of preserving Erchomenology from the taint of rigid ideology.  What people need to hear is not that their elected representatives are calculating Cesare Borgias capable of murdering them in their sleep.  They need to know, rather, that a highly manipulable situation is evolving.  They need to bring to their consideration of public affairs the understanding of a sober adult, not a gullible child; and they need to realize, thanks to that level of maturity, that the intent of our republic’s founders was to give no representative or body of representatives the chance to go bad.  Those apostles of progress who will instantly protest that several of the founders were slave owners, and hence morally discredited, only prove my point: i.e., that generally or apparently good men and women are yet capable of vile acts due to lapses in judgment.  This must be a constant in our calculations.

If we desire not to be slaves again, then we must keep those who lead us in carefully legislated chains.  That, I know, is far more of a homily than a prophecy.  What will ultimately determine events to come, however, is exactly the moral character of their human participants.  I would go so far as to say that Erchomenology must achieve three ends, of which we have already mentioned a training broader than the specialist’s and an abstaining from ideological intransigence.  The third must be this: an understanding acceptance that all human beings are an immensely complex mixture of good and evil spirits.

 

Dr. John Harris is founder and president of The Center for Literate Values.  He has taught English and the Classics throughout the southeastern United States.