assault on truth

The Center for Literate Values ~ Defending the Western tradition of responsible individualism, disciplined freedom, tasteful creativity, common sense, and faith in a supreme moral being.

P R A E S I D I U M

A Common-Sense Journal of Literary and Cultural Analysis

16.2 (Spring 2016)

 

Faith vs. Cultural Meltdown

treedown

 

The Assault on Truth: A Critical Theater in Postmodernity’s War on the Spirit
(a collection of personal notes)
John R. Harris

The human being’s spiritual destiny is to struggle toward higher levels of awareness, yet the postmodern world—through PC orthodoxy, advanced technology, and subtle propaganda—is waging an ongoing assault on truth.

The photos above and below capture a section of my back yard on the second Sunday morning of December last year. High winds had passed through the neighborhood, possibly a funnel cloud. It was pitch-dark when our neighbor’s enormous hickory tree snapped its trunk about ten feet from the ground and fell across half our property, so nobody could have seen a funnel even if one were lurking and even if a few early-risers had been peering out their kitchen windows.

One of many thoughts that flashed through my mind over the next few days was the probability that a goodly number of Americans would ascribe an event like this to “climate change”. The default response any time the weather behaves a little oddly, in the context of our few personal decades on earth—snowing heavily in the “wrong” month, say, or raining not at all during the rainiest month—is to see the outline of the Apocalypse. Ironically, in our high-tech, capitalist/consumerist society, lives are woven of novelty. We suffer intolerable boredom if something new isn’t on TV or at the movies. Yet if nature does not cling rigidly to the cycle that we have known for two or three score years, we panic. Who knows? Perhaps we have willingly, even furiously stuffed and crammed our existence with so much unpredictability that this little bit more is too much to take; or perhaps what we cannot endure is the suggestion that we ourselves may not have authored the unpredictable. Perhaps “global warming” and “climate change”, with their always-implicit indictment of human activity as cause, are our way of asserting control over an aspect of life that significantly affects our routine and also has no demonstrable connection to our activity.

treedown2treedown3

In any case, “climate change”—as a human phenomenon, a craze or phobia—seems to me a very modern form of some primitive mass-psychosis. (I write this as a long-time enemy of “car culture” who has walked to and from his job for more days than many activists have been on Planet Earth.) Like all mass-manias, this one rather scares me. I’m particularly concerned about the fanatical persecution of dissdents that it is nourishing. Proposals have been put forth earnestly that “climate-change deniers” should be legally prosecuted. (It would indeed be absurd to deny that climates change, because they always have and always will until the sun blows up. The New Inquisition has not even enough wit about it to specify what is being denied that its zealots find so criminal.) Truth is not characterized by an ardor to silence, instantly and permanently, anyone who may wish to argue a point: truth, on the contrary, eagerly seeks to bring everything into the light of day. Human beings are never more repulsive than when they swarm about in murderous packs demanding that every individual in their path make a certain salute or else die on the spot. Pograms are always ugly. That’s about where we are now, apparently, with climate change.

A major political party exhorted adherents on its website just before the last holiday season to confront obstinate family members with the “fact” that ninety-nine percent of “scientists believe in” climate change. There were simialarly preposterous claims on the same webpage, but this one especially grabbed me. How could an adult of average intelligence swallow such folderol uncritically? I would point out, in no particular order, that a) “scientists” are no better qualified than engineers, grocers, or auto repairmen to pass a verdict on the climate unless their field is related to meteorology; b) the academic community of climatologists has a painfully clear interest vested in advancing the apocalyptic narrative, since doing so is far more likely to snare a federal research grant; c) the notion of a global survey assessing the views of “scientists” on this or any other matter would imply almost insurmountable complexities, and no such survey has been attempted in any form that doesn’t involve self-selection and self-fulfillment; d) the wording of the question to which the “scientists” responded “yea” is absolutely crucial—no claim lacking this information deserves an instant’s serious entertainment; and e) if the words “believe in” appeared in the phantom survey, then a response based in faith rather than evidence and deduction was being elicited, since we “believe in” things for which evidence sufficient to justify our “believing” them is inadequate.

Regarding climate change, then, we cannot discuss the viewpoints in contention lest we be stoned to death thanks to some dark New Age (New Dark Age?) ritual; or if we do discuss them, we are permitted to use only “facts” whose mendacity outrages thoughtful minds at first glance. A full-scale assault upon truth-finding has been launched.

I have long been noticing as much from other directions, as well (for this is a multi-pronged assault). The lightning-quick disappearance of the Operation Fast and Furious scandal from the mainstream broadcast media (where it received brief, dismissive exposure only because of Internet bloggers) is simply stupefying; and the suppression in these sources of the politically motivated lies surrounding the murder of a U.S. ambassador in Benghazi has been scarcely less so. (At least Mr. Stevens and his three fellow-victims knew the risks of their profession; about 300 Mexican civilians were murdered as a result of our government’s gun-running, including a dozen children at a birthday party in Juarez.) When our mainstream media are not advancing a political agenda by suffocating stories, they appear to do so by creating them ex nihilo. The recent San Bernadino shootings were immediately and explicitly linked to the Planned Parenthood shootings in Colorado Springs without a trace of justifying evidence: I personally remember the nature of the reporting as it reached my ears over the radio. Of course, the San Bernadino incident turned out to be fueled, not by pro-life sentiment, but by radical Islamic ideology; and that fact, too, was and continues to be downplayed or ignored in major news outlets.

refugee1refugee2

Photographs from the War on Truth’s front line: two “takes” on Germany’s Syrian refugees. The European press wants us to see grateful faces with children crowded into the frame. The reality more often shows hordes of comfortably dressed young men on an adventure. Not a single female face is discernible in the second photo; and, indeed, none seems to have been available for the first one. Precisely what, then, is the status of these motherless toddlers? And why are the thankful émigrés arriving in Deutschland with signs written in English, which demonstrators in Europe usually do because they want to be understood worldwide?

And the war on truth is carried even to places where no bridge or port of any strategic value stands to be won by Falsehood’s tireless invasion. The so-called “mockumentary” is making a bid to become a new art form on television channels whose reputation for accuracy and authority has been laboriously earned—before the current outbreak of tasteless insanity—over a period of decades. The History Channel, the Discovery Channel, Animal Planet, and perhaps other tentacles of the giant Discovery Company whose programming has escaped my notice are now dabbling in far-fetched “exposés” whose formatting is expertly and intricately engineered to look real. Deliberately grainy footage is produced, often from hand-held cameras whose amateur investigators are running after a bizarre creature or running for their lives from one. Actors equipped with phony academic credentials sit for “interviews”, mimicking the ticks and stammers of ordinary people when put before a lens. (Curiously enough, staged awkwardness often exceeds the acting abilities of these professional frauds: in some cases, the “fake jitters” of the on-camera expert were my personal tip-off). Government entities at home and abroad are made to be routine villains. They are named as responsible for hushing up the “cyptozoological” evidence or circulating disinformation about wacko researchers.

The first of these productions dealt with mermaids. Another about the extinct dino-shark Megalodon followed. Then the mermaid hoax was deemed worthy of a sequel. Something called Cannibals in the Jungle aired on Animal Planet that dithered about with an undiscovered humanid species in Indonesia. Most recently, an expedition in search of the elusive Sasquatch was hatched. None of these “documentaries” carried any significant disclaimer alerting the audience to the counterfeit operation; a brief statement about the use of “dramatization” might be flashed across the screen in small font as the final credits ran. All five shams purported to give positive, eye-witness testimony and recorded (if blurry) film footage of their subject’s reality. Our children and teenagers who were once urged to improve their minds by watching these channels rather than MTV or ESPN would have been completely unprepared to distinguish between “art” and truth if they were dutifully soaking up such imposture.

I myself, as I implied above, was briefly taken in by some of the staging. The mastery of “amateurism” in filming technique was truly astonishing: the industry has clearly learned how to capitalize on its own stylistic clichés. The notion of governmental powers distorting or suppressing evidence also struck me prima facie as very plausible, since our elected leaders daily offer so many instances of cynical “spin” and high-handed contempt for transparency. Is my very mistrust, perhaps, the preconceived target of some hidden agenda? Are powerful forces rewarding these “educational” channels for creating an electorate whose mounting suspicions will be disarmed by the “fool’s wisdom” of seeing a hoaxer in every whistle-blower? Now that a new series on the History Channel is trying to convince us that Hitler lived comfortably for years in Argentina after the war, what reaction, pray, is desired of us? Full credulity? That is scarcely possible any longer. A sort of “pretend” fascination in a “what if” historical proposition? A total disbelief in this and any other historical claim—or total resignation to some sketchy, Pyrrhonist idea that nothing can be established as more or less historically accurate? And if such resignation is the objective, is a further objective a consequent indifference to any outrageous charge made against any of our revered national leaders in the future? Is this all a conspiracy to discredit, now and forever more, everyone who smells out a conspiracy?

The Discovery Company has made this bed for us: what kind of sleep was the mattress designed to provide? Was the whole rig micro-designed with the most sinister goals imaginable in mind… or was any designer at all on the payroll? Is the New History of infotainment brilliant in a wicked way, or is it just coarse and stupid?

The Sasquatch charade was particularly distressing to me, in that the producers showed no regard whatever for the professional reputation of Dr. Jeffrey Meldrum. An expert on primate anatomy, Meldrum teaches at Idaho State University in Pocatello: in an area, that is, where Bigfoot sightings have been exceptionally common. He has put his academic standing at risk for years simply by daring to inquire into the possibility that certain evidence (such as footprints) may be credible on occasion. The mockumentary’s producers failed (with premeditation, of course) to inform him that the interviews and in-the-field investigations he permitted them to film would be salted in between the staged exploits of a hair-raising—but utterly apocryphal—expedition. Meldrum’s many years of trying to win a serious review of the Sasquatch controversy’s best evidence may well have been nullified by a venture whose justification was explained as increased ratings. Whether those ratings rose as expected or not, the production’s effects also included—and far more predictably than a larger viewership—the sabotage of one man’s career and, in some measure, of that cultural regard for genuine scientific objectivity which we constantly struggle to plant in the next generation. One would hope that casualties of such magnitude might have moved the Discovery Company’s executives to consider rechristening their channels (e.g., Fantasy Channel, Unicorn Planet), if only to avoid some of the most flagrant hypocrisy since Judas kissed Jesus.

In the rest of this essay, I would like to examine why such sabotage of truth as I have been describing portends nothing less than the utter corruption of the human spirit. We cannot grow as moral beings except in cultural surroundings that set a supreme value upon truth: doing so in a relativist miasma such as is descending upon us now is an a priori impossibility. I will show why below.

***

When Immanuel Kant published his treatise, Muthmasslicher Anfang der Menschengeschichte (“Speculative Origin of Human History”), in 1786, he was offering an anthropological explanation of events in Genesis. Specifically, the “forbidden fruit” of Eden was perhaps a variety of flora whose consumption the pristine human tribe had declared tabu… and then one audacious individual tried it. The pernicious snack may tastily have nourished the malefactor, or it may have given him a royal pain in the gut. Either way, the rebel had acquired a knowledge that replaced his earlier acquiescence. He had been won to the idea that nature, for him, was to behave unnaturally—to choose—rather than to live out life within the narrow parameters of instinctive conditioning and its social projections.

Whether one is receptive to reading the Old Testament as allegorical in this manner or not, few of us doubt that our distant ancestors were not quite the same as we are. One need not be a Darwinist in any narrow sense to accept that human culture has evolved: that much is historically transparent. To this day, a few tribal communities in the Brazilian rainforest and Papua New Guinea observe customs and exhibit practices that make them appear almost a distinct species from our introverted computer analyst or multi-tasking legal aid. The distinction, I would stress, is not always pejorative to the past or the “primitive”. We know that preliterate Greeks, for instance, spoke a language with dual verbal inflections as well as singular and plural forms, a vibrant middle voice along with active and passive registers, an optative mood entirely separate from the subjunctive, probably two ablative cases (analogous to those of Sanskrit) to go with the four surviving Homeric ones and a stand-alone vocative, and so forth. In short, these “primitive” peoples could remember volumes of material that would stagger us, and they could (we must presume) manipulate their knowledge with intricate fluidly in ordinary speech.

It is indeed in language, and not in dietary experiments, that I wish to make my own speculative anthropological beginning. John’s gospel traces humanity’s Big Bang back to the Word. I propose to do exactly the same thing here, if somewhat less figuratively; for I believe that in the very genesis of words lay our ability to express truth, and hence to distort truth into lies, and hence to sin and to know the anguish of free will.

I imagine the evolution of verbal expression as far more important to the emergence of human moral understanding than any role that food selection might have played—more important because more directly implicated. Picture to yourself the early stages of human communication. It could not have differed much from the noises made by other primates and, for that matter, other species of vertebrate: grunting, howling, lip-smacking, etc. Crows appear to vocalize information about a possible meal or the demise of one of their own, and jays and other birds certainly have calls related to mating, to the approach of predators, and so forth. Several marine mammals have likely progressed well beyond such basic information-sharing. Nevertheless, it’s hard to suppose that even the highest mammals would develop a sense of individuality due to any such exchanges. All remain appendages of the group. An early tribal man would not have viewed himself as distinct and apart from the clan when his eyes alone detected a mountain lion or his ears alone the baboons’ raucous warning cry. In those instances, he might have been better positioned than the others of the clan; or he might be communally recognized as having the best eyes and ears. Some very slight degree of separating the individual—of distinguishing him or her from others—must have occurred, but it would have marked the “honoree” only within a group context. Without the orienting presence of others, that is, his or her uniqueness would have no relevance. This female had more living children than any other; this male could reach higher branches than any other. That stalwart would be acknowledged the best defender in combat; while others would be understood as weaker due to age, injury, or size. Possibly certain coveted markings or privileges would fall to the especially strong or the gifted.

None of this, however, should be imagined as isolating the individual from the community. He of the excellent eyes would magnify the visual organ for the entire group. (Perhaps his name would be “Eyes of the Tribe” if speech had arrived at the point of naming.) He of the mighty arms who was fearsome in combat would fight for the entire group. It would not occur to Eyes of the Tribe, at this cultural stage, that he might deliberately misreport the dangers lurking around the camp. Why would he? What sense would it make? Likewise, Arms of Stone would not allow a wolf to drag away the infant of a female who had turned away his advances. Why would he? To take revenge? What was revenge? His identity lay all in protecting the tribe. While he might use calculation in determining the kind of attack necessary to repulse a greater or lesser adversary, his ratiocination would never amputate him from the group of which he was a powerful appendage. Sin, one might say, was yet unknown.

Did the chicken or the egg come first: did more words produce an ampler expression of thoughts that made the individual finally aware of his isolation, or did this deepening isolation motivate the invention of more words? Weaker members of the clan would have been chased away from choice morsels of food for ages and eons, no doubt, and immature males from fertile females. Plenty of incidents could long have made an alienated wretch think to himself, “Wait a minute… why does he get more than I?” But would such a thought have been possible without the words to assemble it? If Rabbit Runner functioned as the legs of the whole tribe in catching food for everyone, then at what point would Lame Boy have conceived of himself as something more or other than the smallest finger on a powerful hand? A larger stock of words would prove essential in objectifying the sense of “victimization”… and yet, should we imagine the clan as stockpiling new sounds and meanings haphazardly in its collective vocabulary and, only then, discovering their utility?

If I suggest that an unpleasant event would have played midwife to the birth of new words, I do so because a sort of natural law appears to govern such matters. We produce something new when we perceive the old to be inadequate; the old does not seem inadequate until existing forms fail to meet our needs. A youngster who has lost her mother might use the word for “fire” to express the invisible pain of grief, but the usage would carry obvious risks of misinterpretation. Another youngster whose sibling had stolen his luscious apple (perhaps Eve’s Apple) could howl the word for “enemy” in protest—but that incautious complaint could draw severe reprimand from elders who had at first misunderstood its message. More thought needed more words: that appears a more likely chain of causality than that a wealth of randomly new words generated more thought.

The case of the misinterpreted cry of “enemy” is especially instructive in another way. Misunderstanding is not falsehood, but those who misunderstand cannot immediately determine whether they have been misled by design. A misinterpreted word might be a lie. At this stage, the notion of distinct, separated intelligences (as opposed to a single communal mind) already begins to ripple a once-placid surface. Whatever stirs the thought, “He may not be thinking what I am thinking,” is revolutionary. New interpersonal walls are being discovered, and new gates within those walls must be constructed to allow the communal spirit to circulate. Michel de Montaigne insisted that the Brazilian cannibals about whom he had received detailed reports possessed no word for “lie”. The great essayist himself must surely have been laboring under a misapprehension (or else finessing the truth for his own ends), for a culture as complex as he describes would already have experienced some of the difficulties involved when one neighbor tries to read another’s mind on the basis, not of a grunt or howl, but of a short sentence.

I will concede this much to Montaigne, however: paradoxically, words pose more of a challenge to trust as they become more numerous and precise. A smile can mean “happy” as long as happiness comfortably embraces every emotion from a full belly to the joy of winning a game. As soon as various kinds of happiness begin to be distinguished, though, even the simple pre-verbal smile may become cryptic. Of course, words will eagerly seek to chase down the distinction between the delight of holding a newborn baby and the thrill of standing over a rival’s battered body. As the individual’s psychological state, especially, attracts descriptors, the clan’s members cannot help but realize that a huge portion of reality is hidden to the eye. No less a scout than Eyes of the Tribe will remain incapable of peering into the heart of his brother and seeing envy there; while the envious brother, in turn—thanks to a new word for “inward fire”—will know that something is smoldering in him which is not grief and which does not cook meat. Self-awareness is stirring in the womb: sin is about to be born.

This development menaces the tribe’s survival. The clansmen must be able to trust each other. Before, vital information might have been misreported due to faulty perception or carelessness—but never with intent to deceive. A lookout would not ccy, “Lion!” unless his senses detected evidence of a lion. Now life is more complex. Besides Paleolithic versions of The Boy That Cried Wolf, we can imagine an unprincipled lookout sounding the alarm in order to nip over to the fireplace and grab an extra morsel of meat as everyone else runs for cover. New World monkeys have been observed practicing this very sort of fraud, though presumably without any more motivation than being hungry. A better example, therefore, might be a lookout who does not call the alarm when he sees the stalking cat closing in upon his arch-rival. At such a juncture, we could all agree that the community has serious “trust issues”.

To an extent, truth can be reinforced by severe punishment in the event of its betrayal. The hungry monkey who counterfeits an alarm will be mercilessly beaten by his elders if and when he is found out. A sentinel’s failure to notice the approach of danger (whether through devious intent or due to mere inattention) has traditionally been treated as a capital offense in a military context. The community, in other words, can try to make its individuals communicate truthfully by threatening misrepresentation with the most extreme penalties. Hence we find oral-traditional societies generally dominated by an ethic of shame rather than of guilt: their members cling to “good behavior” in fear (of losing face, of actual mutilation or execution) rather than in devotion to abstract principle.

Yet the notion of principle is the ultimate destination. That is to say, one lives to draw closer to the truth instead of policing the truth in order to live. If truth-telling is a survival strategy on the primal savanna, it is something much more proximate to the meaning of life once language and communities flourish. Let me suggest this formulation. Our lookout gnawed by a hunger of the heart rather than of the belly has nothing wrong with his eyes. He would not misreport material circumstances fully perceptible to him unless a hidden frustration had moved him to dabble in deception. To be sure, dereliction of duty at this level can jeopardize the community’s physical safety; but even if it does not—even if the lions happen to have left the valley this year—the community’s hidden bonds have been frayed, and must be repaired. Its members are harboring secrets that isolate them from the whole. The invisible new realities must be made as visible as possible. Those individuals with grievances must express their grievances: the predaotrs of the heart must be released from their cages and studied out in the open by all. If envy is one of these new facts, then it must be examined as a fact from every angle. Has it any justification—has the envied person taken unfair advantage? What is fairness? Is it the same as enforced equality? Is he who envies, then, in effect demanding that the world descend to his level of mediocrity? Perhaps both parties should make adjustments, accommodating the new reality of ill feeling to yet deeper realities rooted in the willful human being’s selfish (i.e., sinful) nature.

What was at first an unpleasant sentimental reality thus fragments into perhaps a dozen moral realities in the way that a molecule breaks into atoms. The community is the practical context (and the only humanly possible context, no doubt) in which such realities can be forced from our hearts and into the light of day. How do we understand the hold of egotism upon our behavior if not through a struggle with other egotistical beings? How do we ascend from the rigidly insulated innocence of the toddler to the mature responsibility of the adult except through competitive collisions with others as important to themselves as we are to ourselves? And how will such collisions at last prove fruitful unless we testify to—unless we verbalize (or literalize) for the consideration of others—the facts, not just as we see them, but also as we feel them? For if we bring others to see visible reality just as we do, then we have good reason to suppose that the invisible magnetism holding objects together for us is not a self-indulgent distortion; but if the objects we describe do not impress others with that same arrangement and perspective as is known so well to our mind’s eye, then perhaps we should question the assumptions with which we have bound the material facts. Maybe we need to make the uncomfortable acquaintance of a subtle traitor called Subjectivity.

Say that Firemaker and Swiftfoot both desire Doe Eyes, and that she has settled in with the former. Swiftfoot’s desire still gnaws at him. He might ambush Firemaker when no one is watching and then claim the comely widow: this is a somewhat refined version of how lesser animals solve disputes over females. In our hypothetical clan, though, the members have already developed a degree of proficiency at reading each other’s moods. It is common knowledge that Swiftfoot bears a torch for Doe Eyes and a grudge against her mate. The truth of this potentially ruinous animosity is widely talked about. The clan’s Nestor brings our Achilles and Agamemnon together in council, and the rights of Firemaker to Doe Eyes are reaffirmed. She has chosen him and born children to him. Yet Swiftfoot is a valuable member of the community, and his “truth” must not be ignored, even though it is dangerously skewed to certain objective facts. With “reasoning”, he must be made to understand that the same desire in his breast is also in Firemaker’s, and that the object of his desire prefers his rival. If he will concede this much observable truth from behind his personal barricade, then the clan will make counter-concessions. Perhaps he will be awarded an honorary position, or perhaps he is promised the first pick among the females captured during the clan’s next raid upon its enemy across the river.

Obviously, my hypothetical presents moral consciousness at a nascent stage. The notion that one individual should project his purely natural feelings onto another—should “share” them in an act of “common humanity”—may require centuries (hopefully not millennia) to sink deep cultural roots, even though it has always been latent in thoughtful persons. The same may be said of the notion that the woman’s will in this particular matter should be considered and respected; and the “enemy across the river”, alas, has not yet been granted a full measure of humanity. Growth takes time. The cultural bequest that we view today as our “basic rights” has been transmitted through and modified by generations of human beings who testified to the truth as they saw it. From that vast cloud of witness, through laws created to honor an invisible brotherhood rather than a very visible power structure, a collective environment emerged that nourished a higher spiritual awareness in the average person.

Yet even in the most nurturing of cultural circumstances, we remain—some of us—“cavemen”. This is why societies cannot be said in an absolute sense to progress morally, as if lifting every individual irresistibly in their ascent. It is also why telling the truth is of such critical importance at all times: i.e., because higher truths are never immediately, directly perceptible. The “validating by verbalizing” that I wished to represent in the tribal council above is an ideal, and will always be so in a terrestrial setting. (Unless the various “crime” channels on cable are also engaging in mockumentary, sexual jealousy remains to this day a prime motive for foul play.) The clan’s council-meeting certainly doesn’t define a sharp boundary; it isn’t Cultural Step 12 taken by a hunter-gatherer clan on its way to establishing permanent settlements.

Yet with two steps forward, one step back, Paleolithic man must have begun learning that he and his fellows needed to bear steady witness to realities locked inside them as well as to the movements of predators or of game animals. If I may continue to be a shade mystical, I believe that our destiny as human beings has always been to approach higher degrees of truth. Any animal assesses the immediate facts of its physical environment in order to survive another hour or another day. The human animal, working in communities, grew to learn that inner vistas persistently opened up as broad, varied, and perilous as the Serengeti: truth-telling then became a matter of reporting activity within those personal prospects as well as in shared physical surroundings. Without such honest testimony, individuals would increasingly be imprisoned in desert spaces where no one could offer an encouraging word. The choice would become quite stark: either protest, confess, mourn, regret, aspire, admire, and fear before the ears of others, so that the sentiments behind one’s words might acquire a chastening, a refining, a confirming… or else endure a kind of insanity where one’s inner voices draw no reply at all from the warm bodies surrounding one.

The ultimate liberation from selfishness to an objectified self (a will that lives by the Golden Rule, as one might say) can occur only through an arduous process of social exchanges, all of them aiming at a transcendent validity rather than at carnal power. If the kingdoms and empires of the past seem to have opted for the latter, subsequent steps up the cultural staircase have seldom carried us much higher, and seldom for very long.

***

With that, I now come to the great calamity of our particular era: our postmodern habit of silencing individual protests and enforcing the fantasy that global culture is making or can make a collective ascent to beatitude. The former practice, of course, prevents the verbal exchanges necessary to have any hope of approaching inner enlightenment. The latter practice fills the gap left by the suppression of honest inquiry with strident, headstrong “make-believe”. Whenever a modern Western society has strayed down the path of rigorous censorship over the past couple of centuries (the French Reign of Terror, the counter-revolutionary decades of Central Europe, Nazism, the emerging Rot-Grüne coalition of today’s Europe), human development has been arrested. Other parts of the world have known far worse oppression for much longer; yet precisely because their cultures have not built up traditions of free speech and individual freedom, one may say that their struggling trajectory offers more reason for optimism at this moment. The Soviet gulags produced Solzhenitsyn. In my experience, a Chinese intellectual is rather more likely to appreciate the importance of free speech than his American or European counterpart.

That tendency alarms me. Inasmuch as Western culture currently occupies the highest step of the cultural staircase in many ways (most obviously in technological development), to see its progress increasingly wedded to notions of a hive-like social order must raise doubts about our culture’s ability to continue fostering real human progress (which means, again, moral progress at the individual level). Our technology appears at once to have dulled the initiative of those who consume it addictively, to have insulated such consumers ever more from direct and vibrant social contact, and to have inspired ambitious social engineers with visions of playing out their visionary fantasies with a populace of pawns. The indispensable moral exercise of “truth-testing” has been curtailed from every direction. In a sense, we no longer know what a lion looks like. Most of us don’t grow our own food, can’t build a fire, expect a medicine or a surgery to cure the effects of our intemperance, and view the need to live within our financial means as “unfair”. We seek out online “communities” of people who will pamper us in our illusions rather than deal with relatives, neighbors, and co-workers whose corporal presence is accompanied by troublesome opinions or attitudes. Our daily existence brings us into contact with very few physical realities. Death of any kind is always a tragedy or an outrage, never natural and serene; and as far as we are concerned personally, we claim the “constitutional” right to healthy and eternal life on this earth. When the lion of reality at last roars at our window, we take a sedative and turn up the dial on our earphones.

In such a condition, we are primed for the megalomaniac’s exploitative utopian promises. Immune to the pains that have forced earlier generations of human beings into maturing, we become morally stagnant. It is entirely possible that we are even retrogressing—individually and collectively.

For what kind of person, frankly, would refuse another his simple say? What corner of common decency can cloak the shouting of vile names at an adversary who merely wishes to voice a different opinion? What morally responsible adult would consent to expelling from certain public spaces and even confining to prison another human being who will not parrot the chants of the mass? If an “offender” holds a “racist” or “sexist” view, let him express it, respond to him point by point, work with him to identify his wrong turn, and set him free to climb higher on his own. What human objective is satisfied by tarring and feathering him? Is it not a moral axiom that a persecuted minority is always more righteous than its persecutors, even when the beliefs responsible for its targeting are odious? For a morally odious belief is one which denies our fellow beings their humanity—but persecution is a morally odious act which turns ideological rejection into brute force. The man who suspects a certain race of being cowardly is not as bad as the men who beats him to a pulp for his suspicion.

To tell oneself that one advances the cause of truth by preemptively stilling the tongues of those who may speak lies is to lie to oneself. We, collectively, have become that species of liar. Western culture of the twenty-first century is a lie-friendly culture.

Though the West has not responded well to the challenges of freedom lately, its thinkers and authors were the first of our species to appreciate that such challenges would come with greater affluence. Long before Huxley and Orwell, our literature was the scene of an evolving meditation upon what happens when human ingenuity satisfies all basic needs. Thomas More’s Utopia was perhaps penned with a high degree of irony: we cannot possibly measure that degree at this remove. Yet if More’s narrator (or, worse, More through him) sincerely advocates as the end of human existence a well-oiled socio-economic machine whose cogs never question or complain, then the mislocation of ultimate purpose is disastrous. The imaginary island’s uniformly clothed, perfectly disciplined human insects are said to dedicate their ample leisure to reading and the arts (a strain of the Siren Song that Karl Marx would echo); but what need of the arts has a being whose mortal span is complacently passed planting and harvesting and distributing in a routine more like ritual dance than labor? In the art work restlessly prowls the indefinable, the unobtainable… and what do Raphael Hythloday’s Utopians know of such things?

Swift’s Gulliver, at least, is a pitiable dupe of his own naiveté (or “gullibility”). His hosts in Houyhnhnmland are another study in the hypothetical society that has fulfilled all of its essential needs and can proceed to add cultural institutions as it sees fit. Rather less industrious than the Utopians and more Stoical (the Houyhnhnms have never invented writing because they perceive no use in it), Gulliver’s equine citizens, under the pretext of pure reason, practice eugenics, accept a class structure whose bottom rung approximates slavery, and contemplate the genocidal extermination of the Yahoos. A tradition that utterly forestalls any dissent with prevailing opinion (as expressed by elders and by the political assembly) confers upon such behavior the guise of rationality, though it prickles beneath the surface with moral atrocity.

If, in a triumphant abundance of resources and technology, we create a human world where all are fully fed and clothed, then the purpose of life will be food and clothes. If we prefer the Houyhnhnm nudist colony with its Spartan feed but perfect harmony of opinion, then the purpose of life will be to sing every note in harmony. The contemporary Western world of today (which looks rather less Maoist than More’s and less colonial than Swift’s, for the moment) favors a life of pleasure: sex, “virtual” entertainments, feel-good grazing or drinking, and cars. Free speech survives among us to the extent that it lubricates access to more varieties of sex without consequences, more Internet freedom, more endorphins, and more unimpeded travel. Shack up with whomever you like, browse whatever websites you wish, order your weed with a latté at McDonald’s, and drive through (or “thru”) as you buy it: yes, free expression within those parameters is licit, and even celebrated. Should one wish, however, to calculate how deeply this commitment to free speech runs, one may consider what nearly universal ridicule or opprobrium greets advocates of sexual restraint or reduced dependency on electronic gadgetry or criminalization of mood-altering substances or design of pedestrian-centered communities.

Where your treasure is, there rests your heart. Do we value truth as an end in itself—a terminal objective—or do we invoke it only when it furthers the end of a chicken in every pot, or free health care for all, or deregulated drug consumption? Another way of asking the same question is this: would we lie later on to obtain those ends which profit for now from our arguing truthfully?

It doesn’t really matter to most of us, in any practical sense, whether something like a Sasquatch exists or not. Yet getting to the bottom of the more convincing Sasquatch evidence would matter to us, if we revered the truth; and what should further matter to any votary of the truth is that so many Sasquatch-hunters appear less concerned about the phenomenon’s specific reality than about a quasi-mystical cult that they have built around certain suppositions “taken on faith”. Meanwhile, others among us urbanely deride any and every assertive claim on the subject simply because the claimants have so often introduced their most cherished fantasies into it.

I would recommend Canadian Les Stroud’s six “Bigfoot” episodes of Survivorman to anyone who would like to see what intelligent skepticism in the collection and processing of evidence looks like. If we could apply Stroud’s methods to reporting on terrorist activity, or analyzing the anti-terrorist measures of our government, or determining whether the “tech” industries are really making our lives better—or clarifying our understanding of “good” and “better”… but we seem far too lightly invested in truth for any of that to happen in the foreseeable future. We’ll take the truth when it enriches our treasure; and when it doesn’t, we’ll take our treasure.

handstandingbf


A “mermaid’s” webbed hand was represented, very realistically, by one “mockumentary” as appearing on the window of a diving bell. Todd Standing’s purported photograph of a Sasquatch is either a superlative hoax, like the hand, or else genuine. As Les Stroud remarks, “It can’t be both.” Yet is some nonsensical version of that alternative not where many of us live today… or where we are being programmed to live?

***

My own vein of mysticism in this piece, I would argue, is not cultic, in that seeking the truth is our human calling—the nucleus of our complex nature—rather than a mutilation of that nature. I believe, in a metaphysical sense, that we were made to seek truth. The popular answer today is, instead, that we were made by chemical accident, and that our purpose is and has always (since that originary accident) been to survive by whatever means possible. As I have aged, I have grown increasingly impressed by the irreducible incoherence of all such scientific answers. In my little book, Climbing Backward Out of Caves, I tried to expose several of these fundamental contradictions in all material explanations. (They begin at the tiny level of the atom: the basic building-block of matter whose substance is simple yet repels on the “outside” while binding on the “inside”.) We cannot crack the mysteries of our cosmos with a human understanding. By no means does that imply a disparagement of empirical science… but empirical truth is only one variety of truth, and not the highest variety. We are obligated to seek it out and to declare it openly when we find it. The biologists, for instance, who discovered that fish caught in “red” waters carry a toxic chemical secreted by reddish algae would have been morally remiss not to warn the public against eating said fish. The moral obligation precedes the scientific discovery, however, even while being informed by it. Our inner truths are more important than the truths about our outward circumstances, though our circumstances give content to our human duty.

As a plant evolves spines to prevent birds from eating its species into extinction, so, I believe, the instinct-driven animal in us evolved into an autonomous and reflective being through the aid of language. The truth of free will was always latent in our nature: it fell to our collective history—the evolution of our culture—to grasp that truth more and more firmly. The more we talked, the firmer we grasped. When we began to write, we were able to ponder our words yet more carefully and to utter them without immediate fear of scowls or menaces; and so, perhaps, our discovery of the inner world’s realities accelerated with reading and writing. My own scholarly community has tended to give literacy full credit for this soulful work of excavation, and I have often modeled the same prejudice myself (as in naming Praesidium’s organization The Center for Literate Values); but the spoken word undoubtedly removed the first strata of dull encrustation.

Now we have gone a little too deep. We have dug right through the treasure, the chrysalis—the destiny-in-waiting—and continued downward into layers of suffocating clay. Having recognized our personal will and our creative ability through exchanges with our fellow beings, we have begun to create in a vacuum without our neighbors’ salubrious influence, and even (in the case of the deepest diggers—often the most literate and “intellectual”) to will selfish fantasies. Our advanced technology has insulated us from the immediate material shortcomings of ignoring our brethren, and our highly developed imagination has blunted the restraints of “common decency” that human interaction once imposed. We have, in a sense, over-evolved; but it would be more correct to say that we have strayed down an evolutionary cul-de-sac.

Jesus once rebuked his disciples for trying to chase the little children from him, and he warned, “Unless you become like one of these, you shall not enter the kingdom of heaven.” A small child, of course, is a thoroughly egocentric being in the same way that very rudimentary preliterate communities are egocentric. There is no sense of self because all is self and self is all: to be more exact, there are no boundaries that separate the individual’s inner life from the imperceptible inner lives of countless visible others around him. There is no challenge to being David or Irene: there is no identity in need of assertion and defense, for what one feels is what exists, and it exists for all as for oneself.

Obviously, such is not the child-like state recommended by Christ. Yet what could be less child-like, less Edenic and more painful, than the mounting self-knowledge that forces the adolescent to discipline his urges and bring his whims into conformity with the external? What of the child could be said to linger, either in this developmental stage of the individual human or in the tribal culture’s early struggles with the self-consciousness of its members?

What we would hope lingers (and what we should strive to nourish) is the child’s persistent questioning and the child’s forthright verbal observing. If something seems enigmatic, then ask how it works; and if its operation seems suspect, then declare openly that foul play may be occurring. How soon we stifle those impulses in ourselves: the labor of silencing indeed begins usually in early adolescence! As adults, we call it “being professional”. We do not wish to appear fools, and so we add our particular and detailed praise of the emperor’s new clothes. We must absolutely be perceived as “team players”, and so we actively collaborate in covering up awkward truths that might keep the company from landing the contract or the university from earning accreditation.

This isn’t the way to get to heaven. The truth is the way. And truthfulness is not merely the way, as in a means to an end; it is not some sort of recommended good work, like visiting the sick. Heaven is full reality, and truths are its golden bricks. By the same token, our postmodern manner of life, with its “virtual” marvels and its “spin” in service of the political elite’s “higher causes”, represents not merely an abatement of zeal for some airy, indemonstrable afterlife. Falsehood is the highway to hell… and, also, it is hell. As a perversion of our intended nature, it can only make those who embrace it long, sooner or later, for the nullity beyond this existence. Sadly for them, that, too, is a lie.

John Harris is founder and president of The Center for Literate Values.  His most recent book is Climbing Backward Out of Caves: A Case for Religious Faith Based on Common Sense.