Writing and Realism

Realism is always an afterthought in writing advice: two thousand words on plot development, four thousand more on character development, and then a quick footnote on the importance of “doing your research.”  I guess most writers and writing instructors take realism for granted, as if it came naturally with close attention to characterization and plot.

But the single most common defect in fiction writing nowadays—especially with the rise of self-publishing—is the absence of realism.  In fact, most of the novels that fail to maintain the suspension of disbelief, in my experience, don’t fail because of typos and poor style; they fail because factual errors and inauthentic speech culminate in a saturation point where you just can’t be bothered to go on.  Let me provide a downright incredible example of what I’m getting at.

Some time ago I was handed a short story to critique as part of an informal writing group. The author was not only a graduate student in the humanities, he’d also attended a summer writer’s workshop at a prestigious European university.  So his story had not only been vetted by the writing group and its blue-chip instructors, but it was the culmination of a whole summer’s effort in the program.

The story was a piece of historical fiction, which saw the eighteenth-century heroine turn radical feminist after a tryst gone wrong. Not my cup of tea, to be sure, but from a purely stylistic point of view, it was well-written (though it should’ve been after passing through so many hands).

Nonetheless, the story was replete—I mean from the first sentence to the last—with the kind of factual historical errors that would embarrass a first-year history student.  I’m not even an expert on this era, but it was immediately apparent to me that the details about sentiments, manners, clothing, pedigree and place were all so horribly anachronistic that even a powerful plot (which it didn’t have either by the way) couldn’t have saved it. The most cringe-worthy error saw the heroine inspired by the works of Mary Wollstonecraft a full half-century before the feminist activist had even been born—as the kids say, “epic fail.”

At any rate, the experience suggested two rules of thumb to me that have been proven time and again since then: (1) one of the essential differences between professionals and amateurs is the level of realism; and (2) having your work vetted by a successful writer or a credentialed expert is no guarantee that you’ve satisfied the basic conditions for a good story.  Since the second point deserves a separate treatment, I won’t say more about it in this post beyond offering a cryptic preview of a future post: a fanatically pedantic editor with a broad knowledge-base who understands grammar the mechanics of narrative is far more useful to a writer than other writers or literary critics.

As for the first point—the importance of realism in writing—let’s begin with a postulate:

 Every writer can’t be a Hemingway when it comes to prose style; but every writer can get his facts right.

Now, if you’re one of those practical people who prefers to focus on what he can do, instead of dreaming of what he can’t, let me offer a few general principles for getting the facts right.  In other words, let me suggest some categories for “doing your research,” which you should be added to a checklist for improving on character, plot and dialogue.

Wait a minute! You say this sets the bar on style too low. If that’s what you’re thinking, I have a two word response: Herman Melville.  Melville was an amazing stylist and part of the reason was his seamless integration of nautical terminology and sea-fearers’ brogue into his prose.  That one example should suffice to prove the connection between style and factual knowledge, which is really an extension of the principle of realism involved in avoiding cardboard characters, implausible plots and inauthentic dialogue.  The bottom line, then, is that attention to realism—to getting the facts right—is ultimately a way of improving your prose style.

The following are the three most neglected categories in recent fiction, which should serve as touchstones for evaluating your own writing.

1. Nomenclature

Every profession, trade, academic discipline, technical field, hobby and institution has its own nomenclature—its own nouns and verbs.  People who write medical dramas, police procedurals and legal thrillers tend to recognize this, which is why they study medical textbooks, police procedure handbooks and why most successful writers of legal thrillers are also lawyers.  Most writers outside these genres also recognize this, and they wouldn’t think of trying to compete in this area without rigorous research.  But for some reason the same sagacity doesn’t always carry over to other fields.

Now, it’s obviously impossible to give all the terminology of every field, but I think a few representative examples make the point. I’ll follow up with a few simple ways to overcome your knowledge deficit.

a. Hobbies: art-lover, historian, fishing, woodworking, botany, etc.

Giving one’s protagonist a hobby or an area of expertise outside his immediate one is a nice way to fill out his character and provide all sorts of scene-setting fodder. That’s probably why it’s become a staple—dare I say a cliché?—in fiction and television drama over the last few years. Too bad the writer didn’t know the difference between Monet and Manet, Thales and Thucydides, spinnerbait and swimbait, flat-sawn and rift-sawn or the calyx and the corolla.

As someone knowledgeable about woodworking (and other more unusual things), I find it intolerably amateurish when an author draws on this field for effect without even bothering to find out if what he says is accurate. Wood is not “carved” on a lathe, for example, it’s turned. One finishes a piece of furniture when one applies a protective coating of vanish, oil or shellac.  And don’t assume that some fancy finish you’ve heard of, say, “French polish,” comes in one can (e.g., “Chuck opened a can of French polish…”), because, like most high-end finishes, this one involves several products and techniques. Moreover, a hammer and nails is almost never used in fine woodworking, much less a wrench.  And if you say so-and-so ran his hand over the “fine-grained [wood x],” you better make sure the wood being stroked is actually a fine-grained (maple is, for example, oak is not).

It’s also important to know that different species of wood are more commonly used in different eras and that they’re cut in different ways.  Chippendales’ furniture, for example, was usually made out of Cuban mahogany, not oak or maple.  Shaker furniture makers preferred the simpler, inauspicious grains of maple and black cherry (shunning mahogany), while Mission and especially Arts & Crafts furniture are most commonly identified with quarter-sawn white oak.  Each piece of furniture also has a name (e.g., cornice, pediment, etc.) and different eras used different techniques for construction and finishing.

As I mentioned at the outset, these particular facts are only meant to suggest the breadth and depth of the field—it’s hardly enough to write a scene. But if you make a mental note that your last trip to Ikea and the contents of dad’s rusty toolbox don’t add up to a knowledge-base about furniture and woodworking, then you’ve taken the first step toward avoiding embarrassment and adding richness and depth to your writing.

The upside to all this is how little research you actually have to do. You don’t need to undertake an apprenticeship to learn about woodworking—or gardening, or cooking or anything else.  All you need for brief references or a minor scene is an article from a hobbyist’s magazine. The good ones always use the correct nomenclature.  The same goes for more academic subjects, like art history—pick up a book and read it.

b. Construction

Every part of a house or a building has a proper name and a proper verb for carrying out the process.  Carpenters frame houses when they erect the wooden structure out of which most houses are built. Walls are constructed from vertical studs with horizontal headers (on the top) and footers (on the bottom). The framing material is usually spruce, unless steel studding is used or some other more readily available local wood. Floor beams are called joists and the roof is held up by rafters (ergo, “beams” belong in barns and timber-frame construction, unless it’s a steel I-beam, which suspends the floor joists).

The outside is covered with plywood sheeting and then faced or clad with siding.  The inside is faced with drywall or sheetrock (but there are other regional names). Plaster and lath hasn’t been used for seventy years in the developed world.  And all of the wood parts are cut with miter-saws, circular-saws and even chainsaws, and they’re attached with pneumatic nail-guns or simply nailers. Hammering a house together with spikes (not nails) is a rare thing nowadays.

Foundations hold up houses and are constructed by pouring concrete into wooden forms that are removed when the concrete cures; or, the foundation is made from concrete blocks that are cemented together with mortar (not with cement). “Cement” is a generic word for types of adhesive or a process of attaching materials together using those types of adhesive. In other words, foundations aren’t made of “cement” and masons don’t use “cement,” they use mortar on blocks and bricks, each of which is made from different materials and use for different purposes.

Further, plumbers plumb houses (usually with copper and ABS piping), electricians wire houses, and the “electrical piping” in commercial and industrial facilities is not called “piping” but conduit, which is run when it’s installed.  And, like picture frames, doors are always hung.  Incidentally, house walls, doors and modern bathtubs are not bulletproof.  A brick façade, for example, with provide some protection, but the bricks will crumble like mud pies under repeated gunfire. Unlike the cast iron varieties of yesteryear, newer bathtubs are thin steel or fiberglass.

True, some artistic license must be allowed.  Shooting people with nail-guns, for example, was all the rage a few years ago when these tools became commonplace on jobsites. In fact, however, it’s very nearly impossible to do it without pressing the tip against someone, because these tools have a safety feature that prevents…you guessed it… accidentally shooting someone! All the same, there are always and without exception proper names for the materials, tools and procedures used by industry.  Realism requires knowledge of these domains when the story requires their introduction.

As before, you can correct this deficit with a few handyman magazines, which are usually written by people who know the business.

c. Architecture

Few things grate more than the careless use of architectural terms like “Doric,” as in “He admired the Doric columns of the…” when the building or monument that completes the sentence belongs to a wholly different style and era.  Doric refers to a very specific kind of column in the traditional Orders of Architecture, which have come down to us through the Roman engineer and architect Vitruvius. Types of columns are matched with types of styles.  Similarly, a Georgian mansion adorned with Art Deco furniture is an oddity that should be explained, because the terms refer to two wholly different styles, eras and tastes.  When these terms are confused by the author, they point to his amateurism and thus alienate the reader who knows better.  In the worst case, they break the suspension of disbelief and render the book unreadable.

2. Professions and professional idioms

On the face of it, this one’s a given because of the close connection between dialogue and the way people in various professions talk.  Yet I frequently read outlandish expressions and sentiments attributed to various professionals.  A college professor, for example, advises someone to “consult his works.” No one but a pompous ass would refer to his “works,” because the term is synonymous with classics or masterpieces.  He will say my research or my work (singular) in a particular field, but never “my works” (plural) when referring to his publications.  It’s a subtle distinction, yes, but it makes all the difference to someone in the know. (Of course, if the professor character is a pompous ass, then in all likelihood he would advise someone “to consult his many works on the subject.”)

The most common transgressions happen, however, when the author depicts soldiers and the military—just ask a soldier. Know the difference between grunts and jarheads and the kinds of things soldiers actually say to one another and how they act.  Here I can only give two general rules for realism: (1) read soldiers’ memoirs, good field reporters, go to mess halls and talk with actual soldiers; (2) don’t rely on Hollywood movies.

If you thought the recent crop of films about the military were realistic, it’s probably because they reinforced the false perceptions of soldiers and the military that you’ve already picked up from the same source; and if you share this anti-military view out of ideological conviction, then good for you. But if you’re the kind of writer who’s interested in things as they are—in realism rather than perpetuating axe-grinding stereotypes—then you should consider the fact that Hollywood’s view of the military has been hopelessly discombobulated by Vietnam-era anti-war films, which were, at bottom, wholly unrealistic.  If you think otherwise, consult source 1 for an earful.

Much the same applies for any other profession: there’s not only specialized terms used by professionals, but they learn to speak in a professional dialectic, which include speech, manners and other idiosyncrasies unique to it.  I’ll admit that this sort of language and understanding is hard to pick up unless you’ve been there.  But try sneaking into professional conferences and watch documentaries to see how they actually communicate with one another.

3. Detail as double-edged sword—or, fishing out of season

Every rhetorician and trial lawyer knows that precise detail makes speeches and testimony more persuasive.  “He was wearing a navy blue blazer marked with a St. John’s High School crest” is more persuasive than “It looked to me like a dark jacket, maybe black.”  Writers know too that concrete detail is preferable to abstraction—e.g., “Bill was fishing for large mouth bass in Chesapeake Bay” is better than “Bill was fishing in the lake”—at least, when the detail is conducive to plot and character development.

Unlike rhetoricians and lawyers, however, writers have to check foreground details against the details of the background narrative. So if you write that Bill went fishing in Chesapeake Bay (foreground), you better make sure that you didn’t already specify that the time and place was mid-July (in the background narrative) when the fishing season is closed. Obviously, a character can fish out of season, but it’s remarkable if he does, and so it must be noted.  It should go without saying that depicting your “straight-arrow cop” casually poaching fish is a defect in realistic writing (parody notwithstanding).

There are a thousand ways to fall into this sort of trap. Not all of them will be obvious and many will probably be missed by the reader. But you can’t rely on that because of the symmetry between your characters and your audience.  It’s not uncommon for readers to follow a series that depicts characters they identify with or aspire to be.  So creating a crime-solving cop who’s also a fishing enthusiast, for example, will likely attract readers who are also fishing enthusiasts.  But if you constantly make these sorts of gross errors to an expert audience, you’ll lose readership because they’ll feel cheated, as if you’re pandering to them, which is even worse than not having the fishing angle at all—and that’s the sharpest part of the double-edge.

4. Pseudo-knowledge

I downloaded a sample of a post-apocalyptic novel recently, which was not badly written and the story was somewhat compelling.  But the writer insisted on inserting his pseudo-knowledge of global finance into the narrative. No doubt he wanted to heighten the dramatic tension by increasing the realism of the situation with historical precedents of financial collapse.  It’s a good strategy; and if his information dumps had been accurate, he might well have strengthened the realism of the novel.

Unfortunately, what wasn’t pseudo-history was pseudo-finance. Hence, his plan backfired in the worse possible way: in placing this pseudo-historical material so prominently in the narrative, it proved catastrophic to the suspension of disbelief, instead of working to strengthen it. Indeed, he would have been better off not mentioning historical precedents at all.  I may well have kept reading.

There’s a simple and straightforward lesson here that needn’t be belabored:

 Pseudo-knowledge cannot reinforce realism; so, get your history/ economics/ religion/ science/ engineering (etc.) right.

Now, there’s one big exception to this rule.  Unfortunately for rule-rejectionists, it’s one of those exceptions that prove the rule. I’m talking about Dan Brown’s Da Vinci Code here, of course, a perfect example of whole narrative concocted on pseudo-history.  But Brown’s book reinforces my case for the simple reason that only a miniscule number of people are sufficiently conversant in the historical material upon which the book was based to be completely turned off by it. Like many others in that smallish circle, I couldn’t bear reading it. But the vast majority of people had no problem with it, because the gross errors and exaggerations didn’t kick them in the face.  So, no, Brown’s book isn’t an exception; it’s just that the vast majority of his readers couldn’t tell. The rule stands.

Spoilers Make Books Better?

The National Post carried a report last week showing some research into the effects of spoilers on audience enjoyment.  Turns out people enjoy books more when they know how it ends.  Seems to me that it might have a lot to do with the type of story, despite the claims made in the Post story.

The Anatomy of Contemporary Atheism

When I tell people I’ve never met an atheist, they tend to pause awkwardly for the punch-line of a bad joke, or perhaps they’re waiting for confession of a cloistered life. But I’m being serious when I say it (as serious as I can be, anyway) and I’m not such a recluse that I haven’t met people who’ve claimed to be atheists.  What I have not met, I maintain, is the genuine article.  I say this because a little probing of my subject always reveals a different animal hiding beneath the atheist’s cloak—a sheep in wolf’s clothing, so to speak.

I usually find myself conversing with a sentimental progressive who sees disbelief in God as a necessary condition for his faith in Progress and its earthly realization—or to borrow an older term, its immanentization.  The equality of human beings, human rights, even animal rights are all still in place with in my interlocutor’s godless cosmos. Justice still abides without God it seems—indeed, it’s all the healthier for His demise!

Such nominal atheists have never read Carl Becker’s delightful satire, The Heavenly City of the 18th-Century Philosophers. If they had, the smart ones would recognize that their atheism substitutes Progress for God as Voltaire’s had Reason—everything else remaining conveniently unchanged.  Apparently, theology is the one domain where you can kick off the head of the organism and expect the body to live on.

Reading James Wood’s review of The Joy of Secularism in the New Yorker reminded me of all this and further reinforced my view that the so-called “New Atheists,” whatever else might be said of them, resist the real implications of their creed for themselves and for the world in which they live. From what I gather, few in the book Wood reviewed have done so either (that preternaturally cautious philosopher Charles Taylor apparently being the one partial exception).

Only Nietzsche and Plato, it seems to me, really appreciated what it means for a person to reject a transcendent order. Both philosophers realized that ennui and despair are the only psychological responses, and cultural dissolution the only social and cultural result. Indeed, even language and the shared experience and understanding embodied by it will crumble into individualized bits of dust in the wake of the departing gods. To be sure, people carry on once disenchanted and order returns as it always must, but minds can be reconstituted in strange forms and order can become a collective abomination—history and anthropology show what awaits the momentarily godless.

At any rate, my present concern is not to defend the psychological and cultural implications of God’s demise, since I take Plato’s and Nietzsche’s understanding of the world without God as the truest one.  Instead, I want to try and understand why the New Atheists have resisted what should be obvious.  What follows, then, are some hypotheses that may explain why the New Atheists have been able to be atheists.

The psychology of atheism

1. Material comfort. The simplest explanation is also the oldest one: contentment breads complacency.  Like Hitchens, Dawkins, Harris and Dennett, James Wood enjoys his own and his society’s relative health, wealth and freedom (granted, Hitchens isn’t doing so well on the health front, but he has enjoyed a healthy life with all the worldly fortune one could hope for until recently). Under such conditions, talk of God, the afterlife and rationality in the cosmos seems as superfluous as the things themselves. Who needs to wrestle with these otherworldly concerns when the sun shines so bright?

2. Nemesis. As the ancients realized, Nemesis is a wonderful thing. Fighting an enemy gives life meaning. The concentration required for achieving a goal and the thrill and the immediacy of battle keeps away the bigger and harder questions about the point of it all that invite reflection and despair.  Indeed, a battle is a journey to a destination and as every writer since Homer knows, it’s the most powerful and perennial narrative structure.  In simpler terms, atheist militancy is fueled by the same anima that fuels sports and video games—the thrill of ersatz battle.

3. Atheism as de facto religion.  This seems like the most widely accepted explanation among critics of New Atheism, since virtually every critic of Dawkins, Harris and Hitchens points out that these fellows all sound like old-fashioned Bible-thumpers—and they do, no doubt.

But as I mentioned before, my experience reading and talking to New Atheists is that beneath the surface one finds pantheism (e.g., nature as the source and cause of a self-perpetuating wonder), animism (e.g.,  “we all join in the cycle of life”), and, most often, old-fashioned Enlightenment Progressivism.  It’s not atheism that’s the religion, then, but the belief in Progress that provides purpose that is bulwark against despair, filling the gap left by God and the Cosmos in the atheists’ psyche. Atheism is just the tip of the mental iceberg, jutting out of the mass of beliefs that make up an individual psyche, because (presumably) it’s the fashionable face of a more complex set of beliefs.

4. Serotonin. My preferred hypothesis (for the time being at least) is genetic. Much like the rest of the animal kingdom, most people are wired to be happy with their circumstances, at least when the bottom of Maslow’s hierarchy of needs has been met (as per 1). Most of us don’t feel the need for deeper explanations—we don’t suffer from existential angst—because we’ve been born with the optimal balance of the “happy-brain chemical,” which prevents us from feeling anything but contentment once our basic material needs have been met. To borrow the more recent framework, James Wood and other atheists are “dandelions” among religiously-minded “orchids,” not so much clear-minded and self-aware rationalists (unless, of course, one takes the latter as a euphemism for the former).

At any rate, your average atheist likely embodies some combination of all or several of these things. It’s not easy to say for certain without cracking open skulls and reading the brains within.  All I can say for certain is that it is not Reason that sustains them on her divine account, but a combination of transcendent-mimicking false belief and brain chemistry.

Resisting the cultural implications

The psychological hypotheses can explain how human beings can believe in atheism without suffering from its psychic consequences. Moreover, number two above may explain the stridency of the New Atheists. But the psychological considerations don’t explain the resistance to the dire implications for us all, especially if Christianity should lose its influence entirely on the Western world.  So what beliefs prevent the atheist from seeing these consequences?  I tend to think that there is really only one major answer to this question: the belief in human progress.

A favorite pejorative label among conservatives is to describe left-liberals who have adopted environmentalism as a way of further their political agenda as “watermelon environmentalists” (i.e., because watermelons are green on the outside, red on the inside).  As I mentioned, I’ve noticed time and again a similar phenomenon among atheists: it’s either a conscious or unconscious front for what would otherwise be characterized as progressivism.

Which is not to say that some progressives have donned the atheist disguise as a way of forwarding their agenda (though it’s entirely possible); but most seem to suffer under the illusion that Progress is possible in an indifferent universe where living things are subject to evolutionary forces.  Diseases, man-made disasters, asteroids and a thousand other variables can destroy human life in its current state or extinguish it altogether. Whatever the case, it’s worth noting that two kinds of ignorance underlie the belief in divine Progress, making it seem rationally possible.

1. Philosophical ignorance.

There’s no better example than the pseudo-debate that goes by the title “religion versus science,” which has spawned a number of unfortunate tropes or commonplaces that are either pernicious half-truths or outright falsehoods.  One of the former is the idea that there is something called “religious belief,” which is supposed to be qualitatively different from “secular belief.” I’ll get into this false dichotomy another day. Suffice to say for now that religious belief is supposedly sectarian and absolutist, while secular beliefs are neutral and inclusive. Obviously, the former lead to oppression and war, while the latter lead to universal happiness for all. This is nonsense, of course, because we all have “religious beliefs” insofar as we have fundamental values—truths we hold to absolute—like the belief that others should be treated fairly and equally regardless of their particulars.  And we’re all willing to use force to defend and extend these fundamental beliefs—or at least to resist the imposition of the beliefs of others.  In a word, then, we are all fundamentalists of one kind or another.

For reasons I can’t really explain (beyond blaming the education system) we seem to have collectively forgotten the origin of the term “religion.” It comes from the Latin “religio,” which means “the ties that bind us” as individuals, including our obligations to family, the state, the gods or our friends.  The term “religion” was coined for the academic study of these phenomena, though its focus was human faiths, rituals, cults, institutions, the afterlife, souls, cosmologies, etc.  In other words, “religion” refers to an academic discipline that artificially and roughly broke off part of the human condition for study; it is not a logical or scientific term for certain kinds of beliefs.  In fact, what does and what does not fall under the purview of religion is an on-going controversy in religious studies.

In any case, the salient aspect of the false dichotomy between “religious and secular belief” is the possibility that religion can be transcended: that we are all of us moving toward a secular End Times when the evil forces of religion that have caused all human misery will wither and die, leaving only the happy, loving rationalists behind.  It’s this false belief above all else that provides the intellectual scaffolding for the atheist version of Progress.  Kick it away (along with a few others I’ll discuss another time), and the rest falls apart.

2. Historical and anthropological ignorance.

It’s not often that one can make generalizations about human nature.  But “religion” is one of those cases: no culture, anywhere, ever has built its collective spiritual house on the shifting sands of the immanent world.  The reason would be plain as day to anyone who’d lived before modern times and many who live outside the West: the immanent world is capricious and fickle. Droughts and cold spells kill crops; locusts come from the sky without warning and devour one’s harvest; pestilence strikes oneself, one’s family and one’s livestock.  The sea is intemperate, rising up to drown those who’d make a living off of it; and mountains can burst and bury a whole civilization under molten rock.  Even loved ones, friends and brothers can turn against you in this world.

Now, it’s natural that we yearn to escape the tumult of our native land for tranquil shores (we wish that in this life too, after all). But there’s a deeper reason than emotional solace in the invention of the transcendent.  All anthropological evidence suggests that we are hardwired to order our thoughts, our lives and our societies in accordance with a permanent, unchanging order of things.  This is not the old atheist saw that says “things looked designed to us, so we falsely infer that they are,” as if our mental architecture was a series of careful inferences that might be judged true or false on the basis of a decisive experiment.  It’s that we must live life in accordance with an understanding of its meaning and purpose and the only constant source of purpose in a Cosmos—a rational order.  A world that evolved and remains forever in flux cannot offer us that; only a transcendent order can.

It’s this last point, I think, that explains why atheists—in spite of themselves—fall into what can only be described as ersatz religions like Progress or animism.  Like the rest of us human beings, they really can’t help but believe in the gods.

A bit of the wisdom in Genesis

As astute historians of the dismal science have appreciated, the first mention of the idea of scarcity is contained in the Old Testament. Adam and Eve’s expulsion from the Garden illustrates a basic fact about the human condition that’s been codified in the first principle of economics.  What few appreciate nowadays is the significance of this insight for our collective worldview in the West.

What the Expulsion says about Adam’s and the human condition more generally is this: it is not thy neighbor’s fault that ye must toil for your supper; it is not on your neighbor’s account that you are forced to till the soil and tend livestock. It is the condition of all men that they must toil for their supper.

Like good news, good ideas rarely make headlines. We forget how much this idea shapes us and the culture in which we live. To see its effect, one has to consider counterfactual questions: what would life be like if the Book had said something different?  What if Genesis had said that evil men have caused you to suffer, or that History and its material forces have caused you to suffer—or any number of other things that turn brother against brother?

Though it’s never been studied, indirect evidence is found in the weakness of the idea of what anthropologists call “limited good” in Western societies (i.e., weakness when compared to non-Western ones).  Marx and the influence of socialism granted, Westerners have been disinclined to accept the idea that their neighbors are to blame for their misfortune—that the human condition might be fixed if only so-and-so could be brought to justice.

To be sure, Westerners individually and sometimes collectively have succumbed to the belief that disparities of wealth and other intangibles have been caused by others (and sometimes they are). But our better selves—or, rather, our Biblical selves—tend to suppose that our neighbors earned what they have, not that they’ve taken it from us. It is cognitive fact, I suggest, that this belief is only really possible once we’ve accepted that the natural state of things is scarcity, not plenty.

There is the interesting corollary, of course, in that we would never have escaped our condition as much as we have if it were not for this belief in scarcity. Indeed, we would have wallowed in squalor, forever seeking redress through redistribution if we hadn’t internalized this belief long ago.

George R. R. Martin’s prose “functional”?

Daniel Kaszor wrote the following in a review of George R. R. Martin’s A Dance with Dragons over the weekend:

 “No one would call Martin’s prose more than functional…”

The NP’s standards for reviewers are slipping. From what I’ve read of it, Martin’s prose is some of the finest in all genre writing.  Granted, I’ve only read the first two chapters of Game of Thrones, but it’s impressive. I find it hard to believe that the quality of his prose has declined in the intervening years.

At any rate, I think it would have been better form for Kaszor to exemplify the faults and to offer a counter-example. I can’t really think of anyone who writes prose comparable to Martin’s.

The Civilizers and Self-Parody

In yesterday’s National Post Jeet Heer complained of the horrors of Rupert Murdoch, how his conservative and populist empire has debased us all.  Maybe it’s all true, maybe you believe Heer—that’s all beside the point for me at the moment. The interesting thing about his piece is that it perfectly illustrates an attitude that seems widely shared by journalists: the inability to avoid engaging in the very scurrilous invective they condemn on a regular basis.

First, I think it was notable that he had to go back to the 1980s to find examples of bigoted remarks about minorities, begging the question as to what atrocities have been committed this century (do editorial remarks made 30 years ago still count?).

At any rate, note this remark by Heer as exhibit one in my case:

The Sun has a long-standing habit of referring to the French as “frogs,” a term that gets thrown around quite a bit as if it were a clever witticism worthy of Oscar Wilde.

Heer made that remark two paragraphs after this one:

The comedian and actor Steve Coogan, himself an alleged target of the News of the World’s phone hacking antics, described it as “a misogynistic, xenophobic, single-parent-hating, asylum-seeker-hating newspaper.” Coogan’s characterization might be extended to the Murdoch press as a whole, which tends to go after any group that doesn’t adhere to the ideals of middle-class white society.

Notice the double irony: not only does Heer engage in the sort of invective that he claims the Murdoch Empire specializes in, but he quotes the spleen of a comedian like it was a “clever witticism worthy of Oscar Wilde.”

Given the proximity of these two remarks, one wonders whether the self-parody is intentional.

The problem with Hume’s “On Miracles” (and heroes in contemporary fiction)

I’ve always been of two minds about Hume’s “On Miracles” (it’s a short chapter in the Enquiry); and I’ve lately come to dissent from the modern criticism of heroism in literature for reasons not unrelated. So let me perform the marriage…

In “On Miracles,” Hume argues that we ought to use our own experience of what is possible and plausible as a criterion for belief in the second-hand reports of others (which, of course, includes historical reports). There’s a certain indisputable truth in that principle; indeed, the argument is almost too commonsensical to dispute.  When someone claims he’s been visited by his dead grandfather, for example, we should on principle find it harder to believe than when a man claims he’s been visited by the postman.

Yet this criterion—as a rule of thumb—is only as good as the experience of the person who applies it, which is where the contemporary angle and the counterexample come into view.  Consider that one of the most common criticisms leveled at heroes of fantasy and action films is that their characters and their deeds are wholly implausible.  Nowadays, you won’t find many heroes in literary fiction—even the name has been substituted by fiat for the far less ambitious notion of a “protagonist.”

But when compared to real historical heroes, even the characters in fantasy can seem like pikers.  Anyone who knows the historical deeds of Belisarius or Hernando Cortes could hardly be impressed by even the most extreme hero-caricature of the typical fantasy novel. Aragorn, the hero of the Lord of the Rings, for example, has nothing on any of these real historical persons.  And yet, we dismiss him as implausible—the stuff of fantasy—when he barely holds a candle to real heroes.

So what’s at work here? I think the same phenomenon that makes us assign fantasy heroes to fantasy also shapes our judgments of plausibility based on experience: our experience is circumscribed by (1) our democratic values and (2) our modern condition in a relatively peaceful, advanced industrial society. The belief in equality causes us to be prejudiced against the exceptional man, even though we may—as in all ages—naturally admire heroes for their derring-do, if only because we can live vicariously through them.

Similarly, our (mostly) safe and comfortable existence protects us from the sort of situations where heroes abide.  We are generally far removed from the dangers and the heroes who meet them head-on (like our soldiers) than people were in earlier times.  In other words, we just don’t have much experience of the heroic on which to base our judgments of what is plausible for a human being to accomplish.

That brings us back to Hume.  It seems like Hume’s principle is involved in the kind of paradox that renders such rules of thumb moot points: if you do possess a vast historical knowledge, your experience is likely broad enough to follow his rule; but if you already possess this depth of knowledge, the rule is superfluous, since you’ll already know what the rule is designed to help you learn. If, on the other hand, you don’t have a vast historical knowledge, the rule will make you dismiss the very real historical exceptions that constitute the broad experience necessary to apply the rule successfully in the first place.

Many will no doubt dissent from this opinion for various reasons.  The writer of literary fiction, for example, will dissent for pragmatic reasons. He’ll respond (with justification) that he has to accept the world as it is.  The heroes of old don’t sell books, even if one wanted to write one.  Perhaps.  But has anyone even tried in the last 60 years?