Tuesday, January 25, 2011

Imagination in its Application

To my knowledge, there are two ways of doing science. The first resembles a child playing hide-and-seek. He has no rigorous system for finding the other players, maybe, other than looking into the place he just hid. His movements are sporadic, one moment pealing a curtain back, and the next going outside to check the bushes. He aims only for discovery, and, for him, anyone could be hiding anywhere. The world is the playing field. The other way of doing science is more methodical, something you might find an adult doing when he is coaxed into playing the same game. He systematically starts at the top floor of the house, and moves his way down, eventually going outside to find the remaining players.

To our modern and adult minds, the latter option seems the most efficient. But as is my way, I am of the opinion that children always know best, even though they don’t know why they know best. Imagination comes naturally to them, and as a guiding principle for us imagination is particularly suited to address what I take to be a problem in current scientific inquiry.

Humans have the trouble with thinking in paradigms. I often picture thoughts as bouncy balls stuck in cube, that cube being a specific paradigm. If I ask a question, any question at all, the kind of valid answers I receive are limited to the inherent content of that question. When my mother asks me how classes are going, I don’t answer “yeah, only if you put cheese on it” or “the square root of 14.” There a limited set of valid answers to this question: like “boring” or “great,” or I could even tell a long story about how my professor accidentally rubbed chalk on their face. Even this latter answer, though indirect, relates to the question in an important way.

The same can be said of science: if I ask a certain question, the answers I can receive are limited by that question.

The trick of it, or so it has been argued, is to learn how to ask the best question so as to extrapolate the best answers. A controversy over whether this is even possible could arise here, and though I could go down that track, for the sake of argument let’s assume what i'm saying is true.

Given that certain paradigms are more advantageous than others for solving issues within a specific arena, I want to move on to the arena where both the principle of a better paradigm and a childlike imagination come in handy.

Ironically, the study of prehistory is the result of a paradigm shift like the one I explain above. Prior to Darwinian evolution, the very possibility for a prehistory didn’t exist. Scientists and philosophers didn’t ask questions about origins prior to the 17th century—at least in the West. Genesis, after all, gives a clear indication that “in the beginning God created the heaven’s and the earth.” At the time, theology hadn’t considered what ‘in the beginning’ might mean beyond theological interests: things like the creation mandate and God’s aseity. Theologians, in other words, were not biologists or archeologists, they were theologians. Their theology naturally influenced the Christian scientists of the Medieval era, even on into the pre-Enlightenment.

With Darwin’s discoveries, however, came raw scientific data for which that current scientific community could give no rational account. In some of my reading, I’ve often noted the feeling of confusion and shock expressed by Christian scientists at the inability of theology to account for the new scientific data. For us, these sentiments seem both familiar and alien. Any post-Reformation worldview where solo scriptura dominates the psyche inevitably results in over-sensitivity to the exclusion scripture at any time. To be honest, however, though I understand the initial feeling, many of these overactive sentiments pervade contemporary Evangelicalism to the point of being silly. Scripture is a type of authority, it is not the authority in all things. In an simple sketch: scripture does not contain all truth; it contains certain objective truths. Scripture is doubtless the authority when formulating a definition of love, but it is not the authority on underwater basket-weaving or Post-Kantian metaphysics, nor is it, as I am suggesting on science. How can it be? (One may want to say that scripture inadvertently refers to everything. Really? We have to be careful in saying precisely what we mean. Anything is related to everything in some way. I could say that the label on my toothpaste inadvertently refers to everything, and it be equally true. What is meant is that scripture gives an account of reality, not an account of everything in that reality.)

Contemporary atheists rightly point out the flaws in both the past and present Christian psyche for failing to make this distinction. They may be operating from a problematic paradigm. I commend Creationist’s desire to uphold a literal interpretation of the Genesis 1 account. They may very well be right, and, more importantly, have somehow managed to insert teleology back into the scientific discussion. But they might likewise be misguided if all they want is to preserve a bit of theology that qualitatively goes unchanged whether the macro-evolutionary process is true or not. The miracle of creation remains a miracle whether it takes 7 seconds, 7 days, or 7 millennia. When Peter saw Jesus walk across the water, he didn’t think, ‘If only he had run across the water, now that would have been something”!

I return to paradigms and prehistory.

If we are going to construct a prehistory that resembles normal history in the most important way, the way of including facts, we can only do it on empirical grounds. This is the only reliable paradigm. To get reliable results, we can only employ the second kind of science, the methodical science to do it. Normal history is the construction of data over time from either eye witness accounts or reliable hearsay. Whether the bias of the author alters the details of that history is, of course, another question (and one with great import on this subject). At present, I only want emphasis that history needs facts to be history.

But what I find is mental gymnastics on the part of some of our contemporary scientist: I have in mind Renfrew and thinkers like him. They say they take the first approach, that of science in the older way, as an act of discovery. But they’re fooling themselves. The way they do this is by constructing a paradigm from which to ask questions, and then given that paradigm, they begin to make empirical observations. It is an attempt at discovery through a lens. With their given story, they begin to make empirical inferences. But they’re not constructing the right kind of paradigm; they’re constructing a tale.

In other words, contemporary scientists still have the propensity for assimilating data with empirical validations, but they are finding that the quantity of data (for which to make assertions toward the Darwinian model) is lacking. To resolve the issue, they tell a believable story of what might have happened within, say, the process of natural selection, then manipulating the data to see if it fits that story. But what they have done is mistaken the first view of science, discovery, with complete non-science, or narrative. In doing so, they have constructed a paradigm which limits the use of the data, not one that exploits it. In attempts to construct a paradigm for doing science, they construct a fairy tale for doing bad science, or what may be more accurately called non-science, or mythologizing.

As a result, we have professional scientists doing the work of the amateur novelist. Scientists and parts of scientific story are stuck somewhere in limbo: between art and science. For example, apotheosis as a theory is an act of the imagination, but it is neither scientific (in either sense) nor aesthetic. Darwin stuck to science, the Romantics infused aesthetics into nature, but what are these scientists doing? Can it be that they are practicing the ancient art of myth making, only leaving out the art? Their actions are logically equivalent to taking Tolkien’s Middle Earth myth, calling it prehistory, and inferring empirical data. If there's a bone, it probably belonged to an elf. If king Theodin had riches in his tomb, he was probably considered deity. In short, if the empirical accounts prove viable 1.) they prove viable only in accordance with that fairy-tale, and thus, 2.) are narrow both in application and, to use narrow in a difference sense, to the exclusion of the use of imaginative powers outside of that myth—a story they know very well that may be mistaken. This use of the imagination is not the same as the child’s, though it seems like it. Scientists limit in order to control the data, whereas a child has neither limitations nor control. On this model, nothing is left to mystery until empirical science absolutely contradicts.

Then the story has to change in order to fit the data. These scientists have been doing this now since Darwin, and have refused to change the model. One cultural upheaval arose about 5 years ago concerning the evolution into man in middle and high school text books. I forget the exact data problem, but the trouble science was being taught when it wasn't science. But science, by definition, especially empirical science, should never result in outright contradiction, only in modification. Newton's theory of gravity has kinks in it, and implications, but we still think that an apple falls from a tree for some reason. But we do not believe, as we used to, in the world-wide advent of homo sapiens. Now, apparently, humanity started with 2,000 nomads coming out of Africa. How can both stories come under the name of science?

How odd. How strange that scientists would create a mechanism, a paradigm, in an effort to find truth when that very mechanism cuts off data portions with which they have never imagined or encountered. One would think that with the wild discoveries in Quantum Physics that that sort of approach would have been done away with.

I would prefer to operate from another sort of paradigm, one which allows for the free flow of the imagination. I don’t, in constructing a prehistory, want to look at a picture in a cave and infer art because it fits well with my given paradigm. I want to be able to look a cave picture and infer either art, or a game, or a teaching lesson or whatever. The fact of my agnosticism is better than a hope in what may or may not be a fact. In other contexts, this paradigmatic approach may be best. But it does not work best for constructing a prehistory.

Why? A paradigm for the construction of prehistory must depend on facts. (This is why Renfrew succeeds with material engagement in the sense of constitutive symbolism, and fails with material engagement as concerns the extended mind.) Otherwise, we are no longer talking in terms of history; we have switched genres over into mythology or novel writing. The only continuity between prehistory and history is facts. To exclude them, is not to talk about history.

Only from the world of facts can one deduce the possibility for the unknown. The psychological phenomenon of the man who learns more and more but feels less and less intelligent should show us there is more out there than we could ever imagine. The activity of the imagination must be allowed to go wherever, without restraint, like the child. Prehistory exists, but we must be patient to find out what that means.

Wednesday, January 19, 2011

The Gum Predicament: Based On Real Events

You know when you’ve made your way from the parking lot to your car, how something felt strange, but you weren’t sure what it was? How, once you turn the car on, the feeling grows worse? Then you know how your right foot is sticking to brake peddle, and how, you tilt your head to the side, trying to make sense of it? Then, to your horror, you see a pink string extending from the closed doorway, over your left sandal, and tucked beneath your right foot? You know that sudden drop of emotion? And you know how you can tell it’s a fresh piece, and how you can tell by the circumference of the string that this particular piece is particularly large? You know how you try to take off that sandal, but the steering wheel gets in the way of lifting your knee, and you can’t reach down, so you have to open the door, and swing your legs around? But you know how you reacted too quickly and disgustedly and only made the situation worse, and now the gum string that anchored itself to the doorway pasted itself comfortably onto your left jean leg while you swung your legs? You know how, now, you’re really freaking out, and turn around to look in your car for something to peel it off of jean and sandal, and see a scrap piece of paper in the passenger seat? But you know how it’s too far away, how you imagine the attempt at reaching for it, but part of the scenario involves likelihood of pasting the gum to the floor mat? You know that sinking feeling of defeat? How you wished it would all go away, or that you would have seen the gum in the parking lot and stepped over it? Then you know how return to reality, though still tragically harboring a mite of hope? How you look for a small stick on the ground somewhere in view, but can’t find anything? And you know how you begin to grovel, and mope a little, reaching between your legs and underneath your car, blindly scraping your fingers against the pavement for something, anything? And you know how you feel an object, so you take hold of it, and you let an instant of hope back into your heart. You know how it’s only a crumbly leaf? And you know how the remnant of that poorly timed hope makes it seem that the leaf will be sturdy enough to give it a try, so you take your sandal off and flip it over? You know that face you give at the first site? And that little wheezy grunt? And you know how you begin to wipe, but because it’s a leaf it tears open immediately and unexpectedly and your thumb touches the gum a little? You know how, in a moment of terror, you throw the leaf so as to expel whatever diseases you may have just touched, but how, unfortunately, because it’s a leaf it pauses six inches away from your hand, dropping bottom gum-heavy like an air strike directly atop your clean sandal? And you know how you can’t figure out whether you should clean your thumb or sandal first? How you stand up quickly in the excitement only to realize that you don’t know why you’re standing, so you sit back down? You know how, just after you look around to make sure no one sees you? You know how once you notice that the soccer mom over there saw you, that you pretend like nothing’s happening, starring up at the sky and yawning, still secretly freaking out about the tragedy of the gum predicament?


the K.H.

Tuesday, January 18, 2011

Giddy Full Grown Men

Nerds get excited over what will seem to most people trivial things. Our Star Wars fans seem overly obsessive from time to time. But when a nerdy guy meets a nerdy girl who finds his Chewbacca imitation endearing, he knows he's found the one. Today i discovered a word that has troubled me for a long time, though i could not express it, for i did not know it. And in the same way our Star Wars guy gets giddy and bellows in the tongues of men and aliens, so i too now am at last bellowing the word diachronic.

My Greek friends will doubtless notice the distinguishable "dia" and "chronic." What excites me is the combination, and by extension, that the word expresses something not often regarded in much of my readings. So far as the word is used in English dia means 'through' or 'during' and chronic means 'time.' Fortunately, in my reading today, it was applied to an Aristotelian spiel concerning another Greek word 'akrosia,' which deals with whether a person errs regardless of having a right judgment. You've heard of 'know one errs knowingly.' Well yeah...

But my interest in diachronic as a word at present is tied immediately to the activity of the psyche. In particular, with the sporadic psychological activity of an author of literature. I want an account of what goes on in an authors mind diachronically, during the time that he writes during a particular sitting, or, at the various times he sits. Can we study this scientifically or is it forever bound to the subjective realm?

I have read nothing on this subject, nor have i heard it addressed in hermeneutical discussions. It's almost like our ignorance of the subject renders a hush upon the academic community. I partially think the cause a fear of what would result if we discovered any serious implications of diachronic activity. For example, authorial consciousness toward particular ends, or toward aesthetic accomplishments within a given text might fall to the wayside--and with them centuries of shoddy secondary criticism.

We would, after many years of its absence, have to formulate criteria for why we can say an author is or isn't conscious of what he writes, or whether he was conscious at one moment and lost consciousness later, or whether for that matter he was conscious at all. Critics say "this is what this author intended" without ever having asked the question "did he intend this at all" in the first place. It's a classic cart in front of the horse scenario. We need a universally applicable methodology for determining authorial consciousness, and i have the gut feeling that we can, with the help of psychology and philosophy of language, come up with a scientifically verifiable model.

I'm not proposing an alternative to authorial intent, saying that all authors write willy-nilly. Obviously, that's ridiculous. But why to the contrary have people not discussed the equally willy-nilly conception that authors are overtly intentional?

We've all experienced the sensation where a few years ago we said something and only now we would say that we are beginning to understand what we meant. As a point of analogy, i can't see why what happens when speaking might not also happen when writing.

Maybe i'm wrong. Or maybe i'm reading all the wrong books. In any case, somebody's gotta start saying something.

That's all i have to say about that...

the K.H.

Saturday, January 15, 2011

Renfrew and Story Telling

I promised several people that i would communicate what i was learning insofar as time permitted and particularly in terms understandable to a broader audience than, say, just we inhabitants of nerdedom.

According to my sources, Colin Renfrew sits comfortably among the archaeological elite. After reading "Prehistory," i see why. He has either invented or clarified a little something called "cognitive archaeology." Along with the broader stream of recent scholarship, he uses the recent discoveries in the philosophy of language--particularly semiotics--to begin making cultural inferences based solely on empirical, archaeological data.

Historically, attempts at telling stories of prehistoric man demonstrated themselves altogether ridiculous. Chesterton says some pretty funny stuff concerning scholarship some 100 years ago, and his common sense has proved useful in my analysis of Renfrew. "Sometimes the professor with his bone becomes almost as dangerous as a dog with his bone."

Prehistory, by the way, refers to the era of time preceding written historical accounts. In the Judeo-Christian story, this used to mean the Genesis account--though now this seems more Modern than historical. For the Greeks it meant Herodotus in his work "The Histories," the very root of where we have our word history. In short, if we have no written account, we have no history.

Renfrew borrows some rather complex work in philosophy of language--including names like Wittgenstein who is probably the most important philosopher of the 20th century and Charles Pierce the constructor of modern logic and semiotics. Man as we now understand him spends the majority of his time manipulating and interacting with symbols. Words like 'love' represent real concepts as they appear to an individual's conscious experience with the world. Christian books like to separate the word 'love' into the four categories of ancient Greece because the symbol love can nowadays apply to a hotdog as well as a hot-wife. The former is a matter of taste, and the latter a matter of eros/agape. The further you distinguish, the less the symbol can inherently mean because it ascribes itself to narrower and narrower concepts. As Heidegger once wrote "Language is the house of being." The use of the symbol determines the level of consciousness toward reality, toward ontological significance. So much for abstract, intangible symbols.

Where Renfrew comes in is with what he calls "material engagement." The difference here is one of physical symbolism. He suggests, rightly, that physical objects carry meaning. To explain himself he uses the example of weight. One could not make-up the meaning of weight without the experience of lifting light or heavy objects. Experience with physical objects thus offers a level of consciousness not previously possible. The existence of weights, therefore, i.e. of particular weight distributions, say, 10 lbs, demonstrates that though the measurement is itself arbitrary, it is at least consistent.

His example, however, comes under the prefix of evolutionary biological and geographical assumptions. Carbon dating and DNA analysis are both pointing toward an old cosmos. Immediately, then, my Christian sentimentality piques, wondering just how far he's going to take his Darwinianism into the story telling process. After all, he is telling a story: it's just a pre-story.

Not to my surprise, he makes some rather hasty remarks and poor inferences when it comes to ideas of intrinsic value, art, and, of course, religious practice. Sadly enough, and for all his precocity, he has fallen into the same swamp Chesterton warned against. Here's the run of it: he infers intrinsic value on, to use his example, gold because it proved institutionally useful for the economy; he says candidly and calmly that the paintings on cave walls are always art; and he infers deity from apotheosis i.e. heavily adorned and well treated individuals were soon infused into divine categories.

I will get to these examples shortly, but before that, a word on how Renfrew may be right. Cognitive archaeology, if applicable, may produce some fascinating results. If, for example, carbon dating really is reliable, it would positively demonstrate that the 70,000 years old weights show a consciousness interacting with symbols (remember material engagement), and we would have, to whatever degree, something akin to a homo sapien. Of course, Renfrew tells much of his story to this very end. The only trouble, the only question to ask, is how far to take this principle.

I return now to criticisms. The mental activity of the archaeologist occurs within the creative and imaginative as much as it does the geographic. This is not a bad thing: it is probably it's greatest asset. The trouble comes when they get impatient by trying to infer too much from too little.

As far as value goes, Renfrew, not being a philosopher, confuses the difference between intrinsic value and utility. If a hammer sits on a desk, it can be said to be valuable, particularly on the Judeo-Christian model, simply because it exists. Man's creation of the hammer is a derivative of God's creation of man's ability to make a hammer, and it's value would rest entirely in being, not use. In other words, something is valuable because God created it. Doubtless, more value can be ascribed to it through use, but it is not necessary, nor, for that matter, would that new value be intrinsic to the hammer. Renfrew is conflating inherent usefulness with inherent value. A hammer is not useful for baking a cake, because it doesn't have the inherent properties for cookery, but it does have the inherent qualities of medal and handle, both of which are great for smashing a cake. Now, this doesn't mean that the early makers of gold were conscious of this schema, all it means is that there is the possibility that utility was not part of the equation at all, and therefore, we cannot make reliable, scientific inferences from it concerning value systems of pre-historic or early civilization.

The same principle of "other possibilities" transfers over into the artistic realm. A cave painting does not NECESSARILY mean art. It might mean a game of pin the tale on the donkey. It might mean, indeed, a school-master teaching what a donkey is, and therefore be a picture for use. Whatever the case, inferences to be made for formulating a story that is pre-historical must depend on their logical necessity. Renfrew is telling a non sequitur story; non sequitur being an illogical, un-necessitated inference.

Finally, one of his more elaborate stories derives from a non sequitur in reference to material engagement. He wants to say that vast amounts of wealth, burial arrangements--such as we find in Egypt's pyramids--, power, and the like proffer deity status upon individuals. He writes, "By this sumptuous treatment the burial did, in a sense, establish and document that quality. Here we see the material engagement process at its most sublime: in apotheosis, the very creation of divinity"!

Poor silly man. I will not presently bother with the more complex arguments involving the Numinous, or with the difference between personified divinity within the ancient mind. Why can't we simply say that men adorned what they already thought to be divine? Material engagement is only appropriate where the antecedent is necessitated the predicate. We have no evidence to say which came first. To quote Chesterton, "To say that religion came from reverencing a chief or sacrificing at a harvest is to put a highly elaborate cart before a really primitive horse."

His story may or may not be accurate , but that's just the trouble. May or may not developed into a story does not create history, it creates something much more akin to a myth. Presently, my concerns lie with the Darwinian mythos, with extracting what is true and discarding the rest. Christians today, or so i like to think, have the task of balancing their Judeo-Christian mythos with its apparent rival, the Darwinian.

the K.H.

Wednesday, January 5, 2011

Prophet as Poet

Here’s some good ol’ fashion down to earth pith.

The theologians say I’m just a myth:

Isaiah without his destination,

A prophet without the incantation.

But believe you me,

Plain as I can see,

You’re soon to wish I’d hush, and plead the fifth.


the K.H.

Saturday, January 1, 2011

A Different Tragedy

You'll notice i'm writing more than usual these days. Though the greater part of it has to do with needing the practice before school starts, you will be safe in assuming desire and boredom as part of the cauldron. For now, i'm going to skip over what i wildly suggested about the relationship between psychology and Orthodox, and save it for my next entry. A conversation yesterday reminded me of a theory on tragedy i've been harboring for a few years.

The short run of it goes as such: if we combine Murdoch's scandalous suggestion about tragedy to O'Conner's use of the grotesque, thus formulating an anti-thesis to Tolkien's eucatastrophe (‘good’ catastrophe), i think we may find a Christian aesthetic appropriate in an age where indifference rules. O'Conner's literary theory stands unequivocally at the top of modern Christian aesthetics. In her stories, reality barges in on our minds with violence, forcing and ravaging us with fact instead of distorted fantasy. She uses the grotesque as a tool to awaken us from distractions—Pascalan distractions (in my mind) taking the form of literary entertainment. She pummels her characters either by killing them or attaching some concrete aberration to our perception of them, and leaving things that way. The trouble with her story's characters, a trouble i think she would readily admit, is that we do not love them. The length of her stories inhibits character development. But what I like, what I especially want to keep, is the leftover tragedy. The cliff-hanger, non-redemptive end.

By 'love’ her characters, i mean something very specific. I mean what Murdoch says of love in general, and it's particular application to artistic tragedy. "Love," she writes, "is the extremely difficult realization that someone, other than oneself, is real." She insists, i forget where now, that only (or at least primarily) through tragedy can an external consciousness truly begin a process of literary empathy. It takes something very near what Kreeft calls an "empathetic imagination" toward alien philosophical positions to gather and feel and fear the object or person of pity Aristotle would say any drama needs to possess in order to remain tragic. Men and women do not identify with the ethereal, we identify with despondence. Three chapters read of Milton's Paradise Lost will positively demonstrate the point. On the sentimental level, we care nothing of Heaven; but of Hell, of Satan, and his pitiable condition, we squirm with familiarity. Oliver Twist has our sympathies, Frodo our admiration, Raskolnikov our affection because we get them. We are them. We love them. Art is a curious thing when broken down in the abstract. We find ourselves aligned with metaphysical pictures and events, and our sentiments attached to symbols.

And symbol, I would now contend along with Walker Percy, proves the engine behind the meaning in art. Naming, on his (and Noam Chomsky’s) view, means to be conscious of something, of a reality. You were not conscious of the “W” on your own keyboard until you read this sentence. Now that I’ve told you, you’ve had an expansion of consciousness—and it felt something very much like discovery. You did not ‘know’ it like you ‘know’ it now. Phenomenologists, Percy says, rightly point out that humans are not merely conscious, we are conscious of something.

This principle ties quickly and smoothly into Alexander Hamilton’s ‘Contemplated’ and ‘Enjoyed’ distinction that Lewis and Tolkien so appropriately applied to their aesthetics. To ‘Enjoy’ something in Hamilton’s terms is to engage a mode of consciousness distinguishable from the ‘Contemplative’ mode of consciousness. So we don’t have the normal unconscious and conscious dichotomy; we have the unconscious, Contemplated and Enjoyed. Take the “W” on the keyboard example again. While you were reading my sentence, there was a moment of conscious expansion; within that event, you were operating from within the ‘Enjoyed’ mode of consciousness. You were, for lack of better terms, unconscious of your consciousness. Now, however, at this very moment while you are looking back (via memory) into that same event (and only that event), you are Contemplating it. The object is not original, but a copy of the original experience out of which you abstract according to a specific set of questions (guided by your particular interests or hermeneutical preoccupations). For example, the second wave in the “W” is a property of that “W.” Again, you only have one object of concern, but the modes through which you attach your consciousness to them changes (with great rapidity) between the Enjoyed and Contemplated. It’s the difference between listening to your favorite song and talking about it.

Now back to aesthetics. Though I think Lewis’ concept of subconscious preparatorio evangelica is questionable—based on unempirical theories concerning the sub/unconscious, and the fact that semiotics may provide a better model for explaining our psyche’s interaction with art—I think he had the right idea. People can in fact receive elements of the gospel without knowing it. Hence his and Tolkien’s obsession with mythopoetics—by creating self-sustained, independent worlds. We can ‘Enjoy’ The Lord of the Rings without ‘Contemplating’ its meaning, like we can ‘Enjoy’ kissing without ‘Contemplating’ what kissing is or who its with. So either the subconscious manipulates the realities inherent to the text, letting them boil over into the conscious realm (i.e the pity of Bilbo or the goodness of TLOR are adopted into our own worldview without us choosing it), or “Hom- symbolificus” is influenced by symbols more than he knows. Either way, Tolkien and tragedy now come back to the forefront.

The eucatastrophe Tolkien talks about in “On Fairy Stories” stands in direct contrast to the disacatastrophe. As far as his history of middle earth is concerned, I do not think Tolkien could have been a moral author if his story did not end in the eucatastrophe. He is not presenting this world, he is presenting a world in and of itself. But if it is to have any relation to our world; if it is to include humans at all on the axiological or eschatological level, it must conclude with redemption; it must end Christian-like; there must be hope. Likewise, it must have a similar beginning: i.e. a creation myth. The same, I think, goes for Lewis’ Narnia. And notice that they both do. Both Aslan and the Ainur sing their respective worlds into existence, implying creation and harmony, and both Narnia and Middle Earth end in eucatastrophe. (I have in mind the end of the first age of Middle Earth.)

What of it then? I think a moral Christian story must end in eucatastrophe but I believe in and wish to defend the disacatastrophe? I wish to combine a non-redemptive O’Conner ending to a Murdochian tragedy to accomplish a disacatastrophe? Why?

I want to limit what stories must have eucatastropheis and which ones don’t need to. Tolkien’s Middle Earth must, on moral grounds, end in eucatastrophe because he is putting forward an entire world; he is writing a history. Redemption is inherent to his project because it is inherent to human dramatic categories, and thus the human condition. To exclude it is to exclude reality. O’Conner, however, succeeds morally in an arena Tolkien only touches on in doses—most successfully his “The Children of Hurin” –because she does not have to end in redemption. She is not creating a world; she’s working with an already existing one in which redemption is already inherent. She succeeds entirely within the grotesque by sticking to disacatastrophe, and it is morally laudable because of her success at shoving the consequences of evil and sin into our face. After all, how are we to be saved unless we know we are lost?

But where she fails in character development, Murdoch, if she were a Christian, would have succeeded—at least in theory. The closest examples I can think of are Greek, Roman, and Shakespearian tragedies. The trouble with these, of course, is that they are not thoroughly Christian. If we’re going to be sneaking truth and goodness into the readers mind, we may as well make them Christian. I am ultimately after invoking Aristotle’s fear and pity categories toward Christian axiological and eschatological interests. Image a character with whom I identify and love, who, through a series of poor choices (due to character flaws) systematically ruins himself and those he loves. Now imagine that his choices are poor because of his pride, and his pride leads him to kill his own wife, and then, upon discovering his wretched state, kills himself. Even this rough sketch in the abstract shows quite simply that we fear his pride and pity the consequences of his actions. Put it in a story, make us love him, and what follows are the kinds of sentiments toward pride we want all people to have. In this sense, we would certainly succeed in a preparatorio evangelica, or even, within the believer, an appropriate unconscious and involuntary response toward pride. Fiction, like a knife, is a neutral object that can be used either for good or evil. We cut into the heart either to kill or to surgically repair.

Of course, art as valuable because it is useful is the wrong idea, and a whole other subject. All I will say here is that the Christian artist must balance his tastes and talents with reality.

the K.H.