How our attitude determines what the world reveals to us
On August 15, 1945, Japan surrendered to the allies, having suffered a devastating defeat in the deadliest conflict in human history. The detonation of two nuclear weapons on the towns of Hiroshima and Nagasaki some days prior by the United States resulted in hundreds of thousands of deaths. On January 1, Emperor Hirohito, descendant of the Goddess Amaterasu, Heavenly Sovereign, embodiment of the essence of the Japanese nation and representative of its body politic, renounced his divinity.
Whether Hirohito, and the lineage of Emperors preceding him, were God incarnate in the Japanese consciousness is debatable. The status of the Emperor’s relation to the divine certainly varied with historical periods. Prior to the Meiji Restoration in 1868, the Emperor was more of a ceremonial figure, with political power residing with shoguns, military lords who controlled areas of land. Between 1600 and 1868, Japan was ruled by a single shogunate, named the Tokugawa Shogunate after the general Ieyasu Tokugawa who defeated his contenders and unified the country under his rule. By the mid-19th century, a series of events, both internal and external, helped precipitate the collapse of this feudal order. When the American Commodore Perry sailed to Japan exhorting them to open their borders to trade, prominent players in the Japanese political scene noted the superior ships and technology of the Americans. Greater awareness of an encroaching outside world, therefore, fomented political action from pro-Imperial political factions who sought to end the isolationist policy Japan had adhered to in the last two centuries. The restoration effectively put an end to the centralized control of daimyo under a single ruler, and instead consolidated political power in the Emperor. With the Restoration, Japan opened itself to outside trade and pursued industrialization in order to catch up to Western powers. From the Meiji Restoration leading up to WWII, the Japanese national identity underwent significant transformation, with the identification of the Emperor as God acquiring acute importance during the War as a vehicle of national identity. When Japan was defeated in 1945, under the pressure of the Allies, the Emperor issued an imperial script in 1946 known as the Humanity Declaration. The Humanity Declaration has been subject to interpretation, with the following lines garnering the most attention:
The ties between Us and Our people have always stood upon mutual trust and affection. They do not depend upon mere legends and myths. They are not predicated on the false conception that the Emperor is divine, and that the Japanese people are superior to other races and fated to rule the world.
The claim of the renunciation of divinity is predominantly ascribed to this quote. Yet, the quote does not explicitly divest the Emperor from his kami (a word that stands for divine beings or Gods in the Japanese Shinto religion) lineage. At the same time, it dispels the predominant 1930s’ view of the Emperor’s God-like nature.
Whether or not the status of Divinity was renounced or rescinded in any technical sense, the war had exposed an inadequacy in the notion, or in the human wielding of it. The atrocities of the war could not be countenanced with the immanent divinity of the Emperor. Perhaps the Emperor never intrinsically partook in the transcendent realm. Was the Emperor implicated in the moral and human failing both in the afflictions Japan sustained within and produced without? Could such failings be attributed to a God or the God-like? Was, then, the very ascription of the Emperor’s Divinity a hubristic human misattribution complicit in the belligerent regime that had caused unspeakable war crimes? Perhaps some metaphysical gymnastics might salvage the doctrine of Divinity, if there was one. Yet surely, in some incontrovertible way, the war had defied the Divine, or shooed away whatever presence it had in our world to begin with.
The Relationship to the Sacred
In medieval Europe and probably much into the modern period, broad as that designation may be, the predominant Christian worldview separated the realms of the secular and the sacred. The commonplace definition of the Divine holds it to be an attribute of God or the God-like. But the designation “of God” in the Oxford English Dictionary leaves much to be desired as regards a definition. The sacred, meanwhile, defines as “of the Eucharistic elements” or “consecrated to; esteemed especially dear or acceptable to a deity”. So if the divine is of God, then the sacred is an object or activity that honours or pleases God. In our secular and largely materialistic worldview, these definitions seem arcane and superannuated (a fancy word meaning outdated through new developments).
Yet, it seems to me that even if we’re uncomfortable with the idea of God and may reject the notion of a deity, both the divine and the sacred appear too important to dispense with. Why? I venture to say because they appear to be salient aspects of the world we inhabit. How could this be if their definitions point toward a transcendent world?
There is a tradition in philosophy rooted in empiricism that defers ontological questions, questions about what sorts of entities the world is composed of, to the best scientific theory. The question then becomes, by what standard and criterion do we adjudicate the best scientific theory/theories? The philosopher Willard Quine, a proponent of this view, thought of science as an extension of common sense through systematic study of the world. Did Quine think that science must be constrained by some extra-scientific rational standard? He did not. He thought that rationality itself just like the objects of science must be open to revision and improvement by that very science. For Quine, ordinary language provides a schema of posits and objects about what there is in the world. Serious science, meanwhile, regiments the vocabulary of ordinary language toward accordance with the available evidence that contemporary scientific techniques gather. This regimentation supplants networks of concepts like “sacred” and “divine”, but also “knowledge”, “beliefs”, “thought” etc, with networks that admit only predicates applicable to the relevant theory. To take an example, concepts like “knowledge”, “belief”, “thought” might be replaced by tried and tested variables like “mental models”, which might be further refined by reference to neural circuitry etc. Ordinary language, then, forms a vague shorthand for the more precise and more objective vocabulary of science.
If Quine is right, then notions like the “sacred” and “divine” may be consigned to the dustbin of history. Quine did not think that the ontology of science, namely what it ascribes existence to, was ever final or true in the colloquial sense of the world. In fact, he held that “physical objects are conceptually imported into the situation as convenient intermediaries not by definition in terms of experience, but simply as irreducible posits comparable, epistemologically, to the gods of Homer…the physical objects and the gods differ only in degree and not in kind.” In other words, the “sacred” and “divine” are simply replaced by more precise vocabulary that fit with the broader evidence that science has brought to light. It’s not that these concepts are wrong or false, just not consistent with the conceptual schema that yields the best predictions. With characteristic wryness, Quine quipped that “language is conceived in sin and science is its redemption”.
Notwithstanding Quine’s rather impoverished understanding of language (a limitation he shared with the logical positivists by whom he was influenced), one can deny that the vocabulary of the “sacred” and “divine” has been eclipsed by science. What do I mean by that? Certainly one can, even within the admissibility of Quine’s own views, retain the conceptual schema of the “sacred” and “divine” as part of a particular religious framework like Christianity, a pan-religious one that considers commonalities between religions, or even as notions to be explained in the context of religions as social phenomena. I hold, however, that the “sacred” and “divine” signify apart from the religious dogmas within which they are instantiated, such that rejection of such dogmas does not, by the same token, justify oversight of these notions. Yet, I admit, that I am not entirely clear of the terms in which I wish to rehabilitate them: a naturalistic framework consistent with science, or within a vocabulary that works in the margins of understanding and the rationally sayable.
Harbouring affinities with Quine’s naturalism, nay, espousing some form of naturalism myself, albeit not identical with Quine’s, I’m inclined to seek natural processes as the underlying signified of the signifiers sacred and divine. And yet, I demure in carrying out this reduction. For part of the signification space into which these concepts expand demands to remain opaque, and not rationally explicable. It is that element of opaqueness, the inability to exhaustively explicate what they mean, that imbues them with their power, proximate as they are to our source of anchorage in the world.
Where doth the Sacred Reside?
The philosopher Evan Thompson, influenced by the phenomenological tradition and the idea of autopoiesis propounded by the neuroscientists Francisco Varela and Humberto Maturana, characterizes the human being as a self-generating, autonomous system that creates and maintains its own cognitive domain. An autonomous systems is a system whose activity is generated by an endogenous set of internal dynamics, rather than “external causes”. Autonomous systems are often characterized as a subset of “dissipative” or thermodynamically open, systems. A paradigm case of the latter, is a tornado. A tornado has an internal dynamic that reproduces a steady state that is far from thermodynamic equilibrium. When a system is far from thermodynamic equilibrium, the concentration of energy within its internal dynamics differs from the concentration of energy outside. In fact, such a system harnesses energy from the environment to reproduce its “steady state”. In this respect, both a living organism and a tornado are “dissipative”, far from thermodynamic equilibrium systems. What’s the difference? The tornado is not autonomous because its internal dynamics are not independent of the environment. The tornado’s lifecycle terminates when its structural evolution prevents the inflow of warm air from powering it. A cell, by contrast, is a type of structure enwrapped in a set of relations that continually reproduce the boundary conditions between itself and the world. It does so by continually recreating a semi-permeable boundary that exchanges energy with its environment to maintain its identity. This set of endogenous relations amount to the metabolic cycle for simple organisms that produce homeostasis. Autopoiesis refers to a system that instantiates just these properties: a semipermeable membrane, an internal reaction network (e.g. metabolic reaction network that codes for proteins), and their interdependence (the internal reaction network depends on the boundary to recur and vice versa).
Even though the minimal autopoietic system is a single cell, the argument goes, complex organisms such as us are also, albeit scaled-up, autopoietic systems. Some of our most distinguishing features like consciousness and thought are features not unlike metabolism; in fact, they are continuous with it. In other words, there’s an intimate connection between cognition and autopoiesis. An autopoietic system necessarily exchanges energy and nutrients with its environment to maintain itself, but it also, ipso facto, exchanges “information”. Information at this scale consists entirely of physical processes, but in order for a cell to remain viable in its environment it must select for the right compounds to assimilate into its metabolic processes. Thompson characterizes this orientation toward relevance as a normative relation to the world that begins at the cell and scales up to meaning-creation for human beings. There’s a deep continuity between the two: to be in the world is to be normatively coupled with the environment for viability. The phenomenologist philosophers like Husserl, Heidegger and Merleau-Ponty deeply grasped this point: the environment is brought forth and spun through human actualization of itself. Heidegger brings this into relief most starkly when he says that meaning or intelligibility is our most primordial relation to the world. Meaning, therefore, is not an exogenous exchange of “objective” information, but the endogenous (originating inside) product of the autopoietic system’s concern for its viability and flourishing.
Within the autopoietic paradigm, human cognition is both essentially temporal and embodied. It is temporal in the sense that information processing at the level of consciousness and thought situate the human being in a futural horizon toward which it stakes its becoming, and embodied in the sense that the human being progresses toward that futural horizon through a set of skillful activities that combine higher-order cognition with sensorimotor know-how. The internal norms of the human being form a complex and stratified set of embodied circuitry that are specific to the organism’s historicity and set the stage for its future evolution against an environment that’s both informational and social. This radical temporality implies that the human being must continually, in fact, incessantly, bring forth its environment in accordance with its internal norms in a way in which norm and environment are immanent, that is, mutually inhere in each other. Immanence is not a given aspect of our being in the world, it requires effort, yet something that our internal emotional and bodily rhythms recognize at a pre-reflective level. Immanence then is a state that can be summoned into the present as a certain orientation toward the future that attunes us toward flourishing and life-affirmation.
Immanence, however, implies transcendence. We cannot summon immanence without it’s implying a transformation, a welling outward of our identity into a new synthesis. Immanence, in this sense, is the actualization of the process of the human being’s incessant welling into novelty. Transcendence, meanwhile, is what immanence works out: it positions the organism in the plane of actualizing its preferred possibilities against the labyrinths of social and material uncertainty. Let me put it this way: when the organism does not proactively effectuate its own milieu, where it exercises some degree of mastery and sense of accomplishment, other actors in the world crowd its space, and increase the uncertainty of the actualization of its preferred outcomes, and by extension, its viability. The metabolic metaphor does not sufficiently capture this because its adjustments are reactive, not proactive, hence the need to posit a counterpart to metabolic adjustments, allostasis as the process of regaining homeostasis (physical and chemical equilibrium) through predictive processing. The organism scours its futural search space through its cognitive tools (whatever they may be) to be able to fashion a snug set of environmental parameters that do not interfere with its milieu generation. This implies that the organism reduces uncertainty by predicting outcomes, but also transforms uncertainty into immanence by hewing out its own environment. Transcendence, therefore, is a structural feature of the human being that Martin Heidegger recognized as its not-yet, its being always-already-ahead-of-itself. He says: “The ripening fruit, however, not only is not indifferent to its unripeness as something other than itself, but it is that unripeness as it ripens. The ‘not-yet’ has already been included in the very Being of the fruit, not as some random characteristic, but as something constitutive. Correspondingly, as long as any Dasein [human being] is, it too is already its ‘not-yet’ (Heidegger, 288).
There is a deeper point here whose exploration is beyond the scope of our article. The organism, by virtue of its cognitive and material limitations, cannot ever, in principle, eliminate all uncertainty. That is to say, the organism is already attuned to the presence of uncertainty in a fundamental way: it prefers to confront it, to be in its midst so that it it can convert it into variables amenable to its ends. The organism cannot ever be satisfied in a space devoid of uncertainty, where it cannot propel itself in this process of converting the uncertain into a domain of knowledge. The sacred is exactly the enacting of this process. Ritual is both an acknowledgment of this deeper structure latent in ourselves continuous with our evolutionary past, and a way of bringing this state of alertness and orientation toward the unknown into the fold of the present. The unknown is so much vaster than we can ever compute; no numerical value could capture its extent: it overwhelms all our meagre knowledge and focal points. The sacred, thus, brings us into confrontation with the divine, immanence with transcendence.
Some rites of Christianity, for example, define immanence as the expression or revelation of the hypostases (the underlying state that supports all else) of God in the worldly realm. Many rites of Christianity celebrate the immanence of the triune (consisting of three in one) of God through the liturgical feast of Theophany of God/Epiphany (the appearance of a deity in human form). Even though I’ve talked about the individual human being as an autopoietic system, human beings also summon immanence communally as an expression of their temporal character. Yet, in doing so, they celebrate the transcendence of God as a representation of the unknown, the vaster realm of uncertainty against which we as organisms swim. Collective affirmation of the mystery has been expunged from our culture, but to the detriment of our collective targets, aims, aspirations. Collective rituals that affirm the mystery, not externally induced rituals of consumption, help large collectives regain focus of the necessity of summoning immanence and the sacred in order to confront the transcendent. We cannot do any of this without some inkling, some smidgen, some scintilla of reverence. We must revere this process, we must be sensitive to its demands, and enshroud it with certain principles of inviolability, shelter it from the destructive processes of entropy whose numerical possibilities dwarf the numerical possibilities of our viability.
Irreverence and the Obscene
The tenor of our lives fluctuates with seldom acknowledged irony between the momentous seriousness of our exertions and the farcical absurdity of their failures. These are two sides of the same coin. The seriousness of our exertions and consummate self-involvement befit ridicule and mockery. Some of us, having recognized this unresolved paradox, withdraw from the weightiness of our Sisyphean undertakings and pursue modest stewardship of our affairs somewhere in the margins, choosing to parse the unfolding of the world through the prism of levity. From this vantage point, the boisterous and irascible churning of the human spirit acquires an air of absurdity, its achievements revealed to be swaddled in layers of glibness, asininity and fatuousness. The hewing of cultural gems occurs precisely inside the flotsam and jetsam of this vaster wreckage.
It then behooves us, from time to time, to delve low, and ridicule this condition, ourselves in its midst, but also our fellow humans who seldom choose to peak at themselves to get the serene sense that, at the end of ends, it was all for naught. The crescendo of this movement we’ve been thrust in presages only momentary peaks and summits but spills ultimately into a torpedoing nadir. In letting go of all pretence, thus, we gain access to a feast of preposterous proportions. A lewd, and debauched trail of unrealized desires, teetering disasters, the hilarity of despair, and the constant desecration of our loftiest ideals.
And so it is befitting, that Gargantua, the eponymous protagonist of the picaresque novel Gargantua and Pantagruel, creation of the Renaissance French physician Francois Rabelais (who, through his profession, must have surveyed the vast spectrum of the human condition), is birthed not from the womb, but from his mother’s ear; not crying, but demanding with a booming, masculine voice to be served spirits, and imploring the company around him to drink along in celebration. Far from the pious imagery of Scripture’s Nativity, we are offered its complement, that aspect of our reality that escapes codification into the holy, hallowed, and good. That aspect that we ignore, look away from, and, more often than not, omit from the tales we tell about ourselves.
Gargantua’s tumble into the world spawns a revelrous carnival of obscenity:
Gargantua from three yeares upwards unto five, was brought up and instructed in all convenient discipline, by the commandment of his father; and spent that time like the other little children of the countrey, that is, in drinking, eating and sleeping: in eating, sleeping, and drinking: and in sleeping, drinking and eating: still he wallowed and rowled up and down himself in the mire and dirt: he blurred and sullied his nose with filth: he blotted and smutch’t his face with any kinde of scurvie stuffe, he trode down his shoes in the heele: At the flies he did oftentimes yawn, and ran very heartily after the Butterflies, the Empire whereof belonged to his father. He pissed in his shoes, shit in his shirt, and wiped his nose on his sleeve: He did let his snot and snivel fall in his pottage, and dabled, padled and slabbered every where. He would drink in his slipper, and ordinarily rub his belly against a Panier: He sharpened his teeth with a top, washed his hands with his broth, and combed his head with a bole…oftetimes did he spit in the basin, and fart for fatnesse; pisse against the Sunne, and hide himself in the water for fear of raine.
When Gargantua’s father, Grangousier, returns from his military exploits in the Canary islands, Gargantua relates to him how he has “by a long and curious experience found out a meanes to wipe [his] bum, the most lordly, the most excellent, the most convenient that ever was seen.” Gargantua, therewith, enumerates in minute detail these “meanes”, which include “a Gentlewoman’s velvetmask” that he found to be good because“the softnesses of the the silk was very voluptuous and pleasant to [his] fundament”, “a MarchCat” whose “clawes were so sharp that they scratched and exulcerated all my perinee”, then with “sage, fennil, anet, marjoram, roses, gourd-leaves, beets, wool-blade” but also with “mercurie, nettles, and comfrey”, and with “sheets, curtains, cushion and Arras hangings.” While these present only a sample of what Gargantua enumerates, he concludes that “of all torcheculs, arsewips, bumfodders, tail-napkins, bung-hole cleansers and wipe-breeches, there is none in the world comparable to the neck of a goose. (Rabelais, 56)”
So impressed is Grangousier with the “marvellous understanding of his sonne” that he procures the tutelage of the schoolmasters for his education, which render him a great fool, finding that “it were better for him to learne nothing at all , then to be taught such like books, under such Schoolmasters, because their knowledge was nothing but brutishnesses, and their wisdome but blunt foppish toyes, serving only to bastardize good and noble spirits, and to corrupt all the flower of youth” (Rabelais, 59). Through Gargantua’s frame of reference, Rabelais thus turns the world as we know it topsy-turvy, subverting the common prism through which we refract it the into the valuable and invaluable, the sacrosanct and blasphemous. By elevating the mundane, the obscene, the vile, the irreverent, he excavates the riches of the body and the human experience often swept under the rug by the ecclesiastical bodies, and brings them into relief against the narrow prism of the codified and the hallowed.
In Rabelaesian vain, kindred spirit and falsely accused misanthrope, Henry Valentine Miller, more than any other 20th century writer, carries the revelrous torch of the fundamentally absurd and burlesque theatre of life. In the opening lines of his Tropic of Capricorn, chronicle of his American experience and eventual flight to France, chronicled forthwith in the earlier, infamous Tropic of Cancer, he explains the origins of his attitude:
From the very beginning I must have trained myself not to want anything too badly. From the very beginning I was independent, in a false way. I had need of nobody because I wanted to be free, free to do and to give only as my whims dictated. The moment anything was expected or demanded of me I balked. That was the form my independence took. I was corrupt, in other words, corrupt from the start. It’s as though my mother fed me a poison, and though I was weaned young the poison never left the system. Even when she weaned me it seemed that I was completely indifferent; most children rebel, or make a pretense of rebelling, but I didn’t give a damn. I was a philosopher when still in swaddling clothes. I was against life, on principle. What principle? The principle of futility. Everybody around me was struggling. I myself never made an effort. If I appeared to be making an effort it was only to please someone else; at bottom I didn’t give a rap. And if you can tell me why this should have been so I will deny it, because I was born with a cussed streak in me and nothing can eliminate it. I heard later, when I had grown up, that they had a hell of a time bringing me out of the womb. I can understand that perfectly. Why budge? Why come out of a nice warm place, a cozy retreat in which everything is offered you gratis? (Miller, 10)Henry Miller. Photo courtesy of Wikipedia.
In accordance with this, perhaps, veiled apathy, this great hurt the world had inflicted upon him from the very outset, we get the dispassionate account of the death of his best friend as a boy, which, freed from the shackles of concern and, let’s face it, the sham of pretense that most of us are loath to relinquish, focalizes the hilarity of the human theatre:
…things were wrong usually only when one cared too much. That impressed itself on me very early in life. For example, I remember the case of my young friend Jack Lawson. For a whole year he lay in bed, suffering the worst agonies. He was my best friend, so people said at any rate. Well, at first I was probably sorry for him and perhaps now and then I called at his house to inquire about him; but after a month or two had elapsed I grew quite callous about his suffering. I said to myself he ought to die and the sooner he dies the better it will be, and having thought thus I acted accordingly: that is to say, I promptly forgot about him, abandoned him to his fate. I was only about twelve years old at the time and I remember being proud of my decision. I remember the funeral too — what a disgraceful affair it was. There they were, friends and relatives all congregated about the bier and all of them bawling like sick monkeys. The mother especially gave me a pain in the ass. She was such a rare, spiritual creature, a Christian Scientist, I believe, and though she didn’t believe in death either, she raised such a stink that Christ himself would have risen from the grave. But not her beloved Jack! No, Jack lay there cold as ice and rigid and unbeckonable. He was dead and I was glad of it. I didn’t waste any tears over it. I couldn’t say that he was better off because after all “he” had vanished. He was gone and with him the sufferings he had endured and the suffering he had unwittingly inflicted on others. Amen!, I said to myself, and with that, being slightly hysterical, I let out a loud fart — right beside the coffin. (Miller, 15)
The Tropic of Capricorn, like Miller’s other writings, is both an indictment of the civilizational burden and a cathartic return of the repressed, perhaps in dated Freudian fashion, that affirms life in the thick of its misery and failures, that pulls the curtain behind the machinations of human bullshit, to reveal the comedy, rather than the drama, its fatuousness rather than gravitas. The sexual “exploits” recounted in his work are not gratuitous, but life-affirming; for Miller too, like the people he describes, more often than not, with callous detachment, is also an absurd actor in the theatre of the absurd.
Irreverence, as a mode of orienting ourselves in the world, thus opens vast riches of experience as valid. It mocks our toiling spirits, and in doing so, it permits us to take pleasure in failure, in the mundane, in the grotesque; it equals the playing field between high and low, sacred and profane; it takes us outside of ourselves, and in doing so it subsumes into the focal and salient a broader expanse of our experience.
The Dialectic of the Sacred and Profane
Did the Japanese Emperor, then, forfeit the divine for himself and his people?
As I’ve argued, the sacred and divine are concepts that encapsulate complex structural features of what we are. They are the highest expression of ourselves as individuals, but also as collectives, polities, and communities. The sacred and divine are not immutably tethered to some doctrine or other; the sound concepts enshrouded within can be dislodged from their embedment in a particular religion or dogma. But this does not mean that the sacred and divine do not entail moral imperatives. We know, in the depths of our being, through the complexities of affect or feeling (entangled as they are with thought), when we have prorogued the sacred. Yet, we can re-summon it by heeding its sensitive terms. When we act in such a way that the sacred is summoned, we fortify our agency as proactive actors in the world. In doing so, we become more likely to spread the confidence it bequeathes upon us onto others by acting with goodwill and benevolence in our dealings at large.
The renunciation of Divinity by the Japanese Emperor was a renunciation of a compromised, contaminated evolution of Divinity as a possible expression of a people sanctioning dominion over another. Yet this does not mean that an extra-dogmatic expression of divinity cannot be rehabilitated and amended by purging its immoral implications. Knowing when our articulation of these concepts has gone astray (including denying their centrality in the human project) so as to yield human failings of the scale the war revealed, remains a task in progress even in our age of unrestrained economic activity.
Yet, our pursuit of the sacred-divine, even when we do not couch it in those terms, must give away inevitably to the profane. This is because the sacred and divine both rest upon and must arise out of the sea of the profane that are our lives. For the value of human activity to crescendo into an expression of immanence we must first bathe in the profanity of our lowest drives and needs. And finally, the revving of our temporal character into the plane of immanence cannot be sustained without eventual release into the obscene, the coarse, and crude. Reverence, then, must give way to irreverence and vice versa in a dialectical motion. The monk must ridicule his religion at times and make jokes at its expense, lest he/she lose his/her mind. Our reverence must never be such that we cosset anything as beyond reproach. Therein lies the danger. Investing the sacred and divine exclusively within formal power leads inevitably to their corruption.
Perhaps the most acute expression of this notion was embodied in the Festum Fatuorum or Feast of Fools in medieval France. Celebrated by clergy, the feast involved a reversal of social order, whereby subordinate clergy would assume the roles of higher ones, and liturgical practice would be parodied by electing a false Pope or Bishop. The true character of the feast has been obscured to history through misrepresentations and exaggerations, but the alleged extravagances and abuses perpetrated during the feast were rarely committed to writing. Persistent condemnation by the Church led to the eventual eradication of the ritual, yet it is precisely the lack of documentation and codification that bears noting. Rabelais and Miller are figures that fill the void in our civilizational dialectic for these releases and extravagances that subvert formal power and at the same time replenish our appetite for the sacred.