Friday, December 19, 2014

Christmas in a Time of Culture War

It is that time of year again: Christmastime. To paraphrase Dickens (though not from A Christmas Carol), it is the best of times, and also the worst of times. The time when holidays are exhausting, the people you spend it with tiring, and the inevitable return to work depressing. Just like every other time of year, in fact, only more so.

It is also the time of year when a small but vocal sub-section of the population lament the fact that no one seems to remember 'the reason for the season.'

You may know of whom I refer to: the cultural warriors.

They are the one's who want to put Christ back into Christmas, to rescue it from rampant consumerism. Or they want to take Christ out of Christmas because it has become much too commercialized.

The semantic argument neatly parallels the cultural criticism. If Christ is in Christmas, that is because the word itself contains a historical trace of things that once were, namely, the celebration of Christ's mass--or Crīstesmæsse in ye Olde English  And if Christ has been lost to the celebration of Christmas, that is because people now find themselves animated by baser motivations, like the desire for new toys, new clothes, and, with a little less frequency, books. New game consoles, electronic devices, or perhaps just straight-up cash, to be spent however the recipient so desires, are also possibilities.

The rest of the population--tucked comfortably in their secular beds in the feigned hope that a rotund gentleman in a red suit, the memory of Eastern Orthodox saint, which was revived by a major soft drink producer in the first half of the 20th century, will shimmy down the chimney--wonder what all the huff and puff is about. Live and let live, they say. It's is all just stories, anyway. What really matter is that you enjoy the time you have, and with family, if you have them.

Welcome to Christmas in a time of culture war. The culture warriors wag their collective finger at the rest of the population, who happily mold the holiday to their secular ends. They pit the virtues of faith against the indifference of reason; the will to believe against the tepid bath that is common sense. The battle is fought almost exclusively, and certainly ineffectually, from one side, to the great annoyance of the other.

Christmas in a time of culture war is more accurately described as Christmas in a time of material plenty. The capitalist organization of the economy is real miracle here. (No really, it is.) The competitive organization of the marketplace has freed persons to improve their material lot through their own creative industry. This translates, in turn, to an increase in financial resources, which can be invested back into one's own business, or can be used to purchase things things like food, clothing, shelter, or, in line with our theme, Christmas presents.

Consumption, even conspicuous consumption during the holiday season (when, we are reminded, business either makes or breaks its yearly budget), is the engine driving the entire operation. Consumption drives a cycle of economic expansion, which rains blessing on the masses. But, if the culture warriors are to be believed, the cycle is one with vicious moral and spiritual consequences. Market economics and capitalist finance has endowed its beneficiaries with more personal freedoms than has hitherto been known anywhere on earth. We no longer have to wait on others to give us gifts, which we don't deserve anyway. Now, like the proverbial self-made man who pulls himself up by his own bootstraps, we can buy gifts for ourselves--the sort that we really do deserve.

The consequence of material plenty is the schizophrenia of culture war.

Think it over for a moment. The old Christmas stories are typically set in economically trying times. This is as true in the Gospel narratives as it is in Dicken's A Christmas Carol. The stories are about the needy and vulnerable, just trying to eek out an existence on the edge of civilized life. Or think of the Christmas Truce on Christmas Eve in 1914, when 100,000 British and German soldiers decided singing Christmas carols and playing football (i.e. soccer) with the enemy was better than depriving him or breathe and life. The context makes the act of giving especially poignant.

At Christmas in a time of material plenty calls to serve God instead of Mammon amount to the pot calling the kettle black. There are poor people surrounding us on all sides. But they are not the one's vocalizing their displeasure with consumerism. The ones who feels the pin-prick of conscience are the one who raises their voice.

But the Christmas story is more 'secular' and so more materialistic than the cultural warriors allow. The accounts of Jesus' birth Matthew 2 and Luke 2 contain no message condemning material goods nor calls to stand apart from the rest of consumerist culture. Quite the opposite. In Matthew 2, The Magi bring expensive gifts of gold, frankincense, and myrrh to the boy-child in Bethlehem. In Luke 2, this parents dutifully go to Bethlehem to register in the Roman census, but, as the story goes, find no room in the local inn, and are forced to settle for a manger in a stable. The image painted is rustic, not ascetic. It is not a matter of denying oneself the goods one can afford, but in being grateful for the gifts that one cannot.

And finally, there is the general question of what the Christmas story is supposed to amount to. The Gospel of John does not have an explicit account of the the birth of Jesus. But its prologue (John 1) does retell the creation narrative in Genesis 1 in the light of Jesus' coming. It describes how 'the Word became flesh' and 'made his dwelling with us.' The latter phrase could also be rendered as 'tabernacling' or 'tenting' with us. There is no escaping that the meaning of the phrase is thoroughly materialistic. God 'tents it' with humanity precisely by taking on humanity's ruder, material nature. In other words, God is the quintessential consumer who likes stuff just for the sake of liking stuff--the stuff we are made of, the stuff we have, and so on.

The lesson in all of this, I suppose, is that if the cultural warrior finds their material needs more than satisfied this Christmas, they should donate something to charity and/or shut up. Bemoaning the fact that other people either have forgotten about or don't know the 'reason for the season' is only to put one's own self-righteousness on display. And no one wants that.  It is Christmastime, after all.

Saturday, March 08, 2014

Quebec Charter Troubles

A closer look at the Bill 60, the so-called 'Quebec Charter of Values', has me scratching my head. I don't claim to be a legal expert. I probably don't have the correct technical vocabulary at my disposal. Still, I can't help but think that the Charter's proposed amendments to the Preamble and Section 9.1 of the Quebec Charter of Human Rights and Freedom (1976) with a lengthy reference to the values it promulgates--values things like 'state secularism' and 'religious neutrality', etc.--has overstepped some legal limit or violated some legal precedent. From the French Declaration of the Rights of Man and the Citizen (1789), to the American Bill of Rights (1789), to the Universal Declaration of Human Rights (1948), to the Canadian Charter of Rights and Freedoms (1982), to the Quebec Charter of Human Rights and Freedoms itself, no one seems to have thought it necessary to enshrine the secularity of the state.*

That is very strange. That is so strange, in fact, it deserves a moment's pause. Every single one of these other documents are bona fide 'secular' documents. They are 'the real deal' as far as pieces of secular legislation go. So why the difference?

My honest answer is that I don't know. I have my suspicions, of course. What I don't have is the knowledge or resources at hand to provide an exhaustive literature review. The best I can do is list a few things that come to mind.

My leading suspicion is that in amending the Quebec Charter of Human Rights and Freedoms, the PQ government has not the foggiest idea for what exactly bills/charters of rights are traditionally intended. Such documents were drafted for the protection of individual citizens against the arbitrary depredations of the state's representatives. Such documents are made law so that everyone plays by the same rules--and so, by extension, persons in position of authority don't abuse the authority the state delegates to them to see that its business is carried out.

It would never occur to the drafters of these bills/charters of rights to define the state as secular because their character is already intrinsically secular. It is in the very thing they give formal definition: that every single person possesses something like 'inherent dignity' and 'inalienable rights', which cannot be altered by any 'external' consideration--like a physical or mental handicap, or physical appearance or choice of clothing, or the sorts of groups they associated with, to name but a few possibilities.

By adding reference to the secularity of the state to the Quebec Charter of Human Rights and Freedoms, the Quebec Charter of Values seems to fall into the error of treating the state as if it were an individual, whose dignity and rights needed protecting. But protection from what or whom? Other individuals?

When a state has enshrined its own secular character alongside the dignity and rights of individuals, it seems to me something has gone very wrong. The state outsizes all the other persons in the room. In practical terms, the law no longer protects the 'inherent dignity' and 'intrinsic rights' of individuals qua individuals, but privileges individuals qua some external consideration or other.

While I have no objection to the Quebec government legislating for the protection French language and culture, using the Charter of Values to amend the Charter of Human Rights and Freedoms seems to be the wrong way to go about things.

The latter Charter is a great deal less free, after all, when the former Charter requires the following caveat is made:

“In exercising those freedoms and rights, a person shall also maintain a proper regard for the values of equality between women and men and the primacy of the French language, as well as the separation of religions and State and the religious neutrality and secular nature of the State, while making allowance for the emblematic and toponymic elements of Québec’s cultural heritage that testify to its history.”


*A friend of mine pointed at that the French Constitution of 1958 actually does make explicit reference to the secular character of the state. I note it also makes reference to the French language. The more I think about this, the more 'French' the Quebec Charter of Values looks, and the more Anglophone my thought processes appear.

Thursday, March 06, 2014

The PQ's Catholic Hangover

The Quebec Charter of Values, otherwise known as the controversial Bill 60, is a peculiarly Catholic document. The irony ought not be lost on anyone. The document is essentially a lengthy reflection, in the form of a piece of proposed legislation, on the nature of secularism. It holds up as an ideal the separation of 'religion' and 'state', but ends up perpetuating a very Catholic notion of secularity. The long arm of the Catholic Church, it seems, still exerts its influence long after the Quiet Revolution. The evidence is there for everyone to see in Chapter 1, Section 1, in which the state's secularity is affirmed except in instances related to Quebec 'cultural heritage that testify to its history'. But that is only a superficial matter. The influence of Catholicism runs much deeper.

The idea of religious beliefs propagated in the contemporary media is of truth claims that can be easily disproved by the methods of natural science. Evangelical creationism falls into this category, as does a general disbelief in miraculous occurrences. Other examples of this way of characterizing religion can easily be found. But this is to misrepresent broad sweeps of religious belief and practice. The idiom that religious beliefs, in particular the 'ethical monotheisms' Judaism, Christianity, and Islam, more naturally make their home is that of political theory, of law and contract, of judgment and negotiation. The practical expression of what are sometimes undoubtedly abstruse metaphysical claims respecting God are found in relations between persons. Hence sacred texts stipulate morality, lay out a general order for relations between rulers and subjects, and advocate for the materially disadvantaged.

Different religious traditions nonetheless are constituted by different emphases. In the Western world, for example, there are very generally speaking two forms of atheism: Protestant atheism and  Catholic atheism. These engender two very different responses to religion. The basic difference comes down to how authority is understood to be mediated to individual, whether in the form of a text (like the Bible) or in the form of a person (like a priest or bishop). On the whole, Protestantism tends to be more textually-oriented, whereas Catholicism has a much more clerical focus. Hence, when persons with a Protestant background embrace atheism, it tends to be an atheism of a more cerebral sort, which hones in on the 'irrationalities' of religious belief. While, when persons with a Catholic background reject the faith, their rejection tends to assume an anti-clerical form.

The same sort of logic applies to how one understands secularism. Protestant secularism tends to be more abstract, to concern itself with how a person thinks, rather than with how a person appears. As long as a person's outlook regarding the public sphere lines up with the rest of us secularly-minded folk, they can think what they want (within reasonable limits) and wear what they want (at the risk of drawing stares). On the other hand, Catholic secularism assumes the aforementioned anti-clerical form and fixates on the manifestation of authority. The differences between Catholic and Protestant secularism, of course, are not hard and fast. But I find the difference of priorities between Francophone and Anglophone communities too uncanny to pass by without comment.

The sorts of things being proposed in the Charter should now come into clearer focus.

The main task of the Charter is to legislate how the representatives of the sovereign authority shall manifest its secular nature. It identifies both the covering of the human face and the overt display of religious symbols as fundamentally contrary to state secularism. Persons in the employ of the provincial government, who are the visible manifestations of its authority, are instructed to remove any offending garment or decoration in the prosecution of their duties.

Notably, I think, the Charter does not bother to define what secularism is, nor what religion is, nor what the religious neutrality of a secular state is. The only concrete statements it makes regards how the employees of the provincial government ought to appear while the dispense with their duties. The rest is simply assumed.

So Catholicism continues to leave its negative impress on Quebec politics. Proof is found in Anglophone exasperation over what is perceived to be the unfair targeting of Muslim women under the thin veil of disinterested secular state. Anglophones cannot understand why what a person wears matters as much as it does, nor why it commands as much support as it seems to have among Francophones. Whereas persons like Premier Pauline Marois and Minister Bernard Drainville seem genuinely perplexed why anyone would oppose the idea of a employees of the provincial government conforming to some basic secular standard of dress.

Wednesday, March 05, 2014

Another Quebec Referendum in the Works?

In the distant recesses of my memory, there is a vague recollection of 100,000 Anglophones traveling to Montreal to tell Francophones that a united Canada needed the province of Québec in order to be whole.

The large-hearted gesture seems to have had the desired effect. The referendum on Québec sovereignty in 1995 was settled with a decisive 51% voting against leaving the federal union of Canada. I remember commenting to one of my high school friends that a second defeat meant the sovereignty movement was now likely to disappear entirely from the Canadian political landscape.

At the time, I had no notion that I would ever make a home in Québec. The province, for all intents and purposes, was a foreign country. But it was something I was taught about in Canadian history class or learned about reading Canadian history books. In my mind, as a Canadian--which, as an Ontarian, is how I thought myself--I identified with Quebecers because they too were Canadian.

It did not matter that I had never (at least to my knowledge) met a Quebecers. The stories were enough to sustain the mental connection. On the Plains of Abraham, my heart was with the valiant Montcalm, not perfidious Wolfe; just as it was with stalwart Brock on Queenstown Heights, not the American invader. That Canada was an anachronism in both these instances mattered not one bit. Montcalm the Frenchmen and Brock the Englishmen stood for what would become Canada. So also with the French habitants and the Loyalist settlers in the Maritimes and Ontario. They stood for the Canada I inherited, and so they were both my figurative ancestors. (I cannot be the only person who grew up thinking this way.)

My Canada is bilingual and multicultural. My Canada embraced more than my lingually-challenged and culturally flat-footed self ever could be. At the same time, living in Montreal the past 4.5 years has allowed idea of Canada to mature. I now recognize my Romantic idea of Quebeckers was the idea of an Ontarians--an Anglophone's vision of what Francophones should be.

Yesterday Premier Pauline Marois has called a provincial election on what nearly every observer and political commentator agrees are identity issues. The PLQ staked their political fortunes on the so-called Charter of Québec Values. The reasons appear entirely cynical. After the PLQ won its minority government, its movement in the polls was flat. And then, for causes that appear inexplicable to an Anglophone, the party's prospects immediately improved after they found their wedge issue.

Whether Marois' thought processes were essentially cynical is a chicken and egg question. Which came first? The PLQ's Charter of Québec Values, or the voter's desire for something like it? By fixating our attention on the actions of a few individuals on top, we risk misunderstanding the dynamics on the ground.

For the moment, the Charter seems to have done its work. The time has been judge right for the PLQ to seek its majority in the provincial legislature. That, by calling an early election, Marois is backtracked on a resolution from last year to fix the date of the next election is immaterial at this point. Whether she is perceived to have used the election to avoid testifying before a legislature committee to her involvement in her husband's alleged misuse of government funds may not be. The Liberals and there CAQ are both likely to use these barbs to great effect.

But, as strange as it sounds, Québec is the one place in North American where it's not the economy: it's identity, stupid! Campaigning on the virtues of the Charter seems also the perfect way to test the waters for a future referendum. It is a way to change the political conversation, and even to generate future support. The PLQ can also count on the fact that the rest of Canada's attitude towards Québec has changed significantly in the last two decades. One hundred thousand Anglophone's are not making the trip to Montreal this time around.

Monday, December 23, 2013

A Christmas Story: The Coronation of Charlemagne

On Christmas Day in the year 800, the Frankish king Charles the Great, who better known as Charlemagne, was crowned the Holy Roman Emperor.

Christmas stories are various and sundry. There are children’s stories like Rudolph the Red-Nosed Reindeer, the scriptural narratives of Jesus’ birth, or morality tales like Charles Dickens’ A Christmas Carol. The story of Charles’ coronation stands apart from the rest by being both factual and mainly a matter of academic discussion.

Unlike Rudolf or Scrooge, Charles is an actual historical figure. Unlike Jesus, the contested meaning of his earthly life does not inspire cultural crusaders to draw up battle lines. Though impressive in his own right, Charles cuts a rather mundane figure across the backdrop of human history. With so many other impressive figures from which to choose, he easily slips from view.

But this is not to say his story should not be taken down from the shelf and dusted off once and awhile.

The details of his are uncontroversial. Charles entered St. Peters to celebrate Christmas mass. While kneeling before the altar, Pope Leo III placed a crown on his head and proclaimed him the Holy Roman emperor.

No one contests that a coronation took place on Christmas Day in the year 800. Owing to the distance in time and the scanty amount of evidence presently available, however, the meaning of the coronation is not entirely clear. Was the coronation the brainchild of the king or the pope? Which party stood to benefit? His biographer Einhard tells us that Charles was caught unawares when the pope named him emperor. Doubtless this was a well-crafted piece of political theatre. The humble king may be seen by all not to have sought this high office. The pope, as the exalted head of the Church, the Vicar of Christ, raises another up to oversee the mundane, worldly affairs of Christendom—its administration, defense, and the like, so that he could get back to the business of shepherding men’s souls to their eternal home. Both the ends of Church and State were served.

Now, since he already had a kingdom, Charles gained nothing but a title in his coronation. The oldest son of Pepin the Short, he was co-ruler of the Franks with his brother Carloman from 768-771. Just as war between the siblings seemed about to break out, Carloman died from what were apparently natural causes. The source materials give us no reason to suspect any misdeed on Charles’ part. Nor would we expect them to. It is almost inconceivable that any medieval historian would begin the narrative of the reign of so successful of king as Charles with betrayal and murder. Lacking evidence, we may only gesture into a speculative void about what was actually the case.

But it is fairly easy to infer from the events of his life that Charles wanted an empire--and by implication the title that went along with it. Much of his life was spent in the saddle. He saw action in present-day France, Spain, Italy, and Germany, as well as on the islands of Corsica and Sardinia. Everywhere the borders of his realm were extended, northwards into Saxony, southward beyond Barcelona, and eastward towards the Danube. To his credit, Charles was also responsible for a series of economic, monetary, educational, and ecclesiastical reforms. These efforts were collectively responsible for the brief flowering of Frankish culture known as the Carolingian Renaissance. They could not have been affected, except for the support Charlemagne lent to the Church and to its institutional reform.

Ties between the Frankish court and the papal curia grew apace. The Church was a source of both clerical (i.e. religious) and clerical (i.e. administrative) support. So when Pope Leo fled Rome in 799, he went to Charles for help regaining the papal throne. Charles made his way to Rome in November of 800 at the head of an army. And a little more than a month later, at Christmas mass in St. Peters, Leo crowned him Imperator Romanorum, ‘Emperor of the Romans’. The moment had a double significance: it restored to the West the imperial authority that had departed three centuries earlier to Constantinople in the East; and it formally severed ties that had already been severed in practice with the Roman (read: Byzantine) Empress Irene in Constantinople. The material resources of the Eastern Roman Empire had long since become insufficient to maintain the temporal holdings of the Church in Rome proper against the depredations of the Lombard lords. The Western Church needed a Western champion.

Einhard claims Charles had no knowledge of the pope’s intention, and would not have agreed if he had: ‘[H]e at first had such an aversion that he declared that he would not have set foot in the Church the day that they [the imperial titles] were conferred, although it was a great feast-day, if he could have foreseen the design of the Pope.’

As we call know, every Christmas story ought to have a moral. That being the case, we must search for moral to the story of Charles’ imperial coronation.

Perhaps we might see in Charles’ coronation a caution against confusing the jurisdictions of Church and State. The pairing of Charles and his biographer Einhard are strikingly similar to Constantine and his biographer Eusebius. The latter pair has come under considerable criticism in recent centuries for drawing the politics and religion closer together than our modern liberal sensibilities are comfortable with. We might also draw a salutatory lesson about the grand pretensions of political theatre. The vociferous claims Charles made about not having wanted an imperial mantle do not pass the smell test.  The formal imperial inauguration only confirmed in theory what had already come to pass in practice.

These assessments contain a measure of truth in them; but they both go against the grain of human history. Time always moves forward, even as we look back and assess from where exactly we have come. The past was no different. Whatever judgment we pass against Charles must be made with this in mind.

So we ought to judge Charles against his predecessors, just as we must judge ourselves against our predecessors—one of whom among was Charles. 

The appropriate comparison is made with the pharaohs of Egypt, the kings of Babylon, and the emperors of Rome. Set alongside their imperial rhetoric, Charles’ denial to have sought an imperial mantle is startling. It matters very little, in this light, whether Charles’ display of humility was only pretense. What matters was that pretense was necessary in the first place. Over the course of a millennium, the basic forms of political legitimation had been entirely inverted. The rulers of the ancient empires claimed to be gods or sons of gods. At least in theory, even if theory was not completely realized in practice, they were the all-powerful manifestations of divinity on earth. They were the soul animating the body politic; the lives of men were theirs to dispense with as they saw fit. The degree of divinity to which a ruler in the ancient could lay claim, in fact, appears a function of the density, size, and sophistication of the civilization. The larger the political entity, the more precarious its existence; which meant that the measure of authority required to maintain a political entity increased exponentially, until it became only natural for kings to liken themselves to gods.

A clear division between Church and State is the first indication that something fundamental had changed in the transposition from the Ancient into the Medieval worlds. The authority of the gods over men is no longer concentrated in a single person; it is differentiated into separate forms, which serve to limit each other. The key to the transposition is in the pretended humility. The basic form of authority in the Ancient world was of the gods over men. Whereas the basic form of authority in the Medieval world was of God become man, and more precisely, a vulnerable infant. The actual exercise of authority may still entail the command of one man over the rest, but, to borrow a phrase from Yeats, it now slouches towards Bethlehem.

Sunday, August 25, 2013

Funding the Humanities

I won't bother claim to be a dispassionate or disinterested proponent of the humanities. A program of religious studies, like the one in which am I enrolled, is about as humanistic as it gets. Not everyone would agree with the characterization, of course, but the inference is a sound one. Faculties of religious studies got their start as programs in the secular or scientific study of religion. They were supposed to be non-confessional; and so were much less concerned with in the nature of divinity than they were in what this or that belief in divinity said about the human beings who held them. Focus was on the one thing about human beings that is difficult to explain away on other terms: why we believe what we think to be the case about X (where X can anything under or over the sun), rather than merely what we think about X.

So my numbness to the fact we in our collective wisdom have decided the humanities simply aren't valuable in the broad scheme of things thaws a little when the economist Christina Paxson offers 'The Economic Case for Saving the Humanities' over at the New Republic. The piece is an effort to turn the table on the standard arguments against funding the humanities. If we in our collective wisdom deemed the humanities valuable, some of the monies pouring into faculties of science and medicine would be reassigned to history, philosophy, and the fine arts. The federal government would apportion a lot more money to research grants in African literature or Asian antiquities. And employers would eagerly hire persons demonstrating a capacity to learn, critical analyze, and achieve research and/or other goals. 

But money isn't pouring in. Paxson points out the rationale for governments to invest in the so-called S.T.E.M. subjects--science, technology, engineering, and mathematics--is a simple matter of connecting the dots. The payout is calculably predictable, much like the sort of stuff dealt with in the subjects themselves. The same cannot be said of the humanities. Figuring out the monetary value of a study of the relationship between the two parts of Miguel Cervantes' Don Quixote, for example, is like tilting at windmills. A post-structuralist reading of Xenophon's portrait of Socrates suffers from similar pecuniary under-determination. These cannot be quantified in the same way the matter of the S.T.E.M subjects can be quantified. The consequence is that public servants, who must give an account of their funding decisions to their respective political constituencies, err on the side of caution and control for those variables which can be measured. And for the time being, the humanities live off a dwindling institutional memory of better days.

So we need to learn how to argue, Paxson says, 'there are real, tangible benefits to the humanistic disciplines—to the study of history, literature, art, theater, music, and languages.' No doubt she is right. We do need to learn to argue for the tangible benefits of humanistic study. Obviously we, especially those of us in the humanities, have forgotten how to make such an argument.

The 'economic' character of Parson's are problematic. Their weakness may be seen in how they haphazardly circle around the point. Here's a sampling:
'[I]t is evident that many of the men and women who were exposed to that curriculum went on to positions of genuine leadership in the public and private sectors.'
'[W]e do not always know the future benefits of what we study and therefore should not rush to reject some forms of research as less deserving than others.'
'We should be prepared to accept that the value of certain studies may be difficult to measure and may not be clear for decades or even centuries.'
The first argument appeals to anecdotal evidence, to contingent circumstances, not necessary conditions. The second and third argument brings in epistemic considerations about the inability of our metrics to predict the shape of the future. Most notable, these aren't peculiarly economic arguments. All three appeal to a rough and ready practicality. Well aware of the reasons offered for why the humanity ought not to be funded, Paxson skirts around the question why we ought to fund them.

Let me take a stab at answering the question. The strongest argument to be made for increasing funding to the humanities is that they, like so many of the other things we value in our lives, have no obvious, measurable, practical purpose. As paradoxical as this may seem, it gets at something essential to being human. The immediate payout from reading a good novel is almost non-existent. More likely, you spent money in order to purchase the novel. The same goes for conversations in coffee shops, reading the newspaper, or watching the news. The list goes on. We do these things because we want to, because, for whatever reason, we enjoy doing them, not because doing so has an obvious dollar value attached to them.

The idea of an entire human life ought to be subject to market discipline revolts even the most hard-nosed of capitalists. (Hence they spend extravagantly on the so-called superfluous aspects of their own lives.) For that reason, and that reason alone, the humanities needs a humanistic defense grounded in what it means to be human, not an economic one pegged to balance sheets and bottom lines. The proof is near and dear to every single one of us. The latter concerns cannot be ignored, of course, but they have their particular place in well-lived human life, rather than the other way around.

Where do you look for the basic inspiration behind such a reordering of priorities? Usually in religious texts, among other places. The first chapter of the Book of Genesis describes the creation of the world, and the creation of human being's in God's image. No reason is offered for why God created the world. The only thing the reader can make out is that God did. The consequence is that human life, existence itself, is best understood as the product of a supremely pointless divine act. Not to despair, though. Things don't end badly for the human race. The text of Genesis finds a reflection of God's supremely pointless act in the human being, a creature created in the image of its Creator.

The creation of humanity in God's image is one of those catch-phrases, like other ones insisting every human being is possesses an intrinsic dignity invested with certain rights merely by virtue of being human,which illuminate the rest of the world. We reason from them towards some conclusion, not towards them from other premises. Like so much of human life that cannot be rationalized on the strict terms of the hard sciences, things are because they are--or, more precisely, because we want them to be.

The image of the humanities as a beleaguered bastion of light holding out against an assault of bankers and bean-counters won't pass a smell test. The problems facing studies in the humanities are much bigger than mere institutional arrangements the immediate problems of funding allocation. Fiddling while the humanities slowly burn to the ground is something we have collectively determined to do, including persons claiming to work in the humanities. Stanely Fish comes immediately to mind. The malaise of a modern education is subtle and pervasive; it goes much deeper than individual figures, deep down into our basic assumptions about the way things are.

The demise of the humanities follows upon our collective failure to see human life as anything more than an individual can make of it. We live in communities, of course, but we have forgotten how to think about life as if it is lived in the community of others. So we fiddle while Rome burns, and pretend not to understand those things each of us individually desire for ourselves--a roof over our heads, clothes on our backs, food on the table, the company of family and friends, and a modicum of freedom explore this short life's possibilities--aren't also collectively desirable.

In the end, the demise of the humanities isn't merely about a small number of academic disciplines. 

Friday, August 23, 2013

The Interfaith Identity Crisis

About a week ago, the Washington Post argued the nature of interfaith endeavours has shifted with demographics. A more diverse population means interaction between religious groups is no longer restricted to the clergy. In fact, a typical practitioner can now be expected to have some sort of contact with persons of different faiths.

Children who grow up and go to college or university today have very different experiences than their parents. Interfaith used to be something people did. Now it is something people live daily. Though there now exist twice as many interfaith groups in the United States than a decade ago, making the generational transition has been difficult for many. Old assumptions are being challenged, and questions of new priorities must be raised.

In a Huffington Post article, Rev. Donald Heckman, Executive Director of Religions for Peace USA, suggests the interfaith movement must rebrand itself. The term means too many things to too many people to convey anything definite to the wider public. In response to a growing number of persons who do not identify with any particular religious tradition, he says,
'I think we may need to cede the term "interfaith" to the small but growing number of people who see faith, religion and spirituality as boundary-less enterprises of exploration and who allow for multiple affiliations. And the more narrow technical term "interreligious" needs to be co-opted to cover the broad arc of things that are multi-, inter- and intra- for -faith, -religious and - spiritual.'
But is problem really just about branding? If it's about religion, doesn't it go a whole lot deeper than the question of what a person calls themselves?

Heckman is asking the right questions. The way he is asking them, however, leaves something to be desired. The deepest motivation of the interfaith movement has always been to bring people together. And that makes the wisdom of more carefully parsing the names we apply to ourselves doubtful.

The problems the interfaith movement presently faces are perennial problems, which have taken on new forms in a new context. Seen in that light, answers to questions about how to move forward should become more obvious.

The basic problem has always been how one engages persons of other faiths while remaining true to one's own faith. How can I both be a Buddhist, Christian, Hindu, Jew, Muslim, etc. and engage constructively with persons of other faiths?

There seems an assumption, especially in certain Evangelical Christian communities, the logic of religious identity is ironclad: one can be either this or that, but not both. And the only reason to talk to members of other faiths is to convert them.

Rather than rebranding, the interfaith movement should be retooling. Since more and more people are living the interfaith movement on a daily basis, what is needed more than ever is to equip and teach people to find inspiration for interfaith engagement within their particular religious traditions.

I don't mean glossy presentations of the things religions share in common, though that must be a part of it. I mean encouraging Christians to think on what it means to see everyone as being created in the image of God, Muslims what it means to be Allah's representatives on earth, Hindus as jivas, and so on.

Our religious traditions, without exception, classically wrestled with the dignity and misery of being human. They set out to achieve the impossible goal of reconciling the entire human race to each other. They also cautioned against presuming too much about one's own abilities to accomplish that goal. The labels we gave ourselves, in this picture, matter a whole lot less than actual flesh and blood.

The interfaith movement needs to see itself not as a solution to a problem everyone else has. If that were the case, then rebranding is all that's needed. The interfaith movement needs rather to see itself as taking part in the very thing people have been working at for many millenia. Only then will it catch up to the truth that people are living interfaith lives every single day.

Tuesday, August 13, 2013

A Review of Arvind Sharma's Gandhi: A Spiritual Biography

Here is a question worth pondering. Has a biographer really done his subject justice when God appears in a life’s story as an actual actor, and not just as a literary device, inspirational thought, or private conceit?  At stake in the question’s answer is truth. Not THE TRUTH, mind you. Not what truth is; but much more importantly how truth is told.  Has a biographer told the truth of his subject if the divine majesty is allowed to skulk between every line of every page?

The truth is, or ought to be, it seems, much more mundane.  In truth’s unvarnished form, readers confront the cold, hard stuff of the real world. Right?

The question’s answer cannot be so simple, however, when a biographer sets out to write a spiritual biography.  The Yale University Press has just published Gandhi: A Spiritual Biography (2013) by Arvind Sharma of McGill University in Montreal, Quebec. With the opening lines, Sharma warns, ‘History is more than the biography of those who make it’, and immediately counters, ‘Nevertheless, some people leave their mark on history in such an elusive way that historiography perpetually fails to capture it.’

Gandhi was such a person, Sharma suggests, along with Moses, Jesus, and the Buddha, and a small number of others. Most biographies on Gandhi are written about Mohandas Gandhi. They refer to Mohandas with the honorific Mahatma, or ‘Great Soul’, but are concerned with events and people, politics and social processes. A spiritual biography of the man takes Mahatma Gandhi as its subject, and looks what it means to be a mahatma.

Sharma’s credentials certainly qualify him to write such a book. The Birks Professor of Comparative Religion at McGill’s Faculty of Religious Studies, Sharma uses his specialization in Hinduism as a bridge to much more general topics, including religion and feminism and religion and human rights. He is the author of One Religion Too Many: The Religiously Comparative Reflections of a Comparatively Religious Hindu (2011). The book is Sharma’s spiritual autobiography, a chalk full of wry observations about growing up a Hindu and encountering other religious traditions along life’s way. After the Gandhian fashion of marrying faith to social activism, Sharma has also convened two international conferences looking at religion and human rights: World’s Religion after September 11 in 2006 and the Second Global Conference on World’s Religion after September 11 in 2011. A third and final conference is now in the works for the second half of 2016.

Every one of Gandhi’s biographers must confront the question about the source of his power to inspire. The ends of spiritual biography, Sharma’s argument runs, are much more appropriate to Gandhi’s fundamental motivations than are other sorts of biography. It goes to the heart of the matter, so to speak, to the place where word intersects with deed. ‘Gandhi’s claim was made upon our conscience; he demonstrated that spirituality is to be found at the core of our humanity.’

Sharma’s discussion is lively. At points, even if a little dialectical and didactic, the prose dances off the page into the reader’s imagination. Spiritual biographers risk falling into hagiography, but Sharma demythologizes Gandhi in order to preserve his saintliness. Gandhi demythologized himself, Sharma points out, by attributing his larger-than-life accomplishments to God. If he was a saint, his saintliness was in part due to his willingness to own the flaws of his character. Sharma examines a number of them in the course of the book.

Which God did Gandhi serve precisely? Good Aristotelians the lot of us, we may argue over the specific nature and attributes of the divine majesty—or whether it makes sense to speak of God existing or as existent. Whether, in our intensely analytic moments, we master our language or it masters us remains to be seen. We also stand to miss the point, was the point I took away from the Sharma’s book. Gandhi died with three bullets in his chest and the name Rama on his lips. He identified Rama with Truth, wherever it may be found, but especially through introspection and selfless service.

God as Rama as Truth could never be a mere propositional statement. The reality of God must be lived in order to be known. The insistence on identifying word and deed, Sharma points out, led Gandhi to his death. He was assassinated because he insisted India fulfill promises of a third payment to Pakistan because India had given its word. The fact the two countries were then at war could not change his mind. Gandhi took it upon himself to see the promise fulfilled; the name Rama on his lips, his final gesture was one of forgiveness to his executioner.

Gandhi: A Spiritual Biography divides neatly in half. The first half treats significant episodes in Gandhi’s life. The second looks at significant themes in his thought. The book does not propose to be an exhaustive study, though it most certainly qualifies as an illuminating and instructive one. The author may be forgiven, therefore, if readers find themselves wondering how Gandhi got from a point A to a point B, or what motivated him to make the move. The scarcity of this sort of information is easily compensated by the depth of Sharma’s treatment of Gandhi’s psyche: his thoughts on sex and celibacy, British imperialism, his own spiritual heritage, and the caste system are just a few of the topics he covers.

The book draws me to one conclusion: other modes of biographical writing aside, a spiritual biography on the life of Mahatma Gandhi cannot fail to testify to God. Absent the divine majesty, Gandhi’s intentions no purpose, his actions had no end, his thoughts and no object. Absent God there could be no Mahatma.

Monday, July 29, 2013

Reza Aslan's Zealot: The Life and Times of Jesus of Nazareth (Updated)

Let's take a break from the blog series on Isaiah and talk about Reza Aslan's Zealot: The Life and Times of Jesus of Nazareth (2013). Since an interview on Fox News with a host who was not able to get past the idea that a Muslim wrote writing on Christianity, Aslan's book has sold briskly on Amazon. Not that it was doing poorly before; only now it is at the top of the charts.

On my shelf is sitting his No god by God: The Origins, Evolution, and Future of Islam (2006), from which I developed a healthy respect for Aslan's acumen. It seems appropriate that a Muslim should write a book on Jesus, since Islam claims Christ as a prophet who brings the Gospel, like Moses brought the Torah, David the Psalms, and Muhammad the Qur'an, affirms his virgin birth, and proclaims his return on the Last Day. Only the orthodox Christian formulation about the two natures, divine and human, in one person is absent from in the Islamic account. It betrays the ignorance of the Fox News host, and everyone else who thinks what Aslan wrote is fundamentally objectionable, to suggest Aslan has absolutely no business writing on Jesus. Unlike persons, religions are not discrete entities; they overlap, interweave, and mix in the heads of persons down through the course of human history.

But Aslan has taken a lot of heat from certain quarters for his newest book. Understandably, though regrettably, conservative Christian quarters in the main. The most intelligent criticism I have read so far comes from First Things blogger Matthew Franck, who points out 'Reza Aslan Misrepresents His Scholarly Credentials'. I say 'intelligent' because the article is more than mere opinion. The author did a little bit of digging around to develop the piece. But the argument may not be entirely fair. Franck places more value on form rather than content, on the external things, which should only be regarded as of secondary importance. He argues Aslam misrepresents his scholarly credentials, and therefore we should doubt his contribution to the broader conversation is the implied suggestion. But scholar who spends his life reading texts about religious beliefs, writings books on religious topics, ought to qualify as a scholar of religion, in my estimation, regardless what his current academic title is or what his dissertation is on. Franck disagrees. You can read his piece for yourself and form your own opinion.

The obvious point to be made, apparent in the title of book, is that Aslan's Jesus is not the Jesus of the New Testament Gospels. The Gospel are fairly careful to distinguish the sort of messiah Jesus was supposed to be from other Jewish claimants to messiahship around the same time. Jesus' kingdom is not of this world. The kingdom of God is within you. Render unto Caesar what is Caesar, and God what is God. And so on and so forth. Short little catchphrases may be found throughout the Gospels, all of which relativize the importance of transient worldly success. (It is transient, after all.) The message of the Gospels is subversive in a bend-over-and-take-it-on-the-backside kind of way. Jesus ends up going to die on the cross--willingly.

The inch-deep, mile-wide cultural commentary ought there misunderstands that Aslan's basic hermeneutic for reading the Gospels does not come from Islam, but from 19th century European seminaries. Very intelligent theologians, for reasons peculiar to the place and time, decided the Gospels' portraits of Jesus were not historically reliable. They drew a fundamental distinction between the Christ of faith and the Jesus of history. Believers could believe whatever they wanted. On the other hand, scholars had to restrain themselves from saying anything beyond the surface of human history. This scholarly attitude lives on in such organizations as The Jesus Seminar.

Aslan's Jesus is a rebel of sorts seeking to effect some worldly change. So Aslan rejects the final implications of the Gospel portrait. As he says in the opening pages of his book, 'If we expose the claims of the gospels to the heat of historical analysis, we can purge the scriptures of their literary and theological flourishes and forge a far more accurate picture of the Jesus of history.' Divide faith from history, like Aslan does, and the willingness of the historical Jesus to go to death in order to effect a victory, not over mere human powers, but over sin, death, and the denizens of hell, no longer makes much sense.

There is nothing new in Aslan's arguments, and certainly nothing worth loosing our heads over--nor compromising our resolve to love our neighbours as ourselves, even and especially when they may disagree with us. There is nothing especially offensive in his presentation either. It is entirely in line with a Christian confession of belief that Jesus Christ was the Son of God that a non-believer doesn't believe the same. Whether Aslan has mined the Gospel for all the viable 'historical material' that can be had from them--well, that is another question.

Sunday, June 30, 2013

The External World

Among the perennial questions of Western philosophical tradition is one about the existence of the 'external' world. In its most basic form, the question asks, Do things exist apart from our thoughts about things? Is it true, in other words, that to be is also to be perceived?--to borrow a phrase from the 18th century Anglo-Irish philosopher Bishop George Berkeley. Does the tree exist because you think about it, or does it exist prior to your thinking about it?

But the answer to the question is not as obvious as it first appears. The longer you think about the question, in fact, the more obscure the it becomes.

We have good reasons for thinking things exist apart from our thoughts about them. To begin with, we fall asleep at night and wake up to find the world much the same as we left it. We travel familiar routes to home, school, or work, navigating by means of familiar landmarks. The continued presence of objects in our physical environment provides a very strong reason to think they exist apart from our conscious perception of them.

As far as a naive faith in the external world goes, philosophers seem the worst of the bunch. They can always be found talking about Kant's view of this or Heideigger's view of that, as if Kant and Heidegger, and their views of this or that, were out there waiting to be looked at, thought about, and discussed at great length. Which, of course, they are--recorded for us in books.

We are very comfortable with the thought that an external world exists apart from our thought about that world. It helps us make sense of learning, discovery, and being in error. Something 'external' to our thinking provides a standard against which to measure the truth of our thought. Our thought runs up against it, tries to comprehend it, arrives at a provisional understanding, makes a decision as to its adequacy, and so on. We presume the existence of an external world whenever we communicate our thoughts with others. At least, those of us do who have not yet figured out how to communicate directly, one mind to another. Not only do we make use of the external world as a medium of communication, much of our communication has to do with calling others attention to consider some object found there.

Not everyone is happy with the language of an external world, nor the implied idea that the world is one thing and thought about the world another. (The aforementioned Kant is a good example.) The philosopher Daniel Dennett has coined the term 'Cartesian theatre' to capture how strange the idea of thinking about the world as external to ourselves. The most obvious reason for why the idea just doesn't measure up, of course, is that we find ourselves in the external world: our bodies. We are, in some very real sense, our bodies. As our bodies move, so we grow. As our bodies grow, so we grow. Where our eyes look, our conscious attention seems to follows--or does it lead? Dennett enjoys mocking persons who think of themselves as looking at themselves (their body) from an undefined location (their mind). The mind is not the brain, after all. The brain is something that can be seen, picked at, pulled apart, sliced into sections. The same cannot be done to the mind, per the definition of mind. But if it can't be observed and studied, it seems legitimate to wonder whether the thing exists at all.

I haven't much time for Dennett's endless refusal to say anything positive about this thing I call myself, though I find his line of questioning to be a helpful foil. Thomas Nagel has it exactly right when he says that Dennett merely redefines consciousness as an external property, ignoring the essential problem, which is the subjective first-person perspective that each of us occupies, and no one else does for us. Indeed, it's the individual's first-person perspective (which, if re-ified, is called an immaterial soul, the life of the rational animal) makes the external world a problem in the first place.

The individual first-person perspective throws a monkey wrench into any abstract formulation--whether it's Berkeley's to be is to be perceived or Dennett's critique of the 'Cartesian theatre'. Certainly the logic of these positions can be tried and tested; but logical analysis aims at universal applicability, which is precisely not a first-person perspective. If the world exists only because I perceive it, the rest of you have a real problem. Likewise, if a first-person perspective is nothing, or at least nothing worth thinking about, then we, each individually, all have a real problem.

Bishop Berkeley had an answer. To be can still be to be perceived, even if no human being is perceiving every single object in external world all the time claimed Berkeley, because the being we call God perceives everything, which allows them also to exist apart from partial human perspectives. That not a solution open to Dennett, at least not one he thinks is open to him. So he runs away from the first-person perspective; and, we might say, trips over the elephant in the room--himself.

The idea of a world external to ourselves, it seems to me, helps us all make sense of our individuality. It allows me to say your perspective on things may differ from my perspective on things by creating a buffer zone between the part of me only I have access to and the part of me the rest of the world gets to see. You are external to me. We can talk things out, but we won't necessarily come to an agreement, or even an understanding. And that is okay.

Saturday, June 29, 2013

Richard Fletcher, Historian

Richard Fletcher was a rarity among historians. A medievalist, Fletcher published books on Anglo-Saxon England and Moorish and Christian Spain prior to the actual beginnings of the Reconquista in the 11th century (which is usually dated to the 8th century). Another of his impressive scholarly accomplishments was The Barbarian Conversion (1999), which looked at Christian missions into the dark heart of Europe between the fall of the Western Roman Empire and the Reformation, with an eye to happenings in the Eastern Roman Empire, the Middle East, and North Africa. It is not hard to imagine Fletcher thought himself picking up where Edward Gibbon left off, only with a much less jaundiced eye towards events and persons who didn't obviously exude the material greatness and organizational power of the Antonine Dynasty.

It is my experience that history books can all be arranged on along a single axis stretching from a purely objective perspective on the historical subject matter to an investigative perspective that gives readers a glimpse of the difficulties historians encounter trying to interpret their sources. Most historians fit into the former category. They may talk a good talk about the multiplicity of perspectives from which the sources can be studied; but they rarely reflect on the limitations imposed on historians by the limited availability of materials. History textbooks assigned in undergraduate classes, as well as most survey texts, fit into this category. They tell what happened when, and why things happened the way they did. Narrative threads are woven together presenting 'the present state of the field of study'. Specialized historical studies also follow this general pattern. In their introductory chapter, the historian usually tells you what other historians have written, what new evidence they have found, and how it confirms what we have already discovered or how it should radically change how the field of study is understood.

Fletcher's Bloodfeud: Murder and Revenge in Anglo-Saxon England (2002) is one of those rare histories that let's you follow a historian reading texts, trying to discern where all the pieces fall. The roughly half-century stretch of time from the establishment of Anglo-Saxon rule in 577 until the Norman Conquest in 1066 comprises the England's participation in the Dark Ages. The earlier in the period one finds oneself, the more scarce the evidence becomes. Though in the last leg of the period, from the Danish Conquest in 1016 until its conclusion, much is left to be desired.

The northern-most English province Northumbria was ruled by Earl Uthred, celebrated with the title 'the Bold'. In 1016, Uthred came to pay his respects in the court of the Danish king Canute (or Cnut) at a place called Wiheal. The location of the meeting, Fletcher indicates, is part of the mystery. We don't presently know where it is. Uthred and forty of his clients and retainers died that day. His death set into motion 'a bloodfeud that lasted for three generations and almost sixty years'.

Bloodfeud patiently sifts through what evidence remains in an effort to discern the motivations behind the different persons involved. Sometimes all that we have to go on are single sentences carelessly dropped into The Anglo-Saxon Chronicle, a document contemporary to the period in question. Often we are drawing imaginative inferences from what we known generally about what life was like from more general studies of comparable materials drawn from elsewhere, what sort of commonly accepted rules bloodfeuds were prosecuted under, and so on.

Fletcher's gift was to convey the difficult constrains any historian, especially those who work from such a great distance in time, must work under. The gift is rare. The problem I want to think through is why the gift is rare.

A few of reasons come immediately to mind. The first is that most people are not trained as historians. Those who do pay some small amount of attention to the human past by reading survey texts or specialized studies are more likely to assimilate the historian's conclusions than they are the historian's experience coming to those conclusions. This occurred there and then; or this happened because that happened; but not our lack of certainty on this or that point. The second is that the immense amount of materials published on any one place and time in human history is likely to shore up erroneous assumptions about just how much evidence is available. Readers don't necessarily contemplate the fact that single lines in an ancient text can generate exponential growing amount of commentary, none of which can get around the simple problem of a lack of additional evidence needed to corroborate this or that interpretation. The third is that a majority of people, if they are interested in the past at all, are more likely to be interested in the recent past. And it is precisely in the recent past, especially the very recent past, that we encounter of glut of material evidence.

Put together, these give rise to what I will call an 'empirical fetishism'. For every question, there should in principle be an answer. If there isn't an answer, we allow ourselves to hypothesize about a 'best fit' answer. Empirical fetishism means that our knowledge of the world ought to be a seamless whole. We don't like holes in our seamless whole, so we fill them. Fletcher points out that the village of Wighill has been suggested for the location of Earl Uthred's murder at Wiheal, along with a number of other candidate whose name begins with W. Wikipedia names Wighill as the location of his murder, in fact, but without any comment on the interpretive dilemmas of identifying this particular place with that particular name in that hoary tome. History it seems, like nature, also abhors a vacuum.

Let's not make fun of Wikipedia on this point. They are only doing what most everybody else does in their situation: drawing conclusions, filling in blanks. Because of the impossibility of constructing a consistent account of the whole body of our knowledge about the world and its past. empirical fetishism itself gives rise to perspectivism. Everyone has their own perspective on things. You can think about things in as many ways as you want, of course. The interpretation of the human past, even the immediate past, but especially the distant past, however, often leaves a person with nothing to have a perspective on. That sort of empirical sensitivity is why we need more historians like Richard Fletcher, as it's very easy to assume a perspective on things can replace due attention to the things themselves.

Thursday, June 27, 2013

Religion and Canadian Secularity

The very few people who find their way to this page (like my father; or Tyler, who may find later part of the article interesting because it reflects to very different legal cultures) may be interested to read an essay published by an old classmate and housemate of mine: 'Bringing Religion into Foreign Policy' by Robert Joustra.

The essay is a compressed version of Rob's thesis on the public debate around the establishment of an Office of Religious Freedom in Canada's Department of Foreign Affairs. The Canadian discussion has been divided between two rival versions of secularism: Laïcité and Judeo-Christian Secularism. The former sees secularism as a rejection of a role for religion in politics, while the latter sees a secular political sphere as the special creation of a Judeo-Christian outlook on life. What happens when an office of religious freedom is  charged with monitoring religious freedom around the world as part of Canada's foreign policy? Obviously the two secularisms come to blows. Laïcité secularists gets uncomfortable about the an overlap it thinks shouldn't exist. Judeo-Christian secularists sees oppourtunities to promote their own Judeo-Christian outlook on life. The story is naturally a little more complicated than I just described. Rob does an excellent job detailing the holy mess. He concludes with a quote of Jacques Maritain discussing how the committee drafting the Universal Declaration of Human Rights were able to achieve consensus despite being of very different intellectual persuasions. If any of this interests you, the article is well worth a read.

My quibbles with the article are a little more esoteric. The meaning of terms like religion and secular has changed over the centuries, Rob rightly points out at the beginning of the article, and that's where things begin to fall apart. It's seems that Rob has adopted the problematic phenomenological language of 'neutral' description of 'historical entities'.

Here are a few of the examples:
  • Scholars need to do 'a better job of making sense of a thing called religion'.
  • 'These efforts to engage with religion are motivated by the misguided belief that the inclusion of religion encourages a more peaceable global order.'
  • 'But it does not necessarily follow that there is no historical thing as religion and its freedoms.'
  • 'Only once we clarify these meanings can we decide if we are prepared to truly acknowledge religion’s contested nature in the structure and aims of our foreign policy.'
A few points to note: religion is a thing, religion has the capacity to encourage, religion has a contested nature. Turn these phrases over in your mind a few more times. The more you think about them, the stranger they get. Religion is a 'thing', but I bet no one has ever seen it. Religion encourages peace, which seems to imply that is has a capacity to act like a human being acts. Religion has a contested nature--but then again, so do most invisible entities you can't see or touch, but encourage you to love your neighbour and seek the welfare of your fellow human beings.

While Rob does a really good job dissecting different versions of secularism, he doesn't really clarify what he means by religion. If I was in an uncharitable mood, I might venture to suggest that the initial talk of religion being an essentially contested concept is just a smoke screen to cover over...something. I am not sure what.

Rob himself uses the term religion in two distinct and readily discernible ways. The first is as a way of talking about communities. When politicians want to engage with religion, that usually means engaging with religious communities through their clerical leadership. Whenever we talk about religions in history, more generally, we aren't actually talking about something called religion, but about a profoundly powerful way of organizing communities. The second is the way people think about themselves in relation to other things, other people, and ultimately to the world at large.

Instead of telling his readers how he uses the term religion, why does Rob hide behind postmodern rhetoric about the fluidity of meaning? I mean: anything is better than talking about a spooky something that you can use to make people do things, which is the upshot of calling religion a thing, trying to figure out its uses, what it does, how it works, and so forth.

My suggestion is that Rob is actually a Laïcité secularist. My evidence? Talk about the humanity is incredibly sparse. In Laïcité secularism, religion is something added onto human nature in the course of human history, and can be excised from human nature via the application, for example, of scientific thinking. So the Laïcité secularist talk about religion as if it were an object out there that human being can talk about, but which has no necessary relation to the human being.

And that's exactly what Rob does. As I pointed out above, it's obvious he is talking about religion either as a form of community defined by clerical leadership or a way persons think about themselves in relations to the wider world. But he doesn't draw those conclusions. My best guess why he does not do so is because he thinks about religion as something extraneous to the human being. If he had thought about religion in relation to humanity, he wouldn't be hiding behind the smoke screen of an 'essentially contested concept'--which, I note, is exactly what it seems to be: an idea in a person's head, a way of thinking about things, and maybe even a way of thinking about ourselves in relation the rest of the world.

My conclusion: Rob is practically an atheist. Emphasis on the practically. Cheers, Rob.

Wednesday, June 26, 2013

Catholic Ecumenism

I was reminded of how difficult it is to determine what motivates people to do the things they do while reading an article in the May/June issue of Foreign Affairs on 'The Church Undivided: Benedict's Quest to Bring Christians Back Together'. The author Victor Gaetan does a very fair job describing the Catholic Church's idealism, it's desire to be reunited with old friends and foes alike. He describes Pope Benedict's deliberate steps towards reconciliation with the Lutheran, Anglican, and Eastern Orthodox churches. The perception of a papal misstep in the now infamous Regensberg Address, in which the pope appeared to disparage the Muslim faith, is ably shown to be faulty.

In Gaetan description of Benedict's papacy, something of the spirit of the Renaissance Cardinal Nicholas of Cusa's idea there is only ever one religion for all human beings, even if the rituals are various and sundry. A concluding comment on the tenor of Francis' papacy sees more of the same.

What (for lack of a better term) caused or 'gave rise' to these new efforts towards ecclesiastical reconciliation--and even inter-religion understanding? This is where the narrative gets a bit thin. Gaetan notes two contributing factors to the contemporary rise in ecumenism: the need to respond to the marginalization of religion in a secular age and a shared sense of vulnerability in response to an escalation of violence.

It's at this point I scratch my head. Is that the only two things that could be mentioned? Both are external causes acting on the Christian community, forcing it to react to a new situation. Like the theory of natural selection, they are environmental factors determining the development of the social organism. That means internal motivations, like a common confession or the biblical testimony about the desirability, aren't treated on par with the external causes, which would surprise Benedict and Francis, and their counterparts in the Lutheran, Anglican, and Orthodox communities.

Gaetan's explanation also seems to me to be too narrow. Once the importance of internal factors are discounted, or at least demoted in order of importance, those broad inter-generational shifts that sweep everyone up as they make children collectively doubt the wisdom of their parent,s also get missed. Of course, Gaetan is aware that the age of polarizing ideologies has been over since the Berlin Wall fell in 1989. The story of John Paul II's papacy cannot be told without reference to that epochal event.

But something I have notice while reading works published through the 20th century and growing up in the last two decades is that we no longer take our ideas as seriously as our parents and grandparents once did.  We are no longer idealists in the high modern sense of the word. And this has a number of obvious consequences. We no longer think of communities as wholes to which we belong. We don't love abstractions like humanity or a nation like we used to. The boundaries between us and them are being recast on different social fault lines. Where those will lie is not yet clear. Certainly socio-economic divisions, for example, in North America, will be more pronounced than they were in the middle of the 20th century.

That sort of shift in attitude cannot fail to effect on the broader Christian community. In my own life, I witnessed the bottom fall out of a belligerent indifference to other Christian denominations in the Christian Reformed Church. The more Evangelical among the members found common cause with a wider Evangelical community, while the more intellectually inclined became more sympathetic to the Roman Catholic and Orthodox Churches. Some regretted the loss of theological distinctiveness; but that only confirms my thesis that we no longer take our heady ideas so seriously.

I am not claiming we have stopped thinking, only that we think differently about ourselves. It seems to me the   papacy's push for ecclesiastical reunification makes sense in the context of our mounting disillusionment with old intellectual idols. The election of pope like Francis, a pragmatic servant of the people, is entirely of a piece with the intellectual climate.

Sunday, June 23, 2013

My Discovery of the X-Files

I missed the X-Files in its hey-day. The nine seasons running from 1993 to 2002 corresponded almost exactly with my teenage years. But I was too busy watching Star Trek: TNG, DS9, and Voyager, too busy reading the classics of science fiction and fantasy, or quickly paging through the latest literary addition to the Stars Wars universe. Though the literary quality of the latter, it needs to be said, went quickly downhill after Timothy Zahn's Thrawn trilogy.

For the longest time, the X-Files lay just over my cultural horizon. Until this summer, actually. Netflix offered a free month subscription a week before its installment of a fourth season to the Arrested Development franchise went online. Watching the new episodes took an effort three or four days, which left the greater part of a month on the subscription. The X-Files was on my recommended list. Netflix had followed the path I wandered through its offerings of movies and television shows. By the infinite wisdom of a selection algorithm, I discovered the truth really is out there.

I am now almost through three seasons. The show exercises a strange sort of persuasive power over its viewers. Which speaks to its quality, since it can no longer fly on its innovative cinematographic techniques alone.

The backstory has the US government continually suppressing evidence of extra-terrestrial life. Each episode uses the pretext of an FBI investigation to chase some conspiracy theory down a rabbit hole. The name of the show is taken from the name of a supposed FBI office of investigation. The names of its only two assigned agents, Fox Mulder and Dana Scully, have found their way into a grab-bag of references that help us navigate webs of cultural significance.

I admit I did not quite get the show until the scriptwriters used the latter part of the 2nd and 3rd season to develop Dana Scully's Catholic background. Until that point Mulder's willingness to entertain the strangest of explanations played off against Scully's rabid faith in empirical explanations. Between the two characters, the limitations of methods of scientific study were poked an prodded. The point, I take it, was to show how credulity is an attitude a person takes to the evidence, not something produced by the evidence.

With Scully's Catholic background a possibility for comparison opens up considerably. We discover Scully's willingness to believe the sorts of things faith required--but believe on faith, which means an open, questioning attitude towards the things required by faith.  Whereas Mulder believes the sort of things the Ockham's Razor allows him to believe. In the absence of definitive evidence, the simplest explanation may not be the expected terrestrial answer.

Written through the contrast between the two  is a fairly profound disagreement about the nature of human intelligence. What standard is it measured against? Measure the human intelligence against a divine standard, Scully's anthropocentric convictions follow as a matter of course. But if the divine standard is absent, Mulder's speculative suggestions become much more plausible. With a God above, the human being finds meaning within themselves. Without God, we want to look further afield. 

The genius of the X-Files is to leave the viewers to decide whether what they saw was real. (Yes, I know, aliens all but parade across the television screen.) Even when the existence of actual alien life is all but confirmed with an appearance on screen, the viewer can still take a credible repose in Scully's skepticism. Aliens might instead be the unfortunate victims of genetic experimentation. The suggestions are not always made explicit. They do not need to be. Scully's incredulity makes it possible to question what you believed you saw.

Most of the way through the third season, I am not eager to discover how the show gradually declines into mediocrity after the fifth season. 

Saturday, May 04, 2013

The George W. Bush Presidential Library and Museum

Last week saw the dedication of the George W. Bush Presidential Library and Museum, situated on the campus of Southern Methodist University. The project is the result of one half billion dollars in fundraising. Its dedication was attended by every living president, from James Carter, through a wheelchair bound George H.W. Bush, to a spry, and comparatively young, Barack Obama. For a brief moment, it occupied national and international attention, with most major American news sources adding their particular take to a very well worn story. Here is a sampling of journalistic fare from the Washington Post, Foreign Policy, Mother Jones, the New York Times, and The Atlantic.

Reporters seem to have gravitated towards telling one of two stories. The first looked at the "Obama angle". Yes, the current POTUS was in attendance, and also was able to set aside ideological bickering across parties for a very brief moment. This story is favourable towards Obama, but rides on the idea that the Office of the President stands above the Washington fray. Bush escaped largely unscathed. The second story looked more closely at the content of the library and museum. The library and museum will house around 43,000 artifacts and millions of documents from the 43rd president's tenure. Reporters chomped at the bit of demand for journalistic objectivity to raise obvious questions about whether the history told the library and museum will in any way reflect reality. The Mother Jones article linked to above lists eight things you won't find in the new library and museum facilities--the eighth being evidence of the existence of WMDs of Saddam's Iraq. (Because there wasn't any.) In this second story, Bush assumes the form of an object of scorn.

Of the two stories, I find the second is intrinsically more interesting. With the first, we get to watch the great game being played out in a highly controlled environment. Instead of being allowed to criticize your opponents explicitly, politicians have to score points by appearing to play nice. Obama and Bush are cast as figureheads for much larger trends in American society. Their ability to play nice for a brief moment is indicative of the fading memory of a common destiny for all Americans. Compared to the immediacy of the first story, however, the intrinsic charm of the second story is found in its impoverished rendition of Winston Churchill's audacious claim, 'History will be kind to me for I intend to write it.'  Churchill wrote rather glowing accounts of the role he himself played in WWII in a six-part book series on the conflict. Consider the many hours invested in literary production alongside the achievements of Bush, who has taken up painting. Bush solicited wealthy friends to put up millions of dollars to have someone else do what can only be described as whitewashing a rather tarnished public image. Intellectually lazy only begins to describe this latter-day attempt to resuscitate a legacy.

My own take own the efforts of Bush and friends will be obvious from the tone of the preceding. The outrages of the GOP alumni against historical scholarship, however, interest me more as a illustration of larger problems associated with the interpretation of human history than they do as examples of individual failings. The default assumption, or what can best be described as the common sense way of thinking about things, is that the study of human history gets at something objectively out there waiting to be discovered, in the same way that the fundamental features of matter or new species of animals are out there waiting to be discovered. Hence Bush is confident that the 'facts', maturing with the passage of time, will reflect well on his tenure as president. Once all the facts are known, or have come to light, or what have you, they will show him in a much better light than his detractors are presently willing to admit. Not unsurprisingly, those detractors are convinced the same set of facts, in due course, will prove otherwise.

Which raises the question, What is meant by the term 'fact'? The dedication of a temple of Dubya's prowess raises questions about whether and in what sense the study of human history, or the study of the humanities more generally, is comparable to the natural sciences, like physics, chemistry, or biology. Is the historian's object, say, Bush's tenure as president, objectively available in the same sense of the natural scientist's object is available? This is not something scholars and scholars spend much time fretting about. The construction of the modern university discourages comparison between academic disciplines (even as it encourages something called 'interdisciplinarity'). Both groups can go about their day without giving much thought to where they stand vis-a-vis the other.

The fundamental criteria for factuality are that the object in question be observable such that others can verify what was observed in order to confirm the success of a theoretical framework framework to account for what was observed. The theory of evolution one such framework, within which is organized the relationships between different species of animals--or 'the facts' derived from the observation of fossils and living organisms. On this definition of factuality, Bush's tenure as president fails one of the fundamental criteria of factuality. While there is initial observation of the object, there is no possibility of verification. Bush's tenure is a one-time unrepeatable affair--thankfully.

But so is the evolution of this or that species of organism, would seem to be the obvious objection. That's true; but there's a second consideration will further complicate the comparison The material evidence for evolution and the material evidence for Bush's tenure as president are fundamentally different. Scientists theorize about processes operating in biological materials independently of creative human input. Human beings don't guide the long process that lead to the evolution of human beings. The material evidences for the evolutionary process is there to be studied, theorized about, maybe even interfered with, tweaked, 'improved' upon--but that's all. The historian, on the other hand, studies a body of material evidence that could have no existence apart from creative human input. It is impossible to conceive of all those textual, audio, visual artifacts attesting to Bush's tenure as president arising through non-human agencies, which is what evolutionary processes are. The historian never escapes the circle of humanity.

So I raise a Shakespearean equivalent to the middle finger and call a pox down on all their houses. How Bush's tenure will be judged in the long term won't come down to something called 'the facts', however broadly or narrowly that might be interpreted. Bush and friends need to take a page about the inevitably that future generations will judge you out of a Confucian playbook. Future generations will judge your actions, not against a set of objective facts, but for your humanity. If, to cast it in terms of extremes, you were a tyrant or a dandy, don't expect to be looked favourably upon. If the suffering of the mass of humanity increased under your watch, don't expect to be lauded. It one of the features of the interpretation of human history that the next generation is not likely to agree with your own assessment of yourself, especially if its trumped up way out of proportion. And if the next generation doesn't, then the generation after that will--or the generation after that, and so on, and on, and on, and on.

The lesson is that one does best measuring oneself against one's fellow human beings than against some objective standard or abstract goal. We are all, every one of us, in this together.

Monday, April 22, 2013

Teaching Philosophy

The study of philosophy, or the study of what other people have said that gets categorized under the heading 'philosophy', has the potential to leave one feeling a perpetual student. Expertise can be acquired in a so-called "field" of study. Along the way, however, students will have realized their "field" of study looks less like an actual field and more like a narrowly defined section of a bookshelf in a stuffy library. They will have also realized that one never finishes studying philosophy.

It's an interesting thought experiment to put the shoe on the other foot. What if you had to teach philosophy? Where would you start? Any student can give a non-committal answer questions about what they are studying. They are on their way to knowledge, and, in any case, there is too much to capture is a single phrase. On the hand, a teacher lacks that bohemian luxury. They must say something

The first thing I would do is set aside any latent Socratic inclinations. The Ancient Greek philosopher Plato cast his teacher, the famous Socrates, in the role of questioner, leading his conversation partners to the realization of truths they already knew, but had quite been able to articulate on their own. My approach would include a discussion of a famous philosophical text. We would have never known Socrates, after all, unless Plato of cast him as a character in his philosophical dialogues. Regardless how much talking philosophers do, they are always falling back into texts, each of which is going to preserve a small portion of a tradition of reflection that never quite comes clearly into focus.

More to the point, then, which text would you start with? And let's say, for the sake of argument, it could only be one text. Not one thinker. Not one philosophical school. Not one series of texts. One text.

My choice would be Rene Descartes' Discourse on Method (1637). There are a few reasons why.

First, the text is relatively brief. It contains six chapters in total, all of which can be digested in a single sitting, or over a number of sittings without much difficulty..

Second, the purpose of the text is to take a position on anything in particular, but to introduce the reader into a way of thinking about things. You are invited to follow Descartes on his philosophical journey, to think through why he came to the conclusions that he eventually did.

Third, Descartes' philosophical observations are woven through a personal narrative allowing for historical commentary. That means a teacher can put "flesh" on the bones of the argument, placing arid speculative suggestions in a more recognizable human context.

Fourth, the Discourse contains recognizably contemporary intuitions about the nature of things. It takes the form of a personal narrative. It is playful experimental with the ideas it presents. It wonders about the nature of the human self. And, most importantly, God has been displaced from the center of inquiry. The entire world of human experience is no longer assumed to come from God and return to him. (It still does, of course. Descartes holds God to be the Creator of all things. He is not willing, however, to start with that as the presupposition of his inquiry.)

The basic argument of the Discourse is that all those things Descartes had formerly held to be true he had discovered many reasons to doubt. His experience of violent and destructive discord among different Christian sects during the Thirty Year's War had lead him to seek a more certain basis for knowledge. The revelation of God could not be trusted, as it was mediated by human beings. The same went for the teaching of the schools, by which was meant the abstruse logic-chopping arguments of the late medieval world.

So Descartes resolves to doubt all that can possibly be doubted, and in the process doubts no only what other people have told him, but also what his own eyes and ears "tell" him. He even suggests, for the sake of experiment, it is possible to conceive of oneself existing without a body. After "emptying" his mind of all those doubtful thoughts, Descartes arrives at the one conviction he cannot shake: I think therefore I am--"that is to say, the mind by which I am what I am, is wholly distinct from the body, and is even more easily known than the latter, and is such, that although the latter were not, it would still continue to be all that it is."

All of this is ripe for discussion. Is the format of philosophical dialogue effective in conveying the author's intention? To what extent is one's thinking conditioned by one's lived circumstances? What would it mean to begin one's thinking with God rather than oneself? and vice versa? Is it possible to doubt everything? Can we ever be certain of anything? What might it mean that our minds our completely distinct from our bodies. Does any of this even makes sense?

The closer you look at a text like the Discourse, the more perplexing it becomes. Descartes is perhaps best described as not quite modern enough. Which makes his Discourse the perfect choice to begin teach philosophy.

Does anyone else have other suggestions?

Friday, April 19, 2013

Menial Labour

I am no stranger to what is elsewhere called 'menial' labour. Growing up in rural Ontario, my first jobs were both physical and monotonous. The same tasks had to be performed day in and day out. The jobs were, almost without fail, dirty jobs--especially when I was cleaning things. I was good at these menial jobs. I wasn't great at them. I could perform adequately the tasks required of me, though I was unlikely to perform them expertly or to take much of my own initiative. The sorts of thinking required to see solutions to very rural and/or blue collar problems was not in my possession.

I also have exposure to white collar 'menial' labour. The most recent bit of experience I can cite comes from last night invigilating a chemistry examination. I thought I would try invigilation out this year, so I threw my name into a pool of potential hirees. A single hour and a half training session a week in advance and a 15 minutes pep talk before the exam was supposed to make our tasks straight-forward and obvious. Then a 25+ person team was sent in a number directions, with an examination 'package' in hand, but more or less without support.

Sent to the room with the examination package, complete with examination papers and instructions for their distribution, as well as a half hour to spare, I realized immediately that someone getting paid a lot more than me had failed to assign the necessary second person to the room. Unable to raise my supervisor on the phone, I started prioritizing tasks. The examination 'circulator' eventually made their way to the room that I was in and realized much the same thing. For some reason, though, it was my fault that things weren't getting done the way they were supposed to get done.

A second person was sent to the room, a half hour after the examination had started, which had been delayed by ten minutes. Having been told repeatedly to follow every step on the invigilation instruction sheet, I relished the oppourtunity to cut corners where corners could be cut. It wasn't my fault, you see. I did the best with what I was given. If my best wasn't good enough, don't blame me for doing my best. Blame my superiors for their incompetence.

This most recent experience with white collar 'menial' labour impressed upon me the dreadful impenetrability of bureaucratic structures, in particular that of those in immediate authority above you. The experience also raised some questions, in my mind, about the exercise of authority is so proceeds so differently in a rural and blue collar world from a white collar world (though my observations would also apply to highly structured factory environment).

As I said above, I was a good worker, but not a great worker. Those persons who I worked under, whether that was in farming, landscaping, moving, or construction, seemed to understand as much. I put in long days of work, and only once or twice over the course of a decade remember being belittled for a failure or mistake. More to the point, those persons with whom the responsibility ultimately laid usually went about fixing the mess that I had made without too much complaint. There is a certain inevitably in mistakes, was the guiding sentiment. Try to prevent them, but deal with them as humanely as possible when they do happen.

I was surprised how vigorously my supervisors made it plain to me that their failings were ultimately my responsibility. There is a certain rationale for doing so, of course. In the moment, I am the one who has to perform in order for their program to be put into action. But the bureaucratic structure falls to pieces when those in charge fail to anticipate an obvious problem and also vigorously protest the smallest exercise of independent judgment in the matter. The bosses not only think you are stupid and incompetent. They treat you like it too.

Why the difference between these two sorts of bosses? It may be that what I am describing is merely a function of the size of the organization. But I have to also think it is a consequence of the sorts of materials being worked on. In the rural and blue collared trades, you work with particularly stubborn, resistant, and in every case also non-rational materials. Fields of wheat do not rebel against you, nor skids of lumber and brick talk back at you. Persons assigned to do a specific task, in highly structured, rationalized processes, on the other hand, are expected to comprehend and implement a set of instructions in very short order. They are also instructed not to think for themselves, which, if something should go wrong, has a real potential to allow things to go from bad to worse in a very short order.

So I wonder if facing stubborn non-rational resistance necessarily inculcates a very different sort of response from bosses than does facing the apparent irrationality of menial wage labourer in a highly structured working environment. Why do we expect different from persons than we do from the non-human sorts of materials that we work on? Arguably, human materials are more difficult to shape to our wishes.