Tuesday, August 12, 2008

The Lion and the Buffalo

In a few paragraphs I'm going to go out on a very unpopular limb.

A Georgian embassy official accused the Russians of “attempting to conquer Georgia yesterday. Ian Traynor in The Guardian suggests that this conflict is a mere cynical power play that to enrich Vladimir Putin’s political strength, and is the result of long antipathy rather than some instant crisis, which Putin compares to “hundreds of years” of apparently benevolent Russian influence in the Caucuses. Zbigniew Brzezinski (whose name I shall not write again) has compared the invasion to Stalin’s invasion of Finland and the Swedish foreign minister compared it to the dismemberment of Czechoslovakia. These sorts of comparisons started as soon as the invasion started late last week. At the time I thought them hasty and irresponsible - almost as irresponsible as every Russian official calling Saakashvili "insane" whenever remotely prompted. A ceasefire has been reached just now, brokered by the egregious M. Sarkozy, that is little more than a rubber stamp on Russia's total victory - if it holds. Georgia is ready to take it, with the knowledge that there is no one there to save them.

I have been thinking a lot about democracy lately. Not just democracy, but liberal democracy, the broader inculcation of the sort of society that is the staple of the modern world (and will be a staple of the postmodern world). It's not an easy subject, but there's one conclusion I have drawn with confidence: democracy is excruciatingly fragile. I remember playing a version of the game Civilisation not that long ago, in which you control and build-up a society from the dawn of history. In it selecting democracy for your nation would confer extraordinary benefits: but it is also tendentiously volatile, collapsing into anarchy if unrest persists more than one turn. Real democracy works not much differently - and so far I am only speaking internally.

Democracy is constantly under threat from predatory regimes. This seems self-evident, but in the case of Georgia it is especially important to understand how. The earliest modern democracies, namely the United States and Great Britain, were countries with some kind of decisive geopolitical advantage. The United States was very far away from the centers of power, and was thus able to create one in the New World; Britain was invulnerable to the huge land armies of autocratic France, Spain and Austria and was able to build up a significant navy (which requires a large population base but a small standing force) precisely because of the prosperty its nascent democratic traditions allowed via a surging middle class. The Dutch maintained a democracy too; but with all of Britain's weaknesses and none of its advantages eventually succumbed, half to invasion and half to the lure of autocracy as a response to itself.

Why? Simple. Democracies tend to be rich. There is a non-trivial correlation between the strength of a democracy and its wealth and prosperity (that is, the amount of money it has and the extent to which its benefits extend throughout society), a correlation which does not work the other way about. (Well, some think so; but I can think of many old, poor dictatorships and very few old, poor democracies). Even if they're not rich democracies are bad influences on an unfree people: they display a contrast, and an unpleasant one. Democracies tend to get better. Dictatorships tend to get worse.

If we accept the idea that dictatorships tend to be unresponsive to the needs of the people and thus promote poverty and erode wealth (a country with a free press doesn't have famines, after all), but we also accept the idea that a dictatorship must at least prevent itself from gaining the unwavering enmity of the people, then we must find a relief valve. Nationalism, that bastard creation of democracy itself, is an easy way out - but "[people X]ians don't allow other [people X]ians to starve in the streets." So how to use nationalism? Direct it against somebody else, a foreign enemy on whom woes can be blamed. Preferably a small country, in land area if not in population, preferably weak, even better if it's rich - or whose people are at least demonstrably better than yours. Otherwise pick one big and outwardly menacing but either distant or superficially powerful (i.e. as Prussia used France).

So democracies were powerful or dead. Given that fact and their unique susceptibility to shifting public opinions they were also paranoid, which inspired much the same imperialistic instinct as did the cloistered rule of the great autocrats. Democracies were, at least, generally clever enough to note that other democracies made poor meals (Poland, for instance, being easier to swallow than to digest) and otherwise were not threats; and so democracies, like authoritarian regimes, tended to spread themselves. As they did - and as authoritarian regimes hurled themselves at the emergent democracies and were defeated each in turn - the world got broadly safer, if less predictable. Smaller nations were able to enter the stage and establish a credible existence. In 1900 there were only 1/3 the number of nations as in 2000; and a much larger percentage of those countries were capable of significant power projection or under the protection of a larger entity. The triumph of democracy, despite the menace of the Soviet Union, allowed peoples around their world to have a chance at a destiny. (Though in previous writings I have taken issue with the wisdom of their choice, they have at least had the choice; given the events in Georgia I hold to that point).

And I come to Russia. There is no better example of how feeble democracy and how seductive authoritarianism truly is than that of Russia, where the influence of an doddering and suicidal drunk was enough to snuff out the democracy he had tried to create just five years earlier. Make no mistake that it was gone by the time Putin arrived; Putin was just better at his job. And now he's in a new one, acclimating to a change of position (if not one of influence). The need for a renewed demonstration of power is necessary to prove to everyone (not least the Russian people) where control still lies, and why Russia is to be considered a "great power." Unfortunately Chechnya was no longer suitable as a whipping boy - it had already been eviscerated. What was really needed was some minor blunder. Democracy, unfortunately, is uniquely susceptible to such tiny mistakes. Saakashvili made that mistake, and at a uniquely poor time, out of ham-handedness rather than strategem. Putin and Medvedev got their chance and now everybody knows who is in charge.

Democracy is fragile - and democracy has always relied for defense less upon itself than upon the aid of other democracies. It was so in the Peloponnesian Wars and it was so in the Second World War. It remains so today. The clash of powers today unfolds much like the Battle at Kruger: the few, sinister autocrats are lions, waiting constantly to prey on the weakness of others. The many democracies are buffalo: devastatingly powerful in a group but blithely unaware of it, vulnerable and skittish and prone to bad decisions individually. Georgia is surely not yet a mature democracy; they have a long way to go, and the process is never fully complete. But that does not make them less worthy of help - indeed it makes us more culpable in her plight. America's idiotic missile shield (perhaps no longer so idiotic) and our head fake about NATO membership was the crucial hesitation in the face of the Russian lion - and now Georgia, a child among nations, is stricken and on the verge of being slain. We listen to them cry out in fear and pain, and we demand that Putin halt and show proper restraint and for God's sake be reasonable, as a lion would find it reasonable to give up dinner for the sake of cross-species equity. He is not reasonable, not by liberal democratic standards, because he has no interest in peace. He is not one of us. He is trying to devour us. And if we sit idly by she will grow stronger and come back for a bigger meal, sooner or later.

And now the limb. It must not be allowed. Putin should be forced to withdraw from Georgia - all of it - and if he does not he should be tossed in the air by our collective might. War in such a situation is not an option but a necessity and the risks are infinitely worth it and infinitely right. We do not leave our own behind. Not when we can still hear Georgia's cries for help, not when it needn't be so - not when it means that we are next.

Friday, March 14, 2008

Westphalian syndrome

I have been trying my hardest to write a post about terrorism. I've not been able to work the intro quite right - a number of things have inspired the topic - but something remains missing. Perhaps a conclusion? I am still not certain what I think of the subject. I will get back to it, eventually.

But for now, you may have noticed recently that the postwar cascade of countries has taken on its next microstate, the new 'nation' of Kosovo. It is the latest in a series of emergent nations - slowed, but not stopped, by the turn of the millenium and the advent of the global new order (I do not necessarily say that in a conspiratorial sense, mind; its architects take no pains to keep themselves or their objectives hidden anyway). Lately East Timor, Montenegro and now Kosovo have taken a big step forward in their quest for...

Well, in their quest.

I remember vaguely an article in Time, roundabout the end of 1999, which asked if by 2100 the very concept of the nation would even be meaningful anymore - and if it was, if nations would be so small and fractious as to be irrelevant. As Kosovo joins, haltingly, the community of world states, I think that the question grows in legitimacy. With the notable exception of the United States, which (until fairly recently, anyway) possessed an homogenous and well-assimilated population, the tale of the nation-state has invariably been a story of the creation of immutable and intransitive borders to bracket the greatest mass of a certain people, by which they would then realize "self-rule."

It was with the greatest and most unparalleled shock, after the First World War and into the present day, that the world discovered what a silly idea this was. Ethnic strife had long been a feature of European politics, especially amongst the Austro-Hungarians and Ottomans; the first world war touched off as a result of it. But for Hitler the problem would have confronted policymakers long before the time it finally did, but in the shape of a long slow burn rather than the heinous convulsions of the postwar era. But Hitler did want to take over Europe (despite what the British Foreign Office believed), and the resulting struggle ruined not only Germany but most of the colonizing powers. Of the five - Britain, France, Belgium, the Netherlands and Portugal - only Portugal was unscathed by the war, and in tandem with a military dictatorship managed to hold on to their colonies into the mid-1970s. The rest, shattered and bankrupted by six years of Clausewitz's wet dream, began tossing off most of their colonies just as soon as they proved they could walk. (Obviously, with a number of exceptions.)

Here the problems started. The ironic thing is that in Africa, for instance, British and French diplomats were less worried about the country's relationship with the Soviets than with their opposite number in the Entente Cordiale. That agreement, which itself emerged as a result of the scramble for Africa, completed the partitioning of the continent and solidified the borders as they were today. Despite that new cordiality, when the time came for the postcolonial nations to make their own destinies there were few cases in which collaboration between nations resulted in a rational national border encompassing a group or groups that might properly be called (and might properly consider themselves) a nation. In fact I cannot think of one. In the best cases nations emerged shaped by the contours of certain navigable rivers or other geographic quirks (often ones that determined colonial ownership in the first place). More often they were simply lines on a map inspired by the whim of imperial history or, worse still, sheer caprice.

The next fifty years would serve to amply demonstrate the folly of the postcolonial nation-state. In Europe, where nation-states had developed only over long and bloody centuries, there were still constant difficulties over this diaspora or that, here or there; but at least there was a general settlement over the political situation. (Not enough, surely, to prevent Poland and Germany being shifted 100 km. westward; but a stability, to be sure.) Most of Africa and Asia inherited none of this stability; in few places had there been what might properly be called a nation-state at any time in history, and a concept a thousand years in the making could not simply be projected across the world and multifarious political cultures. Chaos inevitably resulted. The strongmen and warlords who seized power in the postcolonial regimes were the unavoidable result of independence (though no cause for independence to have been withheld - in few cases could an argument hold up that a particular place was somehow unready, though you could make a case here or there) . The new nation-states, however, cobbled together with only their name and European recognition to sustain them, legitimated and amplified these struggles by ensuring that the only way to live in security was to possess political power for yourself.

And yet after this blood-letting - after Sierra Leone and the African War, after Uganda and Bosnia, after East Timor and India and Pakistan (over and over again), not only is political statism not seen as outmoded but, indeed, it remains the future. Now every ethnic group with a historical gripe or a modern grievance believes that it, too, not only has the right for a nation but the need for one. The fundamental error of basing political representation for groups around geography has not only not been realized, it has been enthusiastically embraced amongst the political elite in the first world. And why shouldn't it? It is a crucial part of the continued dominance of the first world by the third.

This seems counterintuitive, that "independence" should somehow more deeply shackle a place to outside rulers than occupation does. But it makes sense. Take the example of the British or French. The British and French did not take over Africa because they particularly cared for it - rather they feared simply that the other would. As Voltaire quipped about Poland, Africa was easier swallowed than digested - eventually the cost of defending and administering these new lands became clear and colonialism became in many places a subsidy. Being freed of them not only saved Britain and France money by divorcing them of their obligations - a crucial point, as a nation-state by definition is freed from obligations to or from another save those entered into explcitly - it opened new diplomatic opportunities. Rather than ruling, say, Benin explicitly, and incurring perhaps a cost of millions of francs per year in administering it, the granting of independence allowed administration to be turned into "foreign aid" which was then hung like a knife to the throat of the newly "independent" nations.

In turn, the new nations would be quickly shuffled into every international organization under the sun, not so their voices could be heard but so that France or Britain could bolster their voting blocs. Incidentally, in this way an organization like the UN can be rendered toothless - the bigger it is the less capable it is of international consensus, and the less likely it is to do anything. And the third world is essentially enslaved to this system, because by divorcing itself from its obligations there the first world can instead make use of its own former neglect to continue to indirectly exert its authority. Independence divorces a nation like Britain from the responsibility they would otherwise have incurred to improve the lot of the peoples in their colonies, and instead allows them to extract concessions from local leaders (more often than not comically-weak democratic opportunists or local strongmen) using as leverage the same money and assistance they should be providing anyway. At the same time it very roughly pushes a people - heterogenous and often with greater animus to other local ethnic groups than to the colonial power - into the very harsh world of Westphalia, and opens the door to any manner of exploitation by companies and nations alike all too happy to exploit the tenuous economic situation of such countries by extracting from it natural resources or unfinished goods for completion and export abroad.

Your initial reaction to this argument might be that it is simply a modified white man's burden. On the contrary, my argument is precisely that the system of nation-states means that the independence granted in its form is false and fleeting and simply a way for colonizers to get out of paying what is owed. True independence would be that colonization had never happened. It ought never have happened - but it did. There should have been the assumption that if you break it, you've bought it, and that a responsibility then existed to invest your own resources to improve the peoples on whose peace you've thoughtlessly trampled. It is not about obligation but rather reciprocity for the result of colonialism.

The second reaction is stronger, and expands the debate into the present day. The argument here is less about colonialism (though that still plays a role) than about the benefits of self-rule to a people. This is that independence is more conducive to a responsive, responsible government than that provided by an occupying or distant power, however legitimate its rule. Along with this is the Amartya Sen principle - that there are not famines (and a host of other social ills) in a rigorous democracy with a free press. This is all true, and local rule is generally better than distant rule (India, certainly, would be a textbook example of that). But unfortunately the world gives scant leeway to efficient generalities; the fact is that few governments in the third world possess the resources necessary for responsive rule, if they do not give way to petty dictators first. Despite positive advances in the past few decades, the prognosis for responsible government in the third world (responsible to the people, not in a general sense) does not appear good. Even if it was better, the emergent nations like Kosovo, Montenegro, East Timor &c. generally have small populations and few resources, with even less in the way of advanced economic capabilities. Building them takes time; unemployment and resource exploitation is immediate.

This gets into the question of viability. Advocates for the independence of new states generally think little or nothing about the economic viability of their new country. This is often so because their principle concern is for the power that independence would give their elite, or if they are more romantic some misguided historical and cultural ideal; these are worth the price to separatists even if it comes at the expense of a better deal in their current status. I would wish to deal with that in two parts in order to expand my argument to the first world. Vocal minorities in the latter tend to be subsidized by national governments wishing to avoid sectarianism or secession; hence Scotland is comparatively overfunded by the London government, which is why the Scottish Nationalists are able to profligately spend money without worrying about such pesky issues as defence (or foreign aid!). At the same time you're brought to the perverse place where many English want Scotland to go, under the perception that the Scots are acting as freeloaders and receiving more for it (indeed, the Scots and Welsh are better represented at every level than the English, though more government is not necessarily better government). Ask the Scots what they'll rely upon for funding without the expansive tax base of southern England, and their talk invariably turns to oil. More on that in a moment.

This is not necessarily so in the third world; the Kurds, for example, (or the Kosovars for that matter) can hardly be said to be the recipients of much special treatment and have legitimate goals to which independence might lead. They are more legitimate candidates for independence in that such a move might immediately and concretely improve their lot. However, they risk falling into the same aid and trade traps as do decolonized nations that I discussed above; this is at least partly why Russia and China have been so vociferously opposed to the independence of Kosovo, fearing it will be yet another vote for the West in the UN (among others). Indeed they do risk becoming puppets to a stronger regional power.

Worse, economic viability really must be a factor in these calculations: their people will not necessarily be better off, especially if GDP per capita decreases through independence and there is no sound basis by which it can be restored. It doesn't really surprise one that independence movements from the Acehnese to the Kurds to the Scots always emphasize a single exportable primary good (almost always oil). Because, this day in age, you really want most to have a country whose economy is based on oil as a long-term strategy. Even oil companies are beginning to move away from oil as a long-term commodity; and in any event they do not make money off the oil but rather upon its refinement into secondary goods, something that the Kurds will not be able to do to their advantage. And even if you can secure independence on the basis of a natural resource, how long will that give you? Rarely is a plan articulated past that point to establish the economic viability of a new nation.

I risk limiting independence here to only wealthy, economically-prosperous and yet aggrieved nations. I assure you this is not my point. The criteria for "independence" is always that it is the desire of a certain polity; international recognition is only an adjunct to that, albeit a critical one. What I do here is emphasize the many problems associated with that concept, especially the fact that independence is not really what it purports to be at all. There should be a more rigorous standard for admission into the community of nations, especially where it would involve letting a colonial or occupying power off the hook for its misrule. I have not outlined the generally negative impact of the nation-state; suffice it to say I believe in it, and I think the arguments above can be extrapolated to that end quite easily.

What could be done? Well, I am enthusiastic about a few developments. The first is multinational political groupings. The African Union immediately springs to mind. The AU has been muscular, forceful and broadly effective on issues like Darfur and Zimbabwe, despite its relative youth as an organization (I believe it is no more than six years old). Multinationalism is the only way to engage these problems brought on by transnational ethnic groupings and rivalries, and the only effective way to resist deprivations by other powerful national and multinational groups. Part of the reason corporations are so strong is that they are more monolithic than governments and are encumbered by no borders. For the third world to resist they must learn to mimic some of these features. Connected to this is the evolution of existing national groupings based on shared history, such as the British Commonwealth. Initially it was essentially a mouthpiece for British diplomacy; but now it possesses a more independent spirit as a body and is not leashed to London's beckon call. That is not to say the organization is a lasting or universally positive one; but Sierra Leone proved that it is not always bad to have a great power watching your back, however belatedly. In any event it is better than an organization like NATO or SEATO, which is self-evidently military and coercive.

Eventually, nation-states have got to begin to evolve beyond themselves. Colonialism cannot be undone; neither can de-colonization. Both were evils in their own ways, the former for what it was and the latter for how it was done. But it is done and no amount of foreign aid will make right what transpired. What is required is political and diplomatic innovation, a fundamental redefinition of what sovereignty itself means. I can't speak to what it should be, though I think a move away from the geographical and spatial and towards the social, societal and cultural is where it will and should go. I think it will happen, too, simply of necessity - the nation-state is outmoded and has far outlasted its usefulness, and a combination of multinational corporations and advances in global communications mean that the nation-state will either evolve or it will go under. My fear now is that it will evolve too late for the many nations that Westphalia left behind.

Sunday, February 17, 2008

Going Up?

I have changed the title of this blog; that is the web address. I do so as a public service. Trattoria Italiana is a restaurant in the District of Columbia that has probably been closed by a health violation by now. If not, I issue a warning to you all: don't eat here. You'll have better luck getting Italian food at Oktoberfest. Or, for that matter, at a Chef Boyardee factory.

That having been said, I want to talk about elevators.

I have spent a considerable amount of time on or waiting for elevators, as have most urban dwellers in the US (and I would imagine Western Europe). This has been in a number of contexts - both in residences (be they dormitories, apartments or condominia) and in commercial or academic buildings. Over time I started to notice the oddest little behaviors associated with elevators and elevator travel. People have a system. It's thoroughly bizarre.

There is a geography to elevators. Watch it the next time you enter an elevator with more than one person. The principal position is standing either to the left or right, just behind the doors. Whether the position is on one side or the other depends upon where the buttons are; the first person on an elevator will invariably take up a dominating position over the buttons.

The second person on an elevator will usually - though not universally - move to the rear. Regardless they will stand on the opposite side of the Button Person. How they deal with the Button Person is largely a matter of personal preference and the druthers of the Button Person themselves. A courteous Button Person will ask another entrant which floor they would prefer; the more surly ones do not. The Second Person then has the unenviable choice of asking for a floor or invading the space of the Button Person and pressing their floor themselves, drawing an angry glare from the Button Person.

As more people board, the elevator fills at the back and sides (the side away from the Button Person first) and then from there to the doors. As soon as a Button Person deboards - even if there's only one other person on the elevator - another will immediately take their place and in that role repress their own floor (and perhaps every other selected floor) and/or mash the "Door Close" button repeatedly until the recalcitrant elevator obeys. There is no situation in which the buttons are left unattended, and there is no situation in which an elevator with more than one occupant has each floor called only one. It simply does not happen. You could probably quantify it: for every n people aboard an elevator heading for a given floor, each button will be pressed n-1 times more than necessary. You could add x as an obnoxiousness multiplier: n people will press the button n-1 more times than are necessary multiplied by x, which is the sum total of every irritation, lost minute or missed deadline suffered by the group of people that day.

Obviously that's a very simple equation. You can design your own.

Not everyone can use an elevator, either. Not at all. There seems to be a sliding scale in the social acceptability of elevator usage. In any building where there is a gap of more than two floors between the ground and the top, people going to the closest floors to the ground are expected not to board if there are others waiting. I get dirty looks taking the elevator from the first floor to the second in my building - where the highest floor is third (incidentally, the people giving me the looks always walk up to the buttons and mash '3' as soon as I step off the elevator). When I lived in an eight storey building (a ground floor, a basement and six floors above), I was resident on the third floor. I could not count on both hands the number of times I was given shit for taking the elevator to the third floor.

And I would follow certain rules, not always consciously: I would rarely take the elevator down. I would not take it at all at rush hour. I would not take the elevator if someone I didn't know was waiting or if I did not think I could board it and leave before a person I didn't know could reach it. There was clear glass between the elevator banks and the doors at the front desk, around which people would have to curve to get to the elevator - I became expert at timing the elevator so I could get away before someone from, say, the seventh floor could reach me and begin to bitch. And despite my irritation that I was denied full elevator rights, though I paid the same as everybody else, I always slagged off anyone from the second floor (or worse, the basement) who tried to take the elevator.

Usually an irritated glare would do. Sometimes others weren't so shy. I have been verbally attacked for taking the elevator up a mere two floors, and I'm not the only one. It's really an epidemic of social violence, but it is regrettably outside the scope of this examination. Needless to say somebody should do something about that.

Elevator decorum where boarding and deboarding are concerned seem to follow the same procedures as do buses or subway trains. People get off first and then get on; but the convention is weak and distinctly American. I discovered this, for instance, which details the same behavior in China. One comment describes boarding a subway as "something like the offensive and defensive line, right after the ball is snapped in real-style football." His description almost reminds me of Thomas Hobbes. It was almost gloating, too: comment after comment smirked about how small and slightly-built the Chinese were, and how easy it was for an average Westerner to excel at their door game.

Meanwhile, an article by a Westerner living in Azerbaijan enlightened me to unique breaches of American elevator custom there. Apparently there it is both common and acceptable to hit the down button when you want to go up, if the descending elevator is closer. I don't know if it has ever occurred to an American to game the system this way; it certainly didn't occur to me. (The same search, incidentally, found me this website; it seems... completely egregious, and unlike my project is not explanatory but normative. I am not here to tell you how to take your elevators.)

There was a brief comment about gendered elevator etiquette at a website entitled, ironically, "Nasty, Brutish and Short." That and the Azerbaijan article both talk about the appropriateness of letting women off the elevator first, unless a crowd prevents it. I have never heard of this in life, and frankly I think it quite sexist (this from a man who has been holding doors for all and sundry since he turned ten). I have never observed it in my life, either, or observed any obvious or subtle social cues that women should be let out first. I checked out the blog's about page; the author is from Cincinatti, Ohio, so perhaps there is a midwestern gentility I'm missing out on (I don't know many midwesterners - if you are one, have you heard of this?). Alternatively they might just be inspired by their famous forbear.

I thought about it today: a little metal box, soaring high in the air, no way out except those entrances and exits which are carefully prescribed long in advance by forces far beyond your control. Different customs and behaviors depending upon where you came from and how you were raised and where you are and yet you can't just get away from the ones that are unfamiliar - you're stuck. Certain people (often the ones who got their first) are always in control, and they behave as though it is natural that this is so - and there is a fairly rigid system of behavior that requires the utmost reaction should it be violated. Some fellow travellers are polite and benevolent, some are angry and bitter, and some are just completely lost. And the entire trip - the entire affair - is just an unenviable, undesirable way station, a brief utility in the service a bigger and broader goal.

Elevators. Who knew?

Monday, December 31, 2007

New Year's False Start

I apologize in the delay behind my reflections on the new year, my greater-than-or-equal-to zero readers. It is perhaps better this way, as I take a very dim view to that holiday; by now you've had your fun and nothing I say about it will be able to ruin things. Let me say at the very beginning that I have no idea why we have this holiday when we do. I'm not even sure why we've got it at all.

You know the Romans did not originally have months during the winter. There was something on the order of sixty days that fell after the final month, then December, which were considered some sort of flux time. Given the barrenness of winter this makes sense - originally the principal utility of a calendar was that it aided farmers to know when to plant and harvest, none of which goes on during (at least) two months of the winter.

One of the ancient kings was supposed to have added two extra months to the end of the calendar, these being (as they now are) January and February. Why it would be necessary to suddenly define all of this otherwise uncategorized time I don't know; there is little left from the era to suggest it was for administrative purposes. (The King that was said to add the two winter months, Numa, was known also for creating some of the first rigorous systems of Roman law. It may simply have been that this disorder was not to his liking - this assuming he even existed, as the "seven Kings of Rome" would have to have had reigns of something like sixty years each unless there were interregna or more Kings).

The problem, it seems, was that the Roman lunar calendar did not line up perfectly with the rotation of the Earth around the sun, the latter being more crucial to the agricultural purpose of the original calendars. Apparently the solution would be to occasionally chuck extra days into the year, and the first day of the new year was moved from the ides of March (the ides being generally towards the middle of the month and supposed to correspond to a full moon; this was either the 13th or 15th) to the Kalends of January (that is the first day of that month).

This did not universally continue; time has always been a national political thing, and in medieval Britain during the regime of the Julian calendar (which came down directly from the reforms of Caesar) New Year was on the Feast of the Annunciation, colloquially called "Lady Day," which fell annually on 25 March. This continued through to the year 1752, in which the English new year (along with the English calendar) adopted the Gregorian system.

I do not understand why it is that the beginning of the year was shifted from the middle of March to the beginning of January. I can only assume it was administrative; but why that day is better than any other is beyond me. It seems logical that rather than bisect a season (and the bleakest and most miserable one at that) one would set the beginning of the year as the conclusion of one season and the beginning of the next. The ides of March were always supposed to be quite auspicious (the murder of Caesar notwithstanding - sic semper tyranus), and in addition to being the beginning of the agricultural year spring is both viscerally and metaphorically the most bright and optimistic of seasons - the new beginning of the natural world. Naturally I suggest we move it back, and as soon as I have obtained unchecked power over the peoples of the Earth it will surely be so. Except I'll work something out for the southern hemisphere. Maybe October.

As much as I'd like to think that changing the logical basis of the new year (or introducing one) would necessarily induce an end to the maudlin hand-wringing that accompanies it, I'm not holding my breath. Academic discussions aside, people require landmarks to punctuate the otherwise endless and ruthless flow of time, and to take an opportunity to reassess who they are and where they're going. Sadly, the inspiration provided by an easily-digestible chunk of time appears to exclude most real reflection. A perfunctory Google search turned up a few things of immediate interest: one was usa.gov, which I did not know existed despite having taken an entire class on the government and the Internet. (My education is money well-spent, no?) Of the thirteen "top New Year's resolutions" they listed, no doubt the result of a detailed statistical analysis, all but one were thoroughly self-involved and most were so nebulous that they're impossible to really justify (and therefore to uphold). This from our government, mind you. Hopefully a very very small part thereof.

The second was an article from the Huffington Post, a site I read occasionally as it caters to my smarmy leftish tendencies. This particular piece was by Nora Ephron, who I find all right, and discusses (rhetorically) what her resolutions would be for 2008, given the fact that those she had for 2007 - losing two pounds and cooking some dish I'd never heard of - went bust. I don't necessarily agree or disagree with any of them, though I can't speak to William Kristol's latest sins and I wouldn't send Kiefer Sutherland anywhere on my behalf. Rather what I find telling is the jocular, outlandish focus of the article - and the mocking tone that is all too appropriate to what constitutes our new year's "resolutions." Not to be outdone, I assure you, the University of Maryland provides a handy/ridiculous guide to maintaining your resolutions. I didn't really read it through. I don't think I could bear it at this hour.

There are generally two problems with our hand-wringing new year's behavior. The first and most obvious is that if there were going to be serious resolve to effect serious change in ourselves - or better yet the world - then the date we put them into action wouldn't matter. 1 January is no better a time to resolve than 15 March or 20 September or any other day. Indeed those decisions which most form our lives usually do not come on any convenient or inspirational day. Therefore those decisions we do finally take don't come on our utterly arbitrary holiday - and the ones that come on that holiday end up being as hollow as they are shallow.

The immediate problem is that these resolutions are utterly without thought or basis and so wither like a flower out of soil. Why do you want to lose ten pounds? To improve general health? Fit back into that dress or those pants? To be more attractive, to get dates, to find love? The way weight are done over in popular culture (and the other common fodder for resolutions, like "niceness," smoking, and stress) it would seem that doing this is intrinsically good. It is not, and to "resolve" anything this way - and to do so at a certain moment simply because it's a socially-ordained day - not only makes a mockery of the very idea but sets up such a resolution for failure.

The date probably has no influence in this - in fact the two subjects are probably apples and oranges, the one included to make my observations on the other sound less Chicken Soup for the Student's Soul. But I think that it's just possible that divorced of the wake of three major holidays built around food and profligate expense (and the attendant stress and heartache), and placed in a season itself inextricably associated with rebirth and life, we might be able to take the New Year landmark and the reflection for which we intend it with a bit of gravitas. The pathetic fallacy is what it is; but nice weather really does have a positive influence on moods and the conditions of thought. The basis of a resolution should be the reflection on our flaws, not the immediate desires of the moment.

Failing that, however, we could fix the immediate problem, stop making New Year's resolutions and begin with simple resolutions. Tomorrow is as good as any other day; and there is no bad time to make any positive change, be it momentous or minute. Indeed New Year's may be the worst time of the year to resolve anything: everyone, yourself included, expects you to fail. Belief is usually a prophecy that fulfills itself.

For now, however, I've resolved to lose ten pounds. But don't worry: it's for a good cause. I'm going to donate it to a starving village in the Hollywood Hills.

Sunday, December 23, 2007

Atheist Fundamentalism

Thus far I've learned that I need to be more careful so it doesn't seem like I'm trying to parody myself when I write. So even so far the effort isn't wasted. Let's press on, shall we?

I ran into this article on the BBC website. It's fairly short and I recommend you read it through, but the Cliff's Notes version is that the Archbishop of the Church in Wales released a statement warning of 'atheist fundamentalism' and, of course, expressed concern about its effect on the Christmas holiday. For the life of me I've not been able to come across the full text of his statement, if there is one, so this may simply be a headline-grabbing press statement rather than some sort of proclamation that carries specific religious force.

This is the first I've ever come across the term - and the Right Reverend suggests that it is a "new phenomenon" - but apparently it's popped up here and again in the past couple of years. "Atheist fundamentalism" doesn't seem to really mean anything on the face of it; in fact it seems like an oxymoron. But I think that when you cut right down to it there is, at least, a case to be made for the existence of atheist fundamentalism. After all, you can have fundamentalism if you have a religion - and atheists certainly have that.

That in itself seems weird to say, and in a sense it is - to call something a religion (and indeed a faith) when it is predicated on the vocal rejection of all that seems unfair. But if you ask an atheist what it's all about, and boil it down quite thoroughly, they must invariably admit that atheism is about God. However you define atheism this is a bedrock that will eventually be struck, and to assert that there is no God is not a different statement in quality than to assert any other fact about God, as do the conventionally religious traditions. God is all-loving; God is one; God is many; God is there. God is not there. If you think of religious thought as a tree, with each separate belief setting off onto a separate branch, atheism is one of the very lowest. But it's on the tree.

This isn't quite so cut and dried, of course (if it were I'd have to shut up forever, and how sad that would be - for me). As far as I understand it Buddhism is an 'atheistic' faith; they have no gods, at least not gods like ours, and in any event certainly do not have the idea of God we do. I'm willing to take the obvious cop-out and say that speaking of fundamentalist atheism I mean in the sense of Western atheism; I do not believe the Right Reverend Llandaff was concerned about an epidemic of Buddhist polemics in the article above. Ironically, though, that may be an exception that proves the rule - atheism in the West is not merely a matter of rejection of God anymore, if it ever was. Western atheism is a separate tradition with its forbears and thinkers and dogmas.

It may seem to go without saying that first among these dogmas is a certainty of the lack of a God. But it has to be said precisely because it's so easy to overlook. This excludes atheism from being a mere political position. Agnostics are not atheists, which they would be if atheism was taken to mean simply 'those who do not positively believe in God.' Agnostics are that; but they do not actively assert the absence of a God, as atheists do, and there is an iron bar separation between non-believing and simply abstaining from belief. That separation doesn't exist between non-believing and believing; they are two sides of exactly the same coin. This is especially true since both beliefs operate without factual justification - that is atheists, like believers, take that stance based on faith rather than evidence.

A Western atheist probably won't agree to this (if they would agree to any of it). Oddly enough it is in the nature of all Western religious traditions to be both exclusive and elitist, and atheism is no exception, as the comparison with agnosticism shows. It proves its superiority to other faiths (as they would have it, 'faith' generally) on the following basis: if the primary justification for God is either the well-ordered, mysterious nature of the universe, and science can account (or can plausibly be expected to account at some later date) for these facts without recourse to some 'first cause' or 'primary mover,' then belief in God is neither necessary nor justified. If you're familiar with Carl Sagan's Contact, you'll remember the use of Ockham's Razor to defend atheism - that is, the simplest explanation is generally the right one. Part and parcel of that is that if some element of an explanation is unnecessary, it should be left out. Enter (or exit, more aptly) God.

To take God as a simple metaphysical tool (that is as a device to explain what underlies the observable universe) almost completely misses the point of God, but I admit that there is no other way in which God will speak to an atheist. (Certainly there's not a great deal of revelation at work in someone who rejects the existence of God, if indeed there is revelation in the life of anyone, believer or not.) The problem that atheism still faces is that there has to be some other principle or principles to take the place of God. They may exist, I grant; but science is nowhere near obtaining them. String theory and its related subfields prove that conclusively: the latest developments in physics are not much better-substantiated than the ruminations of the Greeks were. This is not to beat on string theory, far from it; but if science is suddenly so lost even on the number of universes in existence, much less how they come about, whether they persist, how the metauniverse beyond it works or even if any of this is reasonable (and if not why our own universe came about in the first place), it is not terribly convincing to suggest that science will somehow get there given enough time. It may well; but it is not plausible to suggest we're on our way. That being the case God is no more arbitrary (even if it is no less arbitrary) than any other suggestion on offer. God at least has the benefit of surviving the test of time.

This being the case, atheism itself is based at least in part on faith motivated by belief in a dogma, a basic principle about the way the universe is. The fact that that basic principle relates to the divine makes it religious; and all of this makes it possible, at least, that that tradition could lend itself to fundamentalism. When I say this, of course I don't mean that atheists are planning to engage in sectarian violence or anything else; fundamentalists are not necessarily violent. But there is a belief that one has hit upon the absolute truth in fundamentalism, as well as a concurrent intolerance to other points of view; and there are atheists who possess that in spades. I found this article when I attempted to feel out the atheist rebuttal to the charge of fundamentalism; though certainly erudite the author seems to think that his position is some sort of "truth fundamentalism" as opposed to all other fundamentalisms. (Incidentally, the issue which this author is replying to surrounded a conference being put on at Harvard by atheists and agnostics in an attempt to stop the appearance of rising fundamentalism in the atheist community. If you're interested I suggest reading the entire summary there, as the dispute gets a bit technical and the write up is better than any I could offer.)

So atheist fundamentalism is, at least, theoretically possible. But is it actually happening? The targets of the 'atheist fundamentalism' attack appear to be the usual suspects, such as Christopher Hitchens and the ubiquitous Richard Dawkins. If you don't know them, Hitchens is a major public intellectual who is notorious for his book on Mother Theresa which was, shall we say, less than a glowing tribute. Hitchens is out flogging his new book God Is Not Great, whose title at least seems to be an attack on Islam specifically (he, incidentally, is a big believer in Islamofascism). Dawkins carries a bit less political baggage and sits squarely on the left, unlike Hitchens; primarily known for his work in evolutionary biology and genetics (specifically The Selfish Gene), his The God Delusion has been out for some time. Both books have been enormously successful. My interaction with both is limited, as I shy away from polemics and these are certainly that; but it is clear from interviews with both, and indeed from common sense, that this is not simply an attempt to engage in ego masturbation for fellow atheists. This is an attempt to expand the tent; and obviously they and others are of the opinion that it is practically and morally better to be an atheist than a believer. That is, fundamentally, fundamentalism.

So perhaps you can empathize with the Archbishop. I imagine it's easy for him to look at falling church attendance, rates of clerical ordination, and the vastly-changed social mores everywhere in the Western world; and then, when people like Dawkins stick their heads over the parapet, to blast them for their faiths' decline. To be sure, the Archbishop is almost totally off where his specific examples are concerned. The Guardian debunked most of them; the comic nature of the "war on Christmas" in the US, and the cottage industry of literature that inspired, doesn't even bear further examination. By signing on to the overblown tripe about "Winterval" and fake stories about the banning of nativity scenes, he very nearly sinks a real case about an emerging and rather nasty side of atheism. Indeed the write-ups that I've seen elsewhere - apparently I'm behind the curve even four days later - range from the contemptuous to the dismissive. Despite his determination to shoot himself in the foot, though, I think the Right Reverend is onto something.

Whether atheist fundamentalism is justifiable - or even a "bad thing" - is farther than I'm going to go. It's worth saying that though we are used to thinking of fundamentalism as an evil, it may be the sign of a maturing faith tradition, for better or for worse. In the end it is not the doctrines of that particular end of the spectrum that count, but what its adherents do with them; whether they are an excuse to punish non-believers or an inspiration to explain and promote the faith is a choice a person makes, not an ideology. At the very least those susceptible to the charge of fundamentalist atheism have been interacting with the religious rather than ignoring or disdaining them. If this leads people to a more enlightened faith - atheist or not - that may not be a bad thing.

Saturday, December 22, 2007

Bloggito Ergo Sum

I've decided to do the blog thing. I have always hated blog people - the blogging class, that ultrahip, black-rimmed-glasses-wearing, Death-Cab-for-Cutie-listening cabal - but neither they nor I appear to be going anywhere, so I suppose this is my way of making common cause. Anyway I write very little these days, and at present all of that is for myself (and most of it is rubbish), so perhaps the added burden of a potential audience will light a fire under me.

I will explain two things as a preface. The first is the web address for this little project. After spending more than a quarter-hour trying to find something that's not been taken - this page's rather apropos title was taken by a woman called Laura's two-paragraph effort from 2003 - I lost my temper and the result was deemed acceptable. I left it.

The second is the title of this post itself; I thought myself very very clever for thinking of it in a vacuum. I could not resist temptation, however, and googled the phrase. Suffice it to say this particular pun is a bit worn; one of the results was even a spurious (but convincing) etymology for the word 'bloggito.' (Incidentally, it appears that the first-person singular for the word 'cogito' is the same as the infinitive; that is to say to express 'to think' and 'I think' is indistinguishable using that Latin.) The reference, of course, is from Descartes' famous thought experiment from the Discourse. I nevertheless decided to keep the title, as it was witty whether or not I was the first to cogitate it - as it were. It also speaks to what I wanted to address in my maiden speech.

A first post - like any first effort - is an issue of identity. Whatever the task at hand the first thing you in any new environment sets the tone for what you will be seen as and indeed what they will be. It's never just what it is - the first task at a new job, the first crisis under a new head of state - judged as a mere accident of time. It's a statement, whether you like it or lump it. In the case of this entry, I will attempt to make specific what the Hell this is all about, but there's more to it than just what I say. It will set the tone of my effort, to me and to you, my dear reader(s? no, never mind); it will also determine whether you'll ever look at this again. No comment.

So when I say "I'm starting a blog" (or a "blog thing"), what does that mean? This is not a journal and it is not a diary. The differences between the two are not for me to elaborate on, or even if there is a difference - both are explicitly personal, at least in the sense I'm using (obviously I don't intend Hansard- or Congressional Record-style journals). I do not intend to talk about myself anymore than is absolutely necessary to make a point (I have been rather more profuse than I'd have liked here); frankly I am quite boring, and ideally that is not a part of this effort.

That I am putting this up for public perusal rather than simply writing it on my computer and filing it into some bottomless subfolder means that I intend to say something I hope will be of some global value. I am not nor have I ever been of the opinion that a person has the right to a public hearing - morally or practically - simply because they exist. You have to have something to say; otherwise you're wasting everybody's time (or creating the potential for such wasted time) and abusing the public sphere as a result.

Thus I intend as far as possible to avoid bullshit generally, especially the personal or emotional variety (a daunting task for me, I assure you). That says little about what I am intending to do; unfortunately I am hard-pressed to be more specific. The title of the blog is meant to be reflective of the fact that I come into it with no specific intentions. It don't mean to say it will lack a theme - or in some banal way that its lack of a theme will be a theme. One may well come to light, and I may in time come to identify with and operate under it. I may write eighty entries in a row on floor lamps (or in an homage to an old friend, foam padding); but that does not mean the eighty-first will be, or that I am trying to produce the definitive work on floor lamps. A theme may in time emerge, to be sure, but I am not working towards it - themes prove restrictive more often than they prove helpful. There is time enough to identify an essence after the thing actually exists, assuming it holds my interest long enough to get that far.

I will offer the next best thing, however: an example I shall hope to emulate in the person of Michel de Montaigne. His writings were simply an earnest attempt to comprehend everything from the divine to the dirty as best he could. His essays ran the gamut from the problem of certain knowledge to cannibalism, but were always inquisitive and skeptical (perhaps to a fault). Indeed he coined what we know of as the 'essay': Essai is French for "I try." Generations of schoolchildren may hate him for it, but the project is a noble one. I do not necessarily agree with how he wrote or what, but his project is well worthy of that sincerest form of flattery. I am certainly not he, I grant you: but I try. That is my objective insofar as I have one.