About Corazon

Saturday, April 4, 2009

A post-Christian America?

A Brief Synopsis of
Jon Meacham's Latest Article


Jon Meacham, editor of Newsweek and author of the book, American Gospel, has written an interesting piece on the apparent decline amongst Americans who identify themselves as Christians. This comes on the heels of the recent study conducted by the American Religious Identification Survey (an issue I have discussed on my other blog in the past), which revealed a dramatic drop in the percentage of Americans who consider themselves Christian, while at the same time pointing out the obvious increase in those who classify themselves as having "no faith."

In his article, Meacham sites the statistics of the aforementioned survey, along with the opinions and observations of a number of religious leaders, whose concerns over America's Christian piety have grown in light of this and other recent surveys. Meacham quotes Robert Mohler, president of the Southern Baptist Theological Seminary, who stated:
A remarkable culture-shift has taken place around us...The most basic contours of American culture have been radically altered. The so-called Judeo-Christian consensus of the last millennium has given way to a post-modern, post-Christian, post-Western cultural crisis which threatens the very heart of our culture.
For Meacham and others, this "cultural crisis" is a blessing in disguise. Meacham writes:
While we remain a nation decisively shaped by religious faith, our politics and our culture are, in the main, less influenced by movements and arguments of an explicitly Christian character than they were even five years ago. I think this is a good thing—good for our political culture, which, as the American Founders saw, is complex and charged enough without attempting to compel or coerce religious belief or observance. It is good for Christianity, too, in that many Christians are rediscovering the virtues of a separation of church and state that protects what Roger Williams, who founded Rhode Island as a haven for religious dissenters, called "the garden of the church" from "the wilderness of the world." As crucial as religion has been and is to the life of the nation, America's unifying force has never been a specific faith, but a commitment to freedom—not least freedom of conscience. At our best, we single religion out for neither particular help nor particular harm; we have historically treated faith-based arguments as one element among many in the republican sphere of debate and decision. The decline and fall of the modern religious right's notion of a Christian America creates a calmer political environment and, for many believers, may help open the way for a more theologically serious religious life.
But Meacham isn't quite ready to come all the way out of the closet and declare to all the death of Christian America:
Let's be clear: while the percentage of Christians may be shrinking, rumors of the death of Christianity are greatly exaggerated. Being less Christian does not necessarily mean that America is post-Christian. A third of Americans say they are born again; this figure, along with the decline of politically moderate-to liberal mainline Protestants, led the ARIS authors to note that "these trends … suggest a movement towards more conservative beliefs and particularly to a more 'evangelical' outlook among Christians." With rising numbers of Hispanic immigrants bolstering the Roman Catholic Church in America, and given the popularity of Pentecostalism, a rapidly growing Christian milieu in the United States and globally, there is no doubt that the nation remains vibrantly religious—far more so, for instance, than Europe.

Still in the new NEWSWEEK Poll, fewer people now think of the United States as a "Christian nation" than did so when George W. Bush was president (62 percent in 2009 versus 69 percent in 2008). Two thirds of the public (68 percent) now say religion is "losing influence" in American society, while just 19 percent say religion's influence is on the rise. The proportion of Americans who think religion "can answer all or most of today's problems" is now at a historic low of 48 percent. During the Bush 43 and Clinton years, that figure never dropped below 58 percent.
And while these numbers may very well indicate a drop in America's current collective consciousness with regards to religion, it would be foolish, as Meacham points out, to assume that the nation is entering a bona fide "post-Christian" era. As history teaches us, America has been at this crossroads before. Meacham writes:
To be post-Christian has meant different things at different times. In 1886, The Atlantic Monthly described George Eliot as "post-Christian," using the term as a synonym for atheist or agnostic. The broader—and, for our purposes, most relevant—definition is that "post-Christian" characterizes a period of time that follows the decline of the importance of Christianity in a region or society. This use of the phrase first appeared in the 1929 book "America Set Free" by the German philosopher Hermann Keyserling.

The term was popularized during what scholars call the "death of God" movement of the mid-1960s—a movement that is, in its way, still in motion. Drawing from Nietzsche's 19th-century declaration that "God is dead," a group of Protestant theologians held that, essentially, Christianity would have to survive without an orthodox understanding of God. Tom Altizer, a religion professor at Emory University, was a key member of the Godless Christianity movement, and he traces its intellectual roots first to Kierkegaard and then to Nietzsche. For Altizer, a post-Christian era is one in which "both Christianity and religion itself are unshackled from their previous historical grounds." In 1992 the critic Harold Bloom published a book titled "The American Religion: The Emergence of the Post-Christian Nation." In it he cites William James's definition of religion in "The Varieties of Religious Experience": "Religion … shall mean for us the feelings, acts, and experiences of individual men in their solitude, so far as they apprehend themselves to stand in relation to whatever they consider the divine."
To be sure, America's Christian roots are alive and well. Perhaps they have been severely scorched in recent years but the simple fact that roughly 2/3 of Americans still consider themselves Christians constitutes a real dilemma to anyone advocating for a "post-Christian" America in the here and now.

But not all hope is lost for those "post-Christian America" apologists. Near the conclusion of his article, Meacham points out where he believes America's "Christian Nation" movement has gone terribly wrong:
If we apply an Augustinian test of nationhood to ourselves, we find that liberty, not religion, is what holds us together. In "The City of God," Augustine —converted sinner and bishop of Hippo—said that a nation should be defined as "a multitude of rational beings in common agreement as to the objects of their love." What we value most highly—what we collectively love most—is thus the central test of the social contract.

Judging from the broad shape of American life in the first decade of the 21st century, we value individual freedom and free (or largely free) enterprise, and tend to lean toward libertarianism on issues of personal morality. The foundational documents are the Declaration of Independence and the Constitution, not the Hebrew Bible and the New Testament (though there are undeniable connections between them). This way of life is far different from what many overtly conservative Christians would like. But that is the power of the republican system engineered by James Madison at the end of the 18th century: that America would survive in direct relation to its ability to check extremism and preserve maximum personal liberty. Religious believers should welcome this; freedom for one sect means freedom for all sects. As John F. Kennedy said in his address to the Greater
Houston Ministerial Association in 1960: "For while this year it may be a
Catholic against whom the finger of suspicion is pointed, in other years it has
been, and may someday be again, a Jew—or a Quaker—or a Unitarian—or a Baptist … Today I may be the victim—but tomorrow it may be you—until the whole fabric of our harmonious society is ripped."

[...]

"The worst fault of evangelicals in terms of politics over the last 30 years has been an incrediblenaiveté about politics and politicians and parties," says Mohler. "They investedfar too much hope in a political solution to what are transpolitical issues and problems. If we were in a situation that were more European, where the parties differed mostly on traditional political issues rather than moral ones, or if there were more parties, then we would probably have a very different picture. But when abortion and a moral understanding of the human good became associated with one party, Christians had few options politically."

When that party failed to deliver—and it did fail—some in the movement responded by retreating into radicalism, convinced of the wickedness and venality of the political universe that dealt them defeat after defeat. (The same thing happened to many liberals after 1968: infuriated by the conservative mood of the country, the
left reacted angrily and moved ever leftward.)

The columnist Cal Thomas was an early figure in the Moral Majority who came to see the Christian Americanmovement as fatally flawed in theological terms. "No country can be truly 'Christian'," Thomas says. "Only people can. God is above all nations, and, in fact, Isaiah says that 'All nations are to him a drop in the bucket and less than nothing'." Thinking back across the decades, Thomas recalls the hope—and the failure. "We were going through organizing like-minded people to 'return' America to a time of greater morality. Of course, this was to be done through politicians who had a difficult time imposing morality on
themselves!"
So, is America entering a "post-Christian" era? I think not. But at the same time one cannot help but wonder what might be on the horizon, especially in light of these recent trends.

Friday, April 3, 2009

Washington Orders Inoculation

One of the most stirring scenes in HBO's miniseries, John Adams is the inoculation of Abigail and her children. Instead of today's modern inoculation methods (which still cause many to squirm and even pass out), colonial inoculation was much more barbaric. The process involved the cutting of the flesh accompanied by the introduction of the smallpox virus into both the blood and flesh of the patient. As can be imagined, many within the colonial community saw inoculation as an insane method of treatment. There were even debates amongst medical practitioners as to its effectiveness.

What most people don't know when it comes to inoculation during this time period is that General George Washington actually ordered the soldiers of the Continental Army to be inoculated. Washington was a strong supporter of inoculation, believing that the medical procedure would greatly reduce the chances of infection. Though the procedure had many skeptics, Washington firmly believed that the benefits of inoculation far outweighed the risks. In fact, Washington became so paranoid about the spread of smallpox during the early years of the war that he literally became obsessed with inoculating the troops. During the siege of Boston, Washington's concern about the spread of smallpox caused him to issue an order stating that no soldier could enter the city unless he had been infected with smallpox in the past.

Washington's experience with smallpox during his youth was probably the primary determining factor in shaping his opinion on inoculation. During a trip to the Caribbean, Washington was infected with smallpox. In fact, Washington carried a few pockmark scars on his face to remind him of this nearly fatal encounter. His experiences during the French & Indian War had also confirmed to Washington that inoculation was essential for any army. During the war, Washington witnessed several British raids that were unsuccessful, due to the depleted manpower of the British Army.

In his highly acclaimed biography His Excellency, historian Joseph Ellis makes the claim that Washington's decision to inoculate the Continental Army was one of his finest moments:

Washington understood the ravaging implications of a smallpox epidemic within the congested conditions of the encampment, and he regularly quarantined patients that were infected with the virus...And although many educated Americans opposed inoculation, believing that it actually spread the disease, Washington strongly supported it...When historians debate Washington's most consequential decisions as commander in chief, they are almost always arguing about specific battles. A compelling case can be made that his swift response to the smallpox epidemic and to a policy of inoculation was the most important strategic decision of his military career.
In today's modern world we enjoy the benefits of understanding the scientific advancements of modern medicine. In today's world the decision to be inoculated is a "no-brainer" of sorts because of our understanding of infectious diseases. For colonial Americans, however, this was very much a roll of the dice. Fortunately for the Continental Army, Washington was brave enough to take the gamble.

Thursday, April 2, 2009

Large Hadron Collider Rap

One of the coolest inventions of mankind, the Large Hadron Collider (known as CERN) in France and Switzerland will be coming online in the near future. Though I am no physicist and have less than an elementary understanding of particle physics, I have been following the progress of CERN for the past 10 months or so. It is absolutely fascinating what they are doing over there and what they may discover. If all goes to plan we could have a completely new understanding of the universe, the origin of matter, etc. Or if you believe what other "experts" say, the collider could create a black hole and completely destroy Earth and possibly the universe. Sadly, many religious fundamentalists have been actively criticizing what scientists are doing over at CERN. Since many of their ideas go against the biblical teaching of a seven-day creation and the notion that the earth is only a few thousand years old, many religious leaders are crying foul over the scientific discoveries of CERN.

Here is a funny (but pretty accurate) rap done on the CERN particle collider:



Here is a brief explanation of what they are doing at CERN:

Book Review: Remembering Partition

Remembering Partition: Violence, Nationalism and History in India. By Gyanendra Pandey. (New York: Cambridge University Press, 2001. Pp. xiii, 202).


In recent years, most historians have agreed that the partition of British India was a messy and convoluted event that set off a chain reaction of violence, nationalistic uprising and intense political debate. In his highly acclaimed book, Remembering Partition, historian Gyanendra Pandey takes an in depth look at how Indian partition was viewed and understood by different communities within India, and how the “rupture of violence” triggered a ultra-nationalistic movement between opposing communities within former British India.

Pandey’s thesis is made clear right from the start. As he states in his introduction, the book’s purpose is to focus “on a moment of rupture and genocidal violence, marking the termination of one regime and the inauguration of two new ones.” And, “It seeks to investigate what that moment of rupture, and the violent founding of new states claiming the legitimacy of nation-statehood, tells us about the procedures of nationhood, history and particular forms of sociality” (1). In addition, Pandey endeavors to explain how this moment of violence and fervent nationalism caused rival segments of the population, who were formerly under the same British banner, to move in opposition to one another and seek to legitimize their respective claims to national independence.

To set the stage for the impending violence, nationalistic surge and mass migrations to come, Pandey attempts to break down Indian partition into three separate and smaller partitions (24-25). The first of Pandey’s smaller partitions was the Muslim League’s insistence and demand for an independent Pakistani state. As Pandey notes, this was to be a Muslim-majority state free from Hindu influence and control (26). For years, Muslims living under British rule had witnessed the increasing strength and influence that the Hindus had on Congress, and as a result, sought to find their own unique and sovereign state free from the growing Hindu majority.

The second of Pandey’s smaller partitions is the acceptance of Hindu and Sikh leaders to allow the partition/quasi-annexation of the Muslim-majority provinces of Punjab and Bengal. This partition was Both Punjab and Bengal were to be divided with the Muslims controlling one half while the Hindus and Sikhs controlled the other. The division of these Muslim-dominated areas was heated to say the least. Pandey points out that this division essentially se the stage for much of the violence that was to come.

The third and final of Pandey’s partitions, which was also the most important, was the systematic forced removal, massacre, rape, torture and forced conversion of hundreds of thousands of people (35-39). Pandey argues that it was during this stage of partition that nationalistic lines were drawn and allegiance was tested. Violence became the medium through which national pride evolved. It also helped to trigger the mass exodus of people to areas where their respective religion was “accepted.”

Through these mini-partitions, Pandey argues that the partition of India was not a straightforward event where the “keys” were simply handed over from the British in 1947. Instead, partition has a deep cultural and nationalistic history that dates back at least a few years before the actual “transition” of power from the British. National, religious and cultural allegiances had been tested through the fires of violence and forced migration, all of which created a highly tense and volatile period of Indian history.

Throughout the remainder of the book, Pandey attempts to explain how history and historians who have studied Indian partition tended to take a more all-encompassing or macro view of the events leading up, and in their mind, concluding in 1947 (50). For Pandey, this simplistic view of the history of Indian partition ignores important fundamental issues that are unique to the development of Indian nationalism in diverse locations throughout the country. For example, Pandey points out how events in local areas (like Delhi and the Garmukhteshwar) became the “standard” that was then applied to the entire national landscape and historical dialogue by historians who failed to understand that many of these events were highly localized in nature (147).

Along with the misapplication of the local with the national, Pandey also points out that historians have mistakenly misinterpreted what partition meant to the individual. As he states, the violence of partition was partition for many of its participants. A large number of people were forced to either stand defiant to the violence or make huge compromises (like converting to another faith) in order to survive (190). Pandey argues that it was these horrors of the actual people who participated that is left out of the historical record. As a result, Indian partition is seen, by many of its participants, not on the large nationalistic scale, but on the local level where violence, rape, etc. is forever interwoven with partition.

As future historians attempt to dissect the national (and local) story of Indian partition, Pandey’s Remembering Partition will likely serve as an effective barometer by which to judge one’s research. Remembering Partition is an invaluable addition to the historiography of Indian partition that changes the reader’s understanding on an event, which on the surface seems uneventful. By helping to shed light on the true nature of Indian partition, Pandey’s work is likely to stand as a bright beacon on insight on this often misunderstood historical event.