About Corazon

Thursday, May 31, 2012

Why America Lost (and Caused) the War of 1812

The United States is a wonderful nation.  The United States may be the greatest nation in the history of history.  It is good to love and revere the United States.  But the United States is not a perfect nation...far from it.  In fact, our history is full of ugly skeletons that we would rather ignore or sweep under the rug.  The War of 1812 happens to be one of those skeletons.

As unpopular as it may be to say, the United States both caused and lost the War of 1812.  It was a horrible war.  A stupid war.  A war of idiocy and greed, and we were to blame.

And it isn't just historians of the modern era who feel this way.  Reality is that the War of 1812 was an incredibly unpopular war in the eyes of those who witnessed it.  In the official congressional declaration of war, the House voted in favor 79-59, while the Senate was 19-13.  This was the closest vote for a declaration of war in American history.  Of the 50,000 slots authorized for the U.S. Army, only 10,000 volunteers came forth.  In many states (particularly in the New England area) people flew the flag at half-mast, closed up shops, and protested in the mob-like fashion that was typical of the early 19th century.  Even Massachusetts Governor Caleb Strong attempted to conduct secret negotiations with England, and suggested that the northern states should secede from the Union.

So if the War of 1812 was so unpopular, why did we fight it in the first place?  The answer is simple: Greed.

At the beginning of the 19th century, the United States was a nation that was beginning to flirt with what would eventually become the doctrine of Manifest Destiny.  The lands to the west seemed like an endless source of wealth, resources and prosperity just waiting to be plucked.  In addition, the lands to the north (Canada) and Florida (which was controlled primarily by Native Americans) were equally as enticing.  For many Americans there was a real sense of entitlement to these lands.    In Congress, influential leaders like Henry Clay (who was Speaker of the House) and John C. Calhoun led a crusade to claim these neighboring lands at whatever the cost.  Having been given the nickname "War Hawks," these congressional leaders ignited a fever for war among the Democratic-Republicans by invoking the "savagery" of the Indians and their rightful claim to neighboring lands. As historian Walter Borneman states in his book 1812: The War That Forged a Nation:
These twin issues of Indian unrest and a lust for additional territory beyond the Great Lakes heated the pot of war sentiment on the western frontier.  Thoughts of quelling Indian influence for good and ousting Great Britain from Canada became the rallying cry for Henry Clay and his close-knit circle of political compatriots who came to be called "war hawks." 
[...] 
Nationalistic in policy, prompt with a dueling pistol when polite discussion failed, the war hawks were the young Turks of the era: too young to remember the devastation of the last war and certain of their invincibility in the next. (Pp. 28-29).
The arrogance of the "war hawks" is one of the most underrated aspects of the War of 1812.  Case in point, Secretary of War William Eustis stated publicly that America would "take Canada without soldiers.  We only have to send officers into the province and the people will rally round our standard."  John C. Calhoun echoed those sentiments when he said that America would win "in four weeks from the time that a declaration of was is heard on our frontier, and the whole of Canada will be on our possession."  Henry Clay arrogantly boasted that "I trust I shall not be deemed presumptuous when I state that I verily believe that the militia of Kentucky are alone competent to place Montreal and Upper Canada at your feet."

Another justification that is regularly cited as a cause for the War of 1812 was the alleged impressment practices of the British Navy.  During the 18th and 19th centuries, it was not uncommon for nations to impress (force) other sailors they encountered to join their fleet. For many Americans, the thought of U.S. naval merchants being obligated to join the British navy via impressment was unacceptable.  But just how prevalent was this practice?  According to Smithsonian historians Tony Horwitz and Brian Wolly, these allegations were greatly sensationalized:
One of the strongest impetuses for declaring war against Great Britain was the impressment of American seamen into the Royal Navy...President James Madison's State Department reported that 6,257 Americans were pressed into service from 1807 through 1812.  But how big a threat was impressment, really? 
"The number of cases which are alleged to have occurred, is both extremely erroneous and exaggerated," wrote Massachusetts Sen. James Lloyd, a Federalist and political rival of Madison's.  Lloyd argued that the president's allies used impressment as "a theme of party clamor, and party odium," and that those citing as a casus belli were "those who have the least knowledge and the smallest interest in the subject." 
Other New England leaders, especially those whit ties to the shipping industry, also doubted the severity of the problem.  Timothy Pickering, the Bay State's other senator, commissioned a study that counted the total number of impressed seamen from Massachusetts and slightly more than 100 and the total number of Americans as just a few hundred.
Needless to say, the notion that impressment was a legitimate cause for war was more the stuff of sensationalized rhetoric than actual fact.

Regardless of the unpopularity and the ridiculous rhetoric, President James Madison and the "War Congress" took the nation into a war that had no legitimate justification.  It was a decision that would come to haunt the United States for a generation.  American forces learned almost from the start that the war wasn't going to be a walk in the park.  Attempts to invade Canada failed in spectacular fashion.  General William Hull, who commanded the primary American invasion of Canada, surrendered his entire army to the British at Detroit without firing a single shot.  Hunger, cold, and the superior forces and tactics of the British had backed General Hull into an impossible corner.  In addition, Canadian (British) citizens proved to not be as willing to join the American cause as had been thought by the War Hawks.  Canadians opposed American forces at virtually every opportunity.

The massive failure to capture Canada was only part of the story of how the U.S. lost the War of 1812.  Throughout the course of the war, British forces systematically dismantled American forces throughout the countryside, leaving towns and communities completely destroyed in their wake.  In 1813, Buffalo and large portions of New York were burned to the ground, while the budding communities of Detroit and Chicago were captured.  In 1814, almost all of Maine was captured by the British, who forced the citizenry to swear an oath of allegiance to King George.  Later that same year, the British conquered Maryland and burned the Capitol city to the ground.  In fact, President James Madison barely made it out of town before the city fell.  In short, the superior forces of the British had virtually strangled the United States to death.

So why did the British stop?  The answer is simple: Napoleon.  Though the British had virtually dominated the war of 1812, they had bigger fish to fry in Europe.  As a result, a petition of peace was issued by the British.  With the threat of an invasion to Boston, Richmond and New Orleans, President Madison and the now subdued War Hawks accepted the invitation to cut their losses and conclude their stupid conflict.  The only saving grace of the Treaty of Ghent was that it restored relations between the two nations to Status Quo Ante Bellum (the state in which things were before the war).  All of the lost lands and cities were restored to the United States and British forces, who were desperately needed in Europe, left without resistance.  In essence, this treaty allowed the United States to call the war a draw, when in reality the war was anything but.  Sure, the United States had a few small victories to call their own but they were largely insignificant.  Oliver Hazard Perry's naval victories had little impact on the overall outcome of the war, just as Andrew Jackson's attack on New Orleans (which came after the treaty of Ghent) was more of a moral victory than anything substantial.  Even the defense of Fort McHenry (where the Star-Spangled Banner was born) was more a survival of a bombardment than an actual victory.  The "rockets red glare" and "bombs bursting in air" reveal that the British onslaught was severe, but fortunately "the flag was still there" at the end of the conflict.  Whew!

And though it is clear that the United States lost the War of 1812, we can take solace in the fact that much good came from the conflict.  This "second war of independence" helped to unite a nation that was still in its infancy.  It gave birth to patriotic symbols like our National Anthem (which didn't become our anthem until 1931), Uncle Sam and Andrew Jackson.  With all of that said, the War of 1812 was an American disaster. It was a war of greed.  A war of pride.  A war of stupidity.  We were lucky that things didn't turn out worse than they did.  I've often wondered why the War of 1812 wasn't given a better name.  Could it be due to the fact that we cannot put a positive spin on its outcome?  On its origins?  What else would we call it?  The War of American Idiocy?  The "Bit Off More Than We Could Chew War?"  It's time that Americans face the facts: the War of 1812 was largely a waste.

Monday, May 28, 2012

The First Memorial Day Celebration

Happy Memorial Day, everyone!

On this day, Americans from all over the nation pay homage to our brave men and women who made the ultimate sacrifice for freedom (and no, that isn't just some cliche thing that we say but is the literal truth).  This is a solemn day of reflection, reverence and remembrance that should inspire every citizen of this nation to be a better and more grateful person.

Most Americans are probably unfamiliar with the history of Memorial Day, a history that dates back quite a ways in our nation's book of remembrance.  Officially, Memorial Day (which was actually called Decoration Day) began in May of 1868, almost immediately following the American Civil War.  General John Logan, national commander of the Grand Army of the Republic, declared May 30th of that year to be a day set aside for the "decoration of graves with flowers for Union and Confederate forces at Arlington National Cemetery...and all other cemeteries of the nation."  This first "Decoration Day" was to remember the high price that the nation had paid in the cause of freedom.

And make no mistake, this first generation of Americans that celebrated "Decoration Day" knew very well the high price of war.  The American Civil war, unlike any American war before or since, gave our nation a front row seat to the carnage of war.  With more than 750,000 dead (more than all other American wars combined) Americans everywhere had cause to mourn.  This massive loss of life was an obvious reality for every American in every corner of the still infant nation.  Celebrating a memorial/decoration day only made good sense.

But the story of General Logan and the first "official" Memorial Day celebration of 1868 was not the precedent-setter for this national holiday that so many have come to accept.  The very first Memorial Day is actually a beautiful (an forgotten) story that deserves recognition.  The story takes place in the city of Charleston, South Carolina, where by the end of the Civil War the town lay in virtual ruins.  The city had been abandoned by White citizens and Confederate troops and was on the verge of surrendering to the Union.  Finally on April 29th, Union forces, including the 21st U.S. Colored Infantry, took the city and accepted the official surrender of Charleston.

Just a couple of days after the official surrender of the city (on May 1 to be exact), thousands of Black Charlestonians, most former slaves, held a series of memorials to those who had paid the ultimate price for their new found freedom.  Scores of Black citizens made their way to Charleston's horse race track, the Washington Race Course and Jockey Club, which had been converted into a prison for Union soldiers.  The conditions in the prison had been horrific, and at least 260 men perished due to disease.  Most of the dead had been hastily buried in mass graves just months prior.  On this day, this group of Black citizens worked tirelessly to see that all of these deceased Union soldiers received the proper burial they deserved.  The grounds of the race track were also repaired, cleansed and given a sense of reverence all to honor a small group of fallen heroes.

This simple act of kindness, in memory of a group of "enemy" soldiers, spawned a massive movement that captured the entire city of Charleston.  As Yale historian David W. Blight points out:
Black Charlestonians in cooperation with white missionaries and teachers, staged an unforgettable parade of 10,000 people on the slaveholders' race course. The symbolic power of the low-country planter aristocracy's horse track (where they had displayed their wealth, leisure, and influence) was not lost on the freedpeople. A New York Tribune correspondent witnessed the event, describing "a procession of friends and mourners as South Carolina and the United States never saw before." 
At 9 am on May 1, the procession stepped off led by three thousand black schoolchildren carrying arm loads of roses and singing "John Brown's Body." The children were followed by several hundred black women with baskets of flowers, wreaths and crosses. Then came black men marching in cadence, followed by contingents of Union infantry and other black and white citizens. As many as possible gathering in the cemetery enclosure; a childrens' choir sang "We'll Rally around the Flag," the "Star-Spangled Banner," and several spirituals before several black ministers read from scripture. No record survives of which biblical passages rung out in the warm spring air, but the spirit of Leviticus 25 was surely present at those burial rites: "for it is the jubilee; it shall be holy unto you… in the year of this jubilee he shall return every man unto his own possession." 
Following the solemn dedication the crowd dispersed into the infield and did what many of us do on Memorial Day: they enjoyed picnics, listened to speeches, and watched soldiers drill. Among the full brigade of Union infantry participating was the famous 54th Massachusetts and the 34th and 104th U.S. Colored Troops, who performed a special double-columned march around the gravesite. The war was over, and Decoration Day had been founded by African Americans in a ritual of remembrance and consecration. The war, they had boldly announced, had been all about the triumph of their emancipation over a slaveholders' republic, and not about state rights, defense of home, nor merely soldiers' valor and sacrifice. 
According to a reminiscence written long after the fact, "several slight disturbances" occurred during the ceremonies on this first Decoration Day, as well as "much harsh talk about the event locally afterward." But a measure of how white Charlestonians suppressed from memory this founding in favor of their own creation of the practice later came fifty-one years afterward, when the president of the Ladies Memorial Association of Charleston received an inquiry about the May 1, 1865 parade. A United Daughters of the Confederacy official from New Orleans wanted to know if it was true that blacks had engaged in such a burial rite. Mrs. S. C. Beckwith responded tersely: "I regret that I was unable to gather any official information in answer to this." In the struggle over memory and meaning in any society, some stories just get lost while others attain mainstream dominance.
We are fortunate to have the history of this first Memorial Day for all to enjoy.  The imagery of Black slaves, reverently and humbly providing a proper burial for Union soldiers, is a reminder of just how precious freedom really is, and the high cost that we are sometimes required to pay for it.  On this Memorial Day, I am grateful to the God of Heaven for the freedoms I enjoy.  God bless this great land that we live in!

Sunday, May 27, 2012

Getting a Dull Shave With Occam's Razor

“The aim of science is to seek the simplest explanation of complex facts. We are apt to fall into the error of thinking that the facts are simple because simplicity is the goal of our quest. The guiding motto in the life of every natural philosopher should be “Seek simplicity and distrust it.” – Alfred North Whitehead

Most people are familiar with the philosophical principle known as Occam's Razor, which suggests that whenever faced with competing hypotheses to a particular problem, the one with the fewest and simplest assumptions is probably best.  Occam's Razor implies that there is an inherant virtue to simplicity, even from a scientific or philosophical perspective, and that by taking a minimalist stance to a given problem the truth can become more clear. Occam's Razor has become a staple for theological skeptics and nominalists who prefer a more deliberate and palpable view of the metaphysical world.  In many respects, Occam's Razor has been wielded as the ultimate dagger against those who put their faith in the intangible.  As actress Jodi Foster demonstrates:



The portrayal of Occam's Razor in the movie Contact is probably the best known allusion to this philosophical principle in modern culture.  In fact, when most people refer to Occam's Razor they usually end up quoting the very lines that Jodi Foster used in the film: "All things being equal, the simplest explanation tends to be the right one."  And though the idea behind Occam's Razor seems simple enough, the reality of Occam's Razor is that it is far from being the Holy Grail to all logical pursuits, and in many respects is an outdated relic of a time gone by.  Of course, by no means am I suggesting that Occam's Razor is completely worthless.  I personally find much to be desired by appealing to simplicity.  However, Occam's Razor, like any blade, has two sides to it.   

**********

The origins of Occam's Razor date all the way back to the early 14th century, when a brilliant man named William of Ockham began to challenge some of the standard orthodoxy of his day.  William was, without question, one of the greatest and most important thinkers of the Middle Ages.  Next only to perhaps Thomas Aquinas, there are few who can claim to have shaped Western philosophy and Christian epistomology more in that era than William of Ockham.  His ideas gave birth to a more deliberate, logical and nominalistic interpretation of philosophy and religion, many of which continue to this day. 

As a member of the Franciscan Order, William had become well-aquainted with the strict orthodoxy that persisted in much of Christianity.  Pious priests and monks had faithfully maintained the status quo with little resistance to the chuch's central teachings.  Most of the faithful had grown accustomed to the ritualized lifestyle of Medeival Catholicism, complete with its emphasis on faithful discipleship through humble acquiesance to heavenly guidance and passive acceptance of Vatican supremacy.  And while William of Ockham had no apparent problem with church authority (he was, after all, a devout Franciscan monk), he did have one basic character flaw: he was a genius.

It didn't take long for William of Ockham to begin questioning and revising some of his personal beliefs. As a man who prided himself upon logic and reason, William took issue with some of the doctrinal aspects of his faith, particularly surrounding the Trinity and the church's growth and dependence upon wealth.  As William himself stated in his now infamous Summa Logicae:
Plurality ought never be posited without necessity.
And:
It is futile to do with more things that which can be done with fewer.
William was never a fan of the convoluted doctrine of the Trinity.  On many occasions, he argued that the Bible nor logic and reason would support such a view.  In addition, it troubled William deeply when he saw the massive expanse of wealth that was being enjoyed by the chief officers of the church in his day.  As a result, William embraced a minimalist view of theology in where logic and reason were seen as tools to purify one's personal faith.  As a result, William of Ockham is often hailed as being the father of Medieval Nominalism.  Needless to say, many of William's ideas landed him in trouble with the church, and eventually led to his excommunication.  But these developments did not change the fact that William's ideas were here to stay...for the long haul.

And even though William of  Ockham's contributions are praised for their reliance upon logic, reason and the pursuit of basic simplicity, it would be wrong to say that he had no room for accepting matters of faith.  As William himself stated:
Only faith gives us access to theological truths. The ways of God are not open to reason, for God has freely chosen to create a world and establish a way of salvation within it apart from any necessary laws that human logic or rationality can uncover.
These don't sound like the words of a man who supposedly believed that the simplest ideas are always the best.  In fact, William of Ockam seemed to be less interested in the ideas of Occam's Razor (the philosophical idea that was named after him) than most people want to believe.  While it is true that he maintained many nominalist ideas, I disagree that William of Ockham was truly a nominalist at heart.  It is presumptuous for us to say that William's dependance upon logic and reason somehow negated his belief in faith and the intangible.  It did not.  Occam's Razor may be based in principle upon many of the teaching of William, but the end substance of this philosophical concept is far from being in harmony with the man whose name it now immortalizes.

William of Ockham would never have foreseen the day when the pursuit of objective reason and logic would somehow be put in conflict with a life of faith.  As a result, I wonder if it is even right for us to call Occam's Razor after William of Ockham.  After all, the phrase didn't come into existence until 1852 by Sir William Hamilton, more than 500 years after William of Ockham's death.  Since that day, Occam's Razor has evolved to become something that William would never have embraced himself.  For scientists and philosophers today, Occam's Razor has been employed as a heuristic (general guiding rule) to guide scientists in the the development of theoretical models, rather than simply being an arbitrary tool between conflicting theories.  In other words, Occam's Razor has become a nearly irrefutable principle of logic that no objective scientist would dare to question.

But the fact of the matter is that Occam's Razor is not a crystal ball to all logic and objectivity.  In fact, there are quite a few problems with this supposed gem of philosophical thought.  The bottom line is that validity of a theory and simplicity are not automatically related.  Whether an idea or a set of facts is littered with complexities or is stripped down to its absolute bare simplicity has no bearing on its veracity.  Sure, the simpler concept may be easier to understand, but it is not inherantly more correct than a complex theory.  The danger of "appealing to simplicity" is that there are many cases in which factual scientific theories and ideas are incredibly complex. The theories behind quantum mechanics and general relativity for example are so complex that appealing to Occam's Razor wouldn't be practical.  As a result, Occam's Razor can become, at times, a logical fallacy.

As one science blogger aptly illustrates, Occam's Razor can regularly fall victim to a number of pitfalls in science:
Occam’s Razor is actually a vestigial remnant of medieval science. It is literally a historical artifact: William of Ockham employed this principle in his own 13th century work on divine omnipotence and other topics “resistant” to scientific methods. The continuing use of parsimony in modern science is an atavistic practice equivalent to a cardiologist resorting to bloodletting when heart medication doesn’t work.

And it is in the life sciences where Occam’s razor cuts most sharply in the wrong direction, for at least three reasons.

1) First, life itself is a fascinating example of nature’s penchant for complexity. If parsimony applies anywhere, it is not here.

2) Second, evolution doesn’t design organisms as an engineer might – instead, organisms carry their evolutionary history along with them, advantages and disadvantages alike (your appendix is the price you pay for all your inherited immunity to disease). Thus life appears to result from a cascading “complexifying” process – an understanding of organisms at the macroscale will be anything but simple.

3) Third, we know that the even the simplest rules of life can give rise to intractable complexity. Unless you’re a biophysicist, the mechanisms at your preferred level of analysis are likely to be incredibly heterogenous and complex, even at their simplest.

[...]
Thus, the utility of Occam’s Razor is highly questionable. Theories which it would soundly eliminate are usually questionable for other reasons, while useful theories might be discarded for a lack of parsimony relative to their over-simplified competitors. The theory which states “height determines weight” can do a reasonable job of providing evidence that seems to support that theory. And it’s highly parsimonious – Ockham would love it! But the theory which says “nutrition, exercise, and a collection of more than 100 genes predict both height and weight” is highly unparsimonious, even though we know it’s better than its competitor theory. Statisticians have quantified the appropriate penalty for various theories based on the number of variables they involve, but the more theoretical modes of quantitative science have yet to catch up.
In other words, Occam's Razor is wonderful for grasping at the low lying fruit that is easy to reach, but offers little in terms of understanding many of the complex realities of the modern world.  Sure, we would all love to have simplicity reign supreme.  It makes life easier.  But sadly, this cannot always be the case.  No matter what Lynyrd Skynyrd has to say on the matter, there really are no "Simple Men."

Or simple solutions to all of life's problems.

Saturday, May 26, 2012

Native American Origins

Challenges to a Long-held
Archaeological Assumption


From a fascinating article in the New York Times, archaeologists are beginning to challenge some of the traditionally accepted explanations for the origins of Homo Sapiens in the Americas:
For many decades, archaeologists have agreed on an explanation known as the Clovis model. The theory holds that about 13,500 years ago, bands of big-game hunters in Asia followed their prey across an exposed ribbon of land linking Siberia and Alaska and found themselves on a vast, unexplored continent. The route back was later blocked by rising sea levels that swamped the land bridge. Those pioneers were the first Americans.

[...]

Over the years, hints surfaced that people might have been in the Americas earlier than the Clovis sites suggest, but the evidence was never solid enough to dislodge the consensus view. In the past five years, however, a number of discoveries have posed major challenges to the Clovis model. Taken together, they are turning our understanding of American prehistory on its head.

The first evidence to raise significant questions about the Clovis model emerged in the late 1970s, when the anthropologist Tom Dillehay came across a prehistoric campsite in southern Chile called Monte Verde. Radiocarbon dating of the site suggested that the first campfires were lighted there, all the way at the southern tip of South America, well before the first Clovis tools were made. Still, Professor Dillehay’s evidence wasn’t enough to persuade scholars to abandon the Clovis model.

But in 2008, that began to change. That year, researchers from the University of Oregon and the University of Copenhagen recovered human DNA from coprolites — preserved human feces — found in a dry cave in eastern Oregon. The coprolites had been deposited 14,000 years ago, suggesting that Professor Dillehay and others may have been right to place humans in the Americas before the Clovis people.

The Clovis model suffered yet another blow last year when Professor Waters announced finding dozens of stone tools along a Texas creekbed. After using a technique that measures the last time the dirt around the stones was exposed to light, Professor Waters concluded, in a paper in Science, that the site was at least 15,000 years old — which would make it the earliest reliably dated site in the Americas.
These remarkable archaeological discoveries are only augmented by the fact that genetic markers in the DNA of modern American Indians and their predominantly Asian forefathers reveal that both shared a common ancestor that lived more than 16,000 years ago, more than 3,000 years before the traditional Clovis land bridge hypothesis. 

So where does this leave us?  It is difficult to say.  There is still much about the Clovis model that is desirable to archaeologists.  With that said, it is crystal clear that we are far from certain when it comes to explaining the ultimate origins of Native American people.  The most likely explanation is that scores of people from all over the world (with Asian colonizers being the obvious dominant party) made their way to the Americas over a very long period of time, much longer and older than we previously have believed.  What is very clear is that these early colonizers were fully developed Homo Sapiens, predominantly from Asia, who made their way to the American continent in a variety of ways.  But, in the end, nobody can say for sure, and these new discoveries actually give us more questions than answers; questions that we will probably never have full answers to either.  Archaeology, particularly ancient American archaeology, has a lot of hurdles to overcome.

Friday, May 18, 2012

"Separate" Does Not Mean "Equal"

Revisiting Plessy v. Ferguson
in the 21st Century

I am always fascinated to hear people today complaining about the Supreme Court.  For whatever reason, it seems as though a large number of Americans these days esteem our Supreme Court as a group of corrupt, disinterested socialites who care more about individual status than about delivering justice.  And while I am certain that some Supreme Court justices of the modern era have given a less-than-stellar performance while in office, I firmly believe that the past 2-3 generations of Americans have been blessed to have (overall) a strong Supreme Court.  Of course, I am not suggesting that our judges (and their decisions) have been perfect.  Far from it.  Mistakes have been made and I am sure that with the 20/20 hindsight of history, future generations will come to question a number of the court decisions made in our day.  With that said, I again maintain that the past couple of generations has been very fortunate to have the justices and court decisions that we have seen. 

Sadly, the same cannot be said of past generations.  During the 19th and early 20th centuries, for example, Americans witnessed first-hand how the decisions of the highest court of our land could utterly devastate a nation and its people. Cases like Elk v. Wilkins in 1884, which essentially stated that Native Americans could not become American citizens and were considered "less human" than Whites.  Or the 1927 case Buck v. Bellwhich granted mental health institutions the right to sterilize the "unfit" and "mentally retarded" for the "protection and health of the state."  And then there is the infamous 1857 case, Dread Scott v. Sandford (in my opinion, the worst Supreme Court decision ever), which essentially held that African American slaves were to be considered as "property" rather than people, and that any fugitive slave was to be returned to his/her rightful "owner" without question.

And today we have the honor (or better put, responsibility) to recognize another shameful decision from our nation's past.  116 years ago today, the Supreme Court rejected the petitions of one Homer Plessy, who years earlier had attempted to travel from New Orleans to Covington, La. on a "White Only" railroad car.  Plessy, who was considered an "octoroon" (someone of seven-eighths Caucasian descent and one-eighth African descent) by his contemporaries, refused to be segregated based on his race and protested the railroad's policy of separating its passengers based on skin color.  Eventually, Plessy was escorted from the train and booked into jail where he began a campaign to eradicate the budding but still infant practice of racial segregation in the South.  Long story short, Plessy's case ended up making it all the way to the Supreme Court in 1896, where sadly his appeals fell on deaf ears.

In what has become one of the most important and atrocious legal cases in American history, Plessy v. Ferguson stated that there was nothing unlawful about a state, business or institution choosing to separate members of different races, so long as they provided the same goods/services to all.  In what became known as the doctrine of Separate but Equal, the Supreme Court ruled that segregation was not in violation of the 14th Amendment (which prohibits local and state governments from depriving its citizens of life, liberty and property without due process) as Mr. Plessy had claimed, but that the railroad company (and anyone else who wanted to follow suit) was completely justified in choosing to keep the races apart from one another.  Needless to say, Plessy v. Ferguson paved the way for extreme racial inequality to once again rear its ugly head in the South.  And though the ruling stipulated that all separate goods/services needed to also be equal, reality is that Southern governments refused to provide anything resembling equality for Blacks.  In short, racial segregation and inequality became standard operating procedure in the South. 

For nearly 60 years, Plessy v. Ferguson and its gospel of "Separate but Equal" kept the South from seeing things in any other way but Black and White.  It wasn't until 1954 and Brown v. Board of Education that the chains of segregation would finally start to come off.  And as we are all aware, the struggle to eradicate segregation from America took more than a Supreme Court decision to accomplish. It was only after decades of petition, protest, blood, hate and pain that the scars of segregation began to fade away (some rightfully maintain that those scars are still visible today).   This was the shameful legacy of Plessy v. Ferguson.

But thankfully we live in a more "civilized" world today...

...right?

After all, we would NEVER think of repeating those painful lessons of "Separate but Equal."

Or would we?
--When we suggest "separate but equal" health care for any patient in need, we are forgetting Plessy v. Ferguson.

--When we implement "separate but equal" laws for illegal immigrants, we are forgetting Plessy v. Ferguson.

--When we demand "separate but equal" schools and/or funding for affluent neighborhoods v. the inner city, we are forgetting Plessy v. Ferguson.

--When we recommend "separate but equal" tax rates for the rich and the poor, we are forgetting Plessy v. Ferguson.

--When we believe in "separate but equal" restrictions for those of a different religion than our own (i.e. the New York mosque), we are forgetting Plessy v. Ferguson.

--When we preach "separate but equal" laws for those in the LGBT community, we are forgetting Plessy v. Ferguson.
In short, whenever we seek to divide humanity because of our perceived differences, we will be sure to reap our own hell.  Life is hard enough.  Why would anyone want to endure it all alone?  Sorry, but you cannot "divide" and "conquer" and the same time.  We don't have the luxury of simply changing the rules for those we don't like and/or understand.  Such an action is the epitome of bigotry. 

Wednesday, May 16, 2012

David Barton Lies About George Washington

Pseudo-historian and Christian Nation Advocate Extraordinaire, David Barton, has been caught in a lie. A bold faced lie to be exact. As a man who prides himself on knowing the "true" history of the American founding, Barton's latest historical faux pas is so blatantly false that it either reveals Barton's woeful ignorance of how to conduct basic historical research, or that he is a flat-out liar.  The following is Barton's latest offense:
 

I know that many of us have seen the "Prayer at Valley Forge" painting and probably find it very inspiring.  And to the citizen who may not be as familiar with American history, I don't blame them for accepting the painting at face value as historical fact.  But for David Barton to do so is unacceptable, and even worse, to preach it as fact is downright shameful.  I have actually blogged about the history of the "Prayer at Valley Forge" in the past, so I won't rehearse the history here.  The bottom line is this: the story of the "Prayer at Valley Forge" is a myth that anyone with half a brain could recognize.  For a "historian" like David Barton to not recognize this reality (or to simply not give a damn about the truth since he knows his audience won't investigate the matter) is reprehensible. 

I have tried to be patient with David Barton.  I have even given him the benefit of the doubt on many occasions.  He has ZERO training as a historian and it shows.  I have justified his ilk by pointing to his desire to link his Christian faith with American history.  It's a flawed but honest endeavor.  But this recent lie (and yes, I am accusing David Barton of lying) is so in-your-face that I think it has become obvious that Barton no longer cares about finding the truth.  Barton is hell-bent on proving his agenda, and he won't allow TRUE history to get in his way.  As a result, I believe it is now time to declare an intellectual jihad on David Barton.  To borrow from the words of historian John Fea (one of my favorite bloggers), "Is it time to gather Christian historians together to sign some kind of formal statement condemning Barton's brand of propaganda and hagiography?"  Hell yes it is time, Dr. Fea.  I hope the historical community will brand this man as the fraud he is...the sooner the better!!!   

Tuesday, May 8, 2012

The Medieval Origins of Capitalism


I've never been a huge fan of economics.  In my opinion, the difference between most economic theories and practices is predominantly one of semantics.  In the end, all systems of exchange can be reduced to their common denominator: the rich get richer while the poor get poorer. No one system is really all that preferable to another (in my opinion). With that said, studying the history and evolution of economics does help to shed light on the changes and advances that have been made in society, and the efforts to even the playing field for all of humanity.

And when it comes to the study of economics, no system is more important to the modern Western world than capitalism.  For many Americans, capitalism is every bit as important of a component to the founding of their nation as is the Declaration of Independence or the Constitution (even though the Founding Fathers never really put capitalism on their radar).  And though there is much to say for the more modern conceptualizations of capitalism (i.e. Adam Smith, Max Weber, etc.) the original origins of capitalism hail back to a time before the "New World" had even been discovered.  

The world that was 14th century Europe was a world in constant flux.  Severe political, religious, social, economic and health problems plagued (literally) the landscape.  These mitigating factors brought with them sweeping tides of change that helped to redefine European society.  For instance, the Black Death, along with the Great Famine of 1315-1317, had ravaged the countryside, claiming at least 1/3 of the populace in the process.  The massive loss of laborers caused a dramatic change to the Manorial and Feudal systems in almost all of Europe.  This lack of laborers created new opportunities for the peasantry to move about and benefit from additional markets.  In addition, the development of newer agricultural technologies revitalized the markets of a suffering Europe.  Eventually, the emergence of Calvinist doctrines, particularly regarding worldly success as a symbol of God's favor, encouraged further growth, all of which gave rise to the earliest embryonic form of capitalism known as Mercantilism.   Needless to say, these advances fit nicely with the discovery of the "New World" in the following century, and eventually evolved to become a staple in the Western world. 

Of course, I am not suggesting that our modern understanding of capitalism existed in the Middle Ages.  Far from it.  But it is fair to say that an infant form of the system was beginning to emerge during the middle part of the 14th century.  Improvements in naval travel helped to augment the trade markets to and from Europe, and increased the demand for goods.  As a result, an emerging class of specialized laborers found themselves having access to a measure of wealth that had never before existed under feudalism.   Skeptics will, of course, point out that improvements in trade and the emergence of new markets don't necessarily equate to capitalism and they are right.  But there is a large body of evidence for commercial activity in the Middle Ages, and particularly in the Mediterranean, which deserves to be recognized for its enterprise and sophistication.  Mediterranean, and particularly Italian, merchants traded in high-value luxury goods, like spices, gems, dyes, and exotic metalwork. And although goods like these had circulated the seas for centuries, the volume and value of this trade increased dramatically in the wake of the struggles of the 14th century.  And it is very unlikely that such an expansion would have occurred under the old systems of manorialism and feudalism, which insisted on being self-reliant and relatively localized in scope.  Therefore, the expansion that took place in the 14th century should be seen as the result of the many social and economic changes that had taken place.



As you can see in the map above, European and Middle Eastern traders were active across a wide swathe of the Mediterranean world. To this end, the major Italian cities established trading colonies, to protect their interests abroad and monopolize the sources of desirable goods. These cities included, Amalfi, Naples, Genoa, and of course, Venice. The merchant-imperialism of these cities went hand in hand with the complex ways of investing and launching trading missions organised by the merchants themselves.  In addition, it was this expansive trade system that eventually allowed Arabian literature, architecture, mathematics, etc. to make their way into the European heartland, thus helping to ignite the Renaissance.  It's not a stretch to suggest that without these advances, Europe may never have had its De Vinci.

 In conclusion, what we can glean from the history and origins of capitalism (or any other economic system for that matter) is that it didn't come into existence overnight.  It took a great deal of time to evolve into what we have today, and frankly, it's still evolving.  Economic systems are static, unchangeable concepts, but rather are fluid and ever-changing.  This is certainly the case with capitalism.  From its birth in the Middle Ages to its existence today as the predominant means of exchange in the Western World today, capitalism has had a long and interesting history.  Will it last?  I have no idea.  As I said at the beginning of this post, I don't believe there is all that much difference between rival economic systems to begin with, but then again, I never lived in Feudal Europe.  

Monday, May 7, 2012

Hydatius: The Medieval World's Doomsday Prognosticator

Human beings have always been fascinated with "doomsday" stories.  For whatever reason, the idea that humanity might come to an end via alien invasion, a killer comet, nuclear war or religious apocalypse has caused almost every generation and civilization to predict where, when and how the end of days might play out.  I have actually blogged about this phenomenon  before.  American culture is full of examples of doomsday practitioners who tailor their rhetoric to invoke the desired reaction from their target audience.  Whether it takes the form of fire and brimstone televangelists, doom and gloom political pundits, awe inspiring Hollywood films or mysterious Mayan predictions, we Americans seem to have a love/hate relationship with all things apocalyptic. 

But we Americans are far from alone in our apparent affinity for the end of days.  Virtually every civilization in every corner of the world has their doomsday stories.  One of my all-time favorites comes out of Medieval Europe, from the 5th century to be specific. 

Along the Iberian Peninsula, in what is today Spain, a man named Hydatius lived a life of faithful devotion to the emerging religion known as Christianity.  In fact, so great was his piety that in 427, Hydatius was made Bishop of Chaves, where he labored extensively to establish the church in that particular part of the Late Roman Empire.  Hydatius had a reputation for rooting out any and all forms of Christian heresy and pagan loyalty.  As a result, his name was revered by many of the chief figureheads of both Rome and the church.

Despite his tenacity and zeal for the work of the church, Hydatius was forced to come to terms with the changing world around him.  The Western Roman Empire was dying a slow, painful death that was only being made worse by the intrusion of northern "barbarian" tribes who were eager to feast of the rotting carcases of the once great empire.  For Hydatius, this reality was an extremely bitter pill to swallow.  Rome, and the church, were the palpable reality of God's kingdom on Earth.  With Alaric's sacking of Rome in 410 still fresh in the minds of many (not to mention the other barbarian incursions and mounting political instability of the Western Roman Empire), the idea that the Roman Empire might disappear completely was a painful future to consider. 

Due in large part to the emergence of the Christian faith along with the rapidly approaching demise of the Roman Empire, men like Hydatius were quick to assume that not only might Rome come to an end, but the world itself by be nearing its conclusion.  Beginning with the creation story from the Book of Genesis, Hydatius sought to place all of human history within the context of a linear progression, starting with Adam and Eve and ending with the Second Coming of Jesus Christ, which Hydatius believed to be right around the corner (some sources specify the date of May 27, 482).  In fact, Hydatrius could easily be considered as the "father" of the Christian end of days phenomenon, in which virtually every succeeding generation has followed his example.  As a member of the social elite, Hydatius had access to a number of chronographic and historical sources, and he cited them extensively in his forecasts of the end of the world (though he often exaggerated the historical records or simply made stuff up to fit his agenda).  As a result, Hydatius gained quite the following, even among some in the upper class.

In addition to establishing the precedent of fitting a world apocalypse within the construct of Christianity, Hydatius was a pro at depicting the end of the world as a doom and gloom event.  Much in the same way that a Glenn Beck or a Harold Camping of today spins their rhetoric to invoke fear and terror of the future, Hydatius was a master of fear mongering.  For example:
Such are the contents of the present volume, but I have left it to my successors to account of the Last Days, at that time at which they encounter them...famines run riot, so dire that driven by hunger human beings devoured human flesh; mothers too feasted upon the bodies of of their own children whom they had killed and cooked with their own hands...And thus with the four plagues of sword, famine, pestilence, and wild beasts raging everywhere throughout the world, the annunciations foretold by the Lord through his prophets came to fulfilment.
As you can see, Hydatius didn't have to look far to find his "ammunition."  All around, examples of the crumbling Roman world were to be found, and Hydatius was like a kid in a candy store.  The animalistic, heathen barbarians, intent on rape, pillage, plunder, destruction, enslavement and conquest, were the perfect characters for any and all devil/anti-Christ roles that could be imagined.  In short, Hydatius' life, and that of his contemporaries, is a miserable, hopeless, decrepit and evil existence, but all is well because the end was coming...and coming SOON!   

All of this begs the question, "Were things really as bad in the 5th century as Hydatius makes them out to be?"  The quick answer to this question is a resounding, "No."  Sure, the Western Roman world was a world in change and constant flux.  Political strife and social decay, coupled with the rise of "barbarian" northerners and the Christian religion, all made for a very unpredictable world.  But this does not mean that the world itself was hanging by a thread or that good, innocent people were living in a constant state of panic.  In fact, the overwhelming majority of commoners probably never heard or cared about the type of rhetoric that Hydatius was spinning.  For most peasants, coloni, etc., like was pretty much the status quo existence of farming, socializing within a very limited and localized structure, praying to god(s), etc.  Hydatius' message was not one that got a ton of airtime and he was clearly embellishing things to advance his apocalyptic message.  From historian E.A. Thompson's book, Romans and Barbarians:
The entry of the barbarians into Spain in 409 was an event which made an impact, but not a resounding impact, on the chroniclers of the outside world.  Most of them speak of it, but the do so briefly -- only in a few words...For Hydatius, on the other hand, it was a calamity which deserved as much space as the Fall of Rome itself...a disaster which dumbfounded the civilized world.
With that said, Hydatius' accounts, though sensationalized and often misleading, provide some important glimpses into the 5th century history of Spain.  As such, they are an invaluable treasure.  Of course, much of it needs to be taken with a grain of salt.  After all, we now know with the blessing of hindsight that the world didn't end in 482 (far from it), nor was the "barbarian invasion" into the dying Roman Empire the end of the world.  In fact, it marked the beginning for the emerging Medieval societies of Europe, not to mention the future greatness of Christianity as the single most influential force of the next 1000+ years.

Tuesday, May 1, 2012

Paul Revere, the "Immersion" of Jesus, and the Complex Nature of Early American Religion

This past month, officials with the Library Preservation Department of Brown University uncovered a rare engraving (seen on the left) from our nation's founding period, which I believe illustrates the complexities of early American religion.  This engraving, which was completed by none other than Paul Revere, is a depiction of the baptism of Jesus Christ by John the Baptist.  As you can see, the engraving illustrates Christ's baptism as being done through immersion. 

Paul Revere was well known in his day for several of his artistic engravings, the most famous of course being his depiction of the Boston Massacre.  As an artisan, silversmith and dentist by trade, Revere was exceptionally gifted with his ability to create these artistic engravings, all of which helped to gain him notoriety during the early years of the American Revolution. 

But this particular engraving of Christ's baptism is noteworthy not just because of the artist who created it, but because it also sheds light on some interesting aspects of early American religion and the personal creed of Paul Revere himself.  As the son of a very devout French Huguenot who had immigrated to Boston, Revere was raised in a very devout Protestant home.  The family's primary loyalty rested with Christ's Church (Old North Church) where the children were raised in the traditional orthodoxy of their day. 

And though orthodoxy was an important component in the lives of many early American colonists, the sweeping tides of the First Great Awakening had brought about new ideas regarding humanity and its place with the divine.  For a young and intelligent boy like Paul Revere (who seemed to have an inherent attraction to revolutionary ways of thinking) these new ideas seemed to strike a chord.  Though originally drawn to the teachings of the Church of England, Revere eventually began to align himself with the West Church, and its controversial pastor Jonathan Mayhew.  Mayhew's provocative brand to preaching, particularly his support of resistance to civil authority and opposition of British "tyranny" had earned him a large number of supporters within the Boston area, particularly the young fifteen-year-old Paul Revere. 

Needless to say, Revere's newfound faith did not sit well with his extremely orthodox father.  In fact, Revere's decision to give ear to the radical Mayhew ended with him being on the receiving end of a severe beating at the hand of his father, which caused the young lad to "repent" of his error and return to his family's church (though he stayed close friends with Mayhew).  But it wasn't Mayhew's political views that angered Revere's father.  According to Joel Miller, author of the book, The Revolutionary Paul Revere, Revere's father wasn't upset over Mayhew's political rhetoric but rather over his "heretic" teachings:
Mayhew's politics weren't as radical as they might seem. Mayhew was speaking from what was by then a long tradition of civil resistance, primarily from the Calvinists. While John Calvin himself opposed rebellion, his Huguenot heirs in France penned treatises defending it: Fran├žois Hotman, Theodore Beza, and Phillipe du Plessis-Mor-nay and his famous Vindiciae Contra Tyrannos. Ditto for Calvin's Puritan heirs like George Buchanan, Samuel Rutherford, and John Ponet. These writers shaped Puritan and Huguenot ideas about civil power and rights and were hardly radical to those standing in their stream. John Adams spoke glowingly of them. "The original plantation of our country was occasioned, her continual growth has been promoted, and her present liberties have been established by these generous theories," he wrote, specifically referring to Ponet and the Vindiciae.  All this matters because Paul's family was Calvinist. His dad was a Huguenot refugee from France and married into a Puritan family in Boston. Mayhew's politics wouldn't have been radical to him at all, and preachers all over Boston echoed Mayhew's political sentiments. The problem for Revere's dad was the rest of Mayhew's theology. Mayhew was a winsome, exciting preacher -- and also a heretic. He denied some basic Christian teachings, such as the Trinity. From my reading, Paul got the beating for lending ear to a heretic. Mayhew's politics were actually pretty orthodox for their time and place, which was one of the reasons Boston so quickly fell into their resistance against England. (My emphasis).
It was Mayhew's infamous unitarianism, mingled with Christianity, that angered Revere's family so much.  Resistance to some distant king or some foolish tax was one thing, but resistance to the Holy Trinity or God's one true faith was quite another.  This is why I find the engraving above to be of such interest.  As already mentioned, Revere was raised to embrace a very orthodox view of Puritan Christianity.  As a result, one has to wonder why Revere chose to depict the baptism of Jesus as being one by immersion, when the Puritans/Congregationalists taught baptism by sprinkling (particularly at infancy).  Could it be that Revere was once again challenging the faith of his father? 

Of course, it is difficult to say with absolute certainty why Revere chose to make this engraving.  Perhaps, like many others of his faith, he believed that Jesus was baptized by immersion but that the same was not needed for his followers.  Or perhaps he was simply trying to profit from the growing revivalism in the early years of the Second Great Awakening.  After all, we know that Revere had profited handsomely from the growing demand for church bells, becoming America's best-known bell caster.  Could engravings like these also been the result of his desire to make an extra buck?

I don't think so.  First off, this engraving is only one of five known in existence today.  In addition, there is zero evidence that the engraving was published in any of the books or pamphlets of the time.  Instead it appears that Revere made a relatively small number of these engravings and sent them to close associates.  As a result, it would stand to reason that these engravings were more for sentimental value than anything else.  This makes sense when we consider the fact that Revere elected to further his studies of "infidel" Christianity at the hands of Mayhew and others. 

With that said, it is important that we be careful not to classify Paul Revere as a unitarian, closet unitarian, etc.  Revere maintained a very close alliance with Congregationalism throughout the course of his adulthood.  Boston's New Brick Church was like a second home to Revere, as he was a regular in Sunday church services.  Clearly Revere maintained a love for his family's orthodox faith.  As a result, I have no problem with those who wish to classify Revere as a devout disciple of Christian orthodoxy.  With that said, I do think that these apparent "heathen" blips on the radar are noteworthy because they reveal the fact that almost nothing about early American religion (or any religion of any era for that matter) is cut and dry.  Like many of his time, Revere was questioning and thinking about matters of faith.  Was God really the totality of an obscure Trinity?  Is infant baptism/baptism by sprinkling really a requirement for heaven?  Is there really such a thing as "the one true faith?"  In the end, these are questions that are just as relevant today as they were 200 or 2,000 years ago, which proves that Paul Revere was a pretty stereotypical Christian of his time.