I pose this question because it seems many people I associate with at the church I attend would very much like to assert that the United States is very much a Christian nation. In my own mind, I could not disagree with them more strongly– and here’s why.
How do we define Christian? While it is historically true that many of our early American colonies were established as religious based settlements escaping from the tyranny of the Church of England and religious persecution from the Vatican and other church hierarchies that wished to oppress people into some form of conversion, it is also true that the very tactics of oppression many of these colonists escaped from were the same tactics they employed against the indigenous Native American peoples and nations already established in North America.
How can these early colonists who claim to be Christian, use oppression, murder, and coercion to morally justify the establishment of a nation? What defines the U.S. as a Christian nation where there is little or nothing Christ-like about how we formed this country at the expense of others from the time of founding and throughout our nation’s history? How can we call this a Christian nation if we have a history of exploiting Chinese Americans, wrongfully imprisoning Japanese American citizens, thumbing our noses on those seeking a better life here than the oppressive conditions they are escaping from in say Central America, Mexico, Colombia, and Africa? How can we call ourselves a Christian nation when we have subjugated millions of Africans against their will and held them in slavery for roughly 400 years?
How does a nation that claims to be Christian condone state-sponsored murder against other nations in unjustifiable wars based on unreliable intelligence? Then turn their backs of veterans or cut funding and resources for education, veteran and senior care, the homeless, the disabled, the mentally challenged, the addicted… and so on for the cause of building a larger prison population or military arsenal?
What part of Christ is in this idea of the United States being a Christian nation?
Do we love our enemies? Do we love our neighbors? Do we care about the least of us in our society as much as we care about maintaining the upkeep of that top 1%?
To call ourselves a Christian nation is an insult to true Christians and it also points to a straying of a primary priority. The only kingdom or nation we as true Christians should be concerned about is the Kingdom of God– and only Jesus at the Second Coming can bring that to us. It is not about politics or the works of man– we will always fall short. It is about God’s will and about believing that by the grace of Jesus we will be redeemed somehow and made worthy of the kingdom as God created it.
Our laws and the framework for our Constitution may have some Judeo-Christian commonalities with the Ten Commandments… our country’s founders may even have Judeo-Christian affiliation… but this doesn’t make us Christian any more than walking into a garage suddenly makes us mechanics or walking into a hospital suddenly endows us with the ability to perform brain surgery. We may have a Christian heritage… but that is as far as I would go. We are finding Jesus… but seldom are we following Jesus. Just look around. We have invested so much energy in hating each other and being suspicious of “others” that we have neglected love and compassion. We ignore the two biggest commandments Jesus had spelled out for us. Love thy God and love thy neighbor.