Is America a genuinely Christian nation? Is American culture uniquely Christian in its attitudes, values, principles, practices, and beliefs? What does a “Christian nation” look like? What would a uniquely Christian culture look like? Is American Christianity dissimilar from Christianity in other parts of the world? If so, how it is different? Perhaps a better question is why is it different? Furthermore, is American Christianity really Christianity at all or is it something else altogether? These are good questions for any Christian living in present-day America.
Before I begin, I am not suggesting that the Christian community become a group of insurrectionists. I am also not suggesting that in order to be a Christian, American’s have to disavow their country. What I am thinking is that one of the essential components of discipleship in the American culture is altogether missing from this process: cultural catharsis.