Skip to content

Politics

Is America a Christian Nation?

In what sense can we describe a nation as being “Christian?” By its leaders, or its citizens, or maybe its laws? By all these measures, America would not be considered a Christian nation. And yet, I frequently hear Christians speak so earnestly about how America is a Christian nation, and how we need to get back to our Christian roots, almost as though the United States has a special place in God’s heart. For many evangelicals in America their Christian faith is directly tied to their American identity. In fact, I have many memories of saying the Pledge of Allegiance at my Christian elementary school, or singing the “Star Spangled Banner” in church. But the truth is, America is not a Christian nation, and I’m not sure it really ever has been a Christian nation. It’s always difficult for me to hear certain Christians talk about the founding fathers as being these proud, whole-hearted Christians. It’s often said that they stood on biblical/Christian principles. I have even heard some say that they built this nation based on teachings of Jesus and the laws found in… Read More »Is America a Christian Nation?

 995 total views,  1 views today