THERE is the question! This country never has been a "Christian Nation" in the same sense of what many European countries were at the time of the founding, specifically, England. we never were a traditional Christian nation.
We were founded as a Christian nation in the sense that this nation was founded by Christians (with many Christian ideas and ideals heavily influencing the Constitution) and for a religious people, who at the time were mostly Christians of one variety or another. So, we were a Christian nation in as much as we were a nation made up of Christians and with a federal government informed by Christianity (but not subscribing exclusively to it). That is why Christianity is very much a part of our national traditions.
This is in contrast to other countries and governments that came about shortly after the U.S. Most of them were influence by French Enlightenment thinking and specifically rejected religion, and Christianity in particular; basically, creating purely secular governments.
America was unique in that is embraced Christianity and religion like England, but also worked to keep the national government from influencing and excerting power over religion, or creating a national religion.