Evangelical leaders: US no longer a Christian nation: In a statement issued Tuesday, the National Association of Evangelicals said that when it surveyed selected evangelical leaders about whether the United States was a Christian nation, 68 percent said no.
"Much of the world refers to America as a Christian nation, but most of our Christian leaders don't think so," said Leith Anderson, the association's president. "The Bible only uses the word 'Christian' to describe people and not countries. Even those who say America is a Christian nation admit that there are lots of non-Christians and even anti-Christian beliefs and behaviors."
'via Blog this'