Sunday, August 14, 2011

Did evangelical christians found the nation of america as a christian nation?

no. christians did not found the USA, nor did the Deists who DID found it, create it as a Christian nation. Most abhored Christianity, including Jefferson and Benjamin Franklin.

No comments:

Post a Comment