September 29, 2021
For religion to be national, it must first be personal
What does it mean for a nation to be Christian? Does the United States of America fit the description?
At its founding, the United States was undoubtedly a Christian nation. To foster a society of religious freedom and pluralism, the Founding Fathers intentionally did not establish a national religion and took care to separate the domains of church and state in the founding documents of our country. Continue Reading...