Many seem to believe that our country's progenitors intended to enmesh Christian values in our government's practices and constitution, based on letters and essays they wrote at the time. I've read that Jefferson, Adams, et. al. were Deists, not Christians, so the matter seems moot. For the sake of argument though, I'll concede that they were Christians.
If this is true and the drafters intended to form a Christian society, why did they not explicitly say so in the Declaration of Independence or the Constitution? The latter does not mention God at all, and the former alludes to a God and a Creator nonspecific to any one religious sect. It concludes, "with a firm reliance on the protection of divine providence" ' again, an ambiguous reference connoting nothing related to any particular religion.
So why do so many vehemently endorse Christianity in government? Wouldn't the ultimate course of this line of thinking lead to a theocracy? Is that what people want?
I don't mean to lambaste Christianity by any means. I was raised Catholic and, for the most part, have only known kind, altruistic people in the church who keep their faith to themselves and are content with a private, personal relationship with their religion. Isn't that how a person's faith is supposed to be practiced ' in private? It doesn't seem so these days.
So I can't help but question the motives of people who wear their religion on their sleeves and feel so pharisaic as to believe that their values should be adopted by all. Morality and virtue are not dependent on Christianity. Empathy needs to be a bigger part of our society.