I had a most interesting conversation with a friend recently that brought to light an issue for which I want to devote a number of articles. As the title of this article sets out, my concern is whether the U.S. is indeed a "Christian nation" or whether it is a nation of Christians.
The question I am posing is whether Christianity and Christian values are an essential part of what America stands for, or whether the country has a foundation of Christian values. Since virtually all of the founding fathers were Christian, there was no need at that time to factor in differing value systems.
This issue arose in a conversation in which a friend said (I am paraphrasing here): "America was a Christian nation at its founding; but it is no longer so. Christians are being discriminated against and all evidence of Christian values is being removed from American society."
My knee-jerk reaction was that she was wrong; and I told her so. However, as I started thinking about it, I began to understand it from her perspective. As a Christian, she is defensive about references to Christian values being attacked in the public arena. I, on the other hand, was not raised as a Christian; so, the ongoing debate is not "personal" to me.
I intend to address this topic from various perspectives as issues arise that shed light on this issue. In the meantime, I would point out the definition of nation, as set forth in dictionary.com: "A people who share common customs, origins, history, and frequently language." Please note, however, that this definition does not include "religion" or "values."
So, what is it that defines what it means to be an American? This is an important issue, one that I intend to investigage and talk about in great detail.