What do you think?

You might be pro-religion or anti-religion, but either way you might notice a decline in religious values in America today.

This is a spiritual blog and the changes might impact you in some way.

This post looks at the decline of religious influence in contemporary America and what it might mean.

Didn't the beginning of America go back to pilgrims fleeing religious persecution from an English King? They wanted to do it on their own terms.

Let's take it from there.

The foundation of America lies in freedom. Freedom of religion was the first right that came to the forefront.

Freedom of religion

Religion In America

To be clear the majority of The Founding Fathers were not actually 100% Christian, but had Deism intertwined. (The Faiths Of The Founding Fathers). Deists believe in a God that created the heavens and the earth, but think that God left his Creation alone afterwards.

So, basically America had religion as an integral part of its' being from the start. But something has happened since.

In what ways has religion in America seemed to backtrack that this would be said at all?

1. Things like the Ten Commandments and prayer are forbidden is some schools.

2. There have been debates to remove the phrase "In God We Trust" from currency.

3. Under Obamacare some companies are forced to cover abortion for their employees, which adds to the cost.

4. In some places saying "Merry Christmas" is forbidden, in exchange for saying "Happy Holidays."

While some of these things might sound insignificant to some people it still reflects an overall trend.

One does not have to look very far to see a decline in morals in America. For example, the rates of divorce, crime and single parent families are all rising.

A legitimate question would be to ask if freedom of religion means more or less of an influence in daily life. For example does freedom of religion mean there should be no exposure to religion?

What Does This Mean?

The trends mentioned above (divorce, crime, single parent families) have real consequences on a society over time. It is not hard to see why this is.

If a child sees an environment with drugs and stealing they are more likely to do the same things.

In the fight with Al-Quaeda one thing they clearly think is America is declining in terms of morals. People on the other side of the world do not like America's influence on their societies.

This "decay" of American values motivates them.

Technically, America was founded on a freedom from religious persecution, but with Deism intertwined.

Prayer and "In God We Trust" would fall under Deism. The Ten Commandments would indirectly, as the God that created everything would want something good for his creation, and these are general guidelines.

Jesus Christ might not be the Son of your God, but you still have the right to follow one.

People with strong religious practices are likely to make a stronger nation overall and America is in a general decline.

Is religion a part of your life?

If you enjoyed reading this, please subscribe to the blog near the upper right of the page. That way you’ll never miss ways to re-invigorate your life!

Related posts:

Why Is Christianity On The Decline In America?

Author's Bio: 

Active member at an Eastern Orthodox House Of Hospitality and Non-Profit Operations Manager. I started blogging about the Meaning Of Life after a Spiritual Awakening.