answersLogoWhite

0


Best Answer

* The Bible is influencing America, but less and less every day. We should always be influenced, but looking at the world, it is getting worse and Jesus' coming is coming soon. == == * The Christians and some other people in America are influenced and that is a good thing. It helps to lead you into the path of God and someday going to Heaven. == == * The Bible influences the minds of Christian (actually, most religions have some form of a Bible: The Book of Mormon, etc) Americans to make what they believe are decisions right in the eyes of God. Most Bible-readers vote a certain way; for example, in the last election, statistics prove that more than half of Anglican Christians whom read the Bible voted for Bush. Bible readers affect America, not the Bible itself. It is no more stupid to be influenced by an inspired Book than by a screaming politician who says it is morally sound to kill an unborn human life. Also, this is a matter of perspective, considering that you could believe that the Bible "brainwashed" minds of Christian Americans, and I could believe that Atheists are more "brainwashed" by their Anti-Christ beliefs. Overall: Yes, the Bible influences America because of the people that choose to read and believe it. However, there is always another way to say just about everything. == == * It isn't obsession or brain-washing, but the choice of an individual to choose to believe in religion, what sect of religion and also have the right to be an Atheist or an Agnostic. Just like anything else in our society there will be those that use religion to do terrible things, but the majority of Christians are good people and we need more of them! == == * The Pilgrims came here because they wanted freedom of religion. The Bible is a vital part of the development of the US's formation. However, there is danger in going to the extreme of thinking of the U.S. as any kind of Christian Republic, especially if it is seen as the counterpart of an Islamic Republic. We are not a Christian Republic. * Definitely Yes, but was it for the good or for the bad is anybody's guess. * The cynic in me want's to say no. We ("Christians") go to church on Wednesday and Sunday. Listen to the preacher give a sermon and that's it. That is the extent of our exposure to the Bible. Perhaps we'll change.

User Avatar

Wiki User

14y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

14y ago

The Bible was written in the Middle East during ancient times. The final version of the Bible as we know it emerged in the 300's AD. Nobody in the Middle East had any idea that America existed then and so it is not mentioned anywhere in the Bible. If Christian Europe had known that America existed because it appeared in the Bible, they would not have waited until 1492 to visit it.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What does the Bible say about America?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What does the Bible say about the fall of America?

Nothing. America didn't exist until about 1500 years after the Bible was written.


How many followers of the Bible are there in the world?

A LOT. That's all I have to say. Not as much as there were, but there are still a lot, mainly in America.


How best can you explain America in Bible terms?

you can't, the word America is not mentioned in the Bible


Bible belt states in the south of America?

what is the Bible belt states in the south of america?


Does it say in the Bible to be charitable?

yes it say in the bible to be always charitable in the bible always read your bible and prayers


Did the Bible really say its ok to own a Canadian?

No. The Bible contains nothing about Canada, which did not exist as a national entity until the colonization of North America (1600-1900 AD). The closest thing to Canadian in the bible would be the Canaanites.


What is the oldest Bible college in America?

Moody Bible Institute


Does it say in the Bible you are a lier if you say you know everything about the Bible?

Not that I'm aware of. But there is a place in the Bible that says, You are a lier if you say you have not sinned.


Does the bible say fat women are pure evil?

No, the Bible does not say this.


Where in the bible does it say that God would destroy himself?

It does not say this in the bible.


In what Bible does it say that the devil is handsome?

The Bible does not say the devil is handsome.


Where in the Bible does it say that you will not be loved?

Nowhere in the Bible does it say you will not be loved. The whole Bible talks about how much you are loved.