I have been doing some research into morals and how we might come about with the
morals we do. I know most religious people claim it is their god that gives us our morals but I just can't see that. The main problem I see with this claim is that christian morals change with society but by their claims they shouldn't change at all. I like the example of how the laws and morals of people began to change after women got the right to vote here in the USA. Once women started voting many politicians began to enact laws which helped women. It has taken many years but the morals of people have changed when it comes to many crimes committed against women. There are other areas where religious claim morals but their actions don't support their views. Some of these areas are charity, murder, rape, and war. They talk about how their god is all about peace but will condone all the above as long as it is for their god.