Fingerprint, I know some Christians have done good work. I was a Christian for years and know that according to the Christian church sin is inherent in men. However it's abhorrent to me to brand children sinners from the time they are born. Christianity seems to mean guilty until proven innocent - and that verdict only comes after you're dead - and then only if you're lucky.
Jumping to your answer to Chakka. Christians seem to want it all ways. They say the bible is the literal word of god, but as soon as something that doesn't entirely fit their ideology rears its ugly head, they introduce a 'but' and say it doesn't really mean that, or it needs to be interpreted by a theologian because we mere mortals are incapable of understanding it (which I might say I find rather patronising). Well, theologians have different opinions, so which you do opt for? I've studied the bible, both at church and alone, and I know what it contains - and whichever way you look at it, it's not all milk and honey that's for sure!