kurdt318
2006-08-03, 20:10
i have just recently spent this last year dismantling all that religion has taught me. I don't understand why modern religions demonize everything natural about the world. Animals masturbate yet it is a sin in many religions, arent humans animals? Women being inferior to men? women are the givers of life but according to religion the are the whores that will destroy the world
im interested to hear what you all think
im interested to hear what you all think