Political debates used to be the norm around here, but we haven't had one in awhile. I think this issue is important enough however.
I hate to get so serious here, but frankly I don't know what the hell has happened to America. Things have really gone downhill. It used to be that there were certain natural rights that were just understood and would go completely unquestioned. Rights over the bodies that we own, for instance.
A body you possess is something you are supposed to have complete control over. That includes when reproduction does or doesn't happen, or yes, even life and death. (If you really must use such emotionally charged language over something that isn't fully human.)
Isn't it bizarre to think that people in the past had more rights than we do? But this is indeed the case, because of the actions of evil meddling men in the government concerned only with politics and with having their own way, forcing their own morals on everyone else. Nobody asked me, and it's extremely unfair, especially in less affluent Southern states and rural locations where the economy depends on the availability of such services.
I realize this is all just useless bitching on the internet, but still, it just has to be asked: why the FUCK is it a crime for me to own slaves?????