So i've been seeing this "rape culture" term floating around over the last couple months, and i'm not sure what to think of it. The feminists in particular have been slinging it around seemingly more than others, while decrying on those like myself who question it. Maybe i'm naïve, but is "rape culture" really a thing? It's been proclaimed that something like half of all women will get raped by the time they finish college. Can this possibly be true? I find this highly unlikely, so much to the point when I first came across this term it had really taken me aback. My natural reaction was that this was just another media ploy to further alienate the public from itself. Another thing I saw lately that ties into this was an article on men masturbating at woman in public, which they say is something all women have to deal with at least once a year. This is just craziness to me, and I call BS.
What do you guys think?
Edited by venom, 11 August 2014 - 04:01 PM.