Why is it in every one of these thread, feminism (or in this case women's empowerment) and the idea of women wanting to be treated better vilified or treated as one of the big reasons? I'm sorry I'm not going to blame women for wanting better treatment and not expecting to become stay at home Mom's (or be with men at all, always nice the implications the gays and we WlW are to blame) just because society says so; I'm going to 100% blame that rotten society and it's economic system for creating the misogyny in the first place.