Possibly the biggest point of contention with respect to feminism is whether or not the movement is still needed in Western nations like the United States. Are there still any significant rights which women are not given equally? Does perception reflect reality when it comes to how people think women are treated by society? Are remaining gaps between men and women (i.e. the wage gap) due exclusively or largely to discrimination, or are there other variables which can account for this?
Let me just say it up front: Feminism is nothing more than the position that men and women deserve equal rights. It is called “feminism” because women are the traditionally underprivileged cohort; certainly feminists can advocate on behalf of men in areas where they are treated unfairly. However, as anyone who has ventured into this morass surely knows, it’s never as simple as that. “Feminism” is an umbrella term which encompasses a plethora of nuanced positions, some of which even conflict with one another. Thusly, it is extremely hard to define what someone’s position is just by their application of the label “feminst” to themselves. In addition, this label has developed quite a stigma and negative connotation, to the point where activists spend more time explaining their position than actually trying to find common ground with people or explaining what their goals are. I personally choose not to wear the label for this reason, although I do consider myself a feminist.
In this clip from our Secular Roundtable discussion on feminism, I go into depth on this subject, and discuss the complex nature of this debate.