Over the years we have come a long way, as a whole women have the right to vote, there isn’t slavery, but we still have a long way to go. Women have a lot of things we didn’t used to have, like the right to vote, good pay (usually I’ll come back to this later), and a place in our political system. However there still is the wage gap, and some people still believe that men are better and more deserving of all good things in the world than women.
The wage gap is one of the most widely known issues that relates to sexism. Women actually are more likely to graduate college, but only slightly. Still though women earn around 20% less of what men earn for doing the same work, or about 80 cents to every dollar. Also according to a study the poverty rate for women would be almost cut in half if they got equal pay.
Now I want to touch up on something that most people use to justify sexism. Feminists are not people who believe women are better than men. They are people who are fighting for equality and justice, not domination. The official definition of feminism is “the advocacy of women’s rights on the grounds of political, social, and economic equality to men”. Now there are some people who do believe that women should be in charge and that they are better than men. However do not judge an entire movement on just a few people.
If you would like to learn more about the wage gap or feminism visit
Photo by stevendepolo
Sexism in the U.S. by Lexi is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.