Why am I supposed to believe women are being mistreated in America?
Seriously, based off what's going on, you'd would think women are being slaughtered in the streets.
just because western women have more rights today than they did hundreds of years ago does not mean sexism is over. the malcolm X quote about the knife coming out 6 inches from a 9 inch wound applies to women as well.
they were treated as chattel with no rights for literally thousands of years. there was never any reparations. men simply started improving their treatment of women, because they were FORCED to do so by other men who didn't want their wives and daughters to be subjected to the same treatment.
there are alot of parallels between sexism and racism that should give a unique understanding of the feminist view, but somehow it gets treated the exact same way that whites view racism. as invisible.
to most whites, racism doesn't exist anymore, and to most men, sexism doesn't exist anymore. both beliefs are wrong, but trying to explain this to either group seems to fall on deaf ears.