Honestly I’m really not surprised. It’s not like in schools there is a “FYI FEMINISM” course. Even in history, we don’t learn about it. In most community colleges there aren’t even intro to women’s/gender studies courses. Plus most women in the United States think we have achieved the big time.