I want to discuss. The term patriarchy refers to male dominance, which results in women being inferior to men. Throughout history, men and women have played different roles in society, leading to controversy over whether we still live in a patriarchal system today. Men have traditionally been viewed as the alpha males and breadwinners of the family, while women have been relegated to the role of housewives responsible for caring for the family. Being a woman in America has raised awareness of oppression and how women have been oppressed by men for many years. Black women have been oppressed by white supremacy and subjected to stereotypes. Women do not receive the same benefits or treatment as men, and were considered irrelevant in the workplace and politics until the 1940s. Men were often ignorant and arrogant, failing to acknowledge women’s needs and feelings as they were overshadowed by men. Feminists advocate for women’s rights and gender equality. Betty Friedan’s book The Feminine Mystique challenged the idealized image of what women were supposed to be, inspiring women in the 1960s to pursue education and jobs traditionally reserved for men. The mindset of women changed dramatically as they became more determined to break into male-dominated fields such as manual labor.
mame leye Discussion#7
Leave a reply