5
Wiki User
β 11y ago30
No. Women are the majority in the US. Women in certain situations (especially in business management) are definitely still minorities. Women are not the minority in colleges anymore. Women are now very much in the majority in earning college degrees, earning over 60% of Associate's degrees, 57% of Bachelor's, and 60% of Master's. In 2008, women slipped into the lead in doctorate degrees for the first time, receiving 51%. Certain occupations still have virtually no women (carpenters and auto mechanics are less than 2% women).
As for degrees that actually require intelligence, I'm ganna go with 13%
45%
Women
Women 59%&& Men 41%
women
Georgia Female College/ Wesleyan College in 1839.
Women
As of 2020, approximately 53% of doctoral degrees in the US were awarded to women. This represents an increasing trend in the proportion of doctoral degrees earned by women over the years.
According to the American Council of Education, Women make up the predominant percentage of College Students.
Approximately 56% of college students in the United States are women. This percentage has been steadily increasing over the years, with women now outnumbering men in higher education enrollment.