It’s no secret that today’s women are killing it when it comes to their education. For the first time since 1940, women in the U.S. are more likely than men to have a bachelor’s degree. Research shows women are even better students. Still, despite all the seeming advantages us ladies have going into today’s workforce, the wage gap remains an ever-present dark cloud looming over all of our heads, and it affects us far earlier in our careers than we may realize.
New statistics shows men make more than women, even right out of college. According to a new report released by the Economic Policy Institute, male college graduates earned 8.1 percent more in 2016 than in 2000, while female college graduates earned 6.8 percent less than in 2000. This suggests that as women in the workforce, we’re basically falling behind in our careers right out of the gate.
But it’s not just about the money, though that certainly is the big factor at the root of this huge issue. Here are six other things you should know about the gender wage gap.
On average, women earn about 79 cents for every dollar men earn
And it hasn’t increased significantly since 2000.
The wage gap is bigger for minorities
According to the American Association of University Women, in 2014, African-American women earned 63 percent of what white males earned, while Latina women earned just 59 percent, bringing down the average earnings to far less than 79 cents to the dollar.
The wage gap adds up
2014, the median income for men was
$50,383, according to CNN Money. Women, meanwhile, earned a median $39,621.
That’s a disparity of over $10,000 or close to $1,000 a month.
The wage gap affects more than just women
In 2014, four in ten households relied on the mother as the main breadwinner for the family, according to Pew Research. Low wages end up affecting children, and in many cases, spouses and partners, as well.
The gap is bigger in high-paying industries
In the medical field, for example, women represent 1 in 3 doctors, yet still earn 69 cents to the dollar than males earn.
Women actually bring wages down
According to an article published in The New York Times, as women start to take over male-dominated fields, the overall pay for the industry actually decreases.
Paula England, a sociology professor at New York University, believes sexism is to blame. "It just doesn’t look like it’s as important to the bottom line or requires as much skill,” she explained, according to the article. “Gender bias sneaks into those decisions.”