Is it true that Black men make more than white women and black women in america in the work place? Where I've worked in corporate america, I saw more white women in power positions than black men.