See this is what I don't understand. If things are overall better than they were in the past, what does it matter if white people are "at the top". Furthermore, even if things aren't better than they were in the past, why is it such an unpardonable sin if white people are "at the top"?
Black people were the first to be at the top, they were civilized when white people were still enduring the Ice Age in caves(as some people here never tire to remind us

). Various Asian, Indian and Arabic empires were also uncontested during their time. Rome and Greece had their day, then the Moslems had theirs while Europe was in the dark ages. Now white people are at the top again.
Civilizations wax and wane, just like every other thing in the universe. What's so bad about white people enjoying their time at the top?