The views remain in many (not all) white people, but it has morphed into a colonialist spin. They justify stealing land from the Native Americans, colonizing India and Africa, etc based on the idea that the original race wasn't using the land right, thus the white man had to take it and make it better.
Have you guys noticed how many republicans attack Obama for allegedly having an "anti-colonial" mindset? What does that say about them - they support colonialism. They aren't even hiding it anymore.
Language. Shift the words but the motive remains the same.