America is a white supremacist country. In what time in it's over 250 year history has it not been? On July 4th, 1776 when it celebrates independence but still had slavery? I mean the fukking Nazis used the Jim Crow era America as a model for what they would employ in Germany under Hitler's rise!
America is turning into a fascist white nationalist country. Trump by all means is a fascist. A white supremacist. Who calls neo-nazis "very fine people". A good percentage of white Americans relate to Mein Kampf and white supremacist thought and values. A bunch of Nazis came to America after WWII. When has this country fought a war against a white country in recent history? They only got into WWII cause Hitler threatened to take over America! And now we have people actively arguing about the civil rights of Nazis and what they should be properly called!
By all means, when it's all said and done, The American Empire will go down in history as the most evil empire to ever exist and the most evil country to ever exist.