What have blacks ever done that Blacks need to work together with "others" to improve race relations? People don't tell rape victims they need to work with their rapist to change the world, so why do blacks accept the disrespect? Blacks have been nothing but victims of Systematic Racism, so why do we even tolerate being talked to like things are partially our fault? The only mistake Blacks made was not mashing on anyone who was not African that stepped foot in Africa talking anything other than trading non human commodities.

