Just wanted to see if you were one of the delusional people I met on here that thinks violence and cruelty began with Europeans, and that Africa and the Americas were paradises of peace before the white man came, especially considering the sheer amount of anthropological evidence that demonstrates murder and war have been a staple of human existence in every habitable continent and by every race.
That being said, that is not to give an excuse to what the Europeans did to innocent men, women and children.
I blame the rise of Christianity myself for their behavior. It gave them the false pretense that their actions were sanctioned by God. We still see that today, in Islam, just how powerful that belief can be.
During the Greek Civilizations and Roman Civilizations pre-Christianity, there was still excursions around the world, but it was nothing different than what other Empires around the world were doing. What made it really hateful, was the introduction of "fighting for Christ". For example, if you read about the Norman Invasion of the British Isles, they thought the anglo-saxon tribes and other tribes present were savages who deserved to be enslaved for the lack of belief in Christianity. The tribes people were stripped of their land, make serfs, and their women were raped. All in the name of conquest and "the Lord".
This also happened during the 1800s and 1900s with the UK and their treating of the Roman Catholic Irish. They raped, stole, indiscriminately killed, all because their they felt their brand of Christianity was superior.