As I have mentioned; the documented fact, there was a time in the Americas when most slaves were white; but a fact that our public schools avoid teaching. Any comments? I'm guessing if it were taught, it may somehow level the field; and we know that's something our liberals avoid like the plague. Just my opinion. Please, not the indentured servant reply. That got very old a long long time ago. They were slaves.