Is America a racist country? More and more so every day.
“Americans have become incredibly racist while remaining convinced that they are anti-racist. Anti-racism has taught a generation that the less racist they are, the more unconsciously racist they must be, and that the more publicly racist they are, the less secretly racist they become.”
The Left has made America incredibly racist from segregated campuses and workplaces to constant racial invective and paranoia under the guise that racism, renamed anti-racism, will somehow defeat racism which has been redefined from hating people on account of their race to a problem suffered only by white people.
In this video I discuss the destructive hypocrisy and racial cynicism at work here.