You don’t live in the South, I am guessing. In the South, white people definitely feel as if they have an in alienable right to oppress and dominate people of color, and certainly black people of color.
This isn’t as often these case in other parts of the country, but real talk, vigilantism against African Americans is deeply a part of Southern American culture, have you ever heard of the Klan? Of course you have…and now a Netflix doc is showing how the GA Klan killed a ton of black kids in the 80s and framed a black man for it. White folk in the state have been engaged in deeply sick and twisted activity for quite some time.
That most white Americans aren’t psychopathic killers of people of color is beside the point. In the South, historically MANY have been. And nothing has really changed…and really this has nothing to do with “wokeness” except maybe for you. You waking up to some information you don’t really want to process.