Americans (or Democrats) have gotten more racist and conservative in the past 25 years, or at least have dropped any facade of having values that would now be considered ‘progressive’
All part of the plan since 1991, turn America into an openly fascist Reich.
All part of the plan since 1991, turn America into an openly fascist Reich.
turning nazi Germany into a nazi state, mf when wasn’t America openly racist?
Openly fascist. As in, embracing the label.
I don’t think Americans, on an individual scale, were as openly racist, say 30 years ago, as they are now. This is my personal opinion
What happened in '91?
loss of the union