By Apparition of Jack
in Society, Politics, Government, Environment, Current Events,
So the argument goes “we need to crack down on illegal immigration because everyone coming over is a violent drug-dealing rapist.” That’s the basis of the logic for what’s happening.
The problem is, this isn’t true. Countless impartial studies have found that migrant communities commit less crime than native-born ones, but this would never be accepted by those supporting mass deportations because their worldview won’t allow it.
And the reason this is is due to psychological projection.
White America as a whole is a lot less “civil”, “law-abiding” and “professional” as it would have you believe. There is a deep-seated pathological cultural of betrayal, corruption, egotism and outright criminal activity within white society it hasnt yet grappled with. This is most patently evident in the fact that their arch-messiah Donald Trump is quite literally a criminal conman who uses mob tactics and violent intimidation to maintain power.
I am of course not saying that individual migrants can’t be involved in crime, and obviously any crime doer should be punished, but that’s not the point I’m trying to make here.
What I’m saying is that everything White America fears about migrants / brown people / black people etc is at its core a terrified reflection of itself.
I don’t know what went wrong and frankly I don’t care to find out, but somewhere along the way White America as a whole (not individual white people mind you) lost its way and turned to greed, hatred and envy instead of decency, justice and humanity.
If you’re a white person reading this, I would strongly urge you to find your sense of common humanity, love and tolerance in these trying times, in whatever way you can think of. If you’re truly committed to peace, growth, human flourishing and consciousness, it only makes sense that you look to see the best in others and don’t become blind to the wrongdoings committed by people of your own race. America, to me, was founded upon the principle that all people - white, black, Hispanic, man, woman, gay, straight - were endowed with inalienable rights that could never be removed through tyranny.
If you truly believe in America, wouldn’t you believe in this? It only makes sense as your duty as a patriotic American to uphold those ideals. What else do you think America stands for? Money? Brad Pitt? Diet Coke? That’s not America, you can have those anywhere. What makes America unique from all other countries? You tell me.