Americans, regardless of Democrats or Republicans, by and large, no longer believe society can even function without a government to kneel before.
So why is it, in the one nation on Earth founded on the principles of classical Liberalism, that Liberalism means exactly the opposite of what it means practically everywhere else in the world?
To everyone that demands that this nation change to become something it is not, I invite to do the only thing that makes sense: Go away.
If these people cannot be bothered to reclaim something as simple as a word, how do they fool themselves into believing that they can ever hope to reclaim their nation?
Should we stop to think about the underlying assumptions that we've made our entire lives or allow those in media, in movies, in music, in politics, and behind golden microphones to tell us what we should and shouldn't believe and, more importantly, why we believe anything at all?
In a nation founded by daring to question centuries old political and cultural assumptions at the heart of society, how have we become a nation so ready to cry wolf at the barest scent of nonconformity?
Only through clear eyed hindsight, do the one to one connections between who we gave voice to and this election become obvious.