Americans, regardless of Democrats or Republicans, by and large, no longer believe society can even function without a government to kneel before.
So why is it, in the one nation on Earth founded on the principles of classical Liberalism, that Liberalism means exactly the opposite of what it means practically everywhere else in the world?
If these people cannot be bothered to reclaim something as simple as a word, how do they fool themselves into believing that they can ever hope to reclaim their nation?
In a country once founded upon Locke's "Life. Liberty. Property.", we've become a nation shockingly comfortable with policy that is rooted in "From each according to his ability, to each according to his need."
Our society depends on objective measures, whether physical or moral, and there exists one set of standards against which we can justly judge the words and deeds of men and women.
Only "country" is evoked almost as often as god among the smoldering rubble of mankind's bloodshed and mayhem.