America Is Not A Center-Right Nation Don’t Believe The Republicans Fox Or Other Media See The Evidence
Most Americans hold the same view about life and what is important to them. It is the Politicians, the Media and the talking-heads Pundit would have you believe that America is a Center-Right Nation and that is unequivocally false.
Most people want to live and let live, however, in our midst we have the uninformed, ignorant, uneducated and the brainwashed that Trolls Social Media with phantasmagoric rationales and internecine narcissistic deluge. The Politicians and the Media feeds on them for their own self gain.
Twitter @sheriffali
You must be logged in to post a comment.