We live in the United States of America, the country we have been - TopicsExpress



          

We live in the United States of America, the country we have been raised to believe is the land of the free and one of the most caring and compassionate on earth. We have been told about our exceptionalism and Christian roots and big hearts and yet over the past 30+ years, rather than building on the best awe-inspiring moments of our progressive past, we have detoured back and pulled out the ugliest themes in American history, embracing and retooling them to lay down one of the most extreme, miserly, and mean periods in our history -- the one we are suffering through now. We have warred on women and children, minorities, the poor, the disabled, the elderly and the very veterans we proclaim to hold so dear... all the while we have done this we have boasted from pillar to post, Washington to tiny town, that we are a Christian nation... we have invaded foreign lands that have never warred on us, launching hellishly conceived American-Christian crusades and imposing absurd, impossibly flawed nation-building nightmares in foreign lands.
Posted on: Fri, 28 Mar 2014 18:05:43 +0000

Trending Topics



Recently Viewed Topics




© 2015