Its a bit of an unpopular opinion but I believe that the Bible should be taught in public schools on an academic level. There is a way to teach the bible and other religions objectively, and being the fact the Bible is the most important book in Western culture, it should be taught, albeit in a delicate manner along with other important Eastern texts. Secular society likes to preach progress and tolerance, but all I see lately religiophobia and a beginning of the new conservative.
Posted on: Mon, 14 Jul 2014 18:12:16 +0000
Trending Topics
Recently Viewed Topics
© 2015