My opinion is Bring Religion Back To Schools! We are a nation founded under God. God and teaching should be put together for the better of our country and the world. Whether people choose to believe it or not, atleast its a good history lesson, right? Can't argue that if you're up for learning about every other God except for the one you're intimidated by.