It seems the Christianity evangelicals know is changing. With the advent of post modernism, there has been a gradual infiltration of this philosophy into the church. The “wolves in sheep’s clothing” it would seem. Has anyone else noticed this? The Brian McLarens and Rob Bells of this world would have you believe that Christianity itself is actually changing to suit the culture and we need to change with it. The fact is ours is a religion founded in the unchanging and immutable nature of the One True God. I ask then, how can there be a change in the Bible when it is God’s word handed down to us? There is no need for us as evangelicals to change with the culture, but in reality, there is a greater need now more than ever for us to actually live the Christian life and thereby change the culture.

Advertisements