Have you heard believers criticize the church for compromising with the world? The church is supposed to stand for eternal values that never change. Does this imply that any change on the part of a church is a bad thing? Yet the church has changed over time. Setting the question of compromise with the world aside, do you see the influence of the church in the overall culture and society to be increasing or decreasing? There are valid reasons for the church to change that have nothing to do with the surrounding culture. I told a story about wineskins. New wine requires new wineskins.
Related Resources:
How to Hear God's Voice!Related Blogs:
How to Hear God’s Voice
Add new comment