All the language concerning the "transformation of culture" that we hear in ecclesiastical circles today gives rise to the question, "Just where did this idea come from?" According to Hart, it is part and parcel of pietist Christianity. He writes:
"Throughout the twentieth century, evangelical and mainline Protestants have assumed, thanks to their pietist heritage, that religion has immediate relevance to all walks of life.... [T]he legacy of pietism is a this-wordly form of devotion that... manifests 'the passion to hammer down history, to touch the transcendental, to earth the supernatural in the mundane.'" (Hart, Lost Soul, xxx).The intended result of pietism, whether liberal or evangelical, is results. When the poor are fed, abortion is criminalized, and X amount of souls are converted, then the gospel has done its (tangible and utilitarian) work.
Biblially speaking, where does this idea that the ministry of the gospel must produce a visibly better society come from? And if this is not the point, then what is?
|