1/12/15

'post-Christian society' (part 1)


It is often said that what once was called Western Society is now 'post-Christian'.  But that phrase is not intended to define what contemporary (Western) culture has come to be.  Rather, it is an acknowledgement of the apparently insignificant influence of Christianity in contemporary culture. The above image is a photograph of a section of a wall in an abandoned church somewhere in Russia.  The image is powerful in its simplicity.  The upper and lower halves of the photograph convey extreme tension: the detailed drawing in dark tones, in the upper portion, suggests the faded realism of a Christian worldview; whereas, the starkness of simple graffiti spray painted on a bright, formless background, in the lower half of the picture, strongly suggests the purely subjective contextualism of postmodernism.  "Punk's not dead", but Christianity is, the image seems to say.



Is Christianity dead, in America?  Probably, most would say that Christianity is very much alive in America.  Especially, the phenomenon, during the last twenty years or so, of the proliferation of very large so-called mega-churches, may be taken by some as evidence in support of the idea that Christianity in America is flourishing.  Mega-churches indeed are flourishing, and a great variety of supposedly Christian media are flourishing; there is a greater abundance of 'Christian' materials and venues than ever before in history.  But none of that necessarily means that Christianity, as such, is thriving in America.

In fact, genuine Christianity in America is in a state of profound crisis.  The cavalier manner in which so many speak of 'post-Christian society' in America (as if that were a foregone conclusion), suggests that that phrase, that idea, may in fact express the prevailing consensus.  Actually, it seems that it is mostly professing Christians themselves, who appear to be in denial concerning the reality of the situation.  Mega-churches notwithstanding, still the number of regular church-goers in America represents only a fraction of the entire population.  Add to that, the fact that most church-goers in America do not demonstrate a Christian worldview and lifestyle in their personal lives.  To be sure, non-Christians understand that American society has left traditional Christianity far behind -- even if it may not appear so to most church-goers themselves.

The question of the Christian Church's influence upon popular culture is a matter of the greatest importance.  Obviously, that question cannot be answered apart from careful analysis of (American) society's institutions and cultural habits.  When one examines those various institutions, including government and law; business and finance; education; healthcare: it is only possible to conclude that the Christian Church has miserably failed to substantially influence any public sector, in accordance with the precepts of Scripture.  Moreover, the Church's failure to significantly influence popular culture is also painfully obvious.

Is it true, then, that America is now a 'post-Christian society'?  Emphatically, it is not true.  American society is not post-Christian, but it is anti-Christian.  The difference is not one of semantics.  The term 'post-Christian society' implies that some other form of 'society' is possible -- without Christ.  But we know from the Word of God that created beings are incapable of living in true society without Christ.  Does any suppose that demons, for example, live in society with each other?  Among demons, the stronger rule the weaker, through fear and violence; there is no love among devils.  True society has only ever been possible when a population has largely consented to live in covenant one with another, in the bonds of the Gospel.  History reveals that episodes (as opposed to epochs) of human society have existed, though only briefly.  Early American society was one such example.

Some evidently believe (or at least hope) that a sufficiently large proportion of Americans may somehow be persuaded to become Christians, to live according to a truly Christian worldview.  Within the past couple of decades or so, a militant doctrine has arisen which now appears to predominate among many mainstream Protestant churches: it is variously called Reconstructionism, Dominionism, Kingdom Now theology, or the Manifest Sons of God movement.  They essentially teach the same thing, that is, that Christ is coming back to rule on earth -- but not until the Church has won over the whole world through preaching the gospel . . . or, as some suppose, by overtaking and restructuring institutions -- which sounds to me more like revolution than Revival!

But I do not see how that is at all possible.  God does not coerce any to love and obey Him.  And to suggest that God has not all along been trying by all means to lead Americans to heartfelt repentance, were to accuse God of unfaithfulness to His own Word.  The Bible says that God is "not willing that any should perish, but that all should come to repentance" (2 Peter 3:9).  But God also said that His "spirit shall not always strive with man" (Genesis 6:3).

I do not foresee any wide-ranging Revival for America.  Instead, I believe that America has already come under severe judgment, and that more severe judgment is coming.  If that truly is the case, then what is the true Body of Christ to do?  What is the individual Christian to do?

We'll take up that question in the next post . . . .

No comments:

Post a Comment