Recently I was forwarded a link to an excerpt from a book entitled, "When A Nation Forgets God." I'm not familiar with the author, Dr. Erwin Lutzer, but it's published by Moody Press, if that matters to anyone. I resonated with several of the points made in the section I read, but there was one underlying premise that seemed to be guiding the direction of the book, and that can, I think, be summed up in the following quote:
"When truth is rejected in the public sphere, the state will either turn to some semblance of natural law or more ominously, to lies. Secular values will be imposed on society, and it will be done in the name of 'freedom.'"
(The heading for the section is titled, "When God is Separated from Government, Judgment Follows")
I have a couple of big problems with this view.
Now, first, the author's appeal to Hitler's Germany as his primary example of a secular state hostile to Christianity is unfortunate, because Hitler is dragged out so often as a scare-tactic, I think it's become a bit tired. Yes, we all know Hitler was a terrible leader and under his rule Germany committed horrible atrocities. Yes, we need to make sure America doesn't end up like that. Point taken.
But this approach, as the above quote indicates, implies that we, in the U.S., were actually providing a clear portrayal of truth to begin with. And that is precisely what I think all of us, especially Christians, should question.
What do I mean? Well, first of all, we need to be honest with ourselves and accept the fact that America has NEVER been free of 'secular values.' In fact, those very values, albeit in a modified form, were the major impetus for our nation's creation nearly 250 years ago. The idea that America was founded as a 'Christian' nation needs to be put to rest as a type of revisionist history - a history that has developed, in my view, to make we Christians feel less responsible for the 'corruption' of our nation's values.
But the simple fact is, America has always been a negotiation between the religious and the secular. We have always been a nation founded upon 'freedom,' and I'm pretty sure the founders did not mean freedom in Christ! The fact that our constitution includes language about both freedom of religion and the separation of church and state is a clear indicator of that dialectic; an attempt to provide all believers - even those who are not Christians - with the opportunity to worship however they choose, while at the same time protecting the people from any one religious view becoming official by gaining government approval.
(And there were/are good reasons for that. If you want to know what they are, read one of the number of books about the religious wars and persecutions in Europe throughout the 16th-18th centuries.)
But, back to my main point: If America has always been a negotiation between so-called 'Christian' and 'secular' values, then the notion that America was, at some point in the past, a nation founded on 'truth' which is now completely eroded by secularism, is a false dichotomy. At best, America has been a nation with a strong religious heritage, a great deal of which stems from the Christianity of its early settlers. But that religious heritage has always been in tension with desire for the necessary 'freedoms' that make for a modern, secular state.
As such, we Christians have to admit a very uncomfortable truth: To a great extent, if the Christian religion is America's primary religious heritage, WE are responsible for the eroding of our own Christian beliefs, and the encroachment of secular beliefs within our society. Why? The answer is simple. If Christians had not been content to initially negotiate with secular values when founding the United States, we would not be seeing the flourishing of those same values in our present day. You don't grow what you don't plant.
Now, it may be that there was simply no way for America to even have a chance at becoming a nation in the absence of the religious-secular dialectic. Perhaps there is no other option. That's fine -- I actually don't mind, because I fear that any sort of theocratic government would end up being at least as corrupt as whatever secular state may instill fear in the hearts of believers today.
This leads to the second problem: If America was founded as an awkward marriage between the religious and the secular, and if Christians have been at least partially responsible for flirting with the secular, then the idea that placing Christians in positions of political power, or establishing laws founded upon Christian principles, will lead our nation back to the truth, is utterly flawed.
Why? Because Christians (and yes, I'm including myself here) haven't been proper bearers of truth to begin with - why do we assume we will get it right this time? Isn't it more likely that Christians will continue to revel in their flirtatious trysts with secularism, albeit under a new guise? If Christianity has led us to America as it currently stands, why do we think that MORE Christianity will solve the problem?
Now, here I need to provide a critique of my own statements thus far. For there are still two things that have yet to be resolved.
1) What do I/we mean by Christianity? Perhaps the problem is someone has an improper definition of what it means to be a Christian.
2) Am I not simply being cynical? Isn't any attempt to recapture the truth of the Gospel better than letting our nation slide into decay?
To respond quickly: As far as what it means to be a Christian, I believe this is precisely where we, as believers and followers of Christ, need to be having our discussions. Until we are able to determine what it really means for us to follow Christ as the American Church (I'm thinking broadly here - Catholic, Protestant, non-denominational, tiny home group and mega-church), our attempts to enact social change that reflects the Gospel will be muddled at best.
Indeed, it seems fairly clear that part of the reason secular values have gained so much momentum is that Christians have been fighting each other (or very specific political battles) for decades and we really don't have much of a compelling, articulate, alternate narrative for people to consider. I would go so far as to say that since great numbers of Christians can't even seem to agree on what it means to follow Christ, Christianity has essentially lost its place in the cultural discussion. Christians are so concerned about being "relevant" but can't even get along with each other. No wonder people don't take us seriously.
As to the second point, I hope I am not being cynical. I do agree very much with one of the main antidotes Dr. Lutzer offers to the situation, which is also, I think, an antidote to cynicism:
"It is time for us to reread the New Testament book of 1 Peter, written specifically to believers living in a hostile, pagan culture. They had no representatives in government to plead their case; they had no power to 'vote the bums out' as we do in America. They did not have courts that would give them a fair hearing. There was just persecution, intimidation, and deprivation. And sometimes death.
To them Peter wrote, 'Dear friends, do not be surprised at the painful trial you are suffering, as though something strange were happening to you. But rejoice that you participate in the sufferings of Christ, so that you may be overjoyed when his glory is revealed. If you are insulted because of the name of Christ, you are blessed, for the Spirit of glory and of God rests on you.' (1 Peter 4:12–14 NIV).
When confronted with these challenges we are tempted to do the wrong thing — to react with judgmental anger which will only entrench those who are on the other side in this culture war. We must do the opposite: respond with humility, love, and gracious courage. We will neither win these battles simply with politics (however important that is) nor by argument. Every Christian must regain the high ground with credibility, winsomeness, and yes, with joy. We must stand our ground giving thanks to God, even as it shifts beneath our feet."
To that, I say 'Amen.'