"I wonder just how long the rest of us can count on bein' free." - Merle Haggard
I just don't see how the US can be called a "Christian" nation. Can anyone help me understand that? I was raised with all the "one nation under God" stuff and the religious right and Peter Marshall and Ralph Reed and the Christian Coalition. I heard it non-stop at church. I even remember when Carman's "America Again" was performed at our 10,000 member church on July 4 to thunderous applause. Now this stuff just eats at me.
Someone tell me what to do with slavery. Explain to me how getting rich off the blood and sweat of Africans can be identified with a gospel cause. Why do African-American evangelical Christians tend to distance themselves from the Christian right? And why were we singing racist hymns in our churches?
What about how we took England to war over taxes, slaughtering thousands of image-bearers? Did that reflect the love of Christ on the cross? Or did it fit the selfish ambitions of a minority (the war was opposed by an estimated 2/3 of the colonists)?
What can we say about manifest destiny and the Native Americans? What a joke. We stole. We murdered. That's what there is to say.
All this for a godly purpose? I'm less than convinced. The crusaders said the same things. I can't understand how Christianity's place as the unofficial civic religion negates all the bad.
There is little distinctly Biblical language in the founding documents of this country. Why is that if it was supposed to be an overtly Christian nation? Does it really make a difference to Christians? Where does Scripture say we should fight to be made comfortable in our culture? Where does it say that restoring Christianity as the civic religion of our country is a part of the great commission? This kingdom will fail. It's of the world, you know.
I mean this seriously. Can anyone help me reconcile all the bad things with the cross of Christ?