Stop Saying “Toxic”

Every week on social media I seem to see something new described as “toxic.” Toxic has become the word of choice, it seems, to describe something that you feel is bad but seems to resist more precise condemnation. This word is everywhere. I’ve used it myself. Everyone seems to know what “toxic” means even though the word is applied to a staggeringly diverse group of maladies. Here’s a sampling just from my own reading:

  • Evangelical culture is toxic
  • YouTube comments are toxic
  • Jordan Peterson is toxic
  • Political discourse is toxic
  • The New York Times is toxic
  • Pornhub is toxic
  • John Piper is toxic

I know exactly what each of these statements is supposed to make me feel: loathing, disgust, avoidance, etc. The problem here is that “toxic” seems to be a stand-in for other words, other descriptions, and those other words probably won’t mean the same thing if you applied them to everything else on the list. John Piper may be toxic in your view, but nobody would say he’s toxic just like Pornhub is toxic. The New York Times may be, according to you, a toxic institution, but it cannot be toxic for the same reasons that YouTube or my church are toxic. So that leaves us with the impression that toxic just really means bad.

So…why is toxic so much better/cooler/woker to say than “bad”? Where are the essays about bad masculinity? What does “toxic” reveal that bad doesn’t?

A couple theories:

1) We seem to be at a point, at least in online discourse, where the more imprecise a moral judgment is, the better. The obvious example is how loaded conversations about identity are with words like “oppress” or “bigot” or “right side of history.” If you say something like, “Bigots are on the wrong side of history,” everyone knows what you’re saying is true, even if you decline to define the words “bigot,” “wrong side,” or “history.” The word toxic is a nice shorthand because it carries with it the necessary negative connotation but does not contain in itself the object of moral scorn. If you say that such-and-such pastor is misogynistic, that’s an equally loaded term, but now you’ve advanced a claim that can be evaluated based on the meaning of words. But if you say that such-and-such pastor is toxic, you can mean that the pastor is misogynistic (and the right audience will know this) while not risking a potentially defeating response from someone who evaluates your claim.

This benefits the speaker, obviously. But it also benefits the audience by allowing feelings of disgust and icky outgroup-ness to be shared among people who may not have any idea why they’re supposed to feel this way. “Trust me, this person is toxic” is very freeing to hear to folks with particular kinds of ambition and tribal sensibilities.

2) The word “toxic” does not technically describe anything’s nature. It describes an effect. Polluted air is toxic if you breathe it. Rat poison is toxic if you ingest it. This leads me to wonder if a lot of people describe something as toxic as a way of signaling how it makes them feel, or how they believe it makes other people feel. Obviously very few people see the word toxic and make this instant connection, so the word is used as though it does describe something’s nature—e.g., something that’s immoral, prejudicial, oppressive, etc.

Thus, you get think pieces like, “Let’s Talk About Netflix’s Toxic New Show,” or, “The Oscars Have Hired an Unbelievably Toxic Host.” More substantively, you might get something like, “Conservative Evangelicals Embrace Toxic Theology.” In each example the upshot of the article is clear from the headline: Netflix needs to dump the show, the Oscars need a new host, and conservative evangelicals need to believe something different. But if these things are toxic, why does anyone support them, either a Netflix producer or the Oscars or evangelical theologians? Clicking through to the article will likely explain that toxic things are done by toxic people, and that the reason one needs to be sure to keep up with all the new emerging toxicity is so you can avoid toxic people at all costs. Kick them out of your life before they intoxicate you.

You’ve probably heard someone say they got out of a “toxic relationship.” In many cases what they mean is that the other person was mean, rude, selfish, possibly abusive. I’m not sure why describing a relationship as “toxic” is actually better than saying the reasons for it. It seems to me that if someone is truly cruel or manipulative, that moral character is worth describing truthfully, and calling them “toxic” is letting them off the hook. Perhaps the flip side is true too: perhaps some people say “toxic” when they really mean, “I didn’t enjoy this and it wasn’t what I wanted.” In that instance it’s pretty clear that describing another person as toxic lets you off the hook.

Imprecise moral judgments are valuable because they cast a wide net. Precise moral judgments can be pushed back upon by people who would seem qualified to do so. For example, if you accuse a person or group of being racist, a member of a different race could theoretically complicate your accusation by disagreeing with you. The way around this is to ascribe a moral but fluid negative characteristic to the group, so that people who are inclined to agree with you can do so and those not inclined are in danger of walking into your description by failing a standard they don’t know.

“Toxic” then seems to be the perfect word to describe the sin of not being the Right Kind of Person. It’s a conversation ender, a debate finisher, a slammed door. The only way to not be toxic is to not be toxic. The racist could repent, the misogynist could change, the slanderer could make a U-turn. But a toxic person cannot de-toxify. They don’t even know where to start.

A resolution for 2021: Don’t say “toxic” when you mean something else. Say what you mean, so that what you mean will be worth saying.

Engaging Culture From Ahead, Not From Behind

Why Christians shouldn’t let elite journalism set the agenda anymore.

Let me describe an experience that has become very common for me over the years.

I’ll navigate to a well-trafficked Christian blog or publication. The major headlines are almost exclusively devoted to other headlines, from a secular newspaper, journal, or magazine. You see, the entire purpose of this Christian site is to recapitulate what else has been published in mainstream journalism, and to offer a theological or political commentary on it. Whether the topic is “throuples” in Manhattan, the latest ritual at Burning Man, or a tenured professor’s tweets, the conversation is always started by the consensus of prestigious journalism institutions on what we need to be talking about.

Based on my experience, this is what a lot of evangelicals mean by “engaging culture.” Like the cast and crew commentary on the Special Features section of a DVD, this mode of engaging culture adds Christian words to a preexisting perception of the world. Here’s what the editors of The New York Times, CNN, The Atlantic, and BBC want you to be thinking about. “Here’s commentary from a Christian point of view to accompany your thinking about these things. Now you can go and think!”

There is certainly something valuable in offering believers this kind of resource. Especially for Christians whose career puts them in close contact with thoughtful unbelievers, being able to intelligently answer questions has massive evangelistic implications. It’s also true that many American Christians lack the training these resources offer.

But lately I’ve wondered whether something is insufficient, not merely with the kind of commentary being offered but with the genre of writing itself. Does this kind of cultural engagement presume something potentially untrue—namely, that Christians should be thinking heavily on the kind of stories featured in the pages of elite media? Behind that question lies another, perhaps more complex one: Does what we read in the pages and watch on the screens of American media actually represent our “culture,” or does it just represent the ambitions and imaginations of media moguls?

The 2016 presidential election raised serious doubts among many that mainstream American journalists understood their own nation. In fact, in the shocking aftermath of Donald Trump’s victory, many of them said so. Trump’s victory was unthinkable to any whose vision of society was shaped by the stories and ideas promulgated by national media outlets like The Washington Post or Forbes. Some self-reflective members of the media concluded that their work culture was insular and severely disconnected from the concerns and convictions of a huge chunk of American voters. I think that’s a reasonable conclusion.

Evangelicals have sometimes thought of “culture” as a monolith, a coherent ambience that is the sum total of Hollywood, education, the bestseller lists, and journalism. In my experience it’s common for Christians to talk of “the” culture without any effort to specify whose culture is being talked about. This is evident in something I’ve talked about here before: The tendency of a lot of Christian literature to offer over-generalized aphorisms and observations that don’t take into account how different people in different places need to hear different things.

We often talk about purity culture as if it there is only one kind of purity culture, and every single evangelical in America experiences that singular purity culture in the same way. But even a minute’s reflection will reveal that to be spectacularly untrue. Evangelicals raised in rigid, homeschooling environments have a particular experience with the doctrine of chastity that another Christian with a background in nominal religious culture won’t necessarily have. One church in evangelical Christianity uses Scripture to shame and brutalize teen girls over their sin, while another church sweeps the adultery of the minister or the pornography of an elder under the rug. “Culture” is multifaceted.

If culture is not a singular, omnipresent thing, then it makes sense to suppose that perhaps it needn’t always be engaged at face value. Here’s what I mean: What evangelicals mean by “culture” when they talk about engaging culture is in a very real sense a product, something created by an individual or a group and traceable to them. It is therefore a mistake to suppose that whatever ends up in the longform section of The New York Times necessarily represents “where culture is going.” The longform section of the NYT isn’t created by “culture,” it’s created by individuals and groups that want to manufacture something: an idea, a fad, etc.

The reason this matters is that engaging culture by centering one’s intellectual orbit around what comes out of elite journalism can lead Christians to perpetually express the public implications of our faith in the direction of people least likely to heed our message, and on current events least likely to be urgent in actual churches. In other words, if your idea of culture is dictated to you by The Atlantic, you might think the most important thing you can talk about as a Christian is why polyamory is sinful, or why Drag Queen Story Hour is a moral outrage. Assuming, though, that your local church is unexceptional, the odds are incredibly good that suicide, depression, smartphone addiction, and sexless marriages are much bigger issues for you than those. If however the agenda for Christian thinking is being set by elite media, concentrated in affluent coastal bastions of progressivism, the witness of evangelicalism is always from behind—reactive—and never from ahead.

***

What would it look like to engage culture from ahead rather than behind? Simply put, it means fostering Christian publications and ministries and writers who are able to think at a theological and anthropological level rather than merely a journalistic one.

A great example of the potential for Christians to set the intellectual agenda for others is technology. Secular society for the most part sees nothing at all moral in the newest developments from Silicon Valley. But there is a growing number of secular Americans who nonetheless feel that something is being lost in the omnipresence of screens. This is a tremendous opportunity for Christians to supply unbelievers with the language they seek but lack. Christians believe in the inherent goodness of the created world, but also in the indelible tendency of fallen humans to curve the resources of this created world toward sinful, selfish ends. The reason many Americans feel alienated by the technocratic culture is that we are not designed like robots, but in the image of a relational, rational God with real presence. To be disconnected from the physical world is to become less like the God in whose image we are made, thus, to become less human.

On the issue of technology and human flourishing, Christians have the ideas and categories that explain why things are the way they are. And here’s the upshot: Almost everyone in your church, neighborhood, school, workplace, or family has a smartphone. Almost everyone is connected to the liturgies of the internet. Compare that to how many people you personally know who are sending their kids to Drag Queen Story Hour, and you have an idea what is actually relevant to your culture. True, The New Yorker is much more interested in genderqueer libraries. That doesn’t mean you should be.

Take another example: Depression and suicide. There is absolutely no doubt in my mind that anxiety, loneliness, and self-harm are among the most pressing issues facing believers in the West today. The numbers are staggering, the testimonies are too numerous to count, and the severity of the problem is only rising. People are dying of despair. Lots of people. People in your town, in your church. Maybe in your own home.

You’ll get a different answer as to why this is depending on whether you listen to economists, sociologists, doctors, activists, or journalists. Shouldn’t those who believe in a God of life, a God who puts the lonely in families, a God who wipes away tears and will live inside sinners like an explosive spring of water—shouldn’t we be setting the agenda here? Deaths of despair in a rich, affluent time are not surprising to people who know the real condition of the human heart. Are we speaking to this the way we could? Or does the lack of political leverage to this story make us bored, uninterested, or even apathetic? Sometimes I wonder if labeling everything as “culture war” makes us blind to actual death.

The point is that by engaging culture from behind, we shrink our world and our mission field. Being unable to tell the difference in urgency between the carryings on of coastal trust fund socialites and the silent cries of those sitting right next to us is a colossal miscalculation. It is, actually, failing to engage culture at all. It doesn’t “engage” because it usually fails to persuade (and honestly isn’t meant to). And it mistakenly identifies as “culture” what could probably more accurately be described as “anticulture.”

Engaging culture from ahead begins with a careful posture of learning and discernment. It prioritizes life and death rather than language and signaling. And it seeks to speak into a specific need rather than a news cycle. It’s not as lucrative, and it’s frankly not as easy. But it’s obedient.

Josh Hawley and the Need for Tech Stigma

Josh Hawley, the junior US Senator from Missouri, is waging a small war against Silicon Valley. Twice this summer Hawley has introduced legislation that targets social media corporations’ out-sized role in the lives of Americans. His latest bill is perhaps the most straightforward legal challenge to the biggest social media firms yet. The SMART Act would tightly regulate social media technology, forcing developers to make specific changes that dilute the addictive and omnipresent qualities of the apps.

In a May lecture that was published by First Things, Hawley lays out his case against Silicon Valley. He warns that Big Tech firms are pocketing obscene profits by maximizing addiction and carefully overseeing a monopoly on news and information. All the while, the American workforce is being populated by users diagnosed with elevated rates of depression, anxiety, and inability to focus.  Hawley concludes by reflecting that the culture being shaped by social media technology is an “economy that does not value the things that matter.”  Hawley: “That, I want to suggest to you, is something that we cannot afford. It is something that we cannot allow, and it is within our power to change it. And that is the great challenge and task of our time.”

David French, an evangelical columnist at National Review for whom I have great respect, dismisses Hawley’s legislative prescriptions as a misguided attempt to control consumer habits from Washington. French believes Hawley’s bills do address real problems, but establish a dangerous precedent for a “Republican Daddy State.” Writing in First Things, Jon Schweppe rebukes French and other conservative critics of Hawley’s proposals: “Historically, our politicians have determined that government should have a role when corporations exploit consumers by putting their physical or psychological health at risk,” he notes. “This is especially true when those consumers happen to be children.”

***

It’s hard to resist evaluating Hawley’s proposed laws and the debate over them in light of the larger, intra-conservative kerfuffle (also starring First Things and French!) that’s emerged in the Trump years. On the surface this looks like yet another installment in the “What is the proper role of government in the formation of virtuous citizens” question, an issue that takes on radically different shape depending not just on your politics but on your ecclesiology. Because I think David French is right about justification by faith and the mission of the church, and I think the editors of First Things are mostly wrong about them, I tend to gravitate toward a Frenchian perspective on statism.

But Jon Schweppe is right about something crucial: The question is not whether government will regulate the behavior of the citizenry, the question is how. If a legal minimum age to drink alcohol is an acceptable manifestation of a “daddy state” (and to Schweppe’s point, I don’t think any conservative columnists are arguing otherwise), why not proportionate regulations on a consumer product (social media) arguably even more omnipresent and accessible to children than alcohol?

French is right that overreaching regulation, even to fix a serious cultural malaise, could and probably would have long-term consequences. On the other hand, we’re almost certainly already signed up for long-term consequences from the overabundance of digital technology. Worse, functional monopolies held by Apple and Google make it almost impossible for creative solutions to supplant existing business models. “Digital literacy” programs come with the moral and legal authority of government to the benefit of manufacturers, all the while sites like YouTube, extolled as educational tools, oversee an algorithm-based disaster that targets children with disturbing content.

Though I share French’s view of federal intervention, “Daddy State” is a an epithet that fails to reckon with how consumer habits are conditioned and even constrained by the complex relationship between Silicon Valley and the information age. The latter is an unchangeable revolution; there is no rewinding the clock on the internet, and nostalgia is not a synonym for virtue. The former, however, is nothing more than a corporate culture that should be viewed with no less skepticism than pornography industry. What Hawley understands is that our experience of the information age has become cripplingly dependent on a fistful of companies that use jargon and confused lawmakers to exploit loopholes.  Michael Brendan Doughtery (writing in National Review, no less!) was exactly right to say that Facebook is a media and publishing company, regardless of what its executives say or the exemptions and allowances they request.

***

But there is something missing from Hawley’s agenda. The senator is eager to handcuff developers with laws about “infinite scroll” and time limits. This is interesting, but it plays into Big Tech’s hands. The problem with targeting granular technologies is that such technologies are always on the cusp of changing anyway. What does infinite scroll look like in, say, an augmented reality channel? Unless you’re well versed in the psychology and coding of this tech, you probably have no idea, and if there’s one thing Mark Zuckerberg proved, it’s that befuddling aging Congressmen with terminology almost any 13 year old would recognize isn’t that difficult.

What Hawley’s efforts lack is an element of stigma. Rather than trying to play the developer’s game, legal efforts to help our tech addiction should try to put a social stigma on always looking at your phone, or spending hours on YouTube, or anonymous message-based sites that foster radicalization. There should be a social shame to digital addiction that is comparable to the stigma around pornography, which is mediated through age-gate laws, laws that protect the depiction of minors, and other statutes, as well as practices in the private sector (such as cordoned off “adult” sections).  While of course most of us would say that social stigma around pornography is far too weak, since pornography is still too common and accessible, there is reason to think that promoting a stigma around tech sickness would be better and more effective than targeting the zeroes and ones of software.

In a brilliant essay almost twenty years ago, Roger Scruton pointed out that the contemporary West has introduced law and politics as a replacement stigma and custom. This is decidedly not how societies past operated:

In almost all matters that touched upon the core requirements of social order, they [generations past] believed that the genial pressure of manners, morals, and customs—enforced by the various forms of disapproval, stigma, shame, and reproach—was a more powerful guarantor of civilized and lawful behavior than the laws themselves. Inner sanctions, they argued, more dependably maintain society than such external ones as policemen and courts.

Stigma is not effective at eliminating a social ill. But that’s precisely the point. There are some social ills that cannot be radically destroyed, and efforts to do so may seriously damage the underlying social fabric. Scruton uses sexual morality as an example of a communal virtue that protects the vulnerable when it is enforced from within, but tends to turn abusive and corrupt when such enforcement is outsourced to the governing authorities. Of course, there is no hard and fast dichotomy between social stigma and the law, because the law teaches as well as restricts; thus, men who abandon their families must pay child support under the threat of the law.

Might the same principle work for responding to the crisis of digital addiction? Restricting social media to legal adults, for example, would not eliminate its addictive qualities or even fully prevent children from using the services, but requiring a credit card number or some other age-verification tool would create a “mature audiences only” stigma that highlights social media’s addictiveness and tendencies toward vice. Another stigma would be requiring that smartphones or internet-capable tablets not be sold to anyone under 16, and requiring parents purchasing such equipment for minors to sign informational disclaimers about addiction, psychological development, distracted driving, etc. Without restricting speech, such laws would introduce moderate hurdles to using such tech, making it especially difficult for children to have their own private digital lives.

We need a digital stigma. Rather than assuming that mobile, interactive technology is inherently valuable, we should assume that Silicon Valley’s products are comparable to cigarettes and alcohol: Not for children, not for habitual use, and certainly not for tax exemptions and public school programs. This of course doesn’t solve the problem of distraction sickness, nor does it even guarantee that parents would have the will to protect their kids. But it would strike a blow in the cause of cognitive and emotional flourishing, and puts Silicon Valley billionaires off the pedestal of philosopher kings and in the corner, where they belong.

Surviving Our Humanity

Bird Box, just recently released on Netflix, bears an obvious resemblance to John Krasinski’s A Quiet Place. The latter is a superior movie in almost every way, but that’s not my point. My point is that Bird Box and A Quiet Place are strikingly similar in how they ask the audience to consider how much less human we’re willing to become in order to survive. Each film is a horror-parable about our own humanity’s being weaponized against us.

“A Quiet Place”

In A Quiet Place, apocalyptic monsters have taken over and almost invariably kill whoever and whatever speaks above a whisper. In Bird Box, the same idea is turned to a different sense: Sight. Unseen monsters put whoever glimpses them, even for a second, into a lethal trance that ends in suicide. Thus, the heroes of both tales have to live without a part of their normal human functions: Sandra Bullock and her two children are blindfolded even while boating in rapids, and the family in A Quiet Place verbalizes nothing above ground. Human beings are threatened by the very things that make them human. The monsters are of course the problem, but they are quasi-omnipotent; they’re not going away. The real enemies are sight and speech.

I can’t help but wonder if these stories are connecting with audiences at a spiritual level. Might we think of many of the problems of contemporary life as a felt conflict between human flourishing and human nature? Take consumerism. Consuming is a natural human impulse, yet isn’t there a palpable sense right now that our consuming nature is at odds with our desire for meaning and transcendence? Or consider the setting of A Quiet Place, a world in which it is dangerous to speak. Ours is the age of near endless speech, amplified by mobile technologies that allow us to live intellectual and emotional lives out of our phones. Amazingly, this technology has been most efficiently leveraged to make us depressed, insecure, outraged, distracted, and lonely. Perhaps A Quiet Place resonates as a horror film because its premise is actually true for us right now—our sounds invite the monsters.

A similar idea emerges in Bird Box. I was disappointed the movie’s screenplay didn’t explore a bit more the monsters and their power. For example, most of the people who see the monsters immediately commit (or try to commit) suicide. But there a few who instead of killing themselves become quasi-evangelists for the monsters. They violently try to force blindfolded survivors into looking, chanting stuff like “It’s beautiful” and “You must see.” What’s the reason for the difference between the suicidal and the possessed? Regrettably the movie never comes close to saying. It’s fascinating though to consider Bird Box‘s theme of becoming what we are beholding through the lens of the monsters’ creating both victims and victimizers. Those who look at the monsters and live only do so because they are actually dead on the inside. They survive the monsters by becoming the monsters. That’s a pretty potent metaphor for the era of “call out culture” and strong man politics, not to mention the modern shipwrecking of the sexual revolution that is #MeToo.

In both movies, death comes through the body itself, through the senses. This is a provocative way to think about what Lewis famously dubbed the “abolition of man.” Lewis’s essay warned that the death of binding moral transcendence and the subjugation of nature would not liberate mankind, but merely re-enslave it to itself. “Man’s conquest of Nature turns out,” Lewis wrote, “in the moment of its consummation, to be Nature’s conquest of Man.” This is the world depicted by both A Quiet Place and Bird Box, a world in which nature, especially human nature, has been weaponized against us. In both films people must find ways to live below their own full humanity, because it is the expression of their full humanity that brings violence.

To me, this is a stirring poetic summarizing how divided we feel from ourselves in a secular age. The indulgence of our nature in the affluent postwar glow of the latter 20th century failed to slake our thirst for righteousness. Now, slowly awakening from nihilism, we find our own humanity turned against us, especially through technology’s power to shape the mind. To look at modern life, in its pornographic despair, kills the soul, and to speak above a whisper invites the demons of doubt and shame.

It’s interesting to me how both films center on kids. Each story’s drama mostly concerns whether the adults will be able to save their children. Why is this? Perhaps it’s because children are a common literary stand-in for renewal of innocence. But also, perhaps it’s because one of the few motivations left in a world of living beneath one’s humanity is to protect those whom we hope may not have to do so. Perhaps it’s also because such a world inevitably slouches toward new life, one of the final touchstones of grace in a disenchanted world. I sometimes wonder whether protecting children is the closest an unrepentant mind can come to true faith, as if to say, “I cannot become like a child, but I will preserve those who still can.”

 

Have I Sinned Against Unbelief?

Why Christians should take suffering that inflames unbelief far more seriously

While reading a remarkable book titled Christianity: The True Humanism, I was bowled over by this passage by J.I. Packer and Thomas Howard:

It is clear that many humanists in the West are stirred by a sense of outrage at what professed Christians, past and present, have done; and this makes them see their humanism as a kind of crusade, with the killing of Christianity as its prime goal. We cannot endorse their attitude, but we can understand it and respect it…

We, too, have experienced in our own persons damage done by bad Christianity—Christianity that lacks honesty, or intelligence, or regard for truth, or biblical depth, or courtesy, or all of these together. No doubt we have sometimes inflicted this kind of damage, as well as suffered it. (Lord, have mercy!) We cannot, however, think it wrong for anyone to expect much of Christians and then to feel hurt when they treat others in a way that discredits their Christian commitment. Since Christianity is about God transforming us through Jesus Christ, high expectations really are in order, and the credibility of the faith really is undermined by every uncaring and uncompassionate stand that Christians take. Loss of faith caused by bad experiences with Christians is thus often more a case of being sinned against than of sinning and merits compassion more than it does censure.

I instantly realized this was close to the opposite attitude I have had for many years. Instead, I’ve often been so occupied with undermining unbelief, with critiquing the spirit of the age and tearing down the intellectual and existential reasons people give for not following the Christ of the Bible, that I have utterly failed to take seriously the connection between being sinned against and unbelief. If Packer and Howard are right—and I believe they are—this is a major failure.

Why have I been failing here? I can think of two reasons.

First, there is a palpable cultural mood that reduces everything about life to the sum total of one’s experiences. This is the “my story” epistemology that I’ve written about before. Because there are no agreed upon central, transcendent truth claims in a secularized public square, the most truth that anyone can arrive at is their truth, and their truth is often deeply subjective interpretations of relational and social events. This mentality is powerful, and it is destructive; it blinds people to the absolute nature of our most important questions. It empowers confirmation bias. It can make people unteachable and difficult to reason with. It’s bad news.

So I think I’ve been caught up in refuting this mood so much that I’ve lost sight of the legitimate relationship between experience and objective belief. I’ve tried to swing from the one extreme of “experiences are all that matter” to the other extreme of “You should be able to think and live wholly independent of what people do to you.” Both extremes are logically impossible, though one feels more Christian than the other at this cultural moment. But Packer and Howard get to the heart of the matter when they say that unbelievers are right to have high expectations of people who claim to be actually reborn by the Spirit of Jesus. They have those expectations not because of Christians but because of Jesus! Thus, to ignore the failures of people who say they are born again to image the One in whose name they are supposedly reborn is to ignore the moral glory of Christ himself.

The second reason I think I’ve failed here is that I have consistently underestimated the power of suffering. It’s an underestimation that comes straight from my not having suffered very much. But it also, I suspect, comes from my not having listened very closely to the testimonies of people who have suffered much. This is inexcusable, and I’m sure it’s damaged in some way my connection with others.

I’ve said before that virtues like modesty and chastity have attending practices that can help us grow in them. This how I feel about stuff like the Billy Graham Rule, for example. But I think I’ve neglected the fact that empathy is also a virtue, and that like other virtues, it too has practices that must be picked up if the virtue is going to flourish in my life. What if one of those practices is not arguing all the time? What if another one is listening carefully to people who may not validate my assumptions?

Now here’s an important point. I don’t think the main reason to cultivate empathy is to become less decisive or more “open-minded.” The problem with open-mindedness is that it’s not a virtue. Its desirability depends entirely on what is trying to get into the mind. But empathy is a virtue that cuts across whether people are right or wrong, whether people believe or disbelieve. Rejecting the claims of Christ is wrong. Yet it is possible to compound a wrong by sinning in response to it. It is possible to drive a thorn deeper. Neglecting or minimizing the power of suffering, or lowering bar of expectations for believers, are both sins against unbelief. To the degree that I have done so, I’m sorry, and by God’s grace, I will grow in this.

One final thought. All of this applies very much to the way we Christians talk to people about the suffering of others. If we minimize trauma or excuse a lackadaisical response to it, for the sake of making some tribal theological or political point about someone not in the room, we are broadcasting a false view of God to the world. We are propping up a graven image in people’s minds. We are, in other words, acting in the same unbelief as those we are trying to convert.

Poverty, Dreher, and Story

Rod Dreher, a writer for whom I have a lot of admiration and respect, nevertheless has a tendency to overstate things, especially when those things pertain to his lived experience. Likewise, he has an unfortunate tendency to assume the worst intentions of people who push back against the conclusions he draws from his experiences. Those two flaws—which I shamefacedly confess to sharing—were on full display in the minor kerfuffle over this post. I won’t recap the mud-throwing, but suffice to say that I think Rod’s critics are right in their substantive critiques (Jemar Tisby’s, in particular), and that this whole episode might have been avoided by pondering for a few minutes longer the wisdom of defending transparently bigoted remarks by a transparently bigoted politician.

But there’s another contour to this thing that’s worth a very brief reflection. Part of what Rod was getting at in his original piece was that political correctness often runs counter to what people actually experience. This is a familiar beat to Rod, and it would be a mistake for people to assume that Rod has a vested emotional interest in punching down on poor people. If I’m reading him correctly, I think what Rod resents is the deliberate turning away from reality in favor of sentiments that play well with people who have no (literal and figurative) skin in the game. I think there’s something to say about that, and in an era of actual “reeducation” by our culture makers, the effects of Rightspeak are worth contemplation.

But I think what I’ve come away sensing is that Rod, and plenty of others, have not given enough contemplation. Instead, they’ve intuitively normalized their own experience of poor communities and downtrodden cultures into an argument. Rod’s desire to look for truth through experience is further confirmed when one considers the letters that he’s publishing as responses, as well as some responses to the responses. I think the best course of action is not only to reconsider tropes and stereotypes about the poor, but also to ask sharp questions of our tendency to equate experiences with an argument.

A lot of people have had a negative encounter with poor people or communities. And many of them choose to reason from their negative encounters to much bigger ideas about the moral quality of those in poverty. The problem with this is that one’s experiences are not worthy of such intellectual power. Yes, our experiences matter, and they can powerfully shape us, body and soul. But it doesn’t take much imagination how reasoning from experience is an awfully selective and unfair enterprise. If your only experience of poor Americans is being accosted by panhandlers, you’re likely going to reason from that experience that poor people are poor because they’d rather stand on the side of a highway off-ramp than find a job. Is the problem then that you haven’t had more experiences with poor people? Perhaps! But even if additional, more positive experiences broaden your horizon, continually over-relying on your experiences to inform you about the world will simply manifest itself in some other wrong, prejudiced, or naive way.

We see this everywhere right now. People who experienced judgement in a church might start a blog in which their experience of a relatively small number of people is extrapolated into huge, sweeping ideas about Christianity or the church. People who experience unexpected illness or health might intuit such experiences to big, specious notions about what is healthy and unhealthy (how do you think the essential oils business runs?). The point is not to discount our experiences entirely. We couldn’t do that even if we tried! The point is that piecing together our experiences and coming to a true knowledge of anything requires more than just gathering as many experiential narratives as we can.

The truth about American poverty lies far beyond the possibility of my experience, because it is indelibly rooted in history and ideas. I cannot visit the south side of Chicago for a weekend and come away with authoritative knowledge about poverty or urban policy. Nor can I justly conclude that a friend, associate, or Uber driver’s testimony is warrant enough for me to be dogmatic about an issue. My experiences and the truth are not coterminous.

I’m convinced that if Christians are going to coherently carry their witness to Christ and him crucified into future generations, we have to insist on this fact. I know Rod agrees with me, because I’ve read him long enough to know he does. I hope that he’ll apply this principle as liberally to issues of poverty and race as he does to modernism and confession.

My Year in Books

Let’s get this out of the way: Year-end reading lists are usually more helpful for making us feel guilty about what we didn’t read than making us thankful for what we did.

My own year of reading was certainly no exception; the pile of books that I read this year seems so small compared to that of others. Yet, I think it’s important to actively fight against this feeling. There is probably a place for reading to have read, but it’s a place that is often far more prominent in my ego than it needs to be. Reading at whim and for pleasure is, all variables being equal, vastly superior to reading to keep up. The former can, and often has, turned something in my soul. The latter usually just confirms my preexisting insecurities and arrogances.

With that prologue finished, here are the books I spent the most pleasurable time with this year. This isn’t an exhaustive list of my reading (though I won’t pretend that the exhaustive list would be much bigger), nor is it a definitive breakdown of everything I liked this year. Rather, these are the books that stayed with me the longest after I read them, the books I thought about the most, the books I marinated in the deepest. Most are from 2017, though not all.

 

-Brian Jay Jones, George Lucas: A LifeA compulsively readable biography. While it doesn’t offer quite the psychological insights I hoped, Lucas’s eclectic, unlikely career is vividly told with lots of fascinating new anecdotes.

-Rod Dreher, The Benedict Option. If you haven’t read the book, you don’t quite know the argument.

-Tom Nichols, The Death of Expertise. An accessible and unpretentious assessment of a major cultural development. An essential read for anyone trying to understand the impact of the internet on how we think. Speaking of which…

-Alan Jacobs, How To Think. One of my most underlined books of the year. I like to think of it as a long essay about the epistemological consequences of social media. I can hardly think of a more timely work.

-John Stott, The Cross of Christ. This was my first foray in a Christian classic. Stott’s defense of penal substitutionary atonement is beautiful—so much so that it’s odd to even call it a “defense.” Of all the nonfiction I read this year, this one drove me to prayer and worship the most.

-Graham Greene, The Heart of the Matter. Greene’s psychological novels dig deep in my soul. This story about a duty-bound English police officer and his crisis of faith and marriage kept me up late hours of the evening. The ending is one of the most spiritually moving pieces of fiction I’ve read.

-Chinua Achebe, Things Fall Apart. An exquisitely written novel about some of the most fundamental human experiences. Aspiring storytellers should know this book.

-Sarah Shin, Beyond Colorblind. This excellent work is a rare thing: An evangelical treatise on race, white privilege, and community that is both thoroughly Christian and unflaggingly level headed.

-James K.A. Smith, You Are What You Love. Probably the second-best book I read this year. On that note,

-Joe Rigney, The Things of Earth. My #1 read of 2017. I will be re-reading this book regularly. It has given me something for which I’ve longed for a while: A theological perspective on enjoying what God gives, and why doing so doesn’t conflict with enjoying who God is.

The Death of Expertise

Remember that famous scene in Peter Weir’s “Dead Poets Society,” in which professor Keating (played by Robin Williams) orders his English literature students to tear out the introductory pages of their poetry textbook? Those pages, Keating explains, are the soulless pontifications of a scholar trying to explain verse. Nonsense, says Keating. Poetry isn’t what some expert says it is. It’s about “sucking the marrow out of life,” about spontaneous utterances of the subconscious and chasing your dreams and sticking it to your parents and headmaster. Forget the experts, boys; carpe diem!

As a misguided defense of the humanities, “Dead Poets Society” is easy enough to dismiss. The bigger problem is that Keating’s heedless disregard for truth and knowledge is a pretty accurate picture of how many Americans think and live. That’s the contention of Tom Nichols in his new book “The Death of Expertise,” a brief yet persuasive work that laments our generation’s factual free-for-all.

Americans, Nichols believes, are not just subsisting on a low amount of general knowledge. That wouldn’t exactly be a new development. Rather, Nichols is disturbed by the “emergence of a positive hostility” to established, credentialed, and professional knowledge, one that “represents the aggressive replacement of expert views or established knowledge with the insistence that every opinion on any matter is as good as every other.”

According to Nichols, what White House press secretaries might call “alternative facts” have become common cultural currency. If love means never having to say you’re sorry, the Internet means never having to say you’re wrong.

For many people, a first-person blog post is (at least) as authoritative as a peer-reviewed study, and a Facebook link offers truth too hot for professional journalists and fact checkers to handle. This ethos doesn’t just promulgate wrong information, which would be bad enough. Nichols argues that, even worse, it fosters a deep distrust and cynicism toward established knowledge and credentialed communities.

Nichols’s book puts the symptoms of the death of expertise on a spectrum. Some effects are clearly more harmful than others. It’s no revelation that “low-information voters” feel as vehement as ever about a plethora of fictitious things. More worrisome, however, is the growing public comfort with dangerous conspiracy theories. Both of these trends owe much to the “University of Google” (to borrow one celebrity’s self-proclaimed credentials for rejecting vaccinations). With so much access to so much information available to so many people, the web has seriously undermined the responsible dissemination of verified facts and blurred the distinction between truth and talking point. Nichols writes:

The internet lets a billion flowers bloom and most of them stink, including everything from the idle thoughts of random bloggers and the conspiracy theories of cranks all the way to the sophisticated campaigns of disinformation conducted by groups and governments. Some of the information on the Internet is wrong because of sloppiness, some of it is wrong because well-meaning people just don’t know any better, and some of it is wrong because it was put there out of greed or even sheer malice. The medium itself, without comment or editorial intervention, displays it all with equal speed. The internet is a vessel, not a referee.

Nichols doesn’t lay all the blame on the internet. Higher education has contributed to the death of expertise, Nichols writes, both by churning out poor thinkers from its ranks and by defining education itself down to mean little more than payment in exchange for a degree. “When everyone has attended a university,” Nichols observes, “it gets that much more difficult to sort out actual achievement and expertise among all those ‘university graduates.’” Similarly, public trust in professional journalism has been harmed by incompetence on one end and clickbait on the other. All of this, Nichols argues, combines to foster an instinctive cynicism toward expertise and established knowledge. When experts get it wrong, well, of course they did; when they get it right, there’s probably more to the story.

One issue that seems relevant here, and one that Nichols lamentably doesn’t really parse, is the triumph of subjective narrative over objective arguments. Americans have always loved a good story, but what seems unique about our time is the way that story and first person narrative have assumed an authoritative role in culture, often to the contradiction and exclusion of factual debate. Instead of trading in truth claims, many simply trade in anecdotes, and shape their worldview strictly in line with experiences and felt needs.

The privileging of story over knowledge is a glaring feature, for example, of much contemporary religion. While real theological literacy is alarmingly rare, what are far more common are self-actualizing narratives of experience. These authoritative narratives take all kinds of forms—they’re the diaries of the “spiritual but not religious” Oprah denizens, and they’re also the cottage industry of “ex-[insert denomination/church name]” watchdog bloggers. In both cases, when jolting stories about the problems of the religious expert class collide with more established doctrine or practices, the tales triumph.

What’s more, young evangelicals in particular seem more likely to get their theological and spiritual formation outside the purview of pastors, churches, and seminaries (a triad that could be considered representative of a religious “expert” class). Blogs, podcasts, and TED Talks seem to offer many American Christians a spiritual life more attractive than the one lived in institutions like the local church and seminary. Indeed, a casual disregard for formal theological education seems to be a common marker amongst many young, progressive evangelicals, several of whom enjoy publishing platforms and high website traffic despite their near total lack of supervised training. An Master of Divinity may be nice, but a punchy blog and a healthy Twitter following is even better (you don’t have to think long or hard before you see this dynamic’s potential for heterodoxy).

Perhaps we ought to consider this the “Yelp” effect on American culture. In an economy of endless choices, “user reviews” are often the first and most important resource that many of us consult in making decisions. Putting trust in the aggregated consensus of the crowd is likely more endemic in our daily experiences than we think. It’s how we decide where to have dinner, which car to buy, what insurance company to rely on–and, increasingly, whether or not to inoculate our children, and which interpretation of the New Testament to accept. When the self-reported experiences of our peers are just a couple clicks away, and our first instinct toward expertise and credentialed wisdom is suspicion of bias and elitism, it’s not hard to see how we got here.

So what’s the solution? Unfortunately, Nichols’s book doesn’t offer many answers for the death of expertise. This is somewhat understandable; there are only so many different ways to say “epistemic humility,” after all. There is obvious need for self-awareness, both among laypeople and among the expert class. As Nichols notes, credentialed experts should “stay in their lane,” not risking credibility in order to appear omni-competent. Likewise, readers should acknowledge the inherent value in professional training and the processes of verification and review. While these things do not make expertise infallible, they do make expertise more reliable than sheer intuition.

But in order for this epistemic humility to take place, something else needs to happen first, and that is the re-cultivation of trust. Trust has fallen on hard times. Mutual trust in the public square is increasingly rare. In many cases, good faith dialogue and hearty debate have been exchanged for competing “righteous minds” that suspect the worst of ideological opponents. The “death of expertise” is, in an important sense, the death of trust—the death of trust in our public square, the death of trust in our institutions and cultural touchstones, and even the death of trust in each other.

Believing in the inherent value of experts requires us to accept fallen human nature in its limitations. It requires us to to admit that individuals with a laptop and a curious mind are limited, and that “limited” does not here mean “stupid.” The value of experts—whether professors, doctors, theologians, architects, or, gasp, even government officials–is value that we see when we accept that time and training and accountability and certification are helpful precisely because they impose a control on individual passions and abilities. The fact that not everyone is an expert is a good thing, because human flourishing is not when, as the joke goes, “everybody is above average,” but when people learn from each other in striving for the common good.

Expertise is not an infallible panacea. Nor is it a politically motivated trap. It is the natural consequence of being made in the image of a knowing God, who gives gifts and graces to each, for the good of all. Humility to sit under this kingdom economy is the key to resurrecting a culture of trust—and with it, a flourishing, mutually beneficial age of experts.

There Are No Secrets Anymore

Modern technology has made immorality easy to do, and impossible to keep secret.

Disgraced politician Anthony Weiner has been disgraced yet again…and again, it’s all about some raunchy texts. I can’t really laugh at him, because it’s obvious that he’s dealing with some life-deforming demons that I know too well. My prayer is that he would reach to the heavens for the rescue he desperately needs.

In a brief piece at National Review, Charles C.W. Cooke makes an interesting point about technology and immorality. Years ago, this kind of infidelity was hard to keep secret, because it required physical presence. Then, with technology, it got really easy to keep secret. But now, with the way that modern smartphone technology tracks and archives everything, secrecy is impossible yet again:

By the 1950s, everybody had a car, which they could use to get to the next town — or farther. Motels popped everywhere, as did their discreet proprietors. And the analog telephone provided a means by which those who were up to no good could communicate instantly, and without leaving a substantial record. So fundamentally did this transform American life that traditionalists complained openly about the deleterious effect that modernity was having on conventional mores…

[I]s this still true? I think not, no. Now, there are cameras everywhere. Now, most people carry cell phones and drive cars that track their movement by satellite. Now, most communication is conducted via intermediate servers, and spread across multiple devices. In 1960, the average American could make a sordid phone call without there being any chance that it would be taped. Today, with a $3 app, anybody can record any conversation and send it anywhere in the world in a few seconds…Put plainly, it is now nigh on impossible for anybody to get away with infidelity, especially if one is a public figure.

Maybe we could put it like this: In the age of the iPhone, doing something lascivious while no one is watching is the easiest it’s ever been–but doing it without anyone ever knowing is virtually (pun not intended) impossible. At the very least, those naked pictures and crass text messages are being stored somewhere, on technology that someone with a name and two eyes built and maintains.

Surely, as Cooke writes of Weiner, we know this to be the case. So why is there so much explicitness on cloud servers? I can think of two answers.

First, sexual temptation is stronger, always has been stronger, and always will be stronger than logic. This is why Solomon urges his son to not even walk down the street where the adulterous woman lives.

Second, though: Is it possible that many in Western culture are actually OK with the idea of people they’ll never meet having access to their naked bodies and lewd messages? Could it be that our pornified consciousness has actually numbed us to the point where, even if we know that our texts and pictures stop belonging to us the moment we press “Send,” we don’t really care? Have we, as the prophets warned, actually become the very smut we love?

Be Better Than the Bengals

In sports, character and self-control matter every bit as much as talent. You won’t see many demonstrations of that truth more glaring than Saturday night’s NFL playoff game between the Cincinnati Bengals and Pittsburgh Steelers.

In sports, character and self-control matter every bit as much as talent. You won’t see many demonstrations of that truth more glaring than Saturday night’s NFL playoff game between the Cincinnati Bengals and Pittsburgh Steelers. Continue reading “Be Better Than the Bengals”