This is Probably What We Needed

Will a pandemic tie us closer to social digital technology, or expose it as empty?

As we continue to try to wrap our minds around the surreal events of COVID-19, one thought has been recurring for many:

This will destroy the last motivations in our society to actually interact with other human beings.

I understand this fear. It’s reasonable. As the jokes and memes attest, social distancing was happening well before it was mandated. Loneliness has been pandemic a lot longer than coronavirus. It’s logical to imagine that a society voluntarily isolating itself to death would interpret mass quarantine as a validation of the wisdom of living online. I think this fear is probably what kept many churches in the US from closing their doors this past Sunday. People already ask what’s the point of waking up to go to church if I can find a world-class preacher on the Podcast store. If we start asking them to say home, it’s over. Right?

Maybe. But maybe not. There’s another version of this whole story that keeps playing out in my head and I can’t stop thinking about it. I can’t shake the feeling that an oppressive pandemic might actually be the one thing that disrupts the unthinking embrace of virtual social behaviors. When the toxic dust settles, I’m wondering if we’ll find that the punishment fit the crime, and that the anxiety of not knowing when we will see the people we love in real life is sadder than getting a new “Like” is fun.

I’ve written previously in this space about Facebook, and how over the past decade Facebook has made a series of design and functional choices that drop even the pretense of trying to connect people to each other. I’ve been without an account for almost a year now, and even I’m surprised how little I’ve missed it. When I’m looking at wife’s personal feed, here’s what I see: influencer post, influencer post, meme, link to an article, influencer post, somebody trying to sell something, etc., ad infinitum. In other words, Facebook has shifted from a tool to facilitate contact among friends, to a platform by which individuals can communicate with the masses, preferably to help turn a profit. The friendship ethos is totally gone.

People know this, which is why just about everyone you know under the age of 30 has either deactivated their account or gone to Instagram. That’s why predictions that places like Facebook or Twitter  will just become more and more omnipresent until they’ve essentially totally  replaced communities have never been compelling to me. A lot of us are addicted, yes, but that doesn’t mean we cannot tell when the grass is greener somewhere else. Twitter has the advantage of monopolizing the journalist class and therefore being the substance of choice for  “informed” people. But people leave Twitter too, and the odds are good that if Jack Dorsey keeps it up, they’ll keep leaving Twitter. Eventually the same will happen to Instagram.

Before COVID-19, most of us held the assumption that when these companies decline, it’ll be because their users find some other platform. But that’s what I wonder about.

If people in the West will be, as is expected, confined for several months to an absolute minimum of social contact, holed up within their homes and cut off from classmates, church members, concerts, and sporting events, then I think it’s more than possible that social media will fail the cultural test that is given to it. In the coming months social media will be asked to fill a void that is fundamental to who human beings are. Count me among the number who believe that it will fail that test because it cannot do otherwise.

It’s not difficult for digital technology to replace human contact. It is impossible. Silicon Valley advances not just tools for harnessing human nature, but an alternative belief system about what human nature is. That belief system is sort of like the prosperity gospel—it works as long as it doesn’t have to work. When the infrastructure of normal life crumbles, when suffering and sea billows roll, the check always bounces. I understand the fear that people will emerge from their quarantine wondering why they ever left their living room in the first place. But I see another question coming: “Why did we ever bury ourselves with our machines in the first place?”

The logic of tech addiction has been so powerful in part because it almost never feels like we’re losing control. We’re so agile, so upwardly mobile as a society that literally limitless options available to us make retreating into our screens feel like a necessary act of self-care. The infinity potentialities of self-expression in offline life make online life feel accessory rather than replacement. What COVID-19 is about to strip away is the illusion of options, the illusion of total control over what our tech does to us. We are faced with several months of having little else aside from our screens. There’s a gut check coming. And a lot of us will decide we don’t want to live that way.

Maybe this is what we needed. I’m not talking about death, obviously. The deaths of thousands from coronavirus don’t serve the “higher” purpose of rehabbing a culture off technological delusion. I’m talking specifically about those who survive, and go on after this crisis is over to live relatively normal lives. For us, maybe this is the only thing that could really trigger change. I’m optimistic that it will. Once upon a time meditating on death was a spiritual discipline that wise believers said would fortify against complacent worldliness. Hardly anyone remembers their death until they have to. That’s human nature. Human nature.

The Hot Potato

Invoking relativism has never been more performative and less genuine in culture than it is right now. So why do we keep doing it?

One of the ideas that Douglas Murray comes back to time and again in The Madness of Crowds is the backward relationship within contemporary progressivism between confidence and evidence. When it comes to identity politics, Murray points out that lack of scientific or biological or mathematical data is not a problem for many activists. They simply ignore it; or, they pit experience and felt needs against cold, inhumane information, and ask audiences which of these is more likely to keep people oppressed. In the emerging social justice culture the level of certainty always exceeds the amount of rational justification. This is why people can lose their livelihoods or even be criminally prosecuted for believing in something that no serious consensus disputes—the categories of male and female, for example.

Certainty is a strange thing. In a religiously postmodern era it is fashionable to cast certainty as the enemy. Aren’t all we just giving our best guess? Yet as postmodernism gives way to the kind of New Morality that Murray documents in his book, one wonders whether the death of certainty hasn’t been greatly exaggerated. From where I’m sitting it doesn’t look people are capable of living without it. Even the most irreligious groups need an unshakeable framework that imputes meaning into their lives and beliefs. For many who’ve been shaped by western higher education, that framework is social justice.

Yet even the critics of social justice mania cannot cleanly critique certainty. For one thing, almost every opponent of outrage mobs and shout down chants believes with certainty that free speech is an absolute good. At the very least they believe with certainty that people ought to be able to hold unpopular ideas and a job at the same time. Now those open-minded philosophers who extol doubt and portray the life well lived as one of never knowing what’s above are in a real pickle: If certainty is the enemy, then we can’t be certain that those who wield certainty to suppress their opponents are certainly wrong.

I thought about this after listening to a “deconversion” narrative by podcasting duo Rhett and Link. Alisa Childers also listened to it and she put the issue well:

After poking holes in Christianity, Rhett offered no plausible alternative to explain reality. When he jumped the Christian ship, he didn’t jump into another boat, but into a “sea of uncertainty.” His Christianity has been replaced with what he calls “openness and curiosity.” He describes how liberating it’s been to let go of the “appetite for certainty.” To the careful observer, it’s evident that Rhett has traded in one worldview for another: Christianity for postmodernity, with all its skepticism, denial of absolutes, and relativism.

Except it’s not actually relativism, right? As Rhett rattles off the list of scientific discoveries he made that his childhood Christianity could not explain, he didn’t sound very relativist. He didn’t sound like a true patron of uncertainty. The scientific research he had read was not vulnerable in the presence of “openness.” Despite the plaintive word picture of a person exchanging rigid dogma for a peaceful, relaxing float on the river of discovery, what’s actually happened is that Rhett is certain of something he wasn’t certain of before: That Christianity isn’t true. The buzzwords of openness and curiosity obscure the reality of a door slamming shut.

What’s most interesting to me is not how often people decide that the religion they were raised in is not true. It’s how often people sand over their definitive religious or ideological transformations with language about “openness” and “curiosity.” It reminds me very much of transgender activists on a college campus, screaming “Who are you to judge” while they campaign to get an administrator fired and harass his or her family. The nod to relativism has never been more performative and less genuine in culture than it is right now. So why do we keep offering it?

My only theory is that in the current intellectual climate, the best way to bring someone over to your side is not to try to convince them they’re wrong, but convince them they’re irrelevant. “Look, I’m not saying I’m right, I’m just saying I’m open and curious and you’re not.” Who wants to hear that? Ours is the era not of debates and arguments but of epistemological hot potato: whoever carries the baggage of certainty for too long loses. Here’s how you win: Say you’re not sure, then act as if you are. You’ll avoid the existential crisis of having no center, and best of all, you’ll seal yourself off from almost any critique. Define yourself as open, and everybody else gets closed automatically.

The Magic of Secularism

Contemporary paganism meets modern individualism.

Alan Jacobs recently linked to a concise summary by Charles Taylor of “buffered selves” vs “porous selves.” The dichotomy is crucial to Taylor’s thesis in A Secular Age. Taylor argues that the essence of secular modernism is the transition from an enchanted world—which creates porous selves—to a disenchanted one, which creates buffered selves. Here’s how Jacobs puts it:

The porous self is open to a wide range of forces, from the divine to the demonic; the buffered self is protected from those forces, understands them as definitively outside of it. The attraction of the porous self is that it offers a rich, multidimensional cosmos that’s full of life and saturated in meaning; but that cosmos also feels dangerous. One’s very being can become a site of contestation among powerful animate forces. The buffered self provides bulwarks against all that: it denies the existence of those forces or demotes them to delusions that can be eradicated through therapy or medication. But the world of the buffered self can feel lonely, empty, flat. “Is that all there is?”

Sometimes this contrast is referred to as the difference between paganism and modernism. Paganism is a magical worldview, where spirits roam the cosmos, elixirs heal body and mind, and the self is open to the influences and effects of a whole spectrum of metaphysical forces. Modernism is the age of science, logic, and rational belief. Spirits do not roam, bacteria and viruses do. The world is disenchanted in the sense that it can be explained without reference to things beyond the material.

Almost everyone would say that we live in a rationalistic age. We are buffered against the superstitions of our pre-modern ancestors through modern science, medicine, philosophy—the heritage of developed Western thought.

Or are we?

Consider a thought exercise. Ask yourself this: Is the individualistic, self-determining, self-expressive ethos of modern society a porous worldview, or a buffered one? In other words, is “follow your heart” the logical sum of rationalism or is it closer to a mystical mantra? If you’ve read many books on this topic, especially books about identity, you probably got a good dose of “From Descartes to Nietzsche” history of ideas. These books argue that Western rationalism created the modern self. I think that’s probably true and as reliable a narrative of how-we-got-here as we’re going to get.

But here’s where it gets interesting. There is a cultural rise right now for actual paganism (witchcraft). In this Atlantic piece, the author and her interviewees frame the contemporary pagan self in explicitly fashionable terms. It seems there’s a seamless continuity between being a witch and “telling your truth.” Stay for the final sentence:

Now 38 years old, Diaz remembers that when she was growing up, her family’s spellwork felt taboo. But over the past few years, witchcraft, long viewed with suspicion and even hostility, has transmuted into a mainstream phenomenon. The coven is the new squad: There are sea witches, city witches, cottage witches, kitchen witches, and influencer witches, who share recipes for moon water or dreamy photos of altars bathed in candlelight. There are witches living in Winnipeg and Indiana, San Francisco and Dubai; hosting moon rituals in Manhattan’s public parks and selling $11.99 hangover cures that “adjust the vibration of alcohol so that it doesn’t add extra density and energetic ‘weight’ to your aura.”

…To Diaz, a witch is “an embodiment of her truth in all its power”; among other magic practitioners, witch might embody a religious affiliation, political act, wellness regimen, “hot new lewk,” or some combination of the above. “I’m doing magic when I march in the streets for causes I believe in,” Pam Grossman, a witch and an author, wrote in a New York Times op-ed.

“I’m doing magic when I march in the streets” is about a clean a summary of the post-Christian West as you could ever read. The porous theology of Wicca is transposed onto politics-as-religion, and the essence of telling one’s pagan truth is to become an activist. The only question is whether it’s the secular, political identity that is masquerading as a witch, or if the paganism is putting on an activist front. I think somebody committed to the Cartesian theory of self-expressionism would say the former. But what if it’s the latter? What if modern witches reach for political self-actualization precisely because the Modern Self is not a rationalistic creature but a mystical one?

There are clues that point me toward that theory. For one thing, the liturgical and creedal nature of social media culture strongly suggests that many in the contemporary West are becoming less shy acting on impulses and habits that are religious in shape. For another thing, the ecclesiastical personality of spaces like university campuses—featuring excommunication, defenses of orthodoxy, etc—reveal not so much a secularized public square but a religiously redirected one. Yes, in one sense formal religious affiliation is thinning, but in another sense, religious practices have arguably never been more mainstream.

I’m reminded of a great blog post a few years ago in which Ross Douthat flagged a feature essay in Elle magazine about a woman’s experience with mediums. Pointing out that irreligious Americans tend to show interest in things like spiritualism and astrology, Douthat argued that the best way to understand modern secularization is not as a negation of the numinous, but as an ambivalence toward it. I think that’s a compelling theory, and it stands as a challenge to Christian observers of culture to get too sucked into a “transition” narrative—from porous to buffered, from pre-modern to secular—without accounting for the ways in which human nature falls back into contradiction in order to meets its felt needs. As tidy and seamless as the line from Descartes to Disney may seem, there are complications along the way.

And one of those complications is the impure alchemy of many modern worldviews. If the dogma of modern paganism is to be “an embodiment of your truth in all its power,” then we should ask whether the porous selves of the witch coven are plagiarizing expressive individualism, or whether the whole time expressive individualism was actually plagiarizing the porous Self. It could very well be that the arc of post-Christian history doesn’t finally point toward the scientific laboratory or transhuman technology, but toward Amazon, Oprah, and activist witchcraft.

Jeering the Devil

I have a new piece up at The Gospel Coalition today on the power of sanctified laughter. With the help of Peter Hitchens and a very bad novel, I make the case that some sin deserves mockery rather than hand-wringing solemnity.

Here’s an excerpt:

I get why the suggestion that sometimes we ought to laugh at sin sounds errant, perhaps even mildly heretical. Shouldn’t we be killing sin? Isn’t laughing at sin what millions of Americans do during primetime TV sitcom hours? There is, however, a tradition in Christian thought that goes like this: all sin is ultimately absurd, and there are occasions when the absurdity of sin is disguised as seriousness, and on these occasions one of the best things steadfast believers can do is rip off the disguise.

Elijah mocked the prophets of Baal as they uselessly called out to their false god. Commenting on this passage, Matthew Henry writes, “The worship of idols is a most ridiculous thing, and it is but justice to represent it so and expose it to scorn.” The only biblical reference to God’s laughter occurs in Psalm 2, in which rebellion against the Lord and his anointed is met with a ridiculing mirth. Solemnity is occasionally an insufficient response to what is sinful and destructive. Sometimes the best response is to point out sin’s ridiculousness.

Read the whole thing here.

Kobe, Worship, and Us

Admiration that is misdirected is still better than a callous on the soul.

It didn’t take long in the aftermath of Kobe Bryant’s death, and the outpouring of eulogies and sorrow that quickly followed, for me to hear what has become a popular refrain among conservative evangelical Christians. “Can you believe this amount of sadness for an athlete? This just goes to show what an idolatrous culture we live in. People worship Kobe. They should be worshiping God!”

Yes, it’s all true. The level of society-wide grief for the death of an athlete does point in some degree to how sports is its own quasi-religion. We’ve seen already how the floodgates of disordered love can obscure a person’s full, fallen humanity, and result in hagiography that may or may not punish those this person sinned against. And yes, what you’re seeing is indeed a form of worship. There is only One who’s worthy of it, and we ought never be embarrassed to say so.

Yes. But…

Listening to some evangelicals respond this way makes me wonder whether we fully appreciate our cultural moment, and whether we understand what’s really happening in a public spectacle such as Kobe’s death. As overwhelming as the media coverage and hashtags were, I came away not primarily irked at American idolatry of sports heroes but instead conscious of something I think is important. Our era of Western life is an era in which not just worship of the true God is scarce, but the idea of worship is implicitly and explicitly ridiculed. The mechanisms of life in our modern, mobile, digitized, secular age work against the very elements of worship, including admiration. Just as Lewis wrote that nature did not teach him that God was glorious but instead gave the word “glory” meaning for him, admiration—of created things, including fallen people—trains human beings to be able to respond in worship to what is actually worthy of it.

Admiration, the emotional response hardwired into the soul when it encounters something that moves it, is undermined often nowadays. Consider the transformation the smartphone has brought to the art gallery, as visitors stand in the presence of true greatness, snap a quick pic or selfie, and then quickly move on to the next exceptional piece. Anyone who has visited a national landmark in the last 10 years can attest to how modern people now “consume” awe-inspiring landscapes or architecture via their phones, rather than sit in silent admiration of them.

Admiration is the seed of worship because it teaches a responsive attention. To admire a sheer, deluging waterfall is to stand in its presence and know that not only is it beautiful, but that its beauty is good for me. Is the modern culture we see before us one that helps us to admire in this way? Or is it one that rapidly evaluates how well a particular beauty can help us get Likes, or make us “cultured,” or affirms our own self-esteem?

It’s often said that Americans worship celebrities. That’s undoubtedly true. But as cancel culture now demonstrates, even the most dazzling stars now fit in the palms of our hands or laptop screens. Admiration for actors, artists, performers, and even politicians is subject to how well they remain in the public favor, how well they say the right things at the right times and never run afoul of the “rules.” Besides, human admiration fades parallel to memory. Records are broken. Beautiful people get old. This too is the conditional admiration, the worship that ultimately depends on how much the worshipers can get out of the ceremony.

That’s why I found the cultural lament for Kobe Bryant somewhat hopeful. Where some evangelicals see idolatry, I see a flickering ember of something that looks like true admiration, the responsive attention to greatness that must exist in every heart that would feel this toward its Maker. That even people who never wore his jersey or cheered his team would feel sadness and a sense of “there’s-something-wrong-with-this-world” at his death is a sign that our technology and our politics have not fully extinguished our souls’ ability to stand in the presence of something and say, “This is good.” I suppose my thinking is that even love that is misdirected is better than love that is never directed anywhere at all. A room with a poor view still reminds us that there’s such a thing as outside; a hall of mirrors cannot do that.

It’s been reported that the morning of the crash, Bryant and his daughter Gianna went to Mass. I very much hope that’s the case, and I very much hope that they were at Mass for this very reason: to sit in the presence of who is truly worthy of worship, to receive his beauty and grace and truth, and to say, “Yes, this is good, and good for me.” We should all pray that the morning of our deaths would find us like that—and our lives, too.

A Cancel Culture Nightmare

While the vast majority of social media was lamenting the shocking death of Kobe Bryant, something very different was happening to Washington Post reported Felicia Sonmez. I’m writing about it only because how it illustrates the radical effect that online culture has on our perception of everything, even an untimely death of an athlete.

A couple hours after Bryant and his daughter Gianna were confirmed dead in a helicopter crash in California, Sonmez posted an article, not written by her, about the 2003 sexual assault allegations against Bryant. This appears to be the first thing Sonmez posted on Twitter related to Bryant’s death (an important point that I’ll explain in a moment). Within minutes of having posted the link to the article—titled “Kobe Bryant’s Disturbing Rape Case: the DNA Evidence, the Accuser’s Story, and the Half-Confession”—Sonmez was besieged with hundreds of angry replies, criticizing her for bringing the allegations up while everyone was reeling from the news. The replies kept coming and escalated in tone and viciousness, and Sonmez was quickly at the bottom of a social media pile-on. Clearly taken aback by the reaction, Sonmez doubled down, explaining why it was legitimate to talk about the rape accusation, and shaming her online critics by sharing a screenshot of her email inbox, which was filled with some pretty vile sentiments.

The next morning news broke that the Post had suspended Sonmez. Reputedly the suspension is due to her posting a screenshot of her inbox, which revealed the full names of some of her critics. I’ve got no idea if that’s really why she was suspended. It seems more likely to me that the Post did what a lot of employers have done in the social media age: Panic in response to a mob.

But here’s what I’ve taken from all this. This episode is one of the most thorough and illuminating examples I’ve ever seen of just how dysfunctional discourse is when it’s conditioned by technology like Twitter. Every single player in this story looks bad.

First, there’s Sonmez. Of course Sonmez has every right to link to a piece about Bryant’s rape allegations. And those allegations are important and remain important even in the aftermath of tragedy. But Sonmez knew exactly what she was doing by posting the article when she did. Everyone who knows the culture of social media at all knows why someone who had been absolutely silent about a celebrity’s stunning death to that point would post an article like that: in order to reshape the narrative. In the world of Twitter, not even news of someone’s death exists as an objective, actual thing. In the world of Twitter, something only matters to the degree that it participates in the story you want to tell. You know that this is a conditioning effect of social media by imagining someone marching to the middle of a vigil for Kobe Bryant, standing on a soapbox, and yelling about his rape allegations. Such an action would be considered unspeakably crude and unfeeling, not to mention stunningly foolish. Yet this kind of thing is common on social media (not to mention applauded). That’s how disorienting the digital timeline is.

Second, there’s the mob that came after her. Sonmez was unquestionably the target of horrific attacks. These sorts of shame storms tend to only get worse as time goes on and the angry crowd pivots from expressing outrage to trying to accomplish something with it (a firing, a doxxing, etc). How ironic is it that the vox populi of the internet sends death threats and slurs in defense of a celebrity’s reputation? But that’s the moral logic of the online jungle. It’s the same for conservatives and liberals alike, men and women alike, articulate and otherwise. There’s a gravitational pull to online nastiness that seems to cut through every kind of inhibition we have. It’s not enough to disagree. We must destroy. This sure sounds like the recipe for some kind of civilization collapse.

Finally, there’s the Washington Post. The decision to suspend Sonmez is ridiculous. Sonmez was insensitive and unwise, but at the end of the day the only transgression the Post really cared about was her being the target of an outrage campaign. Her suspension, like many other online-reactive disciplinary actions before it, does two lamentable things. First, and most importantly, it sends encouragement and affirmation to online bullies, especially ones that know how to effectively troll. Second, it now gives Sonmez a credible victim narrative and distorts the extent to which her ordeal was merely a twist of fate for someone who in a moment of volatile emotions tried to cancel and ended up getting canceled herself. Nothing excuses the harassment that Sonmez experienced. Nonetheless, there’s a valuable parable in the spectacle of a journalist miscalculating her ability to reshape a public narrative. But that lesson is lost in the aftermath of another bad decision to threaten someone’s livelihood over an unwise social media moment.

This is the state of journalism and of public discourse in 2020. This is the state of our culture’s ability to grieve the loss of life. God help us.

Label Me!

Personality profiling is a way for moderns to receive an identity, rather than craft one.

Everyone who knows anything at all knows you must never attribute someone’s character or behavior to their identity. It is universally agreed in polite society that no person is ever good or bad at something because of their gender, or their race, their family, sexuality, etc. To indulge in this reasoning is at best a crude stereotype, at worst an expression of flagrant bigotry. A president of Harvard University was once forced to resign simply for observing that male students displayed more consistent interest in and aptitude for mathematics and science than female students (an observation which was backed up by all the relevant data, and still is). The unwritten law is clear: A person’s ethnic, genetic, or sexual identity must never explain anything about them.

This makes the cultural fascination with personality profiling all the more intriguing to me. To listen to people talk to one another about their Enneagram numbers is to listen to urbane, educated, and socially conscious people insist on being labeled. It’s not simply that the Enneagram is fun in the same way that all self-knowledge tools are fun. There will always be a market for figuring out the “secrets” about oneself. But the Enneagram fandom I’ve seen takes it quite a bit further. Your Enneagram number is not simply descriptive, it is explanatory and authoritative. Listen closely to enthusiasts talk about their experience with the test, and you will hear explicit appeals to one’s profile as an explanation for even the most trivial facts or behaviors. Their conversation is peppered with phrases like, “I’m such a 7,” or, “Yeah, that’s a very 4 thing to say.”

The same thing happens with in introvert/extrovert conversation. Depending on which you are, certain kinds of habits or tendencies can be expected from you, and it’s a matter of social decorum for others to recognize this. Introverts get nervous at invitations to gatherings; they’d rather watch Netflix at home. Thus, relating well to the introvert in your life means (among other things) not taking offense when they don’t show up. You should also learn how to work with introverts, date them, and recognize the dozens of signs you’re probably one of them.

It took me a long time to realize just how odd this kind of pathological self-categorization really is. For one thing, I’ve always believed myself to be an introvert, and I’ve claimed the label throughout most of my adult life whenever I was uncomfortable or wanted to protect my time. For me, introversion has often been permission: permission to not be like those around me, to make choices others didn’t understand, and to be my own person.

But then I started realizing that it no other aspect of life was I as ready to sort myself into a prefabricated category. Why did I so readily accept the logic of personality profiling when that same logic, if applied to my skin color, my childhood, or anything else about me, would likely deeply offend? More to the point, why did so many people around me — people who rejected all species of stereotypes and determinism — make an exception for their personality?

Here’s one guess: Personality profiling is the last politically-acceptable way of receiving an identity, rather than crafting one. And many people today are weary of crafting their own custom identity and would very much like to belong to something instead.

It’s not been that long since the most fundamental fact about you was considered to be your family. For most of human history an individual’s life was conditioned by their parents. You lived where your parents lived (likely until death). You worked at what your parents worked. Your marriage was in large part downstream from your parents’ relationships and community. You were born into a religion, you were born into a value system, and you would born into a social fabric.

When most of us hear this description of the past we drop down on our knees and thank God that unlike our pre-liberal ancestors, we are not consigned to a pre-written fate. Every Disney film ever made is at some point a story about a person remaking themselves into their own image, getting out from under the restrictive and unfeeling expectations of their family. That’s the kind of story that resonates with Western people who feel their individuality keenly.

You won’t find me arguing that upward mobility is a bad thing, or that people should have no option to improve their life. But something is definitely lost to our humanity when the only identity available to us is one we have to tirelessly craft. There’s something in most of us that tells us that to belong and to be received is better than self-determination. It’s not an accident that The Rise of Skywalker, in its pursuit of fan satisfaction, essentially re-wrote the story of Rey to give her a family name after all. After spending two films arguing that Rey’s anonymity was immaterial and that she could build her own identity through her actions, the filmmakers end the ill-fated trilogy with a scene in which Rey assumes the last name Skywalker. To belong is better than to self-determine.

I wonder then if personality profiling is a kind of refuge for those of us who’ve been catechized in hyper-individuality. A finite amount of Enneagram numbers means that you really can belong to a group. Who you are is not opaque, it is discoverable. Maybe there’s something deep within Western people that craves the kind of self-knowledge that comes from outside rather than inside. Weary of curating our own sense of self, sometimes we just need to be assigned a number and know who we are.

Yeah, But What if the ‘Elites’ Are Right?

Mark Galli and the editorial leadership of Christianity Today believe that Donald Trump should be removed from office. Carl Trueman writes that this is a perfectly defensible position, but takes serious issue with Galli’s notion that Christian faithfulness entails it. I too worry that Galli’s editorial took the wrong angle, emphasizing the constitutional case against Trump and implying that consistent, Bible-believing Christians can come to consensus on that issue. That seems to me to be a category error, as if we can know from Scripture whether the president of Ukraine was pressured into a political favor. If that’s genuinely what Galli meant, it’s a bad take.

Yet is it what he meant? I doubt it. The last sentence of Trueman’s response bothers me: “Lambasting populist evangelicals as hypocrites or dimwits will simply perpetuate the divide.” I certainly agree. But why does Donald Trump somehow stand-in for all Christian populism? Must the demerits against his character, his behavior, and his qualifications trickle down and apply to any and all who are disaffected by America’s two-party administration?  I can’t see any reason why they should.

Galli writes: “That [the President] should be removed, we believe, is not a matter of partisan loyalties but loyalty to the Creator of the Ten Commandments.” This isn’t how I would have worded it. But Trueman’s complaint that this line accuses “every Trump voter of heinous sin, however reluctant or conflicted he may be,” both misses and undersells the point. It misses the context of this line, which is Galli’s citation of CT’s editorializing against Bill Clinton in the 1990s, at which time the magazine also declared the elected president of the US as morally unfit for his office. This is a strange track record of consistency if Galli and CT are simply intellectual elites, unmoved by the plights of the Christian working class (more on that in a minute).

But I think this response (which has come from many more people) also undersells something Galli’s editorial understands. It’s not enough to say that there are understandable reasons to vote for Trump, and so no one can dogmatically claim that doing so is a sin. Trueman points out that many evangelical Trump voters despise infidelity and coarseness, yet felt as if their political alternative was worse. But is this reasoning not also subject to moral evaluation? Is the existence of Planned Parenthood and GLAAD really a biblically and ethically sufficient justification for endorsing—hesitantly or not, joyfully or not—this president? Galli has an answer to this question: No. Perhaps that’s the wrong answer, but it is an answer.

What’s not an answer is to double back on #NeverTrump evangelicals, label them elites, and declare the conversation pointless. I wish so much that evangelicals would fully resist the allure of identity politics, especially the versions that seem to be popular in our conservative theological circles. Substitute the word “white” for “elite” in much evangelical political discourse, and you would end up with lengthy essays that would be logically indistinguishable from those of the wokest SJWs.

Whether Galli and the staff at CT are elites has absolutely no bearing on whether they’re right about this president and the morality of supporting him. The argument fails for the same reason the common pro-choice canard about pro-life’s being “out of touch” with the physical and social trauma of unplanned pregnancy fails. I completely accept the fact that I, a white, middle-class, nuclear-family raised male, cannot sufficiently empathize with a poor, disadvantaged, unwed mother, just as I accept that the editor in chief of a large Christian magazine cannot sufficiently empathize with my rural, pastoring, Trump-supporting relatives. A failure to empathize is not synonymous with a failure to speak truth. Babies are still babies, and low character is still low character—regardless who’s elite and who’s not.

And in any case, are we so sure there’s not something to be said for being at least a little out of touch with populist conservatism? Just last night I was visiting my grandmother. The television was muted but tuned into Fox News, where the chyron read, “SOME ON THE LEFT SAY LITTLE WOMEN IS TOO WHITE.” From what I could gather host Mark Levin had rounded up a couple of obnoxious Tweets from “the Left” and, wham, a segment was born. I found myself wondering what it would be like to consume this kind of “news” hour after hour, day after day. I think I’d be a rather angry person, though I suspect I’d be unable to name the people I was mad at. If you ask me, that’s the kind of thing that can perpetuate a divide, too.

Death of a Critic

On Scorsese, cinema, and parenting.

There was a time not very long ago at all that I would have enthusiastically agreed with Martin Scorsese’s comments about Marvel movies. For a handful of my early 20s I was in love with what Scorsese calls “cinema,” enraptured by artistry and moral ambiguity and disgusted by anything that smelled of kitsch. I once registered a blog domain called “The Astute Film Critic,” and let me assure you it was every bit as pretentious as it sounds. Even now, reading Scorsese’s comments pokes at a tender spot in my heart that conjures up joyful memories of discovery, optimism, and a feeling of genuinely falling in love with film.

And then last spring I attended a screening of Paul Schrader’s First Reformed. As the end credits began flickering I knew beyond any doubt that the film buff I had aspired to become in those years was gone beyond recall. I left the theater feeling little else but contempt and scorn for what I still believe is an utterly confused movie. Yet I knew I was supposed to love it. Critics, including several whose work I still respect, tripped over themselves to declare First Reformed a brooding masterpiece. There was absolutely no square inch of my soul that concurred or even comprehended that judgment. I hated that film. And I knew immediately what that fact meant: it wasn’t meant for people like me.

First Reformed was the climax, not the beginning, of my exit from cultured cinema. By the time I was parking the car in the Yorktown AMC for the movie, I had been sensing a transformation. Cinema had lost its charm. I was bored by the same critically lauded, morally ambiguous films I had devoured in a previous life. Whilst critics lost their minds over Three Billboard Outside Ebbing Missouri, I stopped watching after 90 minutes. I’m told by multiple columnists that The King’s Speech is one of the worst Best Picture-winners ever. I really like it. See what I mean? There’s only so often you can be out of step with an aesthetic culture before you realize the differences are irreconcilable.

For the last few years my movie tastes have become what the cultured despisers call normie. I like Avengers and still love Spielberg. Noah Baumbach bores me to tears. Wes Anderson annoys me. Christopher Nolan thrills me. Don’t get me wrong, I like good movies (Phantom Thread was much better than I expected). But I agree with Scorsese…cinema is its own thing, and it’s not for me anymore.

As best I can tell, what happened to me is that I started having kids. I don’t even know why, but as soon as my inner film critic met my son, he hit the road. It’s not a logistic thing, like, “I can’t watch movies for grown ups anymore.” Nothing prevents me from streaming cinema after the kids’ bedtimes. I would just rather re-watch Raiders of the Lost Ark. It wasn’t always like this, but it’s definitely been this way since I became a dad. Is it simply nostalgia or sentimentality? Probably somewhat! But I’ve got two other theories.

1) I think becoming a father had a transformative effect on why I put myself in the way of stories. My “astute film critic” days were mostly about being an astute critic, not the film. In other words, I think for me the operative desire was not to delight and learn from and be shaped by story but to be the kind of person people thought of as intelligent, while using movies to get that desire.

The more I reflect on this, the more I think bastions of “elite” opinion are pretty much all designed toward this end. It’s inner rings all the way down. Yes, if all you read is Harry Potter your imagination and your moral intuitions will be stilted, but all this means is that the Potter books are finite and limited. It doesn’t mean that the antidote is to subscribe to the New York Review of Books and farm out your soul to the coastal literati. Why not? Because—surprise!—if you do that, your imagination and your moral intuitions will likewise be stilted.

What did becoming a dad have to do with this? I’m not entirely sure, but it may be that children have a way of disabusing one’s delusions of grandeur. Want to be The Atlantic’s film critic? First, change this diaper. I wonder if even more than student loan debt, this is what keeps millennials from having kids. You can do anything in the world, until you have to do one thing.

2) When my son was born, my heart was flooded with the desire that he grow up to be a certain kind of person: Strong, courageous, compassionate, confident, etc. When I looked at the kind of movies that Scorsese dislikes, I saw, imperfect and idealistic, characters like this. When I looked at “cinema” I saw characters who were supposed to be “the real world.” The more closely I looked, though, I realized that the “real world” was actually not a real world at all, but a world created by people like Harvey Weinstein. Was this sex scene or that ideology really a reflection of the authentic world, or was it simply put in there to placate a powerful suit?

I don’t know. Not knowing is part of life, of course. And there’s something to be said for art that is personal instead of market researched. But more than anything, becoming a father has made me want to unite truth, goodness, and beauty, to keep them all together and to resist selling out to the expert opinions of people who know how to win Oscars but apparently not how to spot a hero. I just think that’s a poor trade-off.


image credit: By hashi photo – hashi photo, CC BY 3.0, https://commons.wikimedia.org/w/index.php?curid=9976941

Josh Hawley and the Need for Tech Stigma

Josh Hawley, the junior US Senator from Missouri, is waging a small war against Silicon Valley. Twice this summer Hawley has introduced legislation that targets social media corporations’ out-sized role in the lives of Americans. His latest bill is perhaps the most straightforward legal challenge to the biggest social media firms yet. The SMART Act would tightly regulate social media technology, forcing developers to make specific changes that dilute the addictive and omnipresent qualities of the apps.

In a May lecture that was published by First Things, Hawley lays out his case against Silicon Valley. He warns that Big Tech firms are pocketing obscene profits by maximizing addiction and carefully overseeing a monopoly on news and information. All the while, the American workforce is being populated by users diagnosed with elevated rates of depression, anxiety, and inability to focus.  Hawley concludes by reflecting that the culture being shaped by social media technology is an “economy that does not value the things that matter.”  Hawley: “That, I want to suggest to you, is something that we cannot afford. It is something that we cannot allow, and it is within our power to change it. And that is the great challenge and task of our time.”

David French, an evangelical columnist at National Review for whom I have great respect, dismisses Hawley’s legislative prescriptions as a misguided attempt to control consumer habits from Washington. French believes Hawley’s bills do address real problems, but establish a dangerous precedent for a “Republican Daddy State.” Writing in First Things, Jon Schweppe rebukes French and other conservative critics of Hawley’s proposals: “Historically, our politicians have determined that government should have a role when corporations exploit consumers by putting their physical or psychological health at risk,” he notes. “This is especially true when those consumers happen to be children.”

***

It’s hard to resist evaluating Hawley’s proposed laws and the debate over them in light of the larger, intra-conservative kerfuffle (also starring First Things and French!) that’s emerged in the Trump years. On the surface this looks like yet another installment in the “What is the proper role of government in the formation of virtuous citizens” question, an issue that takes on radically different shape depending not just on your politics but on your ecclesiology. Because I think David French is right about justification by faith and the mission of the church, and I think the editors of First Things are mostly wrong about them, I tend to gravitate toward a Frenchian perspective on statism.

But Jon Schweppe is right about something crucial: The question is not whether government will regulate the behavior of the citizenry, the question is how. If a legal minimum age to drink alcohol is an acceptable manifestation of a “daddy state” (and to Schweppe’s point, I don’t think any conservative columnists are arguing otherwise), why not proportionate regulations on a consumer product (social media) arguably even more omnipresent and accessible to children than alcohol?

French is right that overreaching regulation, even to fix a serious cultural malaise, could and probably would have long-term consequences. On the other hand, we’re almost certainly already signed up for long-term consequences from the overabundance of digital technology. Worse, functional monopolies held by Apple and Google make it almost impossible for creative solutions to supplant existing business models. “Digital literacy” programs come with the moral and legal authority of government to the benefit of manufacturers, all the while sites like YouTube, extolled as educational tools, oversee an algorithm-based disaster that targets children with disturbing content.

Though I share French’s view of federal intervention, “Daddy State” is a an epithet that fails to reckon with how consumer habits are conditioned and even constrained by the complex relationship between Silicon Valley and the information age. The latter is an unchangeable revolution; there is no rewinding the clock on the internet, and nostalgia is not a synonym for virtue. The former, however, is nothing more than a corporate culture that should be viewed with no less skepticism than pornography industry. What Hawley understands is that our experience of the information age has become cripplingly dependent on a fistful of companies that use jargon and confused lawmakers to exploit loopholes.  Michael Brendan Doughtery (writing in National Review, no less!) was exactly right to say that Facebook is a media and publishing company, regardless of what its executives say or the exemptions and allowances they request.

***

But there is something missing from Hawley’s agenda. The senator is eager to handcuff developers with laws about “infinite scroll” and time limits. This is interesting, but it plays into Big Tech’s hands. The problem with targeting granular technologies is that such technologies are always on the cusp of changing anyway. What does infinite scroll look like in, say, an augmented reality channel? Unless you’re well versed in the psychology and coding of this tech, you probably have no idea, and if there’s one thing Mark Zuckerberg proved, it’s that befuddling aging Congressmen with terminology almost any 13 year old would recognize isn’t that difficult.

What Hawley’s efforts lack is an element of stigma. Rather than trying to play the developer’s game, legal efforts to help our tech addiction should try to put a social stigma on always looking at your phone, or spending hours on YouTube, or anonymous message-based sites that foster radicalization. There should be a social shame to digital addiction that is comparable to the stigma around pornography, which is mediated through age-gate laws, laws that protect the depiction of minors, and other statutes, as well as practices in the private sector (such as cordoned off “adult” sections).  While of course most of us would say that social stigma around pornography is far too weak, since pornography is still too common and accessible, there is reason to think that promoting a stigma around tech sickness would be better and more effective than targeting the zeroes and ones of software.

In a brilliant essay almost twenty years ago, Roger Scruton pointed out that the contemporary West has introduced law and politics as a replacement stigma and custom. This is decidedly not how societies past operated:

In almost all matters that touched upon the core requirements of social order, they [generations past] believed that the genial pressure of manners, morals, and customs—enforced by the various forms of disapproval, stigma, shame, and reproach—was a more powerful guarantor of civilized and lawful behavior than the laws themselves. Inner sanctions, they argued, more dependably maintain society than such external ones as policemen and courts.

Stigma is not effective at eliminating a social ill. But that’s precisely the point. There are some social ills that cannot be radically destroyed, and efforts to do so may seriously damage the underlying social fabric. Scruton uses sexual morality as an example of a communal virtue that protects the vulnerable when it is enforced from within, but tends to turn abusive and corrupt when such enforcement is outsourced to the governing authorities. Of course, there is no hard and fast dichotomy between social stigma and the law, because the law teaches as well as restricts; thus, men who abandon their families must pay child support under the threat of the law.

Might the same principle work for responding to the crisis of digital addiction? Restricting social media to legal adults, for example, would not eliminate its addictive qualities or even fully prevent children from using the services, but requiring a credit card number or some other age-verification tool would create a “mature audiences only” stigma that highlights social media’s addictiveness and tendencies toward vice. Another stigma would be requiring that smartphones or internet-capable tablets not be sold to anyone under 16, and requiring parents purchasing such equipment for minors to sign informational disclaimers about addiction, psychological development, distracted driving, etc. Without restricting speech, such laws would introduce moderate hurdles to using such tech, making it especially difficult for children to have their own private digital lives.

We need a digital stigma. Rather than assuming that mobile, interactive technology is inherently valuable, we should assume that Silicon Valley’s products are comparable to cigarettes and alcohol: Not for children, not for habitual use, and certainly not for tax exemptions and public school programs. This of course doesn’t solve the problem of distraction sickness, nor does it even guarantee that parents would have the will to protect their kids. But it would strike a blow in the cause of cognitive and emotional flourishing, and puts Silicon Valley billionaires off the pedestal of philosopher kings and in the corner, where they belong.