7
min read

"What Rough Beast"

“The destruction of experience no longer necessitates a catastrophe…humdrum daily life in any city will suffice. For modern man’s average day contains virtually nothing that can still be translated into experience.” -Giorgio Agamben, Infancy and History (1978) (1)
Written by
Coleman Numbers
Published on
May 21, 2024

“Surely some revelation is at hand;

Surely the Second Coming is at hand.  

The Second Coming! Hardly are those words out  

When a vast image out of Spiritus Mundi

Troubles my sight: somewhere in sands of the desert  

A shape with lion body and the head of a man,  

A gaze blank and pitiless as the sun,  

Is moving its slow thighs, while all about it  

Reel shadows of the indignant desert birds.  

The darkness drops again; but now I know  

That twenty centuries of stony sleep

Were vexed to nightmare by a rocking cradle,  

And what rough beast, its hour come round at last,  

Slouches towards Bethlehem to be born?”

              -W.B. Yeats, “The Second Coming” (1920)2

In Provo, Utah, where I live, when it’s warm, every Thursday evening, a group of street preachers from a local nondenominational church post up outside the grounds of the Latter-day Saint (Mormon) temple on Center Street. These women and men are normal-seeming, well-groomed, sport the reasonable fashion of elder millennials, and pass out nicely designed pamphlets about why the Mormon church (which, and this will be relevant in a moment, is my church) worships a false Christ.

I like to go running on Thursday nights, and I’ve gotten into the habit of running past them and stopping to talk. Despite the polemics, I enjoy conversations with the street preachers because they’re generally very nice and they get me to challenge my religious assumptions. And I’ll admit, it’s fun to push back on their own assumptions and argue for why I am, in fact, a “real” Christian.

As it happened, one Thursday night, before I reached the street preachers, I ran into a woman panhandling on the wide, pedestrian-friendly median of Center Street. Even though it was a balmy evening she wore a big wool coat. Her face was worn, tan, and smudged with grime. She smelled like camp fire smoke and she asked me, frowning in the way people frown to maintain dignity when the world has granted them little or none of it, if I could spare any cash.

I realized I had left my wallet at home—I asked if she’d still be here in an hour or two, after I finished my run. She said she would. So, as I started to dart across the street towards the preachers for our weekly appointment of arguing over whose Jesus was the real Jesus, I promised I’d be back with something to give.

I spent the whole evening sparring with my nondenominational friends and got so absorbed in our pressing theological debate that I didn’t notice when the woman who I’d promised to help had left. After the last dregs of my sectarian zeal drained away, and the preachers were packing up, and everything was bathed in the pallid orange light of downtown streetlamps, I looked around sheepishly and ran home. I never saw that woman again.

Now, reader, you’re no doubt asking, if you’re still reading at all: “what does this have to do with AI or learning?” Admittedly, I was kind of thinking that, too, when I started typing. But the connection comes down to one word: experience.

My encounter that night unfolded like it did because, in a crucial way, I was undergoing the revolting interior death that contemporary philosopher Giorgio Agamben describes in the epigraph of this essay—my religion, my faith, was emptied of any real “experience.” That is, it was void of any content that would make it efficacious or meaningful in the world. As a consequence, I became the painfully hypocritical caricature of piety that Jesus of Nazareth himself repeatedly castigates in the New Testament, debating the law on the temple steps while ignoring the poor right under my nose.

This experience—or, rather, non-experience—has, in the ensuing months, made me think more deeply about what principles, beliefs, and emotions order my life. And in this, I think is an important question I’d like to meditate on vis-à-vis what Agamben calls “the expropriation of experience.”3

Now to that connection: this meditation will be relevant to AI and learning because, I think, so much of our learning technology and technique and philosophy hinges on the question of whether modernity (postmodernity?) even gives us access to genuine “experience” at all. Are we, as instructional designers or e-learning developers or instructors or learners, able to fulfill our roles with efficacy and meaning? Or are we only rehearsing empty, dead rituals of post-industrial commerce? Hopefully I can start to touch those questions in this essay.

***

To do so, there are two fundamental questions I want to tackle: 1) What does Agamben’s “expropriation of experience” look like in the age of AI, and 2) how do we fight against it? To get the proverbial plane off the ground, I’m going to turn to one of my favorite writers of the past couple years—Paul Kingsnorth.

Kingsnorth is a former environmental activist and current post-secular thinker who writes about the task of living out faith (in his case, specifically Eastern Orthodox Christian faith) in a highly technologized, highly commodified, thoroughly secularized West. One of his most potent (and chilling) ideas is “the Machine,” an economic and technological construct that typifies our age. “The ultimate project of modernity,” Kingsnorth writes,

is to replace nature with technology, and to rebuild the world in purely human shape, the better to fulfill the most ancient human dream: to become gods. What I call the Machine is the nexus of power, wealth, ideology and technology that has emerged to make this happen.
We are increasingly unable to escape our total absorption by this thing, and we are reaching the point where its control over nature, both wild and human, is becoming unstoppable. It is developing its own theology, as it takes us at warp speed into a new way of being human. Its modus operandi is the abolition of all borders, boundaries, categories, essences and truths: the uprooting of all previous ways of living in the name of pure individualism and perfect subjectivity. We are not made by the world now; we make it. And we can make anything we want. Or so we want to believe.4

As a writer and a Christian believer myself, I can’t help but find Kingsnorth’s characterization of this post-industrial Yeatsian “rough beast” darkly dazzling. But even if you think his apocalyptic bent is a bit much, Kingsnorth’s portrait of the Machine is a helpful bridge for understanding Agamben’s expropriation in AI terms.

The Machine—the obscure protocols and routines driving corporate decision-making, the algorithms of social media platforms, the global networks of supply and demand that contort daily human life into rhythms that are simultaneously deeply regimented and wildly chaotic—is transforming us. We’re driven by deadlines, product launches, board meetings. Our days begin with buzzing chiming digital alarms and they end in blue flashes of ghostly screen-light. Our communities consist of a vast cloud of abstracted face icons connected to text boxes; our work amounts to little more than the manipulation of squiggles and lines stored on silicon extensions of the Machine’s body. If our job does involve physical labor, this labor is too often divorced from the production of any obvious end product or service. More than ever before, current humans live in a way that would have been unrecognizable only two or three generations before.

This isn’t to discount the radical advances in quality of life that the digital revolution has brought for millions around the world—nor is it to give in to a radical AI “doomerism” that’s seen a groundswell since the rise of the chatbots. I only want to observe the ways that digital technology, even before AI, has irrevocably changed the landscape of human life, perhaps at the expense of genuine experience.

To get more specific, let’s go back to Giorgio Agamben. In the same chapter of Infancy and History that I quoted from earlier, Agamben makes the weird claim that “experience has its necessary correlation not in knowledge but in authority—that is to say, the power of words and narration.”5 In other words, what makes real experience real isn’t that it’s backed up by specialized, objective knowledge of facts about the world. Instead, real experience is backed up by its communicability, its propensity to live in someone else’s mind and heart as fiercely and vividly as it lives in my mind.

What made my street preacher story so de-experientialized was that I was living in the kind of abstraction that sought “knowledge” rather than “authority”—rational argumentation rather than the fiat of human compassion. If my Christian identity was really bound up in the latter rather than the former, I probably wouldn’t have wasted my time arguing about definitions of Christianity. I would’ve been living out that definition. That definition would have been “real,” universal, able to be felt and given to anyone who met me.

We can all probably identify some of these “real” experiences that we’ve either shared or had shared with us—the story of a deliciously bad date, a fishing tale, or even a powerful anecdote from, dare I say it, a business meeting. We can still find them in many places in modern life, to be sure.

But if everything I do consists of staring at words on a screen, taking the occasional Zoom meeting, putting together slide decks, etc., what is there in a given day of mine to express with the sort of personal “authority” that Agamben describes? Very little, I think.

It isn’t hard to start observing the human cost of this increasing virtuality of culture, and this is doubtless not the first blog post to point this all out. One only need to look as far as soulless content mills that have pervaded the online marketing ecosystem and which are being increasingly dominated by AI content—about as expropriated as you can get. Or consider the data labelers in the Global South who are paid to tag images of clothing, TikTok videos, and—in one reported case—bestiality.6

Death of experience, at the hands of the Machine, was already well underway long before AI came along. But when human voices7 and human bodies8 can be drawn into the simulacrum, what do we do? How do we engage with a technology that, at every turn, gorges itself on the mind and heart and form of humanity?

***

In a two-part essay titled “The Universal”/”The Neon God,” Kingsnorth ventures to answer that question. I think his insights can be illuminating for reckoning with the expropriation of experience when it comes to learning.

First, though, please go read the entirety of Kingsnorth’s essay.9 It’s weird, and spiritual, and I think he gets so much right about how the Machine is distorting our view of the world.

What I really want to focus on, though, is what Kingsnorth suggests we can do, as individuals, to reclaim authentic experience.

Kingsnorth identifies two archetypes—what he calls, respectively, the “cooked ascetic” and the “raw ascetic.”10 Each of these describes a type of negotiation with Machine-conquered society.

The cooked ascetic is a person who continues to live in the world built by the Machine but who seriously curbs their direct interaction with it. For our purposes, this might mean limiting Slack availability to defined hours, incorporating meaningful human-to-human contact in corporate training and onboarding protocols, or, heaven forbid, holding some meetings outdoors.11

It might mean even more extreme measures, like dispensing with a smartphone altogether. As this idea pertains to AI, specifically, the cooked ascetic might be wary of becoming what Ethan Mollick calls an AI cyborg12 and instead only apply AI in strategic, regimented ways that are trackable and isolated from an entire workflow.

The raw ascetic is, on the other hand, someone who foregoes interaction with technology altogether—someone who forsakes living in modern society.

For our purposes—instructional design, learning—being a raw ascetic is probably out of the question. But what it might it mean to be a “cooked ascetic” learning and development professional when it comes to AI?

Before anything, I think it means acknowledging that AI-generated content is the beginning, not the sum, of what you offer a learner. If we’re to reclaim genuine experience in the learning environment, a good first step may be to interrogate the way our use of AI technologies interrupts or blocks learners from having meaningful human conversations, moments of silence and reflection, and interaction with a learning environment mediated by real rather than simulated problems.

From someone with Kingsnorth’s perspective, that straightforwardly means scaling back use of AI. But I wonder if this isn’t too reductive. I’ll pose a question I don’t quite have space to answer here: is there a way to incorporate AI into the learning process that enhances genuine experience instead of diluting it?

I ask, not merely (I hope) because of this blog’s vested interest in a continued discussion about AI, but because machine intelligence is and will continue to be pervasive. If the future really is a zero-sum game between the Machine and humanity, then we will lose. Some would say we have already lost.

And part of me thinks that Kingsnorth wouldn’t disagree with the view that humanity as we know it is basically on the out-and-outs. I can understand why he, as a Christian with a pretty vivid eschatology, would be comfortable accepting that conclusion.

But I’m not sure if I want to accept that argument—not yet, anyway. I think human negotiation with technology is going to be more complex—and more beautiful—than reducing all of modern civilization to a “rough beast.”

But then, maybe that’s exactly what the Machine has conditioned me to believe.

Notes

1.  Giorgio Agamben, Infancy and History: On the Destruction of Experience (New York: Verso, 1993), 15.

2. “The Second Coming by William Butler Yeats,” Poetry Foundation,

3. Agamben, Infancy, 19.

4. “The Tale of the Machine,” The Abbey of Misrule, Paul Kingsnorth, published June 29, 2023,

5. Agamben, Infancy, 16.

6. “Exclusive: OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic,” Billy Perrigo, TIME, published January 18, 2023,

7. “What Do You Do When A.I. Takes Your Voice?” Cade Metz, New York Times, published May 16, 2024,

8. “Creating video from text,” OpenAI, published February 15, 2024,

9. “The Universal,” The Abbey of Misrule, Paul Kingsnorth, published April 13, 2023,

10. “The Neon God,” The Abbey of Misrule, Paul Kingsnorth, published April 26, 2023,

11. “How Leaders Can Hold Walking Meetings Successfully,” Gregoire Vigroux, Forbes, published February 15, 2023,

12. “I, Cyborg: Using Co-Intelligence,” One Useful Thing, Ethan Mollick, published March 14, 2024,

AI in Learning Newsletter
Keep up to date on the cutting edge technologies that are changing the way people learn and instruct.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.