Defining my AI alarm

Freddie DeBoer has a magisterially long piece about AI on his newsletter this week, which is really worth pouring a drink and sitting down to read. In it, after a compelling walk through the intellectual and cultural hubris of the 19th century, and the subsequent disillusions of the 20th, he turns to a discussion of the limitations of AI, driven as they are by the inevitable limitations of humanity. AI, despite the promises of its proponents, is probably not going to be the most disruptive force since fire was discovered. And its not going to be so because biological life is infinitely complex, and the idea that we can somehow create processors and chips and servers that can even come close to replicating them is on par with that Victorian hubris. Here is a really important paragraph from this part of the essay:

In Nicaragua, in the 1980s, a few hundred deaf children in government schools developed Nicaraguan sign language. Against the will of the adults who supervised them, they created a new language, despite the fact that they were all linguistically deprived, most came from poor backgrounds, and some had developmental and cognitive disabilities. A human grammar is an impossibly complex system, to the point that one could argue that we’ve never fully mapped any. And yet these children spontaneously generated a functioning human grammar. That is the power of the human brain, and it’s that power that AI advocates routinely dismiss – that they have to dismiss, are bent on dismissing. To acknowledge that power would make them seem less godlike, which appears to me to be the point of all of this.

The human desire to be like God, or even be God. AI is just Babel, endlessly replaying down through history.

Anyways, read Freddie for more in that vein. I want to focus on something else. I’ve written here fairly recently on AI and my own pessimism and even alarm about this new use of technology. What I want to do is reiterate the nature of my AI alarm, in order that I not be misunderstood. This is important because of how Freddie describes many of the loudest voices of AI alarm that are out there. Here he is again:

Talk of AI has developed in two superficially-opposed but deeply complementary directions: utopianism and apocalypticism. AI will speed us to a world without hunger, want, and loneliness; AI will take control of the machines and (for some reason) order them to massacre its creators.

(…)

That, I am convinced, lies at the heart of the AI debate – the tacit but intense desire to escape now. What both those predicting utopia and those predicting apocalypse are absolutely certain of is that the arrival of these systems, what they take to be the dawn of the AI era, means now is over. They are, above and beyond all things, millenarians.

https://freddiedeboer.substack.com/p/ai-or-the-eternal-recurrence-of-hubris

I am not an AI millenarian. My brand of alarm is much more mundane than that, at least in the sense of Great Events. I dislike AI – in the form of the large language models and the like being developed and marketed right now – because I believe they are dehumanizing and destructive to cultural goods. I don’t worry about ChatGPT taking over the world and killing all humans. That’s far from anything I think possible, for the same reasons Freddie lays out. We don’t need AI in order for those fears to become real; haven’t any of us paid attention to history:

Everything that AI doomers say that artificial intelligence will do is something that human beings could attempt to do now. They say AI will launch the nukes, but the nukes have been sitting in siloes for decades, and no human has penetrated the walls of circuitry and humanity that guard them. They say AI will craft deadly viruses, despite the fact that gain-of-function research involves many processes that have never been automated, and that these viruses will devastate humanity, despite the fact that the immense Covid-19 pandemic has not killed even a single percentage point of the human population. They say that AI will take control of the robot army we will supposedly build, apparently and senselessly with no failsafes at all, despite the fact that even the most advanced robots extant will frequently be foiled by minor objects in their path and we can’t even build reliable self-driving cars. They say that we will see a rise of the machines, like in Stephen King’s Maximum Overdrive, so that perhaps you will one day be killed by an aggressive juicer, despite the fact that these are children’s stories, told for children.

No, I just worry that the growth of AI will perpetuate the neuroses and dangers of much of our modern technoculture. AI will perpetuate loneliness. It will continue the devaluation of the creative arts, of the humanities, of original ideas. It will become another tool of wealth inequality and economic destructiveness. Like my recent essay flagged, it is another attempt by humanity to escape all frictions. It will be another technology that promises us the moon and leave the vast majority of us holding the bag while a few get richer and more powerful. In the words of Freddie, it will further the modern propensity to seek “to avoid human.” It is an idol, in the same sense as the Golden Calf that Moses raged against. It promises what it cannot deliver, and we are so desperate to hear it that we forget how to be human.

I wrote this back in the spring:

All the while, people who are being promised a bright, AI-driven future will instead get more loneliness, more monetization of our attention, and less meaningful connection. It’s already well-acknowledged that Big Tech has used the levers of addiction to make the gains they have made in our lives; this knowledge will surely be put to use in figuring out how to addict us to AI in the hopes of extracting a few more pennies from the areas of our lives that have so far escaped their pocketbooks.

Freddie states it like this: “The bitter irony of the digital era has been that technologies that bring us communicatively closer have increased rather than decreased feelings of alienation and social breakdown.” He’s right. And this is what I fear from AI. That it will continue us down the path of despair and alienation and cynicism and apathy we are traveling. That’s a pretty destructive thing to unleash on ourselves. That’s what I fear.

a few brief thoughts on a complicated holiday

Today is the 4th of July, Independence Day here in the States, the day where we are commemorating the signing of the Declaration of Independence, and generally celebrate America. I don’t have a lot to say about it here, on a blog about theology and culture. Usually, I would write a post about the dangers of intermingling Christianity and patriotism, but I’ve not only done that here before, it is also something done in a million other places, to the point of being overdone (progressive Christian social media is almost unbearable today, in terms of the posturing and point scoring this day brings.) Generally, I affirm those views, and I very stringently avoid patriotism in general in my own life, viewing it as ultimately incompatible with my Christian commitments., not to mention the gross fetishization of American flags, nationalist chauvinism, and a kind of blind hubris.

That said, its also true that America, while unavoidably resting on a history of acts that are despicable and terrible, is also, historically speaking, a uniquely liberal and open place, and those of us who live here are pretty lucky, all things considered. Furthermore, the Declaration of Independence (which, again, is the core of today’s celebration, or at least should be) is a pretty remarkable document, in terms of being a document staking out a strong anti-authoritarian and anti-hierarchical claim. It really is an amazing document for its ability to provide ground to stand on for all future independence movements. It rightfully has a place as a crucial historical moment for all those committed to a better, more just and free world for all peoples, especially those laboring under the boot of authority and domination. It has its shortcomings for sure, as does its author, but all in all, there are worse things to reflect on today than the Declaration.

what is culture?

Alan Jacobs asked a question in April, that’s really stuck with me ever since: “what is culture?” I’ve been turning this question over and over in my head since then, as culture is a term I use here quite often. But Alan is right: what is culture really, because what it seems like everyone is always talking about (myself included) isn’t really culture. Here’s how he puts it:

Almost everyone who writes on this subject treats it as unproblematic, yet it is anything but. In the late 18th century Herder wrote of Cultur (the German spelling would only later become Kultur): “Nothing is more indeterminate than this word, and nothing more deceptive than its application to all nations and periods.”

I suspect that (a) when most people use the term they have only the haziest sense of what they mean by it, and (b) no two writers on this subject are likely to have a substantially similar understanding of it.

Alan Jacobs, “Christianity and …?”

I don’t really have a good answer to this, but I think Alan is right when he writes later that “If we can agree on some boundaries for this elusive concept we might be able to have a more profitable conversation.” As with any term we might use, really. It’s hard to have a coherent coversation if we can’t agree on a way to define our terms.

So, reading Wendell Berry as I’ve been doing recently, I ran across this quote, which I find very illuminating on this subject:

A healthy culture is a communal order of memory, insight, value, work, conviviality, reverence, aspiration. It reveals the human necessities and the human limits. It clarifies our inescapable bonds to the earth and to each other. It assures that the necessary restraints are observed, that the necessary work is done, and that it is done well.

Wendell Berry, “The Agricultural Crisis as a Crisis of Culture” in The Unsettling of America

Berry here gives us both a sort of composition of (healthy) culture, and some of the effects such a culture would enact on society. He is, of course, writing here in this essay about agricultural settings, but I think his ideas here apply more broadly then that. And, of course, Berry would surely disclaim any authoritative attempt to “define” culture here, and I agree this shouldn’t be presented her at some final word on Alan’s question above. Those qualifications aside, it’s a stab at understanding such a nebulous term, and if there is a list of voices who I trust on the subject of culture, Wendell Berry is surely near the top.

I really want to focus on that first sentence from the Berry is quote. In a later blog post, Alan does some “hand waving” (his term, not mine) towards defining what culture, or maybe what it isn’t. It involves “spheres of symbolic activity”, politics, symbols and imagery, amongst other things. I think he is right when he concludes that any good definition of culture is inevitably going to require the complexity of any entire theology of culture, which “would combine an inquiry into the character of our power-knowledge regime — a study of powers and demons — with an iconology, an account of the deployment of the images and symbols meant to govern our perceptions and affections.” (links are from the original.)

I like the direction Alan points us in here, and I think Wendell’s idea of culture being a “communal order” of things conforms nicely to that direction. If we are looking to define the character of our power and knowledge, as Alan says, then the values of “memory, insight, value, work, conviviality, reverence [and] aspiration” feel like good indicators of a healthy cultural character. Culture, then, is not necessarily one something among other somethings, but is instead a conglomeration of societal values, made possible by the presence of human virtues that society is forming its people in.

I also like Wendell’s cultural order because it opens the space to define an unhealthy culture as well, which I think is really important in our fallen world. So, just to riff off his essay, an unhealthy culture would be one defined by forgetfulness, shallowness, insignificance, sloth, suspicion, cynicism, and despair.

I have more to say about a healthy and unhealthy culture – for instance, I want to think about what forms these healthy and unhealthy cultures are expressed in – but I think I will leave those thoughts for the future, once I have put more thought to it. But, I do think Wendell’s writings can point us in a useful direction for answering Alan’s question.