The general advice I would give to my arts and humanities colleagues as they address AI is the cliché martial arts fight scene advice:

Trust your training!

From a broadly-conceived humanities critical theoretical approach, what is artificial intelligence? If we want to talk about artificial intelligence as a term invented at conference at Dartmouth in 1956, then it was always a marketing term. It was basically a sales pitch to the department of defense, and it worked. Keep in mind that the academic discipline of computer science didn’t even really exist until the 1960s. To a significant extent this whole house of cards rests on the vaporware 1956 promise of AI.

In the broad strokes of a few sentences, that’s an argument one could make. It is no more or less defensible or contestable than many of the other critical approaches that we take to neoliberal culture. Arguments along these lines were the kind of stuff the launched the first science wars: i.e., science is culturally and ideologically overdetermined. And I would argue that the increased neoliberalization of science and engineering over the last 30 years means that those critiques are more obvious now than ever.

But let’s set aside cross-disciplinary disagreement for a moment. Because we don’t need to go to these disciplinary concerns, when the more pressing ones are market-driven. In the US, all the frontier AI are corporate-owned and require massive financialization to operate at all. In the caveat emptor of the global web, it is hard to imagine some “independent” AI emerging. Independent from what? Given our general skeptical positions as cultural critics, in what universe would any of us imagine that a corporate-owned or state-controlled AI could ever be “ethical”?

Clearly we should reject that notion that “AI literacy” somehow involves our teaching students how to operate the “machine that goes bing.” We also should remain deeply suspicious of the triumphalist discourse around AI. The arts and humanities are and should remain allergic to that absurdity. [I would note that my colleagues in the media arts have made a career of negotiating the difference between education and teaching how to use a product.] Likewise I would reject the notion that our job is to insulate students and others from AI’s consequences, especially as those consequences play out on our campuses.

To the contrary, we should lean into our identities as the disciplines that will not determine or assure your future for you. When we present art, literature, philosophy, rhetoric, etc. to our students, we take risks. There is no, nor should there be, a pre-determined response or right answer. That said, there are many responses that might be viewed negatively and we study that. But we don’t preclude those responses and call it ethics.

I have taught in liminal media spaces for decades, as many in media studies, digital rhetoric, STS, digital humanities, media arts and others have. AI is another one. Is it the one? (whatever that means.) Who knows? But the answer to that question isn’t especially relevant from a pedagogical perspective, unless you want to study that discourse itself. What is relevant is that in these liminal media spaces we expose students to risks, as is integral to learning in any ethical sense. We know that AI is a lethal machine. It has killed people. Others have been driven to suicide and/or experienced psychosis. There are many dangerous aspects to AI, and when we introduce students to AI (or expand their use of AI) we are requiring them to interact with a machine that literally has designs on its users.

So there’s an element of danger. I don’t want to overplay it, but we shouldn’t pretend it doesn’t exist. In the arts and humanities we remain committed to the prospect that humans have positions in their culture from which they can speak truth. This process of veridiction, as Foucault terms it, requires ethical stakes. So yes, we must engage with AI and accept the consequences, as we engage with climate change, capitalism, racism, etc. but also as we engage with literature, art, and other cultural objects and practices.

The risk (and opportunity) is there, if you can tell–if there is–a difference. So trust your training.

Leave a comment

Trending