To follow on Jay Bolter, we might say we find ourselves in the late age of literacy. What replaces literacy as the medial-rhetorical substrate of academia? AI-generated literacy? With its feedback loop, AI generated literacy operates through the rectilinear, recursive attenuation of indeterminacy. Texts literally mean what the AI says they do, as does literacy itself.

Let me say that differently. In the emerging curricular nomenclature, AI precedes everything. “AI and [your field here].” Can I get an AI that’s half English and half Political Science please? Or maybe a humanities lover’s supreme: literature, a romance language, and gender studies, but hold the history. We all know that the history on that AI pizza isn’t real history. It’s “textured vegetable protein.” But we might ask this question of all these fields when we inquire as to the media effects of shifting from a (print) literate to a computational, time-critical medial substrate.

It’s a “matter of concern” (a la Latour) as there are many stakeholders, no simple solutions, and plenty of wicked problems. As I was suggesting in my previous post, when we look at these curricular structures we can start with asking cui bono. Universities go in search of three things: prestige, money, and prestigious money. How can AI serve those purposes? And then we can see when/how AI becomes the tail wagging the university mascot, as I imagine many of us have already seen.

Why is it that “AI” is a subject that cannot be digested by universities and instead induces this response? Before we can explore that question, we need to attend to the assumption in there that AI cannot be digested. An alternative is that it could be normalized but that universities see a marketing opportunity in their competition for students… or they act in advance to try to keep up with where they expect their peers to be headed. This is similar to the argument that we must continue heading full on toward AGI because that’s what the Chinese are doing.

Perhaps universities don’t want AI to be digested. The agency of the “wanting” will have to be black-boxed for this post. The result is a busted transdisciplinarity that demands a master disciplinary discourse. “AI,” whatever that is, is always first. And when it comes to universities, there probably is no choice but to insist on that normative, computer science master discourse of AI. It’s the one that is necessary to compete for research dollars; it’s the one that attracts media attention; it’s the one around which there is a race for prestige with our peers.

To return to Bolter. His point about the late age of print was that we had reached a time when we could imagine a world without print or where print was secondary. My “late age of literacy” points not only to technologies but also to the cognitive conditions in which they participate. If you want to think of this as Google is making us stupid, fine, but that’s not what I mean. What I mean is that our cognitive media ecologies are transforming. Frontier AI products are participating in that transformation. Ulmer’s electracy at least has offered another name, a recognition that it is not enough to just slap “digital” or “AI” onto literacy (or rhetoric).

If we are to ask how time-critical computational media (AI, etc.) shape and our shaped by our medianaturecultures (or however you want to put it), then we cannot precede that question with an answer about what AI is. It’s not easy. Our institutions have other motives. And technologically, the AI will outpace us chronologically.

We need to operate elsewhen.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trending