Categories
artificial intelligence digital rhetoric media studies

the day AI saved students from the humanities and arts

Hey you, yes you. Stop trying to revive the humanities. You’re getting a mess all over the floor…. and besides you’re making me laugh so hard, I’m losing my breath.

In today’s episode of moral panic at the Chronicle, no less than four separate pieces on the front page on AI. Here are some classic cuts. From “AI and the Death of Student Writing:”

I’ve been teaching community-college courses in California’s Central Valley for the past 12 years, and I’ve prided myself on my clever assignments, designed to prevent plagiarism — assignments such as comparing totalitarian regimes in The Handmaid’s Tale and 1984, or discussing the feminist undertones of Charlotte Perkins’s The Yellow Wallpaper and Kate Chopin’s The Story of an Hour. But it no longer matters how good my assignments are

LOL. Yes, very clever. I’d have never imagined asking students to write on feminist themes in “The Yellow Wallpaper.” There are probably only a million student essays on that topic roaming the internet.

And there’s this ditty. I know you’ve heard this one before.

Some teachers are taking this a step further. I read recently that some teachers are using AI to grade papers. How is that going to work? Students will write their papers with AI and teachers will grade them with AI? So it will be one computer grading another computer’s work.

I am reminded of a scene from the Rodney Dangerfield vehicle Back to School, hamfistedly recreated by AI.

Ok, ok. Just one other article, “Professors Ask: Are We Just Grading Robots?”

The explosion in AI use, the endless hours spent figuring out whether — as he put it — there was a person on the other side of that paper, and the concern that students who cheat could end up getting the same grades as those who did the work sent Wilson reeling.

Really? Did you try flipping the paper over? Did you see a person there? I mean, Foucault remains the most cited individual in the humanities and social sciences. You are familiar with the author function, right? What is the theory of agency/subjectivity that is animating you to keep turning a paper over and over looking for a person? I especially like this line: “I’m grading fake papers instead of playing with my own kids.” Here, I fixed it for you: “I’m grading fake papers instead of playing with my own kids.” Do you feel better now? Or are you still questioning your life choices? I know how I felt when my kids were young. Where’s the instructor’s agency in this?

So the answer to the question is “Yes, you are just a grading robot.”

Here’s the thing about college writing instruction, in case you’ve never done it. We spend years training graduate students to become reliable and predictable pedagogy machines, and then they in turn train students to produce predictable prose that fits a fairly narrow set of genre conventions. The students and instructors then enter a transactional relationship. The whole purpose of writing instruction is to instruct students to write efficiently while still producing texts with predefined characteristics at a certain level of quality.

Using AI is as obvious an extension of that objective, as obvious as valuing the efficiency of composing on a word processor rather than a typewriter. Oh, and BTW, as that visual gag from a 1986 film might suggest, this is not news. The whole purpose of lecture halls and courses and educational institutions themselves is to create such efficiencies. If we really could download knowledge into brains Matrix-style, wouldn’t we do it? That Matrix image is horrifying because it completely misunderstands what learning is, even if it is just a reductio ad absurdum version of the infamous banking model of education. There is an analogy here with the misunderstandings of AI.

Let me track back here. So let’s say you create an assignment as above asking students to compare concepts of totalitarianism in two novels. That’s a prompt that you have engineered to produce a specific range of outputs from students. The proverbial million typing monkeys will eventually produce those expected outputs (among many unexpected and unacceptable ones). Those unacceptable others would include answers that are acceptable but are composed in languages not yet spoken by humans. You could say those outputs are produced without thought, but that’s not true: thought animates the monkeys’ fingers. It’s just not the thoughts you desire your prompt to engender.

It’s the same problem with GAI. Students are thinking when they use these technologies. They are just not having the kinds of thoughts you want them to have. So what kinds of thoughts do you want them to have? And what kinds of thoughts do you imagine they actually have while writing essays for your course? And what makes you think that you have the right to require students to think in particular ways while completing assignments for your course?

I’ve thought a lot of things while composing this post. Many of those thoughts are even more uncharitable than the ones I am sharing!

While this Chronicle clickbait is about student writing, it applies across all forms of knowledge-media production. The reason GAI works in college (and many other settings) is that the aim is to produce outputs that fit within a statistically predictable range.

This is the ruthless dogmatism of the humanities and arts. Not only must students produce predictable outputs but more importantly they must think in prescribed ways while doing so. It’s not enough that a students produce acceptable works of art according to disciplinary standards. They must also be able to articulate their thoughts, intentions, and process in a manner that meets with disciplinary criteria.

In an abstract sense, if, as academics, we want to understanding the training of generative AIs, we don’t need to look farther than ourselves. This is how we train humans. I’d suggest that when a student takes your prompt and hands you some AI generated junk, they are definitely sharing some thoughts with you about the course and the assignment.

(n.b. the AI did a better job producing a version of the Back to School image for the “feature image” just working from the body of the text than I got from my prompt engineering.)

2 replies on “the day AI saved students from the humanities and arts”

The content seems to present a critical take on the use of AI in assessing and influencing student writing. It emphasizes the need to consider the kind of thinking students engage in while using technology and it challenges the notion of prescribed thinking in academic assignments. 

Actions to consider:

1. Consider elaborating on specific examples or studies that demonstrate the impact of AI on student writing and critical thinking.

2. Expand on potential alternative approaches or solutions to addressing the concerns raised about AI and student writing.

3. Evaluate the inclusion of any relevant data or expert opinions to further support the arguments presented in the content.

Like

I tried to create an Alien-esque scene of an AI bursting out of a body, but ChatGPT protected you all from my prurient aesthetics. I guess GAI has better taste levels than I, which is why I stay away from the arts as much as I can.

Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.