21 Comments
User's avatar
Enrique Diaz-Alvarez's avatar

What do we do? Obviously, we remove all grading that isn't in-person proctored examinations. Or, we close down the joints, slowly and painfully. Looks like academia is going for the latter. Full pay till the last day!

Expand full comment
Connor Wroe Southard's avatar

I said years ago that the only answer to AI in writing-driven classes might indeed be Oxbridge-style written exams. That may be the best we can do. As for closing down the joints, well, I hope not. But the the current trajectory leaves much to be desired

Expand full comment
Chris Verrill's avatar

I'll make the case that AI is in fact *the* problem.

I think humans as a species excel at rationalizing behavior that makes life easier in the short term and is detrimental over the long term. I also think humans generally tend to underestimate how habitual they are and form habits without really meaning to. I think AI is designed to capitalize on that. I think AI is unique in its ability to get us to outsource our critical thought. I also think that if you use AI for one edge case here and one edge case there, it's very easy to find yourself dependent on the technology in a year or so. Consider this r/nyu post that I saw in a Chronicle essay "Is AI Enhancing Education or Replacing It?" (itself worth a read):

"I literally can’t even go 10 seconds without using Chat when I am doing my assignments. I hate what I have become because I know I am learning NOTHING, but I am too far behind now to get by without using it. I need help, my motivation is gone. I am a senior and I am going to graduate with no retained knowledge from my major."

It is very difficult to admit this to oneself. For that reason, I suspect this individual's situation reflects the situation for the median American college student, many of whom are lying to themselves about how much they're learning. I graduated recently from a liberal arts school with an English degree and many of my peers regularly cheated their way through their degree. My (crank) opinion is that the incentives (get a good GPA to be successful later), incentives though they are, actually are not the primary causal factor here. Instead, if you let a person pick between something that (1) is easy in the moment but may lead to (nebulous) negative consequences later, or (2) something that is hard in the moment but may lead to (nebulous) positive consequences later, most people will pick the former. That's especially true for younger individuals. If you give a student an easy and (supposedly) risk-free way to cheat they will usually take it. AI makes cheating exponentially easier. This feels obviously true to me but is tough to talk about in polite society because it is judgmental and somewhat nihilistic. And, in the professional workforce, AI is everywhere, so why shouldn't students use it? AI is writing emails, summarizing articles, and replacing not only human connection but human thought. That's not something that has any historical parallel and I don't know how it will play out as every professional organization (many school districts included) race to be "AI-driven."

What depresses me is that I feel like I'm losing a fight and I can't rightfully articulate what I'm fighting for. You mention this as well -- I can say ChatGPT is limiting "critical thinking" but that's not a very compelling argument. And I don't know how to make that argument, especially because there isn't yet a major and readily apparent downside of AI dependency.

I've written far more than I should in the comments of a Substack newsletter and I do apologize for that, especially because I don't really disagree with any arguments you make. I also don't have a unique experience to share. (My fiancée, a high school English teacher, tells me "It's bad out there.") I just think that AI comes with a unique downside that will make our broader American society less interesting to live in.

Expand full comment
Connor Wroe Southard's avatar

Thanks for this! Don't apologize for length or anything else. I'm glad to get a dissenting take, for one. And you raise a lot of interesting points; too many for me to respond to in a single comment

I share some of your feeling of despair (if it's fair to call it that). It does seem like AI threatens to strip out/destroy huge parts of not only culture and education, but the experience of being human. And it's hard to articulate the scale of the potential problem. I err on the side of trying to be hopeful and find solutions, partly because there's never a shortage of dirges about the humanities, or about media, etc. But the fact that I try to find reasons to hope doesn't mean hope is the best lens through which to view a given situation. There's a deep part of me that has, well, deep fears

I guess the one--yes, hopeful--thing I'd say here is that it's the frenetic early days of AI. So our anxieties are likely outsized in proportion to the chaos of a rapidly emerging technology. As we learn to deal with it, we may well calm down and find some obvious answers. I remember back in 2010 or so when it was fashionable to predict the end of books and even reading itself. That didn't happen, even tho AI feels like yet another threat to the idea of high-level literacy

So I don't feel able to tell you that you're wrong, nor do I necessarily want to. I share these concerns. I guess I just try to remind myself to hope for the best and that the flipside of human ingenuity is that we're always in a problem-solving dialectic with whatever issues we create for ourselves. But AI does feel like potentially a different beast, I agree. And mostly not in a good way

Expand full comment
NCMD's avatar

My wife and I, high school English and 5th grade teachers, respectively, have spent our careers watching students—with the unfortunate encouragement of the education system—increasingly see education as purely transactional. So when you say AI is not *the* problem, I could not agree more. Students are being taught that the purpose of school is not learning, but to create an outcome (and has been mentioned here, generally a financial outcome). We’re being told as educators that our most important job is making sure that students are “college and career ready.” My district recently endured what was essentially a privatization scheme under the guise of creating magnet schools in which students, starting in kindergarten, would have a specialized educational experience in areas such as hospitality or entrepreneurship. Add to all this the way standardized testing culture has crushed (especially) literacy education, which creates even more reasons for students to feel like they need tools like AI, and it’s all pretty grim!

Expand full comment
Connor Wroe Southard's avatar

Yeah, this all sadly confirms my suspicions. It's a grim state of affairs. Thank you for doing what you can; another thing I'm sentimental about is seeing good teachers as heroic figures. It's definitely a job I couldn't do, but damn if it's not crucial

Expand full comment
Jeremy Morris's avatar

Professor here. I have to say that aspects of the AI-is-destroying-humanities-education is way overhyped. Here in Europe, sure kids know about it and use it, but where there isn't that huge US pressure on grades and high-stakes paid education, it hasn't yet had any noticeable effect on the papers I grade.

Expand full comment
Connor Wroe Southard's avatar

Interesting! I wonder what the difference is. Grades pressure being different makes sense. There was a piece by an anthropology prof at my alma mater in which he notes that most of his students clearly want to write their own papers, and do so, even as he catches a number of them turning in shoddy AI work. So maybe it's confined to a great extent to the same kids who used to offer me money to do their papers (lol)

I could have mentioned this in the piece, but certainly headlines along the lines of "AI has already ruined education" (not far off from the headline of the piece in question) are necessarily going to be exaggerated. Glad to hear you're not seeing too many negative impacts as of yet. Gives me hope

Expand full comment
Jeremy Morris's avatar

I think I’m also lucky in that we don’t really do standardized coursework. Students have to decide for themselves on how to approach a topic which means they are less likely to rely on AI in the main.

Expand full comment
Connor Wroe Southard's avatar

Oh really? Do you mind sharing where this is?

Expand full comment
Jeremy Morris's avatar

Denmark. A lot of autonomy in terms of exam topics while number of assessments and their overall format is nationally decided. A million miles away from 'extra credit' in US context and so closer to UK model with more standardisation of formats nationally.

Expand full comment
Connor Wroe Southard's avatar

Interesting! I'm definitely hoping US higher ed can get better at learning from overseas models

Expand full comment
Jeremy Morris's avatar

Funny thing is, I'd love Denmark to be more student-oriented in the way the US is. There's v. little modularity in course choice (indeed there's almost no choice within programmes), as well as more flexibility in exam format!

Expand full comment
Sam Smith's avatar

As a College and Career Counselor and someone who helps students write a lot of admissions essays, I agree with a lot of your assessment, but would add a few things to the mix. Namely, part of what creates the mindset you are speaking to (how does the major help me get a job?) is the fact that, in our country especially, going to college requires risk in the form of debt. When that risk was minimal (e.g. when fewer people had degrees and the amount of debt required was lower) than you had a lot more leeway in being less "pragmatic" in your school decisions. Now students are a lot more worried about the Return on their Investment, and are more and more convinced that their choice of study will help them with this return. There's a cold logic in the thought process but it really destroys the idea that you just might want to learn something for learning's sake. The other thing that is often missed in this conversation is that writing is a tool for thought. It's useful for thinking through problems and finding solutions. It's weird to me that Math has had this forever with things like wolfram alpha but seemingly has made the successful argument that actually learning how to to do the math is important, too.

Expand full comment
Connor Wroe Southard's avatar

Good points! The debt factor is very real. I understand why students and their families have these anxieties, but I do lament how misinformed people often are. There seems to be ever less understanding, for instance, that many humanities majors do very well for themselves, and that many business majors do not. A more textured understanding of why this is would make for a book unto itself, tho a few factors are obvious (prestige of school and social capital, for example)

The point about math and Wolfram Alpha is a very interesting one. Not even the most devout tech industry AI-heads I know (and I know a few) would argue against studying math. Why is that? Well, for one thing, math and adjacent fields have kept their rigor and their demands on undergrads. The humanities, I'm sad to say, have probably made things too easy on undergrads, at least in many cases. There doesn't seem to me to be much of a future for undergraduate humanities study if we don't up the rigor enough to make the challenge feel worthy. (I know this is a simplification, coming from someone who doesn't work in the field, but it at least seems worth arguing about)

Expand full comment
Richard Jackson's avatar

One of Matt Levine's many good bits is pointing out that a lot of concerns about AI's potential future 'misaligned' behavior are basically just descriptions of Sam Altman's present-day behavior; i.e. the worries we have about being ruled by AI are already borne out by our capitalist (or at least venture capitalist) rulers.

In a similar way, AI in the university is a kind of reification of what boards of trustees and professionalized presidents have been doing for decades now: streamlining the process of higher education to the point it becomes hollow. Somebody like Gordon Gee has already made it so that learning isn't really the point of college so much as obtaining a credential. It was only a matter of time before the students stopped pretending to learn and their instructors stopped pretending to teach them.

And this is kinda always the story of AI in various industries: it's bad, but it's bad in the ways the industry's already been deformed by the pressures of its ownership class, so it can feel like something brand new, which it is, but also like a culmination of existing trends, which it also is.

Anyway it doesn't feel good.

Expand full comment
Connor Wroe Southard's avatar

These are all good thoughts. I don't really see a way forward for undergrad humanities study that doesn't involve being unafraid to challenge students and ask that they be prepared to actually practice intellectual curiosity and some degree of discipline, and so on. That's a heavy lift in some ways, but down the path of least resistance lies doom

Expand full comment
Richard Jackson's avatar

Yeah, the fix is pretty clearly to go back to hiring a sufficient number of professors and giving them a more manageable courseload and the security of tenure track positions, so they can implement the more labor-intensive teaching techniques that can foil cheating and force students, if not to learn, at least to engage with their materials. But that's precisely the kind of approach that all of the people who run universities have deliberately been moving away from for decades; if they'd been willing to do that AI wouldn't pose such a threat now.

Expand full comment
Bill's avatar

connor, I would read 15 more posts from you on this topic, at least. (i understand that does not mean it'd be fun or worthwhile to write 15 posts. but i'd read 'em!)

Expand full comment
Connor Wroe Southard's avatar

I appreciate it! I'm sure I'll end up writing about it again in the future. Despite not being an academic, I can't let go of my schoolboy fixation on humanities education. And this is uh not a historical highpoint for the humanities, to say the least

Expand full comment
Bill's avatar

as an anthropology major who works in tech...yes

Expand full comment