3 minute read

ChatGPT will tank your GPA

I don’t believe any Princetonian — or college student in general — will be tempted to cheat with ChatGPT once they get to know it. You only need to spend a little time reading Twitter posts from users who’ve road-tested it to see what it’s likely to do to your GPA. It’s built on GPT 3.5, but you’ll have a 2.0.

ChatGPT makes errors of fact, errors of analysis, and errors of organization. At the least significant level, it can’t really discern fact from fiction. More importantly, it has no standards of logic to double-check its own analysis. For instance, one user got it to explain (in its characteristic earnest legalese) the meaning of an utterly impossible genetic theory, just by giving it a madeup scientific name.

Advertisement

The software organizes thoughts like a nervous high schooler who hasn’t prepared much for the AP English Language and Composition exam. It spits out one statement after another — all on topic, for sure, but in the same way that a shopping list is all on topic.

If you happened to get ChatGPT to write an A-level essay for you once, it would write a C-level the next time — each of them in the same self-assured voice. The software also has the voice of a middle-aged compliance lawyer, so if you happen not to be middleaged, or particularly officious in your prose, your professor with their fleshand-blood brain will be able to tell within two lines that this was written by artificial neurons. I teach SPI 365: Tech/Ethics, and there are theories of machine learning that will tell you this type of model will never truly understand, and therefore never be able to analyze. And there are ethics of academic honesty too, which compel you not to try it. But I think it’ll come down to your self-interest: you won’t risk your grade to use a tool that will almost certainly fail.

Steven Kelts is a lecturer in SPIA and often the Center for Human Values. He also leads the GradFutures initiative on Ethics of AI. He teaches SPI 365: Tech/Ethics. Find him online or at kelts@princeton.edu.

However, in academic circles, some have noted that students may rely on ChatGPT to cheat and plagiarize, while others point out that ChatGPT is a helpful tool for generating ideas and modeling responsible use of technology.

With this in mind, we asked a few Princeton faculty members for their opinions on ChatGPT’s role and uses, if any, in the classroom.

ChatGPT isn’t at our level

Rather than representing the end of the college essay, ChatGPT offers an opportunity to reflect on exactly what is valuable about a liberal arts education.

ChatGPT is a fascinating tool, and I’ve been playing around with it in relation to the assignments for my Writing Seminar (WRI 106/7: Seeking Nature).

For some tasks, it’s potentially useful — it seems to be okay at generating summaries of sources, which could eventually help more advanced students speed up the research process, akin to reading abstracts before diving into a full article. But because Writing Seminar is about building skills, including how to understand sources and how to craft arguments, asking ChatGPT to summarize a source is only a useful shortcut for students who can summarize sources themselves. Without that skill, they won’t be able to take the next step. Being able to read a source and extract its main claim is the kind of analytical task that requires practice, a task that stands to benefit critical and creative thinking and problem-solving. This is the work of the Writing Seminar.

Beyond summarizing scholarly sources, it seems like the technology is still fairly limited — let’s rein in the idea that all human writing is in danger of being made obsolete! After all, good writing reflects original thinking, and by its own admission, ChatGPT can only “generate text based on patterns in the data.” While noticing patterns is often the first step in producing interesting, important writing, it is only the first step. ChatGPT can’t produce original interpretations based on those patterns. So, once students have a handle on understanding sources, my goal is to introduce them to ChatGPT as a tool to help them track patterns on the road to insightful analysis and original argument — the kind of thinking that, at least for now, AI Chatbots can’t manage.

Sarah Case is a lecturer for Princeton’s Writing Program. She can be reached at secase@princeton.edu.

This article is from: