2 minute read

Artificial intelligence tools raise concerns

Continued from page 1 is considered academic dishonesty. So those are the policies, those are the guidelines, that’s the way it’s going to be treated. So tread lightly.”

Penalties for plagiarism could include a zero on the assignment, a trip to the assistant dean’s office or even a failing grade in the course.

Advertisement

English professor Suzanne Spoor said she initially worried about the potential use of AI tools in her class, but said most students will want to do their own work.

“Most of my students really want to find their own voices, learn, explore their own ideas,” Spoor said. “And so I don’t need to panic because they’re the same students, right? They’re still going to want to do all of those things.”

Photography professor

Christiana Caro, who worked at Google from 2016-18 on AI camera technology, said her impression of ChatGPT “is that it’s really good, which is interesting because it’s super problematic. … Because the pursuit of knowledge is meant to teach us how to do a thing, not how to get a thing to do a thing for us.”

Caro, who also teaches photography courses at Johns Hopkins University, added: “But at the same time, it’s like if the thing is so good, and it writes the paper, you could, like, ostensibly learn from that as well by looking at the way that the AI translates, kind of, an idea or a synthesis of ideas.”

First-year transfer studies student Gabriel Henstrand said the use of AI for assignments “flat out … is just academic dishonesty.”

“I’m sure there are ways to go about, I guess, using the AI to help you, or help someone write a paper,” Henstrand said. “But at the end of the day … I think it should be the student’s own research that’s conducted in their own writing. It’s really the only way that, like, someone can actually turn in something that’s, you know, genuine.”

Second-year transfer studies student John Finn agreed.

“I think it is just cheating to some extent,” Finn said. “If it’s not writing the actual paper, you can get ideas and things from it, and that’s kind of fine. But you do have to do your own work after that.”

Spoor said she ran assignments through ChatGPT to see what the program could generate.

“I said, ‘Compare these two poems,’ and I tried to pick we shut down from COVID, so all of the machines over there were idle,” Kaiser said.

Subway co-owner Sherri Anderson told Campus Current she had to find staff for her five other stores before the campus location could reopen.

“And I’m still desperately looking for employees to keep this one open,” she said.

Kaiser added the owners would “be happy to hire stu- dents” as employees.

“It’s good to see, like, different things here,” first-year undecided student Patrick O’Malley said. “Like free-market competition for the cafeteria. I’d prefer it if it wasn’t a Subway to be honest. I’m not a big fan. But I mean, it’s nice that it’s there, rather than [it] be nothing.”

First-year transfer studies student Austin Clow said he would like to see a cafe or deli on campus.

Professors report that students have already started using AI text generators for class assignments. Pixabay image obscure poets that almost nobody writes about,” Spoor said. “And they did a good job with topic sentences. … But … none of the quotes were from the actual poems, for example, but it just acted like it did. … I think it’s really scary for misinformation.”

Henstrand said he does not “see it getting to a point where it’s, like, more students are using AI than not.”

And Spoor said professors could potentially use AI as a teaching aid.

This article is from: