Wednesday, April 24th, 2024
Categories
News

AI Use and Academic Integrity

Millersville University has created an AI Academic Integrity Task Force to help answer questions and see the areas in which AI can help or hinder learning and education.

With AI technology on the rise and in the news, many questions have arisen regarding its place in the humanities and education. In response, Millersville University created an AI Academic Integrity Task Force to help answer these questions and see how AI can help or hinder learning and education.

Dr. Chad Hogg, assistant professor of computer science, explains AI, or artificial intelligence as “computer programs that solve problems that reasonable people might have claimed cannot be solved without intelligence.” Hogg notes that people interact with AI daily, whether they give verbal commands to Siri, use Google Maps to get to a new destination, or allow Outlook to organize their emails in certain ways.

“The big advantage is that problems that would have required teams of many humans days to solve can now be solved instantaneously by computers, and small tasks that would not have been worth spending human time on can be accomplished automatically,” Hogg continues.

As AI becomes more accessible, potential problems with this technology depend on how and why it is used.

“One common problem is that AI systems are ‘trained’ by providing data, and they can misinterpret unintentional biases in that data as natural and meaningful,” explains Hogg. “For example, a system trained to decide bail amounts based on historical examples will perpetuate the same systemic prejudices that influenced the original examples.”

Additionally, AI programs like ChatGPT are now more available to students. This creates questions about how generated text based on short prompts interferes with Millersville University’s Academic Integrity policies.

“The most significant reason large language models such as ChatGPT have been mentioned in the context of education is that they are very good at generating documents that look like they were written by a knowledgeable and competent human,” Hogg says. “If the purpose of an assignment is to practice writing and to assess how well students can write, the ability to bypass that work allows the student to easily submit a document that appears to satisfy its requirements without having done any of the practicing or allowing any of the assessment that was the entire point of the assignment.”

While there are ways in which this broadly-defined technology can assist with assignments – using spell check and citation generators, for example – problems can arise in terms of what is or is not defined as plagiarism.

“It is certainly possible for students to use this technology to complete assignments,” Hogg says. “Doing so in the most straightforward way of prompting the system with a version of the assignment instructions and submitting its output as the student’s own writing would violate the University’s Academic Honesty Policy.”

Using AI-generated art could pose problems for students in the artistic fields as well. In a similar fashion, generating art using AI removes the learning and creative process and brings up similar questions of plagiarism.

“Artmaking is a very special experience, and using AI to generate art images takes this away from humans,” says Heidi Leitzke, associate professor of art and design. “The act of making is a creative and enriching experience that is more about the process, and not always the end result.”

“Additionally, it is my understanding that AI-generated images are essentially a multitude of images pulled from the internet that were made by humans. The artists that made the original images are not credited, which is very problematic,” she adds.

Overall, AI is by no means a new technology, and it can provide students with certain basic tools to help them succeed. Teachers may be able to incorporate AI as a learning tool or design assignments centered around its use.

However, as it becomes more accessible, using AI technology for assignments can detract from the learning process and the critical and creative thinking that higher education encourages. “These are exactly the issues that the task force is discussing,” promises Hogg.

The members of the AI Academic Integrity Task Force are:

  • Barry Atticks (Associate Professor, Music)
  • Eric Blazer (Associate Professor, Accounting and Finance)
  • Oliver Dreon (Professor, Educational Foundations)
  • Kerrie Farkas (Professor, English & World Languages)
  • Rachel Finley-Bowman (Associate Professor of Student Success and Dean of University College)
  • Chad Hogg (Assistant Professor, Computer Science)
  • Rob Spicer (Associate Professor, Communication & Theatre)
  • Greg Szczyrbak (Associate Professor, Library)
  • Marc Tomljanovich (Dean, Lombardo College of Business)

 

 

 

 

 

Leave a Reply