fbpx

۰ͼ

Here’s how Carolina faculty use AI

They are making sure students are literate in artificial intelligence and embracing it in their own teaching and research.

Daniel Anderson sitting at computer in front of whiteboard.
Daniel Anderson, professor of English and comparative literature, developed training modules as part of the Carolina AI Literacy initiative platforms responsibly and effectively. His goal is to ensure that all ۰ͼ students become AI literate. (Jon Gardiner/۰ͼ-Chapel Hill)

The concept of artificial intelligence has been around since antiquity. Sacred statues in Egypt were believed to be imbued with real minds and could answer any questions put to them.

These days, we have Google.

We also have streaming services, businesses and social media platforms that use AI to track user behavior to provide personalized content and advertisements. Health apps analyze biometric data to give insights for a fit lifestyle. Navigation systems use real-time and historical traffic patterns and predictive analytics to provide route recommendations.

What is different today is the surge in the amount of data, computational power and the public release of generative AI platforms like ChatGPT or Microsoft Copilot. In the world of artificial intelligence, the more data, the more robust these models and algorithms become.

This has led to an exponential increase in how this technology could be used. For some, ethical issues loom just as large. Problems with data privacy, structural bias, copyright, plagiarism and disinformation are all hot topics in the AI ethical debate.

While some view the rise of AI as a threat, others see its potential in learning. One of the latter is , professor in the College of Arts and Sciences’ English and comparative literature department and director of  and the . In his research, Anderson studies the intersection of computers and writing.

Last summer, Anderson allowed English composition students to use generative AI platforms throughout the course and found mixed results. When brainstorming topic ideas, identifying key words or summarizing verified documents, AI was helpful. But when looking for a literature review with the best 10 sources, the chatbot generated false references, what is called an AI hallucination.

These hallucinations taught students a valuable lesson in verification.

“A lot of what I hoped to see, as a teacher, happened,” said Anderson. “By spending time with these activities, students were able to have the lightbulb go off for themselves.”

Based on this experience, and with funding from the , Anderson built training modules as part of the  initiative to help students learn how to use these platforms responsibly and effectively.

Anderson’s goal is to ensure that all Carolina students become AI literate.

“Not all students need a deep understanding of AI, but they should know it’s trained on data and that data can come with limitations and affordances,” he said. “Your bank is going to be using AI; your doctor is going to be using it. It’s useful to have a sense of how this technology is mediating your life.”

  • , an assistant professor with appointments in the computer science department and the School of Data Science and Society, focuses on improving machine learning models, a type of AI that learns from data in order to make predictions.
  • , a professor in the biology and genetics departments and in the ۰ͼ School of Medicine’s integrative program for biological and genome sciences, investigates histones and their role in gene expression.
  • , assistant professor in the computer science department, works on video understanding and computer vision technologies.
  • , assistant professor in the exercise and sport science department, core faculty member in the Matthew Gfeller Center and co-director of the STAR Heel Performance Lab, uses AI to predict athlete performance, injury prevention and recovery.