Skip to main content
 

UNC-Chapel Hill’s Generative AI Committee, which is comprised of representatives from every academic unit, has released guidance on how to responsibly and ethically use tools like ChatGPT and Gemini. A subcommittee led by Eric Everett, director of research integrity, ethics, and education within OVCR, has crafted specific recommendations for the research community.

A graphic with multiple symbols that are interconnected, including people, lightbulbs, and technology.This guidance applies to faculty, staff (SHRA and EHRA non-faculty), students (undergraduate, graduate and professional), guest researchers (e.g., unpaid volunteers, interns, and visiting scholars), collaborators, and consultants involved in research occurring under the auspices of the University.

The guidance covers information on the following:

  • The limitations and risks of using generative AI in research
  • Principles on which to base the usage of the technology
  • Frequently asked questions and resources for citation

“Please review this guidance and integrate it into your research and scholarly practices, tailoring it as necessary to suit your specific discipline and accepted research and scholarly practices within your discipline,” says Everett. “Mentors and supervisors should have regular conversations with mentees and other research trainees about the intended use of generative AI in their research programs.”

Given the rapid pace of advancements in generative AI, Everett anticipates this guidance to continue to evolve. If you have any questions or feedback, please do not hesitate to reach out.

Comments are closed.