Meeting - May 10, 2024
Minutes
Attendees: Rachel Levy, Alyson Wilson, Mardecia Bell, Milad Abolhassani, Eli Cohen, Bill Rand, Jill Sexton, Cranos Williams, Yara Yingling, Helen Armstrong, Tiffany Barnes, Helen Chen, Huiling Ding, Krista Glazewski, James Lester
Charge: Provost Warwick Arden and Vice Chancellor for Research and Innovation Mladen Vouk shared the charge for the advisory groups. These groups are not task forces charged with producing reports but are an advisory group with subgroups. They will identify emerging issues, with a focus on opportunities and risks for the university. Moving forward, the advisory group can define the most productive ways to discuss issues and interact with leadership. There are multiple areas of interest: research, education, and outreach.
Discussion: AI is also a key issue within university operations. With the expanding use of AI tools, there are immediate security and compliance issues that need to be addressed. There is substantial overlap between the issues arising from university business uses of AI and research, teaching, and outreach issues. The Provost and Vice Chancellor agreed to incorporate a third working subgroup into the advisory group focused on business uses of AI.
Introductions: The advisory group leads (Levy, Wilson), members, and business use representative (Bell) introduced themselves to the group.
Discussion
- There are immediate needs to provide policy and process for research and education. The charge of the advisory groups does include recommending areas where the university should develop policy and process.
- The advisory group’s website is: https://committees.provost.ncsu.edu/artificial-intelligence/
- The advisory group leads asked for volunteers to lead each of the subgroups.
- Security and compliance are receiving many requests for evaluation of AI tools that will work with university data. The Campus IT Directors have formed a working group. This working group will be consulted to suggest potential members for the third advisory subgroup. Initial activities for the subgroup could include demonstration of tools (e.g., Gemini, OtterAI) and providing guidance on best practices.
- Other areas of guidance that the subgroups could address include classroom use, NSF policies, publishing policies, data security.
- The groups may wish to create a platform to share expertise, best practices, and references.
- Another area to explore includes Institutional Review Board guidance (including sample text for consent forms) for the use of AI in studies involving children and adults.
- Resources: It may be useful to have research assistants to support the subgroups in areas like benchmarking tools, auditing policies, and gathering information for the groups.
- The bulk of the initial discussion focused on generative AI. What are the best practices for their use? What policy and guidance do we need to generate?
- What is outside the generative AI space that is important for guidance and recommendations?
- The group recommended identifying an expert from the Office of General Counsel to support the advisory groups around legal issues with AI.
- The group would like to focus not only on challenges but also on opportunities.
- Meetings over the summer will be asynchronous.