Âé¶¹ÊÓÆµ

AI Guidelines for Instructors

As generative AI tools become more integrated into educational spaces, instructors across disciplines are faced with new possibilities, challenges, and responsibilities. This guide provides practical, policy-aligned guidance for Âé¶¹ÊÓÆµ instructors on how to thoughtfully and ethically incorporate AI into their teaching practices. It reflects the principles outlined by the George Âé¶¹ÊÓÆµ AI Task Force, aligns with institutional values, and serves as a "living document" to be updated as tools and practices evolve. 

This manual is not a set of mandates, but rather a framework for informed decision-making. Our goal is to support faculty autonomy while ensuring consistency with university academic standards and student success.

This document will continue to evolve as tools, student expectations, and pedagogical approaches change. Instructors are invited to offer feedback or suggest additions by contacting the AI Task Force or the Stearns Center.


 

Guiding Principles

These principles, adapted from the George Âé¶¹ÊÓÆµ AI Task Force Guidelines, provide the foundation for instructor decision-making around AI use:

Human Oversight

Faculty remain accountable for all instructional decisions, even when supported by AI. AI outputs should be reviewed, verified, and never used as a substitute for professional judgment.

Transparency

Faculty should disclose when and how AI tools are used in course design, feedback, or grading. Similarly, students should be expected to disclose their use of AI in academic work.

Data Privacy and Compliance

Instructors must avoid inputting sensitive student information into public AI tools and should comply with all .

  • .
  • .

Critical Thinking

AI should be a tool to enhance—not replace—independent thought. Instructors should model critical engagement with AI outputs and encourage students to do the same.

Accessibility

Instructors should ensure all students can engage with AI-related assignments regardless of their prior access or familiarity. Avoid assignments that assume or require paid or advanced AI tools.

Academic Standards

Instructors must clearly define acceptable and unacceptable uses of AI for each course. Promote understanding of integrity rather than relying solely on detection tools.


 

Ethical Considerations for Faculty Use of AI

Faculty are encouraged to reflect on the ethical dimensions of AI integration:

Disclosure

Instructors should be transparent with students when AI has been used to generate or support course materials, assessment tools, or feedback.

Avoid Overreliance

Instructors should ensure sufficient human oversight when using AI to create assessments, grading, or feedback to AI. Human judgment is essential to preserve learning goals.

Be Cautious with AI Detection Tools 

Many of these tools are unreliable and raise privacy concerns. If used, they should not be the sole basis for academic standards decisions.

Respect Student Privacy 

Instructors should never submit identifiable student work into AI systems that store or reuse inputs. Use institutional platforms that meet George Âé¶¹ÊÓÆµ's data governance standards.

Acknowledge Bias and Limitations

Instructors should discuss with students the limitations of AI tools, including hallucinations, embedded bias, and ethical risks.

Use Cases for AI in Teaching

When thoughtfully integrated, AI tools can support teaching in a variety of ways: 

  • Supplemental Learning 
  • Active Learning Assignments 
  • Discussion Prompts 
  • Customized Feedback 
  • Language Practice 
  • Brainstorming Support

Required Syllabus Language

To support clarity and consistency, instructors are required to include an AI policy in their course syllabi. 

Some example policies are provided below. Additional guidance on writing your own personalized statements can be found on the .

Policy Examples 

Strict Use Policy: "The use of generative AI tools (e.g., ChatGPT, Claude, Gemini) is not permitted for any coursework in this class unless explicitly authorized. Unauthorized use will be treated as a violation of academic standards." 

Moderate Use Policy: "Generative AI tools may be used in this course with clear guidelines. Students must disclose when AI tools are used and how they contributed to the work. Misuse may be treated as a violation of academic standards." 

Open Use Policy: "Students are encouraged to explore generative AI tools in this course. Responsible use, transparency, and proper citation are required. Assignments will note when and how AI may be integrated." 

Instructors should also explain the reasoning behind their chosen policy, provide assignment-specific instructions, and link to university-wide academic standards.

Course and Assignment Design Considerations

Faculty are encouraged to seek out resources, whether at the unit or college/school level, through the Stearns Center for Teaching and Learning, or within their discipline, to ensure that their decisions regarding the integration of generative AI tools into curriculum and pedagogy are informed, intentional, and appropriate to their teaching context. 

When integrating AI into your course, consider the following reflective questions about the impacts of AI on your course and field: 

  • What do students need to learn in my discipline or course? 
  • Does using AI aid or limit students' ability to meet the student learning outcomes for the course or curriculum? 
  • What is the role of AI in my field? 
  • Do students need an introduction to AI to succeed in other courses (i.e., do I need to provide prerequisite information for an upper-level course)? 
  • Am I spending enough time teaching students how to use the tools (both technically and ethically)? 
  • Do I have alternative assignment or tool options for students who do not wish to use AI?

Practices to Avoid (Red Flags)

Avoid practices that may compromise integrity, inclusivity, or trust: 

  • Relying entirely on AI for grading or feedback. 
  • Using "AI detection" tools without transparency or due process. 
  • Requiring students to use AI tools, whether paid or free, that share data with third-party companies, without providing accessible, privacy-conscious alternatives. 
  • Feeding identifiable student work into public AI systems. 
  • Creating assignments that reward copying AI output over student engagement.

Insights from the February 2025 Student Survey reveal: 

  • Students use AI for writing, coding, studying, and time management. 
  • Concerns include unclear rules, privacy, overreliance, and fairness.

 

To address this: 

  • Offer clear, consistent AI policies in syllabi and class discussions. 
  • Ensure assignments don't penalize students without access to tools. 
  • Foster critical awareness of AI's capabilities and limitations. 
  • When possible, consider surveying students about their AI usage to better meet your students' needs.

Supporting Student Understanding

Communication Strategies

Provide Clear Expectations — Students should be directed to carefully read course syllabi to determine what uses of AI are allowable or expected. If the syllabus does not include a statement on AI use, students should know that they can ask the instructor for clarification. 

Be Transparent Âé¶¹ÊÓÆµ Your Practice — Ideally, faculty should inform students about their approaches to AI use and whether they use AI detection services to check student work. Faculty should also be transparent about any AI used in evaluating student work. 

Develop Critical Literacy — Students should be encouraged to develop a critical perspective on AI use. They should clearly understand what AI can and cannot do, as well as how it might affect their learning process. To make informed decisions about AI use, students must first be provided instruction about how generative AI systems work, how to differentiate between models, and receive guidance about both the affordances and limitations of these tools, as well as the legal, ethical, and environmental impacts and consequences of AI use.

Learning-Centric Approach 

Help students understand that they should prioritize their learning outcomes when using AI. While AI can enhance efficiency, it should not replace critical thinking, problem-solving, or deep engagement with course material. Students should be held responsible for content selected that is produced by AI, whether used in coursework, extracurricular activities, internship sites, or other learning environments.

Supporting Student Development

Professional Preparation 

Students should have the opportunity to learn about employers' expectations about their facility with and use of generative AI tools. These opportunities may be offered by faculty in their classes, but may also be available from extracurricular sources, such as and . 

Policy Input 

Students should be encouraged to provide input on institutional-level AI policies to ensure their relevance and effectiveness. Decisions about individual course-level policies should remain the sole responsibility of course instructors.

  • (ASRB) is responsible for reviewing, verifying compliance, and providing recommendations with regards to new and upgrade software and hardware procurement projects.  
     
  • (DDSS) works with students, faculty, and staff by providing research support to facilitate data-focused research and teaching across the university in all disciplines.  
     
  • (ITSO) provides computer users and system administrators with the tools and information they need to secure their systems.  
     
  • (IRB) provides review and oversight of all human subjects research at the university, and is charged with protecting the rights and welfare of human subjects recruited to participate in research in which the university is engaged.  
     
  • (ORC) coordinates and supports the computing-related needs of the university's research community.  
     
  • (ORIA) is responsible for compliance with federal and other relevant regulations pertaining to the ethical and responsible conduct of research (e.g., IRB, IACUC, Export Controls, Conflict of Interest). 

Instructor Resources and Support

Faculty are encouraged to explore the following:

Stearns Center Resources

  • : Sample syllabi, assignment templates, AI pedagogy workshops.
  • : "Generative AI in the Classroom" includes model policies and teaching tools. (TBA)

University Policies and Guidelines