These guidelines provide a framework for helping students understand when and how to responsibly and ethically use generative AI applications, especially within their courses and degree programs. This guide will help you make informed decisions about AI use while prioritizing your learning outcomes and maintaining academic integrity.
AI guideline documentation will continue to evolve as tools, student expectations, and pedagogical approaches change.
Critical Literacy Approach to AI
Students should be encouraged to develop a critical perspective on AI use. They should clearly understand what AI can and cannot do, as well as how it might affect their learning process. While AI can very quickly produce content for coursework, students should be aware that it could potentially come at the expense of deep learning and skills development.
Human Agency and Learning-Centric Approach
Students should prioritize their learning outcomes when using AI. While AI can enhance efficiency, it should not replace critical thinking, problem-solving, or deep engagement with course material. Students should be held responsible for content selected that is produced by AI, whether used in coursework, extracurricular activities, internship sites, or other learning environments.
Accessibility
Students should use AI to promote inclusive education while being mindful of potential biases. AI tools used in courses must comply with accessibility standards to ensure equal access for students with disabilities.
Safety, Ethics, and Privacy
Students should adhere to ethical standards, protect their privacy (as well as the privacy of peers, instructors, and third parties), and be aware of cybersecurity risks when using AI technologies.
Academic Integrity and Course Policies
How Do You Know What Is Allowed or Prohibited in Your Classes?
Check the Syllabus — Ideally, every course should include an explicit description of whether AI use is allowed in the course, and if so, what it can and cannot be used for. Faculty often have good reasons for curtailing AI use, and those reasons should be spelled out in the syllabus. Some courses may encourage or even require AI use; these requirements should also be detailed in the syllabus.
Ask Your Instructor — You are expected to seek clarification from your instructor if their course policies on AI use are not clearly stated in the syllabus. You should also make sure that your definitions of your activities match the expectation of your instructor.
Transparency and Academic Integrity
In all cases, students should be directed to refer to the acceptable use policies in each course's syllabus and follow those policies, as well as any applicable policies in the relevant department or unit. As learning outcomes will be different depending on the focus of the course and the discipline, it is possible to have very different rules and guidelines about AI use in different classes.
Use of AI applications for completing any stage of an assignment should be acknowledged. Students should clearly acknowledge when AI tools have contributed significantly to their work, especially in academic writing and creative outputs. To claim AI-produced content as your own work or otherwise using it in ways that are not explicitly allowed by a given course's syllabus or departmental policies is a violation of academic standards.
How Do Faculty See the Use of AI?
You Are Responsible for Your Learning — Faculty want you to prioritize learning over efficiency when it comes to AI. If you outsource your learning (and thinking) to AI, you won't be able to adequately perform in your future employment.
You Are Responsible for AI Outputs that You Use — When you do elect to use AI, you will be held responsible for content that it produces, whether used in coursework, extracurricular activities, internship sites, or other learning environments.
Faculty May Use AI Detectors — Some faculty may use AI detectors on work that is turned in for a grade, similar to plagiarism detection systems. These systems will often flag writing as AI even if the use of AI is coming from Grammarly or Microsoft Copilot. If your instructor indicates they are using an AI detector, you may wish to document your writing process as evidence that you haven't used AI.
Privacy and Data Protection
You should avoid inputting sensitive, confidential, or personal information into AI tools unless they have obtained explicit, informed consent from the individuals and/or entities affected and the anticipated use is not otherwise prohibited by data privacy or other applicable laws. This includes the work of peers and instructors. Open AI systems—that is, those that are not George Âé¶¹ÊÓÆµ-specific and that don't include privacy protections—should not be used in the process of providing peer reviews.
Intellectual Property and Copyright
You should not upload copyrighted materials into AI systems without proper authorization, including course materials, peer- or instructor- produced content, and materials from Âé¶¹ÊÓÆµ Libraries. You should be aware that AI-generated content may draw from existing copyrighted materials.
Professional Preparation and Employer Expectations
How Do Employers See the Use of AI?
Students Should Know How to Use AI Effectively — In this current landscape of "efficiency," employers are expecting new graduates to know how to use AI to be more efficient and productive. Even if your inclination is to reject using AI for most of your classwork, you should nonetheless understand how to use it and how to leverage these systems at least to the level expected by employers.
Students Need to Demonstrate Value Add — Effectively using AI includes evaluating, correcting, and enhancing AI outputs, and you need to be able to show that you have the ability to do so: that is the value that you bring to the job.
Applying for On-Campus Jobs — On-campus employers are reporting a lot of AI-generated cover letters and responses to questions, and because AI tools generate very similar kinds of responses and content for cover letters, all of these applications begin to sound the same. Set yourself apart by personalizing and customizing your materials.
Participation in Policy Development
Students should be encouraged to provide input on institutional-level AI policies to ensure their relevance and effectiveness. Decisions about individual course-level policies should remain the sole responsibility of course instructors.
What Factors Can Help You Decide Whether to Use AI?
Guidance from Instructors — Consult the syllabus for your course to be sure you know which uses are allowed and which uses are prohibited. If you are using AI outside of class, consider the ethical and environmental costs of such use and the possible consequences of using AI for any given task.
Safety and Privacy — Will using AI expose you to risks to your safety or privacy? George Âé¶¹ÊÓÆµ provides access to AI tools that don't send your work back to the AI companies, but even so, you should avoid providing AI with personally identifiable information. It's important to also protect the privacy of your peers and instructors when providing inputs to AI applications.
Environmental Impact — Every prompt provided to an AI system uses electricity and water. If you are simply doing a search for information, using a search engine uses far less energy than using a chatbot and will almost certainly provide more accurate results.
Copyright and Intellectual Property — You should not upload copyrighted materials into AI systems without proper authorization, including course materials, peer-produced content, or materials from Âé¶¹ÊÓÆµ Libraries.
Beneficial Uses of AI for Learning
How Might You Use Generative AI Systems to Help You Learn?
Reading Assistance — AI applications can be used to summarize readings and to assist in reading difficult texts; however, keep in mind that most AI summarizing tools will include errors. You will benefit from trying to read through the text—even if it is difficult—before using AI to summarize or guide your reading.
Discussion and Language Practice — AI can be employed to generate discussion prompts or for language skills practice, and to simulate conversations. While AI applications can point out errors and suggest corrections, asking AI to fully produce texts takes away your opportunity to learn through the process of writing.
Feedback and Self-Assessment — You can ask AI systems for feedback on your work, but you should make sure to understand what is allowed in any given course. One effective use of AI for learning is to prompt it to act as an instructor or expert who can explain ideas and concepts to you and also create questions that can help you test your knowledge.
Supplemental Learning and Ideation — You can use AI tools to provide additional explanations or examples on any topic, but be sure to check its answers to ensure they are correct. AI systems can also be useful for brainstorming or prototyping, but make sure that using them in this way is acceptable in each class.
Extracurricular and Supplemental Uses
AI for Tutoring
You should be allowed to use AI-powered tutoring tools to supplement your learning, with the following considerations:
- AI tutoring should not replace human interaction with instructors or teaching assistants.
- You must disclose your use of AI tutoring tools when submitting work if required by course policy.
- Students are responsible for verifying the accuracy of information provided by AI tutors.
Course and Instructor Selection
Be aware of the following guidelines when using AI tools to gather information about courses and instructors:
- AI-generated recommendations should not be the sole basis for course or instructor selection.
- You should verify AI-provided information through official university sources.
- The use of AI to create or spread false or misleading information about courses or instructors violates university policies (and may be illegal in some cases).
Use in Scholarships
Students should avoid using AI tools for scholarship applications, both in essays and for completion of the application form, as many grantors will reject applications with AI-generated content.
When to Avoid Using AI
Don't Use It for Peer Review
The process of peer review is actually most helpful for the person doing the reviewing: If you outsource your review work to AI, you won't learn from doing it. And you should never submit the work of peers and instructors without their explicit consent.
Don't Use It When Applying for Scholarships
Unless explicitly instructed otherwise, avoid using AI tools for scholarship applications, including essays or the completion of the application. If the review committee suspects or can tell that you have used AI, your application will be rejected.
Don't Use It if It Is Harmful or Illegal
- Don't use AI to impersonate instructors or other students, in ways that disrupt classes or threaten instructors or fellow students, or employ AI to create or spread false or misleading information about courses or instructors.
- Don't use AI to generate false or misleading academic records or recommendations.
- Don't use AI tools to circumvent academic integrity policies or gain unfair advantage in assessments.
Open Questions
Recording and Summarizing Classes
What if you want to use AI tools to assist in note-taking and summarizing lectures? Guidance may include:
- You should obtain explicit permission from the instructor before recording any class sessions.
- You should not share recordings or AI-generated summaries with individuals outside the class without the instructor's consent.
- You should respect the privacy and intellectual property rights of both instructors and fellow students.
Students should have the right to attend classes without being recorded by AI tools used by others. Instructors should establish clear guidelines for recording and using AI in their classrooms.
George Âé¶¹ÊÓÆµ Resources
- (ASRB) is responsible for reviewing, verifying compliance, and providing recommendations with regards to new and upgrade software and hardware procurement projects.
- (DDSS) works with students, faculty, and staff by providing research support to facilitate data-focused research and teaching across the university in all disciplines.
- (ITSO) provides computer users and system administrators with the tools and information they need to secure their systems.
- (IRB) provides review and oversight of all human subjects research at the university, and is charged with protecting the rights and welfare of human subjects recruited to participate in research in which the university is engaged.
- (ORC) coordinates and supports the computing-related needs of the university's research community.
- (ORIA) is responsible for compliance with federal and other relevant regulations pertaining to the ethical and responsible conduct of research (e.g., IRB, IACUC, Export Controls, Conflict of Interest).