Implementation Guidance for AI-Restricted Courses

 

Brief Description

Establishing clear guidelines for “No-AI” or AI-restricted courses is essential not only for upholding academic integrity but also for empowering students to become thoughtful, responsible users of emerging technologies. When instructors clearly define where AI can be used (like independent study) and where it cannot (like graded assignments), they help students build important skills in writing effective prompts, checking sources, and thinking critically about their learning process. This approach ensures that students remain AI literate: they understand the limits of automated outputs, know how to evaluate and attribute information correctly, and can leverage AI as a constructive learning coach rather than a shortcut to a finished product. In turn, this fosters a culture of honesty and accountability, reduces the risk of academic dishonesty, and ultimately strengthens students’ confidence and competence in making informed decisions when using AI.

 

Implementation Steps

The following five-step approach provides a comprehensive framework for implementing AI boundaries effectively:

Begin by clearly stating in your syllabus that AI tools are not permitted in submissions but may be used for self-study purposes. This distinction is crucial for helping students understand the boundaries while still allowing them to benefit from AI as a learning tool. Emphasize the importance of academic integrity and foundational skill development in your course materials. Consider using the WMU AI in the Syllabus as a guide to clearly outline these expectations.

Support Materials:

Present students with a simple AI literacy model that explains both why and how they can use AI to coach themselves in their learning process. Focus on teaching essential skills such as prompt crafting, reflection techniques, and information verification methods. This approach helps students understand AI as a learning coach rather than a replacement for their own thinking.

Support Materials:

Demonstrate four to five model prompts during class time to show students effective ways to interact with AI for learning purposes. Examples might include prompts like "I don't understand this problem—what's the first step?" or "Can you help me identify what concepts I should review before tackling this topic?" Consider sharing a prompt bank that students can reference, or custom GPTs or other tools that are specifically designed to help with course-related tasks like writing feedback or subject-specific tutoring.

Support Materials:

Provide students with a comprehensive list of reflection questions to consider each time they use an AI tool for learning. These questions should guide them to think critically about their AI interactions and include considerations such as: What did I learn from this AI interaction? How did AI help me move forward? What worked in my prompt, and what didn't? How will I check this content's correctness and verify the legitimacy of AI-provided sources? Students should also reflect on whether they are preserving their own voice and original ideas, ensuring proper attribution, and identifying what they will do differently next time.

Support Materials:

Create and post a clear "Yes/No" AI-use chart in your Elearning platform that explicitly lists acceptable uses (such as student independent self-study and concept clarification) versus prohibited actions (such as final solution generation or assignment completion). This visual reference helps eliminate confusion about appropriate AI use. Send regular email reminders to reinforce these boundaries and maintain consistent communication about expectations throughout the course.

Support Materials:

By following this structured approach, instructors can successfully implement AI restrictive policies that maintain academic integrity while still preparing students to be thoughtful, ethical users of AI technology in their future academic and professional endeavors.

Enas Aref is a former AI Graduate Fellow with the WMU Office of Faculty Development and a doctoral instructor in the Industrial & Entrepreneurial Engineering & Engineering Management Department at WMU, and current Assistant Teaching Professor of Management & Technology in the College of Engineering and Innovation at Bowling Green State University. Research interests include Ergonomics and Human Factors, STEM Education, Artificial Intelligence, User Experience (UX), and Engineering Management.