Faculty Guidelines
As educators and subject matter experts, faculty should play a central role in shaping
how students understand and ethically use generative AI, equipping them for success
in both coursework and the workforce. IVC encourages faculty to set clear expectations,
model responsible AI use, and design assignments that foster critical thinking and
digital literacy.
Key Recommendations/Best Practices
- Use the ASCCC continuum guidelines (Academic Integrity Policies in the Age of Artificial Intelligence (AI)) to craft a specific and detailed syllabus statement regarding use of AI in your
course
- Consider using their system of levels (Open, Conditional, Restricted, Closed), which
reflect varying degrees of AI acceptability
- Be transparent about the faculty member’s own use of AI and any AI-detection tools
- Remember that all AI “detectors” are fallible and should not be relied upon exclusively
- Stay informed through ongoing professional development on AI tools, pedagogy, and
digital ethics
- Clearly articulate AI expectations and guidelines on assignments
- Any AI usage must align with IVC’s Academic Honesty Policy
- Provide accommodations for students who may have concerns about accessibility, privacy,
and/or the environmental impact of AI
- Provide guidance and context to help students understand AI’s benefits, limitations,
and risks
- Make space for students to ask questions about AI use in your class
- Including language like, “If you’re unsure whether a specific tool or use is allowed,
please ask me,” helps foster a supportive environment and reduces unintentional misuse.
- Acknowledge the dual nature of AI: it can both reduce equity gaps and reinforce bias;
streamline learning and spread misinformation; foster engagement and lead to disengagement
- Align AI use with institutional, program, and learning outcomes
- Support inclusive access while maintaining high standards for academic integrity
Adding an AI Statement in Your Syllabus
In addition to the general AI statement provided in the IVC syllabus template, faculty
should include a course-specific AI use policy. This statement should address how
AI relates to the course’s learning objectives, and should include the following:
- What AI tools are allowed (if any)
- Acceptable and unacceptable use
- For example, provide assignment-specific examples to reduce ambiguity. For instance:
- Acceptable: Using Grammarly to check grammar before submitting a final paper
- Unacceptable: Using ChatGPT to write a discussion board post or essay without permission
- When and how students must cite AI use
- For example, you might ask students to:
- Cite the tool (e.g., ChatGPT, DALL·E)
- Include the prompt they used
- Provide a screenshot or transcript of the AI-generated output
- Whether AI detection tools will be used
- How the instructor may use AI for instruction, grading, or feedback
What if a Student Misuses AI?
If you suspect a student has misused AI:
- Refer to your syllabus policy and IVC’s Academic Honesty Policy – BP/AP 5500.
- Gather documentation (e.g., AI-generated text, prompts used).
- Have a conversation with the student when appropriate.
- Report academic dishonesty through the usual channels if needed.
Tip: Faculty are encouraged to build transparency into assignments (e.g., requiring
screenshots of AI chats or prompts) to help prevent confusion.
Resources for Faculty
- Sample AI Syllabus Language- From ASCCC and other CCC faculty
- ASCCC AI Integrity Resource Document
- Stanford AI Teaching Guide
- CCC AI Resource Hub
AI Tools and Detectors Use
- All use of technology (including educational technology tools, like AI detectors)
and practices musty comply with IVC Board and Administrative Policies.
- If you are using an AI detector tool, it is important that you are transparent with
your students and that the tool is secure.