AI plagiarism
Define unacceptable copying, transparent assistance, teacher-approved uses, and the evidence students must provide.
Build trust with clear guidance on AI plagiarism, academic integrity, student safety, ethical teaching practices, and school-ready policies.
Define unacceptable copying, transparent assistance, teacher-approved uses, and the evidence students must provide.
Design assignments that value process, drafts, oral explanation, citations, and teacher-student conferences.
Teach students to question outputs, verify claims, cite assistance, and revise with their own reasoning.
Minimize personal data, use school-approved tools, respect age limits, and keep human review in the loop.
Use AI to expand access and feedback without lowering expectations or outsourcing relationships.
Make expectations visible to students, families, teachers, and administrators before conflicts arise.
Students may use approved AI tools for brainstorming, feedback, explanation, and revision when the teacher allows it and when assistance is disclosed.
Students may not submit AI-generated work as their own independent thinking, use AI to avoid required practice, or enter private information into tools.
Students name the tool, purpose, and changes they made.
Drafts, notes, reflections, conferences, and oral checks support integrity.
AI-assisted work remains subject to teacher judgment and school policy.
Policies should be easy to explain in parent communication.
A full ban is hard to enforce and can miss an opportunity to teach responsible use. Many schools benefit from clear permitted and prohibited uses instead.
AI detection tools can be uncertain. Teachers should avoid relying on detection alone and should consider process evidence, conferences, drafts, and school policy.
Teachers can ask students to identify the tool, date, purpose, prompt type, what was used, and what the student changed or verified.