AI tools such as chatbots (e.g., ChatGPT), text generators (e.g., Rytr), and image creators e.g., DALL·E) are becoming more accessible in education. While these tools can support learning and creativity, they also raise important questions about honesty, originality, and skill development, especially in vocational training where demonstrating personal understanding and hands-on competence is essential.

As a teacher, your role is to help students use AI tools in a way that supports learning without compromising integrity. This starts by understanding your institution’s policies on AI use and guiding students to make informed decisions about when and how it is appropriate to use these tools.

To promote responsible AI use in your classroom, consider these three key areas:

  • Understand Your Institution’s Policy: Policies about AI vary. Some institutions prohibit the use of AI tools in assessments, while others allow limited use (e.g., for idea generation or research) as long as it’s acknowledged. Check your faculty handbook, assessment guidelines, or consult your head of department to understand the current rules.
  • Model Ethical Use: If AI is permitted, explain what responsible use looks like in your classroom. For example, a student may use an AI tool to generate a list of kitchen safety tips but must adapt the content, understand it, and cite the tool used (e.g., “Safety ideas generated using ChatGPT, adapted by student”). Helping students understand not just the rules but the reasons behind them builds a culture of trust and professionalism. It also reinforces that AI is a support tool, not a replacement for developing essential trade knowledge.
  • Weigh the Implications: Used ethically, AI tools can support learning, for example, by helping students explore formats for writing a cleaning checklist or planning a safety procedure. However, overuse or uncritical reliance on AI can result in students skipping the very thinking and decision-making they need to build trade competence. Encourage open conversations about the role of AI in your classroom. When students are involved in defining what ethical use looks like, they are more likely to take ownership of their learning. For example, you could ask: “When is it helpful to use AI, and when does it stop being your own work?” These conversations help students build ethical awareness, which are skills they will need in the workplace.

By guiding students in how, when, and why to use AI appropriately, you are not just enforcing rules. You are helping them become responsible digital citizens and skilled trade professionals. Clear boundaries, practical examples, and honest conversations will prepare your students to navigate AI confidently and ethically, both in the classroom and on the job.

Teaching Scenario: AI-generated Answers: You discover that several students used AI-generated answers for a take-home exam, even though you had clearly stated they should “explain in your own words.” They claim they did not realise this was a problem because the tool was freely available and easy to use.

Self-Reflection: What policies would you establish around AI use in assessments? How can you educate students about appropriate and inappropriate uses of AI? What steps can you take to ensure students understand the ethical boundaries as well as the potential benefits of using AI tools in their trade learning?

Share your reflection here. We would love to hear how you plan to raise awareness and promote accountability in your students’ use of AI.