About These Guidelines
Background, purpose, and how schools can use and adapt the VINE School GenAI Guidelines.
Overview
Generative Artificial Intelligence (GenAI) is now embedded in the tools and platforms used across education. Since the VINE School first published its GenAI Guidelines in 2023, the technology has moved from a novelty to a standard part of the educational environment. AI capabilities are built into productivity suites, learning platforms, creative tools, and administrative systems. Students and staff may use AI daily — and even where they do not use it deliberately, AI features are increasingly operating in the background of tools they already rely on.
These updated guidelines reflect three years of rapid change. They are informed by feedback from VINE member schools, consultation with educators and students, and alignment with national and international frameworks including:
- Australian Framework for Generative AI in Schools
- OECD/EC AI Literacy Framework
- UNESCO Guidance on Generative AI in Education
- UNICEF Policy Guidance on AI for Children
A whole-of-school document
These guidelines are not addressed solely to ICT managers or to teachers — they require engagement from school leadership, curriculum, wellbeing, IT, operations, and governance. AI touches every function of a school, and effective AI governance is a shared responsibility.
How to use these guidelines
Each section follows a consistent structure:
- Part A: Guiding Statements — the VINE School's position on key issues, intended as a starting point for school-level discussion, not as prescriptive rules
- Part B: Practical Tools — checklists, frameworks, and templates that schools can adapt and use immediately
- Part C: Key Roles, Key Questions — acknowledging that these guidelines are read by people in very different positions, with different concerns
Designed to be adapted
The guidelines are published under a Creative Commons licence (CC BY-NC-SA 4.0) and written for a fictional "VINE School" so that schools can customise them to their own context. Schools are at different stages of their AI journey — these guidelines are designed to be entered at any point. Schools do not need to implement every section at once.
An initial audit of existing school policies (see Tool 3.5: AI-Adjacent Policy Audit) is recommended as a starting point, as AI intersects with policies schools already have in place.
A living document
These guidelines are a living document. Schools should review their AI practices at least annually — with a lighter-touch check each semester — and adapt them as the technology, the evidence base, and the needs of their community evolve. The pace of change in AI means that any policy written today will require updating, and schools should build this expectation into their governance from the outset.
Stakeholder Map
These guidelines are read by people in very different roles, with different concerns and sometimes competing priorities.
| Role | Primary Concern | Typical Tension |
|---|---|---|
| Board Directors | Governance, risk, reputation | Want assurance without getting into operational detail |
| Principal | Whole-school strategy, culture | Balancing innovation against risk; answering to both board and parents |
| Head of Curriculum | Teaching quality, assessment integrity | Wants pedagogical freedom; needs clear frameworks |
| Head of Student Wellbeing | Student safety, mental health | Concerned about harms; resists tools that increase risk |
| Business Manager | Efficiency, compliance, cost | Wants to use AI for operations; needs to manage vendor risk |
| ICT Manager | Security, infrastructure, control | Wants secure systems; frustrated by shadow IT and scope creep |
| Teachers | Day-to-day classroom reality | Needs clarity and practical support, not abstract policy |
| Parents | Their child's safety and fairness | Worried about cheating, privacy, screen time, and cognitive development |
| Students | Access, fairness, clarity | Wants clear, consistent rules and a voice in the conversation |
References and Alignment
National and International Frameworks
- Australian Framework for Generative AI in Schools — the foundational Australian framework for GenAI in K–12 education
- OECD/European Commission Draft AI Literacy Framework (2025) — structures AI literacy into four domains
- UNESCO Guidance on Generative AI in Education and Research — international guidance on ethical and pedagogical use
- UNICEF Policy Guidance on AI for Children (2021) — a rights-based framework for AI impacts on children
- EU Ethical Guidelines on the Use of AI and Data in Teaching and Learning (2022) — seven ethical requirements for trustworthy AI in education
- UK DfE Generative AI Product Safety Standards (2025) — safety and compliance standards for AI products in schools
Additional Resources
- Leon Furze — Blog — free articles and resources on AI in education
- Practical AI Strategies — online courses and digital downloads for educators
- Teaching AI Ethics — open-access lesson plans, discussion frameworks, and case studies (CC BY-NC-SA 4.0)
- eSafety Commissioner — Australian Government agency for online safety
- Office of the Australian Information Commissioner — guidance on Australian Privacy Principles