Privacy & Security
Data governance, shadow IT, risk zoning, vendor transparency, age-appropriate access, and the AI Lead role.
Privacy and security governance must be proportional — managing risk without stifling innovation. These statements establish the VINE School's approach to AI data governance.
Guiding Statements
3.1
Data Governance
No personal or identifying information about students or staff is to be entered into any AI application or service. This includes names, student numbers, learning data, behavioural records, and any information that could be used to identify an individual.
When AI features are activated within existing platforms (e.g. Microsoft 365 Copilot, Google Gemini), schools must understand where data is processed, whether it leaves Australia, whether it is used to train AI models, and what retention policies apply.
Student work must not be entered into AI platforms by staff without transparency and consent.
3.2
Shadow IT and Shadow AI
Shadow AI is the use of AI technologies outside of sanctioned channels. The VINE School recognises that shadow AI is most often a rational response to unmet needs.
The VINE School does not aim for zero AI tool use outside the approved list. It aims for zero covert AI tool use.
The primary risk is not that staff or students use AI, but that they use it without anyone knowing — making risks invisible and unmanageable. Unapproved AI use that is disclosed is treated as a governance opportunity, not a disciplinary matter.
3.3
Risk Zoning
The VINE School applies proportional governance through a three-zone model:
Personal Productivity
Staff use AI for brainstorming, drafting, organising notes, administrative tasks. No student data involved.
General awareness and transparency. No formal approval required.
Classroom-Facing
AI used in teaching, generating resources, providing feedback, or any context where AI-generated content reaches students.
Tools must be from the approved set or vetted through the fast-track process. Professional learning required.
Student Data Involved
AI processes student work, learning analytics, or reports. Personal/identifying information must not be entered.
Formal approval required. Privacy impact assessment. IT and leadership sign-off.
The answer to "can I use this?" is not yes or no — it is "which zone does this fall into?"
3.4
Approved Tools and the "Paved Road"
The VINE School provides a self-service catalogue of pre-vetted AI technologies and services. The goal is to make the official path the path of least resistance. If the approved toolset is harder to access than the shadow alternative, shadow behaviour is a predictable and rational outcome.
Shadow AI is treated as a signal, not a failure. When patterns of unsanctioned use are identified, the school treats this as feedback that the approved toolset has a gap.
3.5
Vendor Transparency
Before any AI tool is approved for Zone 2 or Zone 3 use, the VINE School requires clear answers from the vendor on data storage and processing location, training data use, data retention, Privacy Principles compliance, content filtering, and safety measures.
The VINE School notes that vendor terms and conditions change frequently and without notice. The AI Lead monitors changes to the terms of approved tools and flags material changes to school leadership.
3.6
Age-Appropriate Access
Different year levels require different levels of AI access and different safeguards. The VINE School maintains an up-to-date register of age requirements for approved tools. Where a platform's terms of service specify a minimum age, the school enforces this.
3.7
The AI Lead
The VINE School designates an AI Lead — a staff member with responsibility for coordinating the school's approach to AI across all three pillars. Responsibilities include coordinating professional learning, maintaining the approved tools catalogue, monitoring vendor terms, liaising with ICT on governance, supporting staff with AI integration, and serving as the first point of contact for AI-related queries.
3.8
Transparency in School Operations
The VINE School discloses the use of AI in school operations, including administration, reporting, communications, and marketing. Any use of AI in generating student reports, parent communications, or public-facing content is reviewed by a human before distribution.
3.9
AI in Meetings and Recordings
AI-powered meeting recording, transcription, and summarisation tools must not be used in any meeting unless all participants have given their consent. This decision must be made at or prior to the start of each meeting and must not be treated as a standing arrangement.
Key Roles, Key Questions
| Role | Key Questions | Guidance |
|---|---|---|
| ICT Manager | How do we secure our systems when AI is embedded in everything? How do we manage shadow AI without being the "no" department? | 3.1, 3.2, 3.3 |
| Business Manager | What is the risk profile of AI technologies in use? What's our liability if student data ends up in an unapproved tool? | 3.1, 3.3, 3.5 |
| AI Lead | How do I keep track of vendor terms as they change? How do I support staff while maintaining governance? | 3.2, 3.4, 3.5 |
| Teachers | What can't I do? If I've been using a tool that isn't approved, what happens when I tell someone? | 3.2, 3.3 |
| Principal | Do I have a clear governance structure I can explain to the board and the community? | 3.1, 3.2, 3.4, 3.7 |
| Board Directors | Are we compliant with Australian Privacy Principles? Is our governance proportional? | 3.1, 3.2, 3.3, 3.7 |
| Parents | Is the school keeping track of what AI technologies are being used with my child's data? | 3.1, 3.4, 3.5, 3.7 |