Education

Your staff use AI every day. Do you know what student data they're sharing?

Vireo Sentinel shows education leaders what's happening and catches sensitive student data before it leaves.

What's actually happening

Staff aren't being malicious. They're being efficient. But every one of these is a potential privacy breach.

Report cards

A teacher pastes observation notes into ChatGPT. Student names, learning difficulties, family circumstances. All on OpenAI's servers.

Counselling notes

A counsellor types up meeting notes about a student's mental health into Claude. Sensitive health information, and it just left the building.

Suspension letters

An admin officer pastes an incident report into an AI tool. Names, dates, what happened. Since June 2025, parents can sue for serious privacy invasions.

Blocking doesn't fix it. Policies alone don't either.

Staff work from home, use personal devices, and access AI through phones. Bans push usage underground. Only 7.5% of employees have received extensive AI training (WalkMe, 2025).

78%
use AI tools not provided by their employer
WalkMe, 2025
8.5%
of prompts to AI tools include sensitive data
Harmonic Security, 2025
60%
say unsanctioned AI is worth the risk to meet deadlines
BlackFog, 2026

See what's happening across your department

Most organisations discover AI usage they didn't know about within the first week.

Start free

The data at risk

This isn't generic corporate data. It's information about children.

Student records

Names, dates of birth, addresses, enrolment details, attendance records, student IDs.

Learning and development

IEPs, learning difficulties, gifted and talented assessments, academic performance data.

Behavioural and wellbeing

Incident reports, counselling notes, mental health observations, family circumstances, child protection information.

Staff and administrative

HR records, performance reviews, payroll details, internal communications, board papers.

Parent communications

Custody arrangements, financial hardship disclosures, meeting notes about medical needs.

Child protection

Mandatory reporting notes, at-risk assessments, welfare referrals, inter-agency correspondence.

Regulatory drivers

Existing privacy laws already create obligations around how student data is handled with AI tools.

The Privacy Act 1988, as amended by the POLA Act 2024

Australia's privacy laws changed substantially in December 2024. For education departments, three changes matter most.

Statutory tort for serious invasion of privacy

In effect from 10 June 2025

Parents can sue on behalf of their children for up to $478,550 in damages. They don't need to prove financial loss. Emotional distress is enough. A teacher pasting counselling notes into ChatGPT could trigger this.

Children's Online Privacy Code

Due 10 December 2026

The OAIC is developing a mandatory code for online services accessed by children. AI tools used in educational settings will be caught by this.

Enhanced penalties

In effect now

Up to $50 million for serious breaches. The OAIC's enforcement priorities for 2025-26 specifically include scrutinising AI and biometric technologies.

Automated decision-making transparency

Due 10 December 2026

Organisations using automated processes for decisions affecting individuals must disclose this in their privacy policy. Relevant for any school using AI for grading, assessment, or student support.

The Australian Framework for Generative AI in Schools (endorsed June 2025) explicitly requires schools to restrict uploading personally identifiable information into generative AI tools.

UK GDPR and Data Protection Act 2018

Schools are data controllers with specific obligations around children's data under the UK GDPR and Data Protection Act 2018.

Data Protection Impact Assessments (DPIAs)

Required now

Schools must conduct DPIAs before deploying new technology that processes children's data. Using AI tools without one is a compliance gap the ICO actively investigates.

ICO Children's Code (Age Appropriate Design Code)

In effect now

15 standards for online services likely to be accessed by children. Data minimisation, default privacy settings, and transparency are all directly relevant when staff use AI tools with student data.

Lawful basis for processing

Required now

Sharing student data with third-party AI providers requires a lawful basis. Most school privacy notices don't cover sending personal data to ChatGPT or Claude. That's a compliance gap.

ICO enforcement and fines

In effect now

The ICO can issue fines up to 17.5 million GBP or 4% of global turnover. Education sector enforcement actions have increased, with the ICO specifically highlighting AI as a priority area.

EU AI Act and GDPR

The EU AI Act creates specific obligations for AI systems used in education, classifying them as high-risk.

High-risk classification for education AI

August 2026

AI systems used in education are classified as high-risk under Annex III. This triggers requirements for documentation, logging, human oversight, and risk assessments. Applies to any institution deploying AI tools in educational settings.

GDPR data minimisation

In effect now

You shouldn't be sending more personal data to AI platforms than strictly necessary. Staff pasting full student records into AI tools is a clear data minimisation violation.

AI Literacy requirements

February 2025

Organisations must ensure staff have sufficient AI literacy. Education departments need to demonstrate their people understand the risks of sharing student data with AI tools.

Penalties

In effect now

GDPR fines up to 20 million EUR or 4% of global turnover. EU AI Act adds penalties up to 15 million EUR or 3% of global turnover for non-compliance with high-risk requirements.

FERPA, COPPA, and state laws

US education institutions face a layered compliance landscape across federal and state legislation.

FERPA (Family Educational Rights and Privacy Act)

In effect now

Protects student education records. When staff paste student information into commercial AI tools, that data may be shared with a third party without consent, creating a potential FERPA violation. Penalties include loss of federal funding.

COPPA (Children's Online Privacy Protection Act)

Updated rules effective 2025

Applies to children under 13 with significant per-violation penalties. The FTC finalised updated COPPA rules in January 2025 with tighter restrictions on how companies can handle children's data.

State privacy laws

Varies by state

California's SOPIPA, New York's Education Law 2-d, Colorado's Student Data Transparency and Security Act, and similar legislation in 40+ states create additional compliance requirements specific to student data.

FTC enforcement

Active enforcement

The FTC has actively pursued cases against companies mishandling children's data, with settlements in the hundreds of millions. EdTech companies are a specific focus area.

How Vireo Sentinel helps

See what's happening

Which AI tools do staff use, how often, and what categories of work go in. Spot patterns you didn't know existed.

Catch data before it leaves

Real-time detection of student data. Warns the teacher and gives them options: cancel, redact, edit, or override with justification.

Prove governance works

Compliance reports mapped to relevant frameworks. Evidence that your controls are working, not just that you have a policy.

What this looks like in practice

Report card season

Teacher pastes observation notes with student names into ChatGPT. Vireo detects the identifiers, warns the teacher, offers to redact before submission.

Welfare meeting summary

Counsellor types up notes about a student's home situation. Vireo catches health-related language and family details. They remove the identifying information and get a clean summary.

Policy drafting

Department head pastes real incident reports into AI to update behaviour policy. Vireo flags the real names and dates before they reach the platform.

IEPs and medical data

Coordinator pastes assessment results and diagnoses into AI. Vireo detects health information patterns and prompts them to redact before submission.

Built for education

Warns, doesn't block

Choices, not roadblocks. Teachers keep moving.

Deploys in minutes

Browser extension. No agents, no proxies, no IT project.

Privacy by design

Data detected and redacted in the browser, before it reaches our servers.

Affordable

Enterprise governance without the enterprise price tag.

Explainable detection

Deterministic pattern matching. No black boxes. Predictable and auditable.

See how your department uses AI

Start free