In September 2023, a secondary school in Essex accidentally exposed 1,200 students’ personal data—including special educational needs information and home addresses—to an unauthorised third-party application. The Information Commissioner’s Office (ICO) investigation revealed the school lacked basic vendor vetting procedures, resulting in an £18,500 fine and lasting reputational damage that affected subsequent enrolment figures.
This incident represents a growing pattern across UK educational institutions. According to ICO data, schools and colleges reported 483 data breaches in 2023 alone, a 34% increase from the previous year. Meanwhile, social media platforms face mounting regulatory pressure to comply with the Age Appropriate Design Code, with TikTok receiving a record £12.7 million fine for failures affecting 1.4 million British children.
For students, parents, and educators, understanding privacy laws has evolved from being optional to being essential knowledge for safeguarding. Young people’s data encompasses far more than names and addresses—it includes behavioural patterns tracked by learning management systems, biometric identifiers used for school access, location data from educational apps, and psychological profiles generated by AI-powered tutoring platforms.
This comprehensive guide examines the privacy laws that protect students and minors across international jurisdictions, with a particular emphasis on the UK regulatory frameworks. We examine the General Data Protection Regulation (GDPR) and its application to children, as well as the US Family Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Act (COPPA), and compare these frameworks with those in Canada and Australia.
Table of Contents
The Evolving Digital Landscape for Minors and Students
The transformation of education from primarily physical to substantially digital environments has accelerated dramatically over the past five years. This section examines the pervasive nature of data collection affecting young people and explores why minors constitute a uniquely vulnerable category requiring enhanced legal protections.
From Classrooms to Social Feeds: The Pervasive Data Trail
Modern students generate extensive digital footprints that begin remarkably early in life. Pre-school children interact with educational apps that record learning progress, response times, and engagement patterns. As children progress through primary and secondary education, their personal information flows into increasingly complex systems—school information databases, learning management platforms such as Google Classroom or Microsoft Teams, online assessment tools, behaviour tracking systems, and communication portals connecting schools with families.
Consider a typical Year 10 student’s daily digital interactions: using a school-issued tablet with biometric login, participating in an AI-powered mathematics tutoring session that tracks problem-solving approaches, accessing a mental health support app recommended by the school counsellor, completing online homework through a third-party platform, and engaging with social media during breaks.
The breadth of data categories captured includes basic identity information, academic records, behavioural data, sensitive information such as special educational needs, biometric identifiers, location data, communication content, and psychological indicators. This data fragmentation across numerous disconnected systems creates significant privacy vulnerabilities.
Understanding the Vulnerabilities of Young Data Subjects
Minors possess inherent characteristics that necessitate enhanced privacy protections beyond those afforded to adults. Developmental psychology research indicates that children and adolescents continue to develop cognitive capacities essential for informed decision-making, particularly regarding abstract future consequences. Young people exhibit greater susceptibility to persuasive design tactics employed by online services—techniques deliberately crafted to encourage data sharing and extended engagement.
The power imbalance between young people and institutions further compounds vulnerability. Students cannot meaningfully refuse school-mandated technology systems; refusal would exclude them from educational participation. Children lack the life experience to recognise manipulative practices, understand surveillance capitalism business models, or anticipate how today’s data collection might affect tomorrow’s opportunities.
These inherent vulnerabilities justify dedicated legal protections specifically designed for young data subjects. Such laws typically mandate meaningful parental consent mechanisms for younger children, require clear age-appropriate information provision, impose strict data minimisation principles, establish “best interests of the child” as a primary consideration, create enhanced security requirements, and provide accessible mechanisms for minors or parents to access, correct, or delete personal information.
Key International Frameworks Protecting Young People’s Data
Privacy laws protecting minors and students vary significantly across jurisdictions, reflecting different legal traditions, cultural values, and regulatory philosophies. This section examines the primary legal frameworks governing the personal data of young people internationally.
GDPR: Europe’s Gold Standard for Children’s Data
The General Data Protection Regulation, implemented in May 2018 and retained in UK law following Brexit, represents the most comprehensive privacy framework specifically addressing the protection of children’s data. Article 8 GDPR establishes special protections for “information society services” directly offered to children—broadly encompassing apps, games, social media, and online platforms.
The regulation creates a fundamental age threshold for digital consent: children under 16 cannot lawfully consent to data processing for information society services without parental authorisation. However, the UK exercised flexibility to lower this threshold to 13 years through the Data Protection Act 2018. Consequently, online services offered directly to UK children under 13 must obtain verifiable parental consent, whilst 13-17 year-olds may consent independently to standard data processing.
Beyond consent requirements, GDPR Article 8 mandates that privacy information directed at children must be “clear, plain and in language which they can easily understand.” The UK’s Age Appropriate Design Code (Children’s Code), implemented in September 2020, extends these protections substantially. This ICO-issued statutory code comprises 15 standards that online services likely to be accessed by children must implement, including default high privacy settings, data minimisation, transparent parental controls, and no profiling unless demonstrably in the child’s best interests.
Enforcement demonstrates regulatory seriousness. TikTok received a £12.7 million ICO fine in April 2023 for processing data of children under 13 without parental consent, using children’s data for targeted advertising, and failing to provide accessible privacy information. The ICO estimated 1.4 million British children under 13 used TikTok in 2020.
The GDPR framework extends beyond commercial services to educational institutions. Schools processing student data must identify appropriate lawful bases—typically public task (delivering statutory education) or legitimate interests. Schools cannot rely on consent as their primary lawful basis for core educational functions, as the institutional power imbalance prevents truly free consent.
FERPA: The Cornerstone of US Student Privacy
The Family Educational Rights and Privacy Act, enacted in 1974, governs how American educational institutions receiving federal funding handle student educational records. Whilst FERPA applies primarily within US jurisdiction, its provisions affect UK students attending US universities, participating in transatlantic exchange programmes, or using educational technology platforms developed by US companies.
FERPA grants parents two fundamental rights regarding their children’s educational records: the right to inspect and review records maintained by the educational institution, and the right to consent before the institution discloses personally identifiable information, subject to specific exceptions. These parental rights transfer to the students themselves upon reaching 18 years of age or attending post-secondary institutions.
The Act defines “education records” broadly as any records directly related to a student and maintained by an educational agency. This includes academic transcripts, disciplinary records, special education documentation, health records maintained by schools, and communications related to the student.
FERPA permits disclosure without consent in specific circumstances: to school officials with legitimate educational interests, to other schools where the student seeks enrolment, to specified officials for audit purposes, in connection with financial aid, to organisations conducting studies for educational institutions, to accrediting bodies, to comply with judicial orders, in health or safety emergencies, and in response to lawfully issued subpoenas.
The “school official” exception has generated considerable controversy as educational technology continues to proliferate. Recent guidance clarifies that outside contractors may qualify as school officials if they perform institutional services under the school’s direct control and are subject to FERPA’s use and redisclosure requirements.
COPPA: Regulating Online Services for US Children
The Children’s Online Privacy Protection Act, implemented in April 2000 and substantially amended in 2013, regulates commercial websites and online services that are directed to children under 13 years of age. Although COPPA constitutes US federal law, its extraterritorial reach affects UK-based services accessible to American children.
COPPA imposes several core obligations on covered operators. They must post clear privacy policies; provide direct notice to parents and obtain verifiable parental consent before collecting children’s personal information; offer parents access to their child’s information; maintain reasonable data security procedures; and retain children’s information only as long as reasonably necessary.
The Act defines “personal information” expansively, including names, addresses, email addresses, screen names, telephone numbers, geolocation information, photographs containing the child’s image, and persistent identifiers that enable recognition across websites for tracking purposes.
COPPA’s verifiable parental consent requirements include providing credit card information, calling a toll-free number staffed by trained personnel, submitting government-issued identification, video-conferencing with trained personnel, or submitting a consent form via post or electronic scan.
COPPA’s school exception permits educational institutions to obtain consent on behalf of parents for online services used exclusively for educational purposes. This provision enables schools to provide learning management systems and educational apps without individual parental consent for each service. However, educational technology companies cannot collect student information under this exception and subsequently monetise that data through advertising or data brokerage.
Beyond the US & UK: Canada’s PIPEDA and Australia’s Privacy Act
Canada’s PIPEDA Framework
Canada’s Personal Information Protection and Electronic Documents Act governs how private sector organisations collect, use, and disclose personal information during commercial activities. Unlike GDPR or COPPA, PIPEDA does not establish specific age thresholds for consent. Instead, Canadian law applies capacity-based approaches—individuals capable of understanding the consequences of collection may provide valid consent, regardless of age.
The Office of the Privacy Commissioner of Canada has interpreted this framework to require organisations to consider whether young people have sufficient understanding to provide meaningful consent. For very young children, parental consent remains necessary. For adolescents, organisations must assess capacity contextually.
Australia’s Privacy Act Framework
Australia’s Privacy Act 1988 establishes the Australian Privacy Principles, which govern the handling of personal information. The Act does not establish specific child privacy provisions or age-based consent thresholds comparable to GDPR or COPPA. Instead, Australian law recognises that children may lack the capacity to consent, requiring organisations to consider whether individuals can reasonably understand the terms.
Recent Australian developments indicate an increasing focus on regulatory attention to children’s privacy. The 2020 OAIC investigation into TikTok’s data practices concluded that the platform had likely interfered with the privacy of Australian children by collecting personal information without their consent. The Australian government initiated a review of the Privacy Act to examine whether specific protections for children’s privacy should be introduced.
Navigating the New Frontier: AI, Biometrics, and Educational Data

Emerging technologies present profound privacy challenges that existing regulatory frameworks often struggle to address adequately. This section examines three technology categories increasingly prevalent in educational settings.
Artificial Intelligence in Education: Privacy Implications of Personalised Learning
Artificial intelligence systems have permeated educational environments rapidly, powering adaptive learning platforms, automated essay scoring, behaviour prediction algorithms, and administrative decision-making tools. These systems promise educational personalisation at scale but simultaneously create novel privacy risks.
AI-powered educational platforms typically function through continuous data collection and analysis. As students interact with learning software, the system records response times, error patterns, navigation behaviours, engagement indicators, and correctness rates. Machine learning algorithms process this data to construct student models—computational representations of knowledge states, learning styles, and likely future performance.
The privacy implications multiply when considering data permanence and inference. Unlike traditional assessments producing discrete grades, AI systems accumulate granular behavioural data over extended periods. Furthermore, machine learning systems can infer sensitive attributes from seemingly innocuous behavioural data—predicting special educational needs, mental health challenges, or family circumstances without explicit disclosure.
Algorithmic bias introduces additional concerns. Machine learning systems trained on historical data perpetuate patterns present in training sets. Research has documented AI tutoring systems that allocate more attention to male students, college admission algorithms that disadvantage applicants from under-resourced schools, and behaviour prediction models that disproportionately flag minority students.
Current UK data protection law addresses some AI privacy concerns through provisions of the GDPR. Article 22 establishes rights regarding automated decision-making, though with limitations. However, GDPR’s transparency obligations clearly apply to AI systems. Data controllers must provide clear and understandable information about the logic involved in automated processing.
Biometric Data Collection: Consent, Security, and Ethical Concerns
Biometric systems utilising unique physiological or behavioural characteristics for identification have expanded rapidly in UK schools. Common applications include fingerprint scanning for library book borrowing, cashless catering payments, and attendance registration; facial recognition systems for building access; and voice recognition for literacy applications.
GDPR Article 9 categorises biometric data processed for unique identification as “special category data” requiring enhanced protections. The Protection of Freedoms Act 2012 establishes specific requirements for the processing of biometric data in English and Welsh schools. Schools must obtain written consent from at least one parent before processing pupils’ biometric information. Crucially, if the pupil objects to processing—regardless of parental consent—schools must provide reasonable alternative means of accessing the same services.
Data security concerns intensify with the use of biometric systems. Unlike passwords or identification cards, biometric characteristics cannot be changed if compromised. Schools collecting biometric data must implement rigorous security measures, including encryption of biometric templates, strict access controls, regular security audits, and comprehensive staff training.
Several UK schools have faced controversy over the implementation of biometrics. In 2021, a secondary school in North Ayrshire installed facial recognition for cashless canteen payments, generating substantial media criticism regarding proportionality and necessity. The school subsequently suspended the system pending further consultation.
Wearables, IoT, and Student Monitoring: Balancing Safety with Privacy
Internet-connected devices and wearable technologies increasingly feature in educational settings. Applications include GPS trackers on school transportation vehicles, fitness trackers distributed through school health initiatives, smartwatches with messaging capabilities marketed as child safety devices, and school-issued tablets equipped with monitoring software that tracks browsing activity.
Location tracking technologies exemplify the tension between privacy and safety. GPS trackers on school buses allow parents to verify their child’s safe arrival and enable schools to optimise routes. However, they also create continuous location histories revealing detailed movement patterns.
School-issued device monitoring software presents one of the most pervasive surveillance mechanisms. Learning management systems and device management platforms can track browsing histories, application usage, keystrokes, screen captures, and location data. Schools justify this monitoring through safeguarding obligations—identifying potential harm indicators such as concerning online searches or cyberbullying communications.
The proportionality of device monitoring depends heavily on implementation. Keyword filtering, which triggers alerts for defined terms, differs fundamentally from comprehensive browsing history collection. The Children’s Commissioner’s 2021 report on educational technology privacy practices found that 89% of EdTech products surveyed could engage in data practices that risked children’s privacy.
Empowering Stakeholders: What Everyone Needs to Know

Effective privacy protection requires active engagement from all stakeholders. This section provides targeted, actionable guidance for each stakeholder group.
For Parents: Advocating for Your Child’s Digital Rights
Parents serve as primary advocates for children’s privacy, particularly for younger students. This role requires an understanding of available privacy rights and the ability to engage effectively with schools and online services.
Understanding Your Rights Under UK GDPR
Parents of children under 13 generally exercise their children’s privacy rights on their behalf for information society services that require consent. These rights include the right to be informed through clear privacy notices; the right of access enabling you to obtain confirmation whether your child’s data is being processed; the right to rectification allowing correction of inaccurate information; the right to erasure in specific circumstances; the right to restrict processing; the right to data portability; and the right to object to processing based on legitimate interests.
Engaging With Schools: Practical Steps
When schools introduce new technologies requiring parental consent, ask specific questions: What personal data will be collected? How will it be used? Who will have access? Where will data be stored? What security measures protect the data? How long will data be retained? Can I access my child’s data? What happens if we withdraw consent?
If concerns arise about school data practices, follow appropriate escalation channels. Initially, raise concerns with the class teacher. If unsuccessful, contact the school’s Data Protection Officer. If school-level engagement does not resolve concerns, you may escalate to governors or lodge formal complaints with the ICO.
Reporting Privacy Violations
Several reporting mechanisms address different privacy concerns. For suspected data breaches, report them immediately to the organisation and directly to the ICO if you are dissatisfied. For online safety concerns involving illegal content or exploitation, use the Child Exploitation and Online Protection command (CEOP) reporting system. For identity theft affecting children, report to Action Fraud.
For Educators and Schools: Implementing Robust Data Protection Policies
Educational institutions carry substantial data protection responsibilities. Compliance requires embracing data protection principles throughout school culture and operations.
Core Legal Obligations
Schools must identify appropriate lawful bases for processing student data. For core educational functions, the lawful basis is typically public task or legitimate interests. Schools cannot rely on consent as the primary lawful basis for essential educational processing because the power imbalance prevents genuinely free consent.
All schools processing UK residents’ data must designate a Data Protection Officer, who advises on obligations, monitors compliance, conducts training, and serves as the primary ICO contact point. Schools must maintain comprehensive documentation through Records of Processing Activities.
Conducting Data Protection Impact Assessments
Data Protection Impact Assessments (DPIAs) are mandatory when processing is likely to result in a high risk to individuals’ rights. This includes implementing biometric systems, AI-powered learning platforms, or comprehensive monitoring software. DPIAs involve systematically describing processing operations, assessing the necessity and proportionality of these operations, identifying associated risks, and determining appropriate mitigation measures.
Vetting EdTech Vendors
Educational technology procurement requires careful due diligence regarding privacy. Schools should establish standard vendor questionnaires that request information about data collection, lawful bases, data sharing, international transfers, security measures, data breach procedures, data subject rights provisions, and compliance with the Children’s Code.
Vendor contracts must include Data Processing Agreements specifying processing purposes, data types, controller and processor obligations, sub-processor arrangements, breach notification requirements, and data deletion upon contract termination.
Establishing Data Breach Response Procedures
Upon discovering a potential breach, schools should immediately contain the incident, assess the severity of the breach, determine their reporting obligations, and implement remedial actions. Under the GDPR, organisations must report breaches to the ICO within 72 hours if they pose a risk to individuals’ rights. Schools must notify affected individuals without undue delay if breaches pose high risks.
For Minors and Students: Understanding Your Privacy Rights
Young people possess their own rights that they can increasingly exercise as they mature. If you’re 13 or older in the UK, you generally have privacy rights you can exercise yourself.
Your Privacy Rights Explained
Your rights include the right to know what information organisations hold about you; the right to see your information through Subject Access Requests; the right to correct mistakes in your information; the right to delete your information in some situations; the right to limit how information is used; the right to move your information between services; and the right to object to processing for direct marketing.
Making Privacy Choices for Yourself
Consider these principles: think before you share, as information online may never be fully controlled again; read privacy settings and adjust them to protect your privacy; understand what you’re consenting to when agreeing to terms of service; and question unnecessary data collection.
Protecting Your Digital Footprint
Your digital footprint significantly impacts your reputation, relationships, and career opportunities. Manage it thoughtfully through regular privacy check-ups, careful posting consideration, and being alert to your digital reputation by occasionally searching your own name.
For EdTech Companies: Ethical Design and Legal Compliance
Educational technology companies bear significant responsibility for implementing privacy-respecting practices when developing services likely to be accessed by children.
Children’s Code Compliance
The ICO’s Age Appropriate Design Code establishes 15 standards for online services likely to be accessed by children. Compliance involves implementing default high privacy settings, providing prominent and accessible privacy information, ensuring parental controls are usable, minimising data collection, and avoiding nudge techniques that encourage children to weaken privacy protections.
Privacy by Design Principles
Privacy by Design requires that privacy be built into the system architecture from inception. Implement default high privacy settings, data minimisation, purpose limitation, transparent data flows, user control, security by default, and regular privacy audits.
Transparent Privacy Policies
Children’s privacy notices must be genuinely understandable. The ICO recommends presenting layered privacy information, including essential details in simple language, in prominent locations. Use short sentences, simple vocabulary, active voice, and examples. Avoid legal jargon.
When Privacy Laws Are Breached: Consequences and Accountability
Understanding enforcement mechanisms helps illustrate the practical significance of privacy laws beyond abstract legal obligations.
Regulatory Actions: Fines and Reputational Damage
The Information Commissioner’s Office possesses substantial enforcement powers under UK GDPR. These include information notices, assessment notices permitting ICO premises inspections, enforcement notices requiring compliance actions, and monetary penalties up to £17.5 million or 4% of global annual turnover.
Significant UK educational technology penalties include TikTok’s £12.7 million fine in April 2023 for processing data of children under 13 without parental consent. British Airways received a £20 million ICO fine in October 2020 for a data breach affecting approximately 400,000 customers, demonstrating the ICO’s willingness to impose substantial penalties for security failures.
Beyond monetary penalties, regulatory enforcement creates substantial reputational damage. Media coverage of ICO investigations and penalties damages consumer trust, potentially affecting user acquisition and revenue. Educational institutions receiving enforcement notices often face negative publicity, which can affect parental confidence and negatively impact Ofsted inspection outcomes.
The Future of Young Digital Privacy: Trends and Advocacy
Privacy protection for children and students continues evolving as technologies develop. The UK Government’s Online Safety Act, receiving Royal Assent in October 2023, imposes duties on online services regarding child safety. International harmonisation efforts aim for greater consistency across jurisdictions, although significant divergence persists.
The advancement of artificial intelligence creates ongoing privacy challenges. The metaverse and immersive technologies present distinctive privacy issues, with virtual reality environments potentially collecting unprecedented behavioural data. Current privacy frameworks may inadequately address the risks associated with immersive technology.
Non-governmental organisations play crucial roles through research, policy advocacy, public education, complaints support, and technology accountability. Organisations active in UK children’s digital privacy include the Children’s Commissioner for England, the 5Rights Foundation, the NSPCC, Internet Matters, and Childnet International.
Student and minor privacy protection requires sustained attention from all stakeholders. Parents must actively engage with schools and online services, asking critical questions and exercising children’s privacy rights. Educators must embed data protection throughout the institutional culture, treating compliance as a fundamental responsibility for safeguarding data. Students should understand their own privacy rights and make informed choices about their digital engagement. Technology providers must prioritise child safety throughout design processes.
Privacy law continues to evolve as technologies develop and societies reassess their priorities. Continued vigilance, informed advocacy, and willingness to adapt regulatory frameworks will determine whether young people navigate digital environments with appropriate privacy protection. Privacy constitutes a fundamental human right essential for dignity, autonomy, and flourishing. When we effectively protect children’s privacy, we create digital environments where young people can learn, explore, and grow without compromising their rights or futures.