Existing clients

Log in to your client extranet for free matter information, know-how and documents.

Client extranet portal

Staff

Mills & Reeve system for employees.

Staff Login
05 Feb 2026
4 minutes read

How to protect childrens’ personal data, according to the Data (Use and Access) Act 2025

After a few years where data protection laws in the United Kingdom were settled, The Data (Use and Access) Act 2025 (DUAA) is ushering in a period of change.

For those of you who have only just recovered from the GDPR, the news that data protection laws are changing again may not be entirely welcome. The good news is that many of the changes DUAA is bringing in are intended to make your life easier, rather than introducing complex new obligations and – while some things are changing – much remains the same.

This series of articles from the IT & data team at Mills & Reeve aims to help you understand the impact that DUAA will have on established ways of doing things. This blog looks at how DUAA changes the rules around the protection of childrens’ personal data. If you need a reminder about the meaning of some of the key data protection terminology used (eg personal data, data subject, data controller, processing), please refer to our data protection glossary.

The UK Position: The Children’s Code and Data Use and Access Act
The UK’s approach to childrens’ data protection prioritises design standards and transparency.

The Age Appropriate Design Code (Children’s Code) published by the Information Commissioner’s Office under the auspices of the Data Protection Act 2018, set out standards for any online service likely to be accessed by under-18s, such as apps, games, educational platforms, retail sites, and IoT devices. These standards include privacy by default, clear and age-appropriate privacy notices, and refraining from using ‘nudge techniques’ to encourage children to provide more personal data.

Section 81 of DUAA adds to Article 25 of the UK GDPR (data protection by design and default), explicitly requiring online services which are likely to be accessed by children to take into account the fact that: “children merit special protection  with regard to their personal data because they may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing, and have different needs at different ages and at different stages of development.”

The ICO’s DUAA guidance for organisations advises that you will satisfy this requirement “if you conform to our Age Appropriate Design Code.” The code has not taken on legally binding status, but it does represent best practice. If you have not previously considered the needs of children of different ages who use your online service, now would be a good time to revisit this.

What does this mean for businesses handling Children’s Data?

Firstly, if your organisation already conforms to the Children’s Code then you’re likely to be compliant with the DUAA. However, if the Children’s Code is news to you and your organisation offers digital products or services that children may use, you should ensure that:

  • Services are designed with privacy friendly defaults. For example, a gaming app should not automatically share a child’s location with other users.
  • Privacy notices are written in clear, age appropriate language so that a ten year old can understand what happens to their data.
  • Restrictions apply to using children’s data for behavioural advertising. An online toy store should avoid profiling under 18s to push targeted ads.
  • Data minimisation is essential. A homework app should not collect a child’s exact address when only school login details are necessary.
  • Where appropriate, services provide parental controls to allow parents to oversee or limit data use.

The consequences of failing to protect children online

The laissez-faire attitude that some online services appear to have towards the privacy and safety of their younger users has in some jurisdictions led to more drastic action. On 10 December 2025, Australia’s unassumingly named Privacy and Other Legislation Amendment Act 2024 started to apply. From this date, children under 16 are no longer permitted to access social media platforms. Providers of such platforms must implement strict age verification or face significant penalties. Australia’s approach has sparked debate in other countries that are considering something similar. In the UK, the Government has just announced a consultation on children's social media use and bans phones in schools. Conservative leader Kemi Badenoch has suggested that an under 16s social media ban is now Conservative Party policy. More than 60 Labour MPs have also recently written to the Prime Minister to express their support for the concept.  

Conclusion

Parents and young users are increasingly aware of privacy issues. A single misstep, such as exposing a child’s location or using manipulative design to encourage data sharing, can quickly escalate into a reputational crisis. In today’s digital landscape, trust is currency. Businesses that demonstrate a genuine commitment to safeguarding children’s data not only avoid regulatory risk but also position themselves as responsible brands. This can be a powerful differentiator in competitive markets, where families actively seek services that prioritise safety and transparency.

Ultimately, treating children’s data responsibly is not just about ticking legal boxes, it’s about building long-term credibility, fostering user confidence, and aligning with societal expectations for digital wellbeing. 

Our content explained

Every piece of content we create is correct on the date it’s published but please don’t rely on it as legal advice. If you’d like to speak to us about your own legal requirements, please contact one of our expert lawyers.