What’s in and what’s out?
The Children’s code applies to information society services likely to be accessed by children (under 18s). Recognising the increasing role of the digital world in young people’s lives, it puts their protection first. It was developed under the requirements of the Data Protection Act 2018 for the development of a series of Codes of Practice to help organisations to apply privacy law in specific situations.
The kinds of services likely to be covered by the Children’s code are apps, many kinds of websites including search engines, social media platforms, online messaging or internet based voice telephony services, online marketplaces, content streaming services (video, music or gaming services), online games, news or educational websites. Providers need not be UK-based, provided their services are directed to UK users or monitors their behaviour.
The code does not apply to preventive or counselling services (although general health and fitness apps would be caught), broadcasting and most online public services.
The boundaries are not always obvious, and it would be safer to assume that a service is covered where there is doubt.
The fifteen standards
Fifteen flexible standards sit at the heart of the Children’s code.
- The best interests of the child should be a primary consideration when designing and developing online services likely to be accessed by a child. This concept derives from international law, and includes wide-ranging considerations around the safety and wellbeing of children. The Children’s code recognises the possibility for conflicting interests and complexity, noting that advice from external experts may be needed.
- Data protection impact assessments. The Children’s code advises that organisations must embed a DPIA into the design of any new online service that is likely to be accessed by children.
- Age appropriate application. The Children’s code divides children into five age groups and advises organisations to analyse the likely impact on these groups separately. A variety of different methods to establish age are suggested, ranging from self-declaration for lower risk processing, to the use of “hard identifiers” such as a passport.
- Transparency. Privacy and other information provided to users must be concise, prominent, and in clear language suited to the age of the child.
- Detrimental use of data. Organisations are advised not to use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions, or Government advice. This can include encouraging use of drugs or alcohol, or using strategies to extend user engagement like auto-play features.
- Organisations should uphold their terms, policies and community standards such as privacy policies, age restriction, behaviour rules and content policies.
- Settings must be ‘high privacy’ by default, unless an organisation can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child.
- Data minimisation. Organisations should collect and retain only the minimum amount of personal data needed to provide the elements of a service in which a child is actively and knowingly engaged. For example, searching for music tracks to download is one element of the service while recommendations based on earlier searches is another. Separate choices for each element should be given.
- Data sharing. Disclosure of a child’s data should be avoided unless an organisation can demonstrate a compelling reason to do so.
- Geolocation options should be turned off by default, unless the organisation can demonstrate a compelling reason for geolocation to be switched on by default. Children should be able to see when location tracking is active.
- Where parental controls are provided, organisations must give the child age appropriate information about this. If parents or carers can monitor a child’s online activity the child should be able to see when they are being monitored.
- Options which use profiling should be switched ‘off’ by default, unless there is a compelling reason not to. Profiling should only be used where it is needed to protect a child – to prevent them from being fed harmful content, for example.
- Organisations should avoid nudge techniques to lead or encourage children to provide unnecessary personal data or turn off privacy protections.
- Organisations that provide connected toys or devices must ensure that they include effective tools to enable compliance with the Children’s code. Where the connectivity aspect of the product is outsourced, both the toy supplier and their commercial partners will need to comply.
- Children should be able to use prominent and accessible online tools to help them to exercise their data protection rights and report concerns. As with standard 3, this is broken down into age bands to help providers design for different user groups.
A bigger beast is coming – the draft Online Safety Bill
Those interested in this area will no doubt be aware of the draft Online Safety Bill currently under consideration by a Joint Committee of Parliament. This would extend well beyond data privacy issues, imposing far-reaching duties of care on providers of online content-sharing platforms and search services, both in relation to children and the public more generally. The impact of the Children’s code on online services may help to inform the ongoing debate around how effective the Online Safety Bill could be in achieving its goals.
Learn more about our IT, technology and information law and data privacy services.
Our content explained
Every piece of content we create is correct on the date it’s published but please don’t rely on it as legal advice. If you’d like to speak to us about your own legal requirements, please contact one of our expert lawyers.