The Children’s Code

The Information Commissioner’s Office continues to push for greater protections online for children. Their Children’s Code outlays key requirements for all Information Society Services, which all such providers will need to be active in implementing. 

20 February 2024

Child on computer accessing online content

The protection of children from online harms continues to be a major focus for governments and regulators alike. We have seen the recent enactment of the Online Safety Act and, similarly, the Information Commissioner’s Office (ICO) continues to push for greater protection for children. 

The ICO’s major focus is around the Children’s Code – formally known as Age-Appropriate Design: a Code of Practice for Online Services – which was issued by the ICO in 2021. The UK Code sets out 15 flexible standards designed to ensure that the best interests of children are the main consideration for those designing and developing online services.

Initially, the Children’s Code was recognised as leading the way in addressing concerns regarding the interaction of under 18s with Information Society Services. Since then, we have seen similar developments in other countries, notably the Californian Age-Appropriate Code potentially coming into force in 2024 (with other US states following), as well as further developments in Europe on this issue. In the meantime, the ICO continues its work in this area, reviewing and updating its guidance to reflect changes in technology and practice.

Below we explain some of the key elements of the Children’s Code that Information Society Services providers will need to consider. 

Likelihood of access by children

The Children’s Code applies to Information Society Services which are likely to be accessed by children. As such, the question of when online services are likely to be accessed is therefore key for developers designing for compliance with the Children’s Code. The definition of “Information Society Services” captures most online services so long as they provide a requested service, normally for remuneration, from a distance and through electronic means. This includes apps, online games, social media platforms, and any websites offering goods or services to users over the internet.

Where under 18s are the target market, there is no question that the Children’s Code will apply. While this would suggest a pass for services that are marketed towards over 18s, news stories constantly remind us that under 18s often access inappropriate material. To combat this, in September 2022, the ICO clarified that the Children’s Code also applies to "adults-only" services if those services are "likely to be accessed by a significant number of children". 

The ICO was reluctant to clarify what might be considered "a significant number of children", and therefore where the Children’s Code might apply. However, how that is determined is key for online service providers. The ICO therefore addressed this in its guidance issued last year, which includes a list of factors for organisations to consider when assessing whether the Children's Code applies to them, alongside a number of FAQs and case studies. 

The examples given under the guidance are drawn from a few areas including pornography, gaming, and social media. These case studies largely only present examples of companies responding after they had received complaints of children accessing their sites. Hopefully in future there will be more developed case studies showing examples of companies practising the ICO recommendations in their internal assessments and dealing with issues before complaints arise.

Age assurance

For something like pornography, it’s clear that the material is deliberately targeted at the over 18 market and therefore the key questions here are: how to address age verification and how to make sure those measures are robust. 

Other areas, such as gaming and social media platforms, may also choose robust age verification but in all cases, it is key to be able to demonstrate that the Children’s Code has been addressed and has not been ignored. 

In order to establish or estimate the age of those using a service and, where necessary, implement protections or access restrictions for content, service providers should be employing appropriate age assurance methods in accordance with the Children’s Code. 

The ICO has recently published its updated expectations on age assurance for the Children's Code, which offers further guidance on the implementation of such methods, including a set of "must, should and could" standards which set out the minimum standards required to achieve compliance as well as what the ICO considers to be best practice. 

Children's Code Risk Assessment 

Businesses offering online services now need to build a Children’s Code Risk Assessment into their accountability requirements, which should take account of the factors raised in the ICO's guidance. Sites providing lower risk content should take into consideration who their market is and if their content is appealing to children. Crucially, this assessment should also be documented where it is decided that the Children’s Code does not apply. This can be backed up by evidence such as:

  • market research;
  • current evidence on user behaviour;
  • the user base of similar or existing services; and 
  • service types and testing of access restriction measures.

Real life consequences

There’s no doubt that the Children’s Code remains a priority, and the consequences for non-compliance can be severe. Last year the ICO issued a £12.7 million fine to TikTok for misusing children’s data.

This fine related to failures by TikTok in monitoring the use of its platform by under 13s, despite the rules that TikTok set on not allowing children below that age to use its services. In this case, TikTok did not have robust age verification measures in place, nor did it review its customer-base to remove under 13s once identified. Towards the end of 2023, the ICO also issued an enforcement notice against Snap, focussing on its failure to assess the risks regarding its generative AI chatbot “My AI” and in particular the use of it by children aged 13 to 17.

While these examples are targeted at large organisations, they serve as a stern reminder to businesses offering online services to ensure that they are considering the Children's Code and, where it is decided that the Children’s Code does apply, achieving compliance. 

 

If you have any questions regarding any of the above or would like to discuss how the Children’s Code impacts your business, please do not hesitate to contact Joanna Boag-Thomson