The Children’s Code in Context
Concerns over the data privacy of children have generated a recent wave of regulatory activity, not only in the UK but also around the world. In Ireland, the Data Protection Commissioner (DPC) recently fined Instagram €405m for violating children's privacy. In the US, the California Age Appropriate Design Code was passed in August 2022, which means not only the services targeting children under the federal Children’s Online Privacy Protection Act (COPPA) but also all the online services including those with significant amounts of children accessing or features that are known to be of interest to children are mandated to safeguard and process children’s data appropriately.
In the UK, the Age Assurance Design Code (AADC) - also known as the Children’s Code - came into force in September 2021, regulated by the UK’s data protection watchdog, the ICO. This marked an important step forward in protecting young people online. The AADC sets out 15 standards of age appropriate design for online services, which centre on:
A year into the implementation, the ICO is beginning to take enforcement action. Whilst four Big Tech firms may face prosecution for breaching the children’s code, various challenges also confront small and medium-sized enterprises (SMEs) in complying with the code. For instance, they fear a significant budgetary overhead and Governance, Risk & Compliance (GRC) burden. Their concerns are exacerbated by the uncertainty and lack of clarity on the requirements for in-scope online services to meet in practice.
At PUBLIC, we have accumulated deep expertise in child safety and ‘Safety by Design’ advising the UK government, regulators and civil society organisations in recent projects. In summer 2022, PUBLIC worked with Privately - a leading privacy-preserving age assurance technology provider and member of our GovStart 2022 cohort, to understand pain points in complying with the AADC and explore a more seamless solution. This blog reflects the latest insights on the compliance challenges that SME service providers face, and the solutions that might resolve these.
Through a combination of desk research and user interviews with a range of online gaming and video sharing platform providers over 8 weeks, we have identified three key areas of challenges for AADC compliance:
Lack of clarity around in-scope online services
Identifying the scope of online services that shall comply with the AADC is not straightforward. Despite the ICO’s guidance, online service providers have different interpretations of what ‘likely to be accessed by children’ means in practice. The perceived excessive administrative burden and limited budgets mean that small and medium-sized online service providers are hesitant to take on compliance duties. This prompts the question of whether self-declaration - currently the most common approach - is sufficient for age-gating. With the ICO’s recent statement that ‘adult-only services are in scope of the Children’s code if they are likely to be accessed by children’, it is worth exploring how to integrate age assurance measures into online services in a scalable way.
Technical and Business Challenges to SME compliance
Although recognising the need for a compliance solution, small and medium-sized companies with child users are taking a ‘wait-and-see attitude’ to take active measures to comply with the AADC.
We heard from gaming and video sharing platform providers that:
Challenges for regulators in ensuring compliance
The ICO faces the challenge of ensuring that AADC enforcement is appropriate and scalable. Therefore, regulatory enforcement action must strike the balance between putting sufficient pressure on regulated providers to comply, whilst ensuring the approach is fit-for-purpose with SMEs. In addition, the ICO will likely need to consider their regulatory capacity given the high amount of online services that would be considered ‘likely to be accessed by children.’
Age assurance is growing in importance and is central to helping work through these challenges. Some industry players expressed interest in a proportionate solution that combines Know Your Customer (KYC) checks through age assurance technology and a simple, standard compliance template. With age assurance technologies becoming more mature, and improving in accuracy whilst ensuring minimal personal data collection, online services could lean on third-party solutions to fulfil their obligations to protect young people online.
International regulatory collaboration
Our research highlighted an overlapping regulatory landscape between the UK, US (COPPA, California AADC), and Europe (GDPR) in child data protection. At the same time, online gaming and video sharing services can be accessed by child users across the globe with relative ease. Therefore, businesses operating in the UK with an international user base are now looking to expand from COPPA-compliant to AADC-compliant, and will need support to do so. We believe it is critical to long-term success of regulation that international regulators explore collaboration around multilateral enforcement and information sharing.
Get in Touch
If you are an online service provider exploring AADC compliance and how to build a safe, age-appropriate online environment for children, get in touch with email@example.com and firstname.lastname@example.org to find out how we can help.
Insights from a roundtable between PUBLIC, UK Government, Ofcom, investors and industry experts on what’s next for the online safety tech sector.
Explore PUBLIC's pivotal role in shaping the Online Safety Act, supporting government agencies, and pioneering evidence-based online safety solutions, in this new interview with Dan Fitter.
Is Generative AI a game-changer or a threat to online safety? Our latest blog delves into how this new technology is transforming industries and online spaces, and the real risks it could pose to our citizens.
The metaverse is more than just a tech buzzword, it will likely have an impact on the way we socialise, play, work, and learn.
This month we kicked off our brand new GovStart programme - taking on a cohort of 34 startups across a number of different sectors. In this blog post we delve into the solutions of our 7 Privacy, Security and Online Safety startups spanning areas such as Disinformation detection, AI auditing and identity verification.
As part of our advisory work in online safety, PUBLIC is delivering a range of projects to improve child safety online. This blog outlines our view of the landscape, its challenges and potential solutions, through a Safety Tech lens. Safety Tech is technology that has been designed to facilitate safer online experiences, and to protect users from harmful content, contact, or conduct.
Hearing from 3 leaders in the field: Sasha Havlicek (ISD), John-Orr Hanna (CRISP), and Lyric Jain (Logically), this blog offers a series of takeaways from the Defence Disrupted panel discussion, alongside PUBLIC’s own insights on trends and challenges from our current work with leading public sector organisations.
A Safer Future: Enhancing Citizens’ Trust in a Digital World
Nominet and PUBLIC partnered to research and tackle key illegal online harms faced by young people when using the internet. This blogpost provides a final update on the research, focusing on systemic opportunities, and outlines next steps for interested stakeholders.
The long-term societal impacts of false narratives propagated by foreign adversaries has yet to be completely understood and quantified, the development and anticipated launch of the Government’s Online Safety Bill in the UK provides an opportune moment to consider counter-disinformation interventions.