Blog Post

March 24, 2022

From Livestreaming to Education: Finding Priority Areas to Tackle Illegal Online Harms

PUBLIC has partnered with Nominet to research and tackle the key illegal online harms that young people can face when using the internet. This research will help to inform Nominet’s grantmaking to counter illegal online harms.

We’re sharing our thinking and key outputs as we go – in the hope that working in the open will enable and encourage key stakeholders to challenge us, give feedback and add depth to steer our analysis and work.

Our first steps for this research have been to try to identify the trends and developments of the illegal online harms that young people face. We compiled these trends based on reports from academics, government and the third sector before testing them via interviews with key individuals and organisations in the space (we’re still open for further discussions so please reach out to us).

Because this subject is both so complex and wide-ranging, we have narrowed our scope to focus on specific problems within Child Sexual Abuse and Exploitation (CSEA) and grooming.

Having generated a ‘long-list’ of trends we developed three criteria to create a shortlist of ten trends which we hypothesise may be a priority for funders and civil society to address. The criteria we applied were Development, Saturation and Popular Coverage and Awareness.

  1. Development. The trajectory and growth of the trend over time. We prioritise trends that are getting worse more quickly because they are likely to increase in negative impact in the future.
  2. Saturation. The number of initiatives and stakeholders addressing the trend. We prioritise trends that are less saturated because that suggests a blindspot or a particularly tricky area to address.
  3. Popular Coverage and Awareness. The coverage of the trends in the popular media and public awareness of the trends. We prioritise trends that have a lower level of popular coverage and awareness because there is often a delay between some of the worst developments and their coverage. It also allows us to focus on areas that may not be attention grabbing but are still impactful.

Once we developed the shortlist of trends, we attempted to distil each trend to a succinct problem statement. We have shared these below and actively invite your feedback – we recognise this is both a work in progress, and that the exercise of shortlisting is more an art than a science. Therefore if you think we have made critical omissions or if you have perspectives, experiences or knowledge you can share with us that might shape our research – please do get in touch.

Our ten problem statements are shared below, grouped under five broad themes that we identified in our research:

Internet Infrastructure. The technical workings of the internet, such as how websites are hosted, can be exploited to evade detection or threaten the safety and wellbeing of young people.

Top-level ‘domain hopping’ allows Child Sexual Abuse Material (CSAM) websites to evade long term takedown. 

The Internet Watch Foundation (IWF) explains in their reporting that top-level domain hopping is ‘when a site (e.g. ‘badsite.uk’) keeps its second-level domain name (‘badsite’) but changes its top-level domain (‘.uk’), creating a whole new website with different hosting details but retaining its ‘name brand’. So from ‘badsite.uk’, the additional sites ‘badsite.ga’, ‘badsite.ml’ or ‘badsite.tk’ could be created.’ This allows the harmful website to continue to exist online and undermines the effective take-down work of a national domain host as websites move to locations that are known to be less proactively regulated.

New Technology. New technologies are always being developed and they are becoming more accessible to everyday users of the internet. However, these new technologies can be exploited to create new methods of causing harm to young people.

End-to-end encryption (E2EE), anonymity and child safety have become polarising subject areas. 

There is a tendency to see privacy and child safety in tension with one another and therefore mutually exclusive. However, it can be true that E2EE and anonymity online can both protect everyday users of the internet (including children), but also present challenges to certain child safety measures. 

To move beyond this impasse, a possible area that is often overlooked are the potential benefits of creating and maintaining privacy for children online to enhance their safety, namely in how their data is being collected and used. Our research highlighted that platform and algorithm design commonly use children’s online behaviour, messages and preferences to actively recommend and guide them towards harmful content. This can include sexualising content or content that promotes other online harms such as eating disorder and suicide and self-harm content. All sides of the debate could potentially come together around the aim of simultaneously enhancing both the privacy and safety of children.

Livestreaming is becoming a more common technology to distribute and share CSAM. 

New technologies such as livestreaming are becoming accessible to everyday users of the internet to both view and share. When used for their intended applications, they can allow young people to share experiences in new ways. However, these new technologies can also be exploited by offenders to create new methods of causing harm to young people. Often this exploitation of a new technology happens before robust safeguards have been introduced to manage its safety.

Livestreaming CSAM allows offenders to engage and interact with another, engage in abuse live and raise funds. According to EPCAT, the temporary nature of livestreaming video makes it harder for law enforcement to attribute media and files to offenders.

Legislation and Enforcement. New legislation and guidance are being introduced in governments around the world to address illegal online harms.

Law enforcement recognises that arrests can only be one part of preventing CSEA and other measures and earlier stage interventions are also necessary. 

There are a large number of actual and potential offenders in the UK. The Home Office has published an estimate of ‘80,000 people who present a sexual threat to children online.’ The National Police Chief’s Council (NPCC) estimate that there could be ‘as many as 500,000 men in the UK who have or are viewing indecent images of children.’ 

Meanwhile, law enforcement agencies in the UK are arresting slightly over 5,000 men per year. This challenge has prompted the NPCC Lead for Child Protection to state that ‘alternatives, such as rehabilitation and treatment rather than prosecution for those who view low-level indecent images but are assessed not to pose a threat of physical rape or sexual abuse.’ 

A significant number of offenders recognise their problem and seek help to prevent and safeguard young people. Despite many stakeholders recognising the important and effective role offender prevention and treatment can have in stemming the flow from offenders viewing material to grooming and abusive direct contact with young people, this area understandably elicits polarised responses in society.

Stakeholder Collaboration. Collaboration is essential across stakeholder groups to ensure a more unified approach to tackling illegal online harms.

Third sector organisations do not have good access to – or representation in – technical forums. 

Many third sector organisations feel like their insights and expertise around child safety are omitted from technical forums where new standards and regulations may be developed and where key industry players convene. These third sector organisations either do not have an invitation to attend or may not even be aware that certain relevant discussions are taking place. Moreover, the (real or perceived) technical skills and language gaps of some third sector organisations may not allow them to present their work on the same footing as technology experts and stakeholders.

Adult websites are at risk of hosting CSAM and causing other harm to young people but stakeholder collaboration is currently limited between this group and other organisations. 

Almost every stakeholder we spoke to stated that adult websites are a ‘piece of the puzzle’ of harm young people face online and must be engaged with proactively. However, the appropriate method of collaboration is unclear. These websites are at risk of hosting CSAM, allowing minors to access their inappropriate content and may also encourage unhealthy sexual norms and practices to young people during key years of their psychological development. 

Despite recognising the important role adult websites play in securing online safety for young people, funders and third sector organisations are generally reticent about working with this stakeholder group due to concerns about the negative impact to their reputation.

How young people use the internet. The internet was not originally designed with children in mind but today, young people increasingly make up a significant portion of online users and many online services are directly targeted towards them. This also presents a lot of challenges as young people use technology in new and sometimes harmful ways.

Most existing digital skills training and online safety education (if offered) in schools are uncomfortable for students and teachers. 

The Department for Education (DfE) publishes ‘Keeping children safe in Education’ (KCSIE), which includes online safety material that must be taught in schools to children. We have learned from organisations that work with both schools and children that these in-school sessions may not be as effective as intended. Young people do not necessarily think it speaks to them or their realistic behaviour. Teachers don’t feel like they have any frame of reference for how young people use the internet, which is often completely different from even the youngest teachers. Moreover, some of the subjects are intimidating to teach and teachers may not feel well prepared or comfortable teaching them. 

Young people are increasingly participating–and at risk of grooming–in online eating disorders communities and suicide and self-harm communities. 

Both formal groups and informal groups organised around hashtags and specialist accounts are widespread on major social media platforms. While young people may use these groups in certain circumstances as a safe space to healthily discuss their emotions or share their recovery experiences, there is also extensive problematic content that celebrates and promoters eating disorders, suicide and self-harm. Adults can infiltrate these groups of vulnerable young people and target and groom them, exacerbating the harm young people face. For example, some adults become ‘anorexia coaches’ to young people, encouraging the groomed young person to push themselves further.

It is becoming hard to safeguard young people as they switch between platforms, some of which may be less safe. 

Young people use platforms in a multitasking and hybrid manner. It is common to play a game in one application, have voice chat in a second platform and exchange messages and images in a third. This is particularly risky if certain platforms have weaker safeguards than others, if an adult were to encourage a child to switch platforms in favour of one that allows them the most anonymity and the weakest protections.

Young people are creating and sharing more self-generated CSAM. 

Under UK law, if children create sexually explicit images or videos of themselves, it is considered CSAM. Young people may share these images or videos with a close peer. These images can then be shared without their consent more widely. Through this process, it may also end up in the hands of an offending adult, or the child may be directly groomed by an offending adult to take and provide these images. 

Young people are increasingly thinking that taking and exchanging these images is normal behaviour. Thorn reported in 2020 that 1 in 5 children aged 9-12 ‘agreed that it’s normal for kids their age to share nudes’, up from 1 in 8 in 2019. The profile of the average children affected is getting younger and the problem proportionately affects young girls more, as well as LGBTQ+ youth. Among 9-12 year olds in 2020, Thorn reported that 1 in 7 said they had shared their own nudes, up from 1 in 20 in 2019. Of the children sharing nude images of themselves, 2 in 5 believed they were sending the images to an adult.

***

Next steps

Our next step is to map the stakeholders currently operating in these ten problem areas and brainstorm the different methods of approach in tackling each area. We’ll share this work in our next blog post. We welcome your feedback on these ten problem statements. If you would be interested in sharing your experiences and knowledge, or have any comments for the wider project please get in touch and contact marco@public.io.

Content warning: The following article contains references to child sexual abuse and exploitation, eating disorders and suicide and self-harm.

Partners

Our partner's logo

Nominet

Photo by the author

Marco Iovino

Former Team Member

Explore more insights

Stay in the loop!

Sign up to our monthly newsletter to get a snapshot of PUBLIC’s impact across the public sector, thought-leadership from our experts, and opportunities to get involved!