top of page

ICO Updates On Children’s Digital Privacy



Whilst many may lament the days when children walked to school by themselves, played in the streets and weren’t seen until they trudged back home for dinner, the modern world brings new challenges for parents. Technology has enabled more contact with our children when they're outdoors, though, with the advent of smartphones and tablets, children are less inclined to play outside anyway. But are they any more safe in their rooms with their devices locked firmly in their palms?


Despite parental controls and existing legislation, recent research from the ICO (Information Commissioner’s Office) revealed that 42% of parents in Britain feel they have little control over the information social media and video-sharing platforms collect about their children. 


The speed at which we expect to access information online has increased exponentially in the last decade. Although attempts have been made to slow us down with cookie consent pop-ups, age confirmation and tick-box privacy policies, few adults even take time to properly consider their digital privacy, so how can we expect children to?


The ICO has been attempting to shift responsibility over data protection from consumers back to the platforms for a while now and they are making some progress. Here is an update on recent developments in children’s privacy in the UK, as led by the ICO.


How Is Children’s Digital Privacy Being Protected?


Although we can put restrictions on digital access for our children, we live in a digitalised world and we cannot aim to keep them from it. The role of the ICO, as well as the government and other organisations, is to alleviate parents of at least concerns over children’s digital privacy when online. Some of this has been addressed in the publication of a set of standards reinforcing existing data protections but including additional specifications to cover children's data privacy, known as The Children’s Code.


The ICO's Children's Code was introduced by the ICO in 2020. It aims to ensure that online services protect children's privacy and data in compliance with the UK GDPR and Data Protection Act 2018.


The Children’s Code applies to online services that are likely to be accessed by children in the UK, including:


  • Apps

  • Social media platforms

  • Online games

  • Streaming services

  • Websites offering goods/services

  • EdTech and connected toys


Even if a service is not directly aimed at children, it must comply if children are likely to use it.


The Children's Code doesn’t introduce new laws but clarifies how GDPR applies to children's data, and organisations that fail to comply risk enforcement actions from the ICO, including fines.


Read more about The Children's Code and what it consists of in our Guide to Children and Data Protection.


Working With Social Media And Video Sharing Platforms


As an organisation that wants to see businesses thrive in the online space, the ICO is committed to balancing companies's ability to do so whilst operating in a way that is responsible when it comes to privacy rights, especially those of child users.


In 2024, they contacted 11 businesses and spoke to platforms regarding keeping children safe online. In working with these organisations, the ICO reports progress. Social media platform, X, has ceased advertising to under 18s and automated geolocation information has been removed from the profiles of children on the Sendit app.


However, they have also launched several investigations into the platforms TikTok, Reddit and Imgur, into how they assess the age of users and how their personal information is used. This is an opportunity to scrutinise whether these platforms are doing all that is necessary to prevent age-inappropriate and/or harmful content from being served to under 18s. The aim of this endeavour is primarily to pinpoint areas of improvement and provide recommendations and advice. However, if companies are found to be infringing upon data protection legislation further action may be taken.


How Are Platforms Using Children’s Data?


In the past, much concern over data privacy has been around corporations sharing data with third parties or allowing too much personal information to be accessible by other users, which is concerning to all but particularly so when that data belongs to a child. Remember, a child has limited ability to consent under the law, due to their age. Advancements in data protection laws though have lessened such issues. More recently, the focus has been less on the protection of personal data and more on how that data is applied.


We know that some platforms and websites use people's data to personalise content. Whilst this can be beneficial, it must be actioned appropriately. Many platforms and apps remain free to use, which is appealing to children, but they make their money through advertising. This means it pays to keep user engagement high and some companies have been accused of doing so through negative or potentially harmful means. Parents, online safety advocates and bodies such as the ICO, are concerned that children may have their data used to keep them engaged on platforms in ways that could put them at risk.


The ICO is working closely with Ofcom and The Online Safety Act regulator in addition to other organisations to enable children to have a safe and positive experience online. They work with platforms to ensure compliance and, where safety provisions are lacking, will advise and, if necessary, enforce.


What Can We Expect For Children’s Online Safety in the Future?


Due to the speed at which social media content is published, it appears almost impossible for some platforms to regulate content. Whilst automated technology may go some way towards screening content for material unsuitable for under-18s, systems don’t always get it right - this has been evidenced many times. Announcements, such as Mark Zuckerberg's resolve to dissolve independent fact-checking in Meta have sparked serious concerns. Not only over the ease of spreading misinformation but also the human role in keeping children safe online.


Whilst the ICO have an essential role to play - and I will continue to keep you up-to-date on their progress - there are other invested parties in the space of digital privacy rights for children, including parents, educators and advocates. Faced with an issue of information sharing, information sharing has also become a solution, with organisations and individuals seeking to understand and share information regarding keeping children safe online. Understanding children’s rights is key but children also need to be made aware of what they should and should not share online. Many schools and community services are making it their job to provide this information and guidance.


Educating young people on information sharing online, the risks and their rights regarding consent is essential. There's much focus on teaching children about online safety and how to spot red flags. Therefore, online businesses and platforms will have to make their data privacy policies more transparent and age-appropriate if users might be children. Not only because it is their legal and moral obligation, but also because we are raising a generation to be more savvy when it comes to their privacy rights online. Because whilst legislation and enforcement have a part to play, consumer awareness, even amongst children, will be an important driving force in persuading online organisations to take their responsibility to protect the privacy of their users seriously.


Do you run an online business where users/customers may be under 18? Aubergine Legal can help ensure your documents and policies are appropriate for children and that you’re compliant with online safety regulations. Get in touch for a consultation.

 

Comments


bottom of page