Firms Must Work Harder to Guard Children’s Privacy, Says UK ICO

[ad_1]

The UK’s privacy regulator has warned social media and video sharing platforms that they must improve data protection practices to safeguard children using their services.

The Information Commissioner’s Office (ICO) implied that many of these providers are still not meeting the requirements set out in a code of practice governing children’s privacy launched in 2021.

Its newly published priorities for 2024-25 are therefore to urge social media and video sharing providers to take action in the following areas:

  • Ensure children’s profiles are private by default and geolocation settings are turned off by default, to ensure their information can’t be misused to compromise “physical safety or mental wellbeing”
  • Switch off targeted advertising for children by default, unless there’s a “compelling reason” for user profiling
  • Require parental consent for children under the age of 13 for their personal information to be used by an online service. Age assurance checks will also be scrutinized
  • Take action on “recommender” systems which use behavioral profiles, search results and personal information to potentially encourage children to stay longer on a platform and/or create “pathways to harmful content”

The ICO cited Ofcom research to back its campaign. It revealed that 96% of UK children aged 3-17 watch videos on video sharing platforms, and 30% of 5-7-year-olds are on a social media site – rising to 63% of children aged 8-11 and 97% of 16-17-year-olds.

“Children’s privacy must not be traded in the chase for profit. How companies design their online services and use children’s personal information have a significant impact on what young people see and experience in the digital world,” said information commissioner, John Edwards.

“I’m calling on social media and video-sharing platforms to assess and understand the potential data harms to children on their platforms, and to take steps to mitigate them.”

The regulator said it is still investigating several companies for compliance with the children’s code of practice, reminding them that it retains significant enforcement powers under data protection law.

Last year it fined TikTok £12.7m ($16m) for failing to protect children’s personal data in line with the law, although the Chinese social media giant has appealed.

Read more on children’s privacy: Meta Fined $400m in Ireland For Children’s Privacy Breach

[ad_2]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top