The Information Commissioner’s Office has introduced a new Children’s Code this month to protect young people from being exploited by internet services. The code outlines standards for any apps, games or websites that are likely to be used by children, even if children aren’t the intended audience.

The 15 Standards

The Children’s Code provides 15 standards for any information society services (for-profit online services). One of these standards is to consider the best interests of the child, a right adopted from the UNCRC. Another standard is transparency, advocating for age-appropriate language in policies, so that children understand the services they’ve signed up for. Data sharing is a breach of privacy that has been hitting headlines in recent years – think Mark Zuckerberg Facebook hearings – but the Children’s Code ensures that companies avoid sharing children’s data unless they have a compelling reason. 

Parental controls may be enforced with the intention of protecting children from harm. However, they can be used in a way that breaches children’s independence and right to privacy. The ICC believes that services should make it obvious to children when they are being tracked and monitored by parents, a standard implemented in the new Children’s Code.

The Internet’s Child Population

Most internet services are tailored towards an adult’s understanding and ignore the many children accessing the same websites. The Children’s Code considers young people as social agents and beings in their own right. Platforms must acknowledge the overwhelming population of children on the internet.

According to a series of telephone interviews conducted by Statista, the majority of 5-15 year olds had social media profiles in 2020, the first time the figure has passed 50% since the study began in 2009. 18% of 3-4 year olds had social media accounts in 2020, the highest figure seen in recent history. Although this may seem terrifying, it’s a reality that internet services must adapt to.

How The UK is Making Social Media Safer for Children

Social media companies can no longer feign ignorance to their young users. An article in the Wall Street Journal exposed Facebook for understanding how harmful their Instagram app is to young girls.

Instagram makes body image issues worse for 1 in 3 teen girls, yet their carefully-crafted algorithm is intended to keep young users scrolling to push the maximum number of adverts for revenue. However, the Children’s Code already appears to be making waves in social media policies.

How Companies are Changing

Instagram has decided to disable targeted ads for under 18s, no longer exploiting data for advert revenue. Youtube has also disabled auto-play for teenagers, a feature designed to keep users on the app for as long as possible by automatically queuing tailored content. Furthermore, TikTok has stopped sending children notifications after bedtime, nudging them to maximise their time on the app.

TikTok received pressure to change after being sued for its use of children’s data in April this year. A claim was filed on behalf of millions of children in the UK as TikTok wasn’t being transparent about the data it gathered on them. Phone numbers and locations were being collected about children without their informed consent.

The chair of the BPS’ Cyberpsychology section, Dr Linda Kaye says: “It is great to see that children’s rights online are being taken seriously by the UK Children’s Code. One of the main challenges of such initiatives is the global nature of the Internet and the fact that regulation usually operates only at a national level. However, it is encouraging to see that some technology companies are already pledging changes in the interests of children’s privacy and protection.”

The Children’s Code has already positively impacted internet safety and experiences. Children on the internet are only increasing and it’s about time that corporations consider them more than just sources of data.

Share this article

About Author

Comments are closed.