Social media

How Europe is tackling online age verification

Lessons to be learned from Spanish, French experiences
Life
Image: CSO Online

12 July 2024

The passing of the Protection of Children (Online Age Verification) Bill looks to be a certainty at this point, especially with the government already promising to have it in place ahead of the net general election. IN the meantime, lawmakers across Europe are putting social media platforms under a magnifying glass. Social media platforms such as Instagram, TikTok and Snapchat attract many children – despite policies that require users to be at least 13 years old. The ease with which children circumvent these age restrictions is a major challenge for both regulators and platforms. Research from eurostat shows that more than 40% of children ages 8-12 use social media, underscoring the need for more robust age verification measures.

Age verification is crucial to protect minors from harmful content. Platforms such as Instagram, TikTok and Snapchat have minimum age requirements, but enforcement is a challenge. The loophole is in the registration process, where manipulating the year of birth allows children to easily create accounts. This leads to an ongoing cat-and-mouse game between platforms and underage users. European regulators, educators and tech companies are trying to protect children, but enforcing the minimum age remains a difficult task.

Why it matters

More and more children are accessing the Internet and social media platforms without being of age to do so, highlighting the critical need for robust age verification to protect children from harmful content.

 

advertisement



 

This loophole has led to widespread underage use of social media, creating the need for more robust verification methods. However, it seems that steps on age verification are coming more from legislation than from the companies themselves. In fact, the EU has been known to fine large companies that they believe are not doing enough to ensure the safety of children.

Most recently, TikTok was penalised €345 million for routinely disclosing child accounts and failing to verify the relationship between linked adult and child accounts.

The European Union is working on a comprehensive code for age-appropriate design and digital identity solutions. The EU’s eID proposal focuses on improving age verification through certification and interoperability. In addition, the euConsent project is developing a browser-based age verification method. These initiatives seek to standardize age verification across Europe and create a safer digital environment for children.

Last week, Spain announced the introduction of the app Cartera Digital Beta, an innovative approach to prevent minors from visiting pornographic Web sites. Like a digital passport, this app aims to restrict access to porn by enforcing age verification. Users must verify their age using electronic IDs or qualified certificates. The app also includes a dual authentication system to ensure that minors cannot access adult content through adult devices.

In France, a new law requires social media platforms to verify users’ age and seek parental consent for users under 15. This legislation aims to limit children’s screen time and protect them from cyberbullying and other online risks. Platforms that fail to comply with the law could face fines of up to 1% of their global revenue. This legislative move highlights the growing emphasis on protecting minors in the digital space.

AI rises to the challenge

Technology companies are exploring AI-powered solutions to address the problem of age verification. AI algorithms can estimate a user’s age based on their activity patterns and interactions with content. For example, facial recognition technology can be used during the registration process on websites that require a certain minimum age, such as gambling websites. However, these technologies raise privacy and data security concerns. A key challenge for tech companies is to ensure that these AI-driven methods are effective and privacy-conscious. Other options include behavioural analytics, already used by TikTok, which deletes accounts suspected of belonging to underage users and will have deleted more than 76 million accounts worldwide by 2023.

While robust age verification methods are necessary to protect children, they should not invade users’ privacy or make it more difficult to use the platform. Achieving this balance requires careful consideration of both technological capabilities and user experience.

As regulators tighten rules, social media platforms must adapt to protect young users while maintaining usability. Collaboration between governments, tech companies and educators is essential to developing effective age verification systems. Integrating innovative technologies and legal measures makes it possible to create a safer online environment for children.

Read More:


Back to Top ↑

TechCentral.ie