Image created by AI

EU Commission Initiates Investigation Into TikTok Over Child Safety Concerns

Published February 20, 2024
2 years ago

The European Union escalated its regulatory oversight this week by initiating a formal investigation into TikTok's adherence to mandates designed to protect minors online. Rooted in the Digital Services Act (DSA), which serves as a cornerstone for digital content regulation in the EU, the inquiry marks the EU's growing assertion in maintaining an online environment that is safe for young users.


This move follows close on the heels of a similar action taken against one of tech billionaire Elon Musk's platforms in December and underscores the EU's commitment to implementing the DSA's regulations. TikTok, under the ownership of Chinese tech giant ByteDance, has come under scrutiny for potentially failing to shield its young audience from the harmful impacts that can arise from its platform's use, particularly through algorithms that lead users down potentially harmful content pathways, known as the "rabbit hole" effect.


One central concern raised by the European Commission pertains to TikTok's measures for verifying user age, which, as stated by the Commission, "may not be reasonable, proportionate, and effective." There are also questions related to TikTok's transparency in advertising and the accessibility of its data for research purposes.


The Commission's examination is precipitated by analyzing risk assessment submissions from TikTok, which detailed the platform's existing countermeasures against unlawful content and measures to protect minors, and its facilitation of data access to researchers. With ongoing evidence collection, the Commission is poised to implement additional enforcement actions if deemed necessary.


Thierry Breton, the EU's internal market commissioner, emphasized the duty TikTok has in safeguarding its users, especially younger demographics. With the platform's user base in the EU surging from 125 million to over 142 million monthly users within a year, TikTok's influence and the risks associated with usage, particularly by children and teenagers, have become more pressing than ever.


Echoing the sentiment, Margrethe Vestager, Commission executive vice president, stated that the probe will carefully dissect TikTok's strategy in mitigating systemic risks while ensuring compliance with underage user safety, satisfactory advertisement disclosures, and an increase in overall transparency.


In response to the Commission's actions, TikTok has expressed its commitment to safeguarding minors on its platform. The company has cited the introduction of various features and settings aimed at protecting teenagers and preventing under-13s from accessing its services.


The investigation into TikTok extends across multiple facets of operations, scrutinizing how the platform confronts and lessens systemic risks, the efficacy of their privacy and safety protocols for minors, advertisements' reliability, and actions towards greater transparency.


The ramifications of breaching the DSA are substantial, with the potential for hefty financial penalties which could amount to as much as six percent of a company’s global revenues. For serious and repeated violations, the Commission possesses the authority to block platforms within the 27-nation bloc.


The DSA, which has been in force since last year for the largest online entities including TikTok, Facebook, and Instagram, stipulates that companies must demonstrate greater diligence in content moderation and swift action in protecting online consumers. The law has been applicable to all platforms since February 17, reflecting the EU's broader mandate to regulate digital marketplaces and social platforms with a firmer grip.



Leave a Comment

Rate this article:

Please enter email address.
Looks good!
Please enter your name.
Looks good!
Please enter a message.
Looks good!
Please check re-captcha.
Looks good!
Leave the first review