Last week of Early Bird!

TikTok investigated by European Union over possible online content breaches

Image by amrothman from Pixabay

The European Commission is investigating whether social media giant TikTok has broken online laws that protect children and ensure transparent advertising.

TikTok, a video-sharing platform hugely popular with young people, could potentially face a large fine if it is found to have breached the Digital Services Act (DSA), which requires companies such as online platforms and search engines to do everything possible to tackle illegal content and protect public security.

Thierry Breton, the European Commissioner for Internal Market, announced the commission’s intentions to open formal proceedings against TikTok, which was established in 2012 and is owned by the China-based  ByteDance Ltd.

Posting on X (formerly Twitter), Breton said the investigation would be looking at a ‘suspected breach of transparency & obligations to protect minors’ on the issues of addictive design & screen time limits, the ‘rabbit hole’ effect, age verification and default privacy settings.

The investigation, says the commission, follows a risk assessment report provided by TikTok last September, along with TikTok’s replies to the commission’s formal requests for Information on illegal contentprotection of minors, and data access.

It says the proceedings will focus on compliance with DSA obligations to:

  • Assess and mitigate systemic risks, in terms of actual or foreseeable negative effects stemming from the design of TikTok’s system, including algorithmic systems, that may stimulate behavioural addictions and/ or create so-called ‘rabbit hole effects’. Such assessment is required to counter potential risks for the exercise of the fundamental right to the person’s physical and mental well-being, the respect of the rights of the child as well as its impact on radicalisation processes. Furthermore, the mitigation measures in place in this respect, notably age verification tools used by TikTok to prevent access by minors to inappropriate content, may not be reasonable, proportionate and effective;
  • Put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems;
  • Provide a searchable and reliable repository for advertisements presented on TikTok;

The opening of these proceedings empowers the commission to take further enforcement steps, such as interim measures, and non-compliance decisions. No deadlines have been set for how long the investigations could last, as it depends on several factors, including the complexity of the case, the extent to which the company concerned cooperates with the commission.

The Digital Services Act has applied to all online providers in the EU since February 17.

Author: Simon Weedy

Add your comment

characters remaining.

Log in through one of the following social media partners to comment.