Is TikTok not protecting children and young people enough?
TikTok has to answer to the EU Commission. The reason for this is the alleged violation of transparency and obligations to protect minors.
Big news for TikTok. The EU Commission has accused the entertainment platform of failing to adequately protect children and young people. The responsible EU Commissioner Thierry Breton published a post on the short messaging service X (formerly Twitter) on 19 February 2024 in which he cited the following points as grounds for an investigation:
- Addictive design and screen time limits
- Rabbit-hole effect
- Age verification
- Standard privacy settings
According to a document obtained by the Reuters news agency, TikTok, as a platform that reaches millions of children and young people, has a special role to play in protecting them. The impact of TikTok content on radicalisation processes is also part of the procedure. It is necessary to examine whether and in what form the platform ensures that minors are effectively protected from problematic content (e.g. videos glorifying violence, content on eating disorders or suicide or xenophobic videos).
Clear and proportionate regulations required
The proceedings are based on the EU Digital Services Act, which had already initiated formal investigations into the short messaging service X in autumn 2023. The short messaging service had not acted consistently enough against misinformation. Meta, the parent company of Facebook and Instagram, was also accused of the same failure.
The law focuses on preventing illegal or harmful online activities and the spread of disinformation. The extent to which Tiktok and other platforms that appear on the EU Commission's radar will deal with the allegations remains to be seen.
How do you protect your children from problematic content on social media? Tell us in the comments.
I'm a fan of ideas that make you shake your head at first. And of coffee.