On Monday, the European Union announced an investigation into whether TikTok has violated the bloc’s stringent new digital regulations aimed at enhancing safety on social media and the internet.
The European Commission, the executive arm of the EU, disclosed that it has initiated formal proceedings to evaluate whether TikTok has contravened the Digital Services Act (DSA), which became effective last year.
Focus on TikTok’s Compliance and Concerns
The DSA encompasses comprehensive regulations intended to safeguard internet users, such as requirements to facilitate the reporting of harmful or illegal content like hate speech, provide users with alternatives to algorithm-driven recommendations, and prohibit targeted ads directed at children. The focus of the commission’s inquiry lies in assessing whether TikTok has adequately addressed “systemic risks” inherent in its design, including the potential for “behavioral addictions” resulting from its algorithmic systems. The commission expressed concerns that measures like age verification tools to prevent minors from accessing “inappropriate content” might not be sufficiently “reasonable, proportionate, and effective.”
Emphasis on Safeguarding Minors
Thierry Breton, the EU’s internal market commissioner, emphasized the prioritization of safeguarding minors under the DSA, stating, “TikTok must fully comply with the DSA and has a particular role to play in the protection of minors online.” He announced the commencement of formal infringement proceedings to ensure the implementation of appropriate measures to safeguard the physical and emotional well-being of young Europeans.
TikTok’s Response and Commitment
TikTok responded by asserting its commitment to implementing features and settings to safeguard teenagers and prevent users under 13 from accessing the platform, acknowledging the industry-wide challenge. The company expressed readiness to collaborate with experts and the industry to ensure the safety of young users and welcomed the opportunity to provide detailed explanations to the Commission.
In addition to scrutinizing TikTok’s measures for protecting minors’ privacy, the commission’s investigation will assess the platform’s transparency regarding advertisements and its provision of access to data for researchers.
Broader Scope of DSA Enforcement
The EU has identified nearly two dozen major online and social media platforms, including TikTok, as subjects warranting the highest level of scrutiny under the DSA, with potential for substantial fines for non-compliance. The bloc is already investigating X, previously known as Twitter and now part of Elon Musk’s ventures, for breaches, including failure to mitigate the dissemination of illegal content.
What is DSA?
The Digital Services Act (DSA) is a legislative framework applicable within the European Union that governs digital services functioning as “intermediaries,” facilitating connections between consumers and content, goods, or services. This encompasses platforms like Facebook, Google, Amazon, and app stores.
Key provisions of the DSA include safeguarding children from being targeted for advertising purposes on social media platforms, establishing mechanisms for users to appeal content removals, ensuring the authenticity of products sold on online marketplaces like Amazon, and addressing issues such as disinformation and gender-based online harassment.
Violations of the DSA carry penalties, including fines amounting to 6% of the violating entity’s global turnover and, in severe cases, temporary suspension of the service. Additionally, the EU possesses the authority to compel platforms to promptly address identified issues. The act also grants users the right to seek compensation for any harm incurred due to breaches of its provisions.
How will it protect children?
The legislation aims to safeguard children on social media platforms through several measures. Firstly, it prohibits platforms from creating profiles of child users for targeted advertising by external companies. Additionally, platforms accessible to minors, essentially most social media platforms, are required to implement measures ensuring the privacy and safety of child users.
Moreover, major platforms are obligated to conduct risk assessments to identify and prevent harmful content from reaching users under the age of 18. Specifically, the proposed EU legislation addresses the removal of online child sexual abuse material, aiming to enhance child protection online.