TikTok urged to ‘accelerate’ compliance with new EU rules
European Commissioner Thierry Breton on Wednesday asked the Chinese social network TikTok to “accelerate” its work to comply with new European Union rules against online disinformation and hate from the end of August.
Like Twitter and other companies in the industry, TikTok has agreed to engage in a “stress test” with European Commission services to help it adapt to new Digital Services (DSA) legislation.
This historic regulation came into force in mid-November, but companies have until August 25 to comply.
Among these firms, nineteen very large online platforms, including Twitter, TikTok and the main services of Amazon, Apple, Google, Meta and Microsoft, will be subject to reinforced controls.
The results of the TikTok test, conducted Monday at the group’s European headquarters in Dublin, show that “additional efforts are necessary in order to be fully ready” for August 25, commented Thierry Breton, in charge of digital regulation within the European executive.
“Now is the time to accelerate compliance. At the end of August, we will assess whether real and tangible changes have been made,” he added in a statement, welcoming that the Chinese subsidiary Bytedance has accepted this optional exercise.
The test focused in particular on the protection of children, moderation of content and the fight against illegal content online, access to data and transparency.
“In view of the DSA, TikTok is implementing organizational improvements (…) and devoting significant resources to its compliance”, noted Thierry Breton.
“TikTok’s user base represents more than a quarter of the EU population (125 million) and most of them are teenagers. In the event of failure, the consequences can be dramatic”, warned this former French minister.
At the end of June, Mr. Breton also called on Twitter to strengthen its resources to be able to comply with the new legislation, after a similar test in San Francisco.
The constraints imposed on companies by the DSA include the obligation to carry out an analysis of the risks linked to their services in terms of illegal content, invasion of privacy or freedom of expression, but also in terms of public security.
Adequate means, particularly in content moderation, must be implemented to mitigate these risks, and access to algorithms must be granted to experts in Brussels.
The DSA also includes prohibitions, such as those on using “sensitive” user data (gender, political leaning, religious affiliation) for targeted advertising and transparency obligations, such as the publication of the main parameters used by recommendation systems.