What will change for Facebook and TikTok in the face of the European elections?

Italy also has a digital services coordinator, the new figure who, according to European rules, must support the fight against disinformation and the spread of illegal content on social media, from Facebook to TikTok, via Twitter and Instagram. Under the agreement signed today with the EU Commission, this role will be filled by communications regulator Agcom.

The purpose of this agreement is to follow the new European regulation on digital services, DSA, which came into force last August. The DSA provides a range of measures for web giants (those with more than 45 million users) to be more effective in combating the spread of harmful and illegal content on their platforms. This content includes fake news and hate speech. Under the law, affected companies will be required to “immediately” remove illegal content (under national and European law) as soon as the platform becomes aware of it, and suspend users who “frequently” violate the law.

The platforms will be checked annually by independent bodies and coordinators such as Agcom and monitored by the European Commission, which can impose fines of up to 6 percent of their annual sales for repeated violations. Additionally, from February 24, the regulation will make it mandatory for social media to sign partnerships with fact-checking organizations that can provide useful content to counteract fake news circulating on platforms.

In order for the system to work, controls need to be increased: First of all, social media’s own internal controls. Several Facebook, YouTube and TikTok have been working for several months to adapt their internal control systems to comply with the new rules. But systems often exhibit different “sensitivities” in deciding what content to target, as seen in the Hamas attack on Israel or the elections in Slovakia. In the Hamas case, Brussels intervened by sending clarification requests to Meta, Twitter and TikTok, but for now there is no sign of sanctions.

For now, the European Commission seems to want to use the carrot more. But he may soon change his mind: The terror alert triggered by the conflict in the Middle East has put renewed attention on online content spread by religious fundamentalists. EU elections will then be held in June 2024, and Brussels wants to keep the impact of disinformation (and possible foreign interference) on voters to a minimum. Therefore, the governments of the 27 member states are asked to exert greater efforts in controlling social media: “States are invited to collect and share evidence of the dissemination of illegal content in their territory through very large online platforms and very large online search engines. Dated 18 October A recommendation states that this evidence should be forwarded to the Commission so that it can respond quickly and appropriately to such content.Currently, only three countries have responded to Brussels’ call (France, Ireland and now Italy).

Follow today on the new WhatsApp channel too

Source: Today IT

\