While META is eliminating fact checkers in the United States, it does not initially apply to the European Union. In Germany, fact checkers from Corrective and DPA continue to work for the group. When Heise Online asked if a change was planned here, Meta reported that initially there were no such plans. In theory, META could even “get rid” of controllers here, as Zuckerberg puts it. However, he then has to find other ways to ensure that he remains compliant with the DSA.
Advertisement
Elon Musk is already feuding with the EU to some extent. She threatens to punish X. However, this is less about a question of freedom of expression and moderation and more about a lack of transparency obligations, access and dark patterns. DSA also regulates such subjects.
Illegal content on Meta, X and other platforms
The law does not regulate what content is illegal. It is recorded in other specifications at national or EU level. On the other hand, the DSA determines how this illegal content should be dealt with once it becomes known. “This includes an EU-wide framework for detecting, flagging and removing illegal content, as well as risk assessment obligations.”
This also means that very large platforms like Meta and Such illegal content, disinformation and risks to minors must be systematically combated. How the provider does this is up to them. Fact checkers are just one option. Nevertheless, even automated filters may be sufficient to reduce risks in the sense of DSA. To prove this, proper reports are required, which then have to be submitted to the EU. In theory, it would also be possible for Meta to end its cooperation with fact checkers in Germany.
The Federal Network Agency and the EU Commission monitor DSA.
The implementation of the DSAs is monitored by the EU Commission, but there are also national coordinators – in Germany this is the Federal Network Agency. Incidentally, proceedings are already underway against almost all the major platforms – From LinkedIn to Google to Apple and Alibaba to Amazon and Booking. However, these were till now limited to inquiries and exchanges. Before fines are imposed, providers can defend themselves and make changes to services.
Obligations that providers of large platforms must meet also include ensuring that users have fair and transparent channels for complaints. Everyone has the right to officially challenge a decision if, for example, their account has been blocked. The platform should not be blocked arbitrarily.
No content, i.e. posts, can be easily deleted. DSA requires defined criteria for removing posts which need to be communicated transparently. The so-called shadow ban is only allowed if the user concerned is informed about it – and can lodge an objection. This means that people who receive less visibility, i.e. are excluded from the algorithm, need to find out why this is happening to them.
Automatic filters and internal policies
By the way, fact checkers are not responsible for blocked accounts or disapproved posts and people, but other mechanisms are responsible. This includes reports from other users and automated filter systems. One form of report on X are community notes. Users can write opinions, ratings or ratings about content under other people’s posts. Meta is also planning to launch such a function.
It is also known that the internal rules for automatic moderation in the United States have been changed. Many statements that were previously prohibited or undesirable are now permitted. For example, this applies to topics related to LGBTQA+, Jews, and minorities. In the United States, freedom of speech is a more broadly interpreted term than in this country, where it more readily gives rise to crimes such as libel. The platform operator determines what additions are permitted or prohibited within the legal framework.
The DSA also provides several provisions affecting election integrity. Big platforms will have to adapt their recommendation systems, label political ads and here, too, assess and address systemic risks.
(EMW)