Esta es la versión HTML de un fichero adjunto a una solicitud de acceso a la información 'Access to documents request: 2022-05-19 meeting between Thierry Breton and Meta Platforms Ireland Limited and its various subsidiaries (f/k/a Facebook Ireland Limited)'.




Ref. Ares(2022)5170881 - 15/07/2022



DSA and DMA 

Ongoing 
decision
-making  •
process 
Art 4(3) 
first 

subpara
graph 






 
 
 
 
 
 
 
Code of Practice 
A centrepiece of the EU’s efforts to fight disinformation has been the Code of Practice on 
Disinformation, which has been in place since 2018. It is the first instrument of its kind 
worldwide, and the result of self-regulatory efforts by platforms and industry, who have 
agreed to standards to counter disinformation and its harmful impacts in the EU. 
Existing signatories of the code include major online platforms –  Facebook, Google, 
Twitter, Microsoft and TikTok – as well as trade associations from the online advertising 
sector.  
In line with our 2020 European Democracy Action Plan, the Commission published its 
Guidance in May 2021 (Setting out its views on how the Code should be strengthened).   
The Code’s signatories and potential new signatories are currently carrying  out a 
thorough revision and strengthening of the Code. More than two dozen new 
organisations from industry and civil society are participating in the revision, and intend 
to become signatories to the revised Code.   
 
Ongoing 
 
decision
 
-making 
process 
Art 4(3) 
first 
subpara
graph


Facebook’s position on the DSA and DMA 
DSA 
• In an interview in Le Mondein Q1 2021, 
 noted that Facebook is largely in
agreement  with the objectives pursued by the Digital Services Act, which seem
sensible.
o Facebook noted that the text of the DSA seems to be wel  reflected and
supported the transparency related obligations, in particular in relation to
content moderation.
• In response to the Irish Government’s call for views on the Digital Services Act in
January 2021, Facebook further added on the DSA that2:
o It welcomed that harmful content was not equated to il egal content and that
harmful content was included in the DSA through co-regulation, risk assessments
and mitigation measures. At the same time, it highlighted that these provisions
were quite broad
  and that there should be more clarity about the trigger for
sanctions.
o It noted that the compliance burden  for very large online platforms was very
significant.
o It highlighted that the restrictions on personalized algorithms  needed to be
carefully considered as such restrictions would directly impact the quality of the
service and negatively impact the user experience.
o It noted its concern on the role of the European Commission in the enforcement
of the DSA. According to Facebook, the Commission has disproportionate
powers
  and these powers do  not have appropriate safeguards. In addition,
Facebook highlighted that there was uncertainty around the role of the Digital
Services Coordinator and other competent authorities.
DMA 

Ongoing decision-
making process 
Art 4(3) first 
subparagraph 
and commercial 
interests Art 4(2) 
first indent 
1 https://www.lemonde.fr/pixels/article/2021/01/23/facebook-sur-la-moderation-on-nous-reproche-
une-chose-et-son-contraire 6067357 4408996.html. 
2 Facebook-DSA-Submission.pdf (enterprise.gov.ie)