Meta

New Features and Additional Transparency Measures as the Digital Services Act Comes Into Effect

By Nick Clegg, President, Global Affairs

Takeaways

  • Meta welcomes the principles of transparency, accountability and user empowerment at the heart of the DSA.
  • We’ve introduced additional transparency measures and user options as part of our ongoing commitment to meeting our regulatory obligations.
  • The DSA is in keeping with our long-standing record of openness around our policies, processes and enforcement.

Later this month, the European Union’s Digital Services Act (DSA), one of the most comprehensive pieces of internet regulation, will begin to fully apply to Facebook, Instagram and a number of other tech platforms and services. It is a big deal not just for European tech companies but for all tech companies that operate in the EU, and it will have a significant impact on the experiences Europeans have when they open their phones or fire up their laptops. 

Meta has long advocated for a harmonised regulatory regime that effectively protects people’s rights online, while continuing to enable innovation. For this reason, we welcome the ambition for greater transparency, accountability and user empowerment that sits at the heart of regulations like the DSA, GDPR, and the ePrivacy Directive. The DSA in particular provides greater clarity on the roles and responsibilities of online platforms and it is right to seek to hold large platforms like ours to account through things like reporting and auditing, rather than attempting to micromanage individual pieces of content.

We’ve been working hard since the DSA came into force last November to respond to these new rules and adapt the existing safety and integrity systems and processes we have in place in many of the areas regulated by the DSA. We assembled one of the largest cross-functional teams in our history, with over 1,000 people currently working on the DSA, to develop solutions to the DSA’s requirements. These include measures to increase transparency about how our systems work, and to give people more options to tailor their experiences on Facebook and Instagram. We have also established a new, independent compliance function to help us meet our regulatory obligations on an ongoing basis.

Building on Our Industry-leading Ads Transparency and Protections

We were the first platform to put in place ads transparency tools and, for many years, we’ve provided industry-leading transparency for social issue, electoral and political ads. We are now building on that by expanding our Ad Library to display and archive all ads that target people in the EU, along with dates the ad ran, the parameters used for targeting (e.g., age, gender, location), who was served the ad, and more. These ads will be stored in our public Ad Library for a year, so anyone, anywhere, can better understand every ad that’s run in the EU.

As part of our continued work to keep our apps age-appropriate for teens, we’ve also made changes to their ads experience on our platforms. Since February, teens aged 13-17 globally no longer see advertising based on their activity on our apps — like following certain Instagram posts or Facebook pages. Age and location is now the only information about teens that advertisers can use to show them ads.

Giving People More Information About How Our Platforms Work

We’re providing an unprecedented level of insight into how our AI systems rank content by releasing 22 system cards for Facebook and Instagram. These cards provide information about how our AI systems rank content for Feed, Reels, Stories, and other surfaces; some of the predictions each system makes to determine what content might be most relevant to people; and the options available to help customise an experience on Facebook and Instagram. These build on our long-standing “Why Am I Seeing This” feature, which allows people to see details directly in our apps about why our systems predicted that specific content would be relevant to them, and the types of activity and inputs that may have led to that prediction.

We’re also rolling out two new tools for researchers – the Meta Content Library and API. The library includes publicly available content from Pages, Posts, Groups and Events on Facebook, as well as publicly available content from creator and business accounts on Instagram. Researchers will be able to search, explore, and filter the publicly available content on a graphical User Interface (UI) or through a programmatic API. These tools will provide the most comprehensive access to publicly-available content across Facebook and Instagram of any research tool we have built to date.

Giving People More Control Over Their Experiences on Facebook and Instagram

As well as additional transparency, we’re also providing people with more options to help tailor what they see on Facebook and Instagram. I’ve written previously about our AI ranking and recommendation processes, which help you see content we think you’ll find most meaningful and reduce the distribution of problematic content, so you’re less likely to come across it. We’re now giving our European community the option to view and discover content on Reels, Stories, Search and other parts of Facebook and Instagram that is not ranked by Meta using these systems. For example, on Facebook and Instagram, users will have the option to view Stories and Reels only from people they follow, ranked in chronological order, newest to oldest. They will also be able to view Search results based only on the words they enter, rather than personalised specifically to them based on their previous activity and personal interests.

For a number of years, in addition to reporting options for content that might violate our Community Standards and Guidelines, we’ve also had dedicated reporting tools for illegal content. We’ve now made those tools even easier for people to access. And while we already notify people when we remove a piece of their content, and typically give them the chance to appeal, we’ll now provide this information to people in the EU for a broader range of content moderation decisions. This includes when we apply feature limits to people’s accounts and when we restrict content for violating local law. 

The Future of Harmonised Regulation

From early on, we’ve been supportive of the objectives of the DSA and the creation of a regulatory regime in Europe that minimises harm effectively, protects and empowers people, and upholds their fundamental rights. The hard work of creating these pioneering new rules has come to an end, and the process of implementing them has begun. In this new regulatory environment, it is critical that the DSA now maintains its primacy over existing and new national laws, to protect the clarity it has created for services, maintain consistency in the way tech companies are held to account, and preserve the harmonious way people experience our platforms across the region. A strong and open digital single market is of vital importance to the competitiveness of Europe as a whole, and we will continue to work closely with European policymakers and regulators in support of this shared vision.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy