Menu
Back

STF Establishes New Parameters for the Civil Liability of Digital Platforms for Third-Party Content

23/07/2025

In a recent and landmark decision rendered on June 26, 2025, the Federal Supreme Court (STF) established new criteria for the civil liability of social media platforms for content generated by third parties, partially reinterpreting Article 19 of the Brazilian Internet Bill of Rights (Marco Civil da Internet – Law No. 12,965/2014). The decision was issued in Extraordinary Appeals No. 1,037,396 (Theme 987) and No. 1,057,258 (Theme 533), both with general repercussion recognized.

Summary of the Decision

By majority vote, the Court held as partially unconstitutional the requirement of a specific court order for platforms to be held liable for damages resulting from user-generated content. The interpretation marks a jurisprudential shift and will remain in force until the National Congress enacts specific new legislation.

New Parameters for Liability

The thesis established by the STF sets forth three main categories of liability:

  1. Crimes Against Honor
    • Platforms shall only be held liable if they fail to comply with a specific court order to remove the content.
    • However, platforms remain free to remove content preventively upon receipt of an extrajudicial notice.
    • Reposted content based on facts already deemed offensive by a judicial decision must be removed immediately upon simple notification, without the need for a new court order.
  2. Serious Crimes
    (e.g., terrorism, racism, homophobia, crimes against children, coup attempts, among others)
    • Platforms may be held liable regardless of a court order if they fail to act promptly and effectively to remove such content.
    • Liability arises in cases of systemic failure—namely, the absence of adequate prevention and control measures.
  3. General Crimes and Unlawful Acts
    • Platforms may be held civilly liable if, after being notified by the affected party, they fail to remove the reported content.
    • This rule also applies to fake profiles or accounts used to commit unlawful acts.

Self-Regulation Guidelines

The STF further ruled that application providers must implement mechanisms for:

• Effective extrajudicial notifications;
• Due process in content moderation;
• Transparency through the publication of annual reports;
• Permanent and accessible user service channels.

These measures must be incorporated into each platform’s internal self-regulation policy and must observe the principles of proportionality, freedom of expression, and the protection of honor and dignity.

The decision has binding effect and must be followed by all lower courts, serving as a legal reference for platform operations until the eventual enactment of new legislation by the National Congress.

NEWSLETTER

Stay updated on the latest news and bulletins in the tax and corporate sectors.

    By providing my data, I agree to the Privacy Policy.