Rapid Fire
Safe Harbour Provision for Social Media
- 03 Sep 2024
- 2 min read
Recently, French police arrested (later released on conditional bail) Telegram CEO Pavel Durov near Paris, marking a significant shift in tech accountability.
- This action highlights growing scrutiny over tech executives regarding their platforms' role in illicit activities.
- Charges Against Durov: Telegram is alleged to have enabled the distribution of content related to drug trafficking, child pornography, violent propaganda, and organised crime.
- Authorities accused Telegram of not cooperating with law enforcement efforts to moderate and control objectionable content on Telegram.
- Safe Harbour Rules: Social media platforms are not held legally liable for user-generated content, as long as they act to remove or address flagged objectionable content, thus supporting free speech and ensuring platforms are not responsible for preemptive content control.
- United States: Safe harbour protection is provided under Section 230 of the Communications Decency Act, which shields platforms from being held liable for user content.
- India: Section 79 of the Information Technology Act, 2000 offers similar protection.
- The Information Technology Rules, 2021, require social media companies with over 5 million users to appoint a chief compliance officer, who can be held criminally liable for non-compliance with takedown requests or other regulations.
Read More: Information Technology Act’s Section 69A, Digital Personal Data Protection Bill 2022, New IT Rules 2021