Social Network
QUICK LINKS:
- Departments
- Concerns and Issues
Section 230
Section 230 provides limited federal immunity to providers and users of interactive computer services. It provides protection to online platforms from being sued for content posted by users, including defamation, misrepresentation, or emotional distress. This allows platforms to remove content they deem objectionable or leave content they want without fear of being held responsible for that content. It protects platforms from liability for blocking or filtering user content which allows a greater freedom of speech to remain on their platform. In Summary: if platforms don't moderate their content, they risk being sued for harmful posts and if they DO moderate their content they are liable for censorship.
Section 230 allows some leniency for the content providers. However, it also encourages their algorithms to elevate specifically problematic content, not elevated by actual people's interest, in order to encourage more clicks and revenue and content that is problematic.
Section 230(c)(2) further provides "Good Samaritan" protection from civil liability for operators of interactive computer services in the voluntary good faith removal or moderation of third-party material the operator "considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected." The goal is to specifically remove the protection to content being pushed by an algorithm because it is being used/supported by the platform for extra appeal.
These concerns can be solved by removing any content that was algorithmically-elevated from the protection of Section 230.