Social media content moderation by technology platforms was one of the “hot” legal topics in 2023-2024. Three States — California, Texas, and Florida — passed different statutes to either require more content moderation (California) or to limit such moderation (Texas and Florida). All the statutes, in one way or another, demanded more transparency and information from the technology platforms about their methods of content moderation and the end results. Politically, left-leaning California favored more moderation to prevent the spread of mis- and disinformation, and the right-leaning States opposed content moderation as a form of censorship of free speech.
Legally, the Texas and Florida statutes reached the U.S. Supreme Court first. In NetChoice, LLC v. Moody the Supreme Court did not reach the specific legal merits of the cases, but DID hold that technology platforms were engaged in “speech” when they engaged in social media content moderation. Shortly after the NetChoice opinion was issued, the California statute reached the Ninth Circuit Court of Appeal in X Corp v Bonta, Case No. 24-271 (9th Cir. 2024). In that case, the judges held that the California statute imposed improper burdens and restrictions on free speech activity. As a result, most of the California statute was deemed unconstitutional.
The combination of these two court decisions has clearly dampened any enthusiasm from lawmakers for passing statutes that require or limit social media content moderation. As far as we can tell, there are no current efforts to enact similar laws. The Texas and Florida cases are still pending. However, the cases are not receiving much notice or attention. The Texas case is scheduled for trial in the summer of 2025.
In addition to the legal actions, several other things have dampened enthusiasm. The first is, of course, the election results. Among other things, the election results have given Elon Musk and his X platform (formerly Twitter) a heightened prominence making it more difficult to make a successful challenge to X’s content moderation policies. Interestingly, the legal challenges to X’s reduced moderation are coming internationally from Brazil and the European Union, for example. The election results have also caused a pause in the sale and/or banning of TikTok which has been the focus of ire from the political left for insufficient content moderation. Further, another large social media platform — Facebook — has pulled back on its content moderation as a result of the election. Together, a ripple effect has been created where there is significantly less content moderation than even a couple of years ago.
That being said, the issue remains important. However, it seems that the content moderation debate has moved away from the legal space and back into the “persuasion space,” where the arguments are being directed at the technology platforms.
Contact the Internet Attorneys at Revision Legal
For more information, contact the experienced Internet Lawyers at Revision Legal. You can contact us through the form on this page or call (855) 473-8474.