California’s new law concerning social media moderation content policies has gone into effect, and social media internet platforms should be producing their first reports in compliance with the legislation. The legislation is commonly called “Assembly Bill 587” (“AB 587”) and is an addition to California’s Business and Professions Code. AB 587 was not given a stand-alone statute name but was identified by lawmakers as legislation regarding content moderation requirements for Internet Terms of Service Agreements. See text here. The gist of AB 587 is that internet and social media platforms (that have more than $100 million in gross annual revenue) are required to publicly report various information about how the platform moderates content uploaded by users. Companies are required to submit to the California Attorney General semiannual “terms of service reports.”
Note that New York, Florida, and Texas have also enacted legislation similar to California’s AB 587. Unlike the other statutes, California’s social media content moderation statute does not ban any practices or provisions in a Terms of Service Agreement. Rather, the purpose of AB 587 is to mandate notices and reports. The stated reason for the legislation was to create transparency with respect to how social media companies moderate user comments and content. One issue, for example, is the use of machine-automated programs to moderate comments and content. AB 587 requires information to be disclosed about this and how and when human intervention/review is used.
What AB 587 Requires
Section 22676 of AB 587 mandates that social media companies post terms of service “in a manner reasonably designed to inform all users of the social media platform of the existence and contents of the terms of service.” Further, subsection (b) requires that such terms of service include the following:
- Contact information for the purpose of allowing users to ask the social media company questions about the terms of service
- A description of the process that users must follow to flag content, groups, or other users that they believe violate the terms of service
- A description of the social media company’s commitments on response and resolution time
- A list of potential actions the social media company may take against an item of content or a user, including, but not limited to, removal, demonetization, deprioritization, or banning
What Must Be in the “Terms of Service Reports”
The semiannual terms of service reports require that the social media company provide its current terms of service agreement and a detailed statement of any changes made since the previous terms of service report was filed.
Further, each report must provide a statement of whether the current version of the terms of service defines each of the following types of content and, if they do, the definitions of those types of content, including any subcategories. These are:
- Hate speech or racism
- Extremism or radicalization
- Disinformation or misinformation
- Harassment
- Foreign political interference
Further, AB 587 requires that the terms of service report provide a detailed description of content moderation practices used by the social media company. Descriptions should include the following, at a minimum:
- Any existing policies intended to address the categories of content listed above
- How automated content moderation systems enforce the terms of service of the social media platform, and when these systems involve human review
- Method of responding to user reports of violations
- How the social media company would remove individual pieces of content, users, or groups that violate the terms of service, or would take broader action against individual users or groups of users that violate the terms of service
Finally, AB 587 requires a long list of statistical information related to content moderation. For example, the terms of service report must provide information on content that was flagged by the social media company, including:
- The total number of flagged and actioned items of content
- The total number of actioned items of content that were removed, demonetized, or deprioritized by the social media company
- The number of times actioned items of content were viewed and shared by users
- The number of users who viewed the content before it was actioned
- The number of times users appealed social media company actions
- The number of reversals of social media company actions on appeal
- How the content was flagged such as by company employees, artificial intelligence software, community moderators, etc.
- How the content was actioned such as by company employees, artificial intelligence software, community moderators, etc.
Violations of AB 587 are to be enforced by the California Attorney General’s Office. Civil penalties can be imposed of no more than $15,000 per violation per day.
Contact the Internet Law and Social Media Attorneys at Revision Legal
For more information, contact the experienced Internet Law and Social Media Lawyers at Revision Legal. You can contact us through the form on this page or call (855) 473-8474.