Missouri AG Demands Big Tech To Give Social Media Users Their Choice of Algorithm
The attorney general of Missouri, Andrew Bailey, announced on Tuesday the implementation of a new regulation aimed at granting social media users greater control over the algorithms that curate their online experiences. This initiative positions Missouri as the first state in the United States to enact such comprehensive legislation targeting big tech’s control over social media content.
Bailey took to X to unveil his administration’s ambitious plan, which mainly includes a “rule requiring Big Tech to guarantee algorithmic choice for social media users.”
“Americans should control the content they consume on social media, not Big Tech oligarchs. Let the best algorithm win,” Bailey wrote on X.
Bailey’s regulation mandates that social media platforms operating within Missouri must provide users with the ability to select from various content moderation algorithms, rather than relying solely on proprietary systems controlled by the platforms themselves.
Fighting The Algorithm
In a detailed press release, Bailey elaborated on the necessity and mechanics of the new rule, saying that “social media companies are supposed to provide a space where users can share views, content and ideas.”
“Instead, Big Tech oligarchs have manipulated consumers’ social media feeds for their own purposes and exercised monopoly control over content moderation,” Bailey added. “I am invoking my authority under consumer protection law to ensure Missourians get to control the content they consume on social media.”
Under the Missouri Merchandising Practices Act, the regulation stipulates several key requirements for compliance. Social media platforms must implement a choice screen that appears during account activation and at least every six months thereafter, allowing users to select from competing content moderators.
Importantly, no selection can be chosen by default, ensuring that users make active decisions without bias. Additionally, the interface presenting these options must treat third-party moderators equally, without favoring the platform’s own moderation services.
When a user opts for a third-party moderator, the platform is required to grant interoperable access to necessary data, facilitating effective content moderation based on the user’s preference. Furthermore, except where explicitly authorized, platforms are prohibited from censoring or suppressing content if the chosen moderator permits its viewing. This ensures that users retain control over their content consumption without undue interference from the platform’s default moderation policies.
This regulatory framework follows the Supreme Court’s recent decision in the NetChoice case last year, which underscored the need for greater oversight of digital platforms under consumer protection laws. By leveraging the Missouri Merchandising Practices Act, Bailey’s regulation seeks to institutionalize algorithmic transparency and user autonomy at a legislative level unprecedented in the United States.
Policy or Political Points
The announcement has sparked a mix of support and skepticism among legal experts and industry stakeholders. Rob Freund, a prominent attorney and commentator, expressed confusion and criticism on social media.
“Hey platforms, you don’t have the right to control what content you display to users. You are required by law to let an outside party control the content that you show,” he wrote.
He further criticized the legislative approach, suggesting it might be more about garnering publicity than enacting feasible policy:
“It’s frustrating to see lawmakers waste taxpayer money announcing legislation that has no chance of succeeding or ever being implemented, just so they can score cheap publicity wins via press releases. Then when the law doesn’t pass or is struck down, they can blame their opponents,” he added.
Should Missouri’s regulation prove successful, it could set a precedent for other states and potentially federal legislation aimed at curbing Big Tech’s influence over social media. Major platforms like Meta Platforms and X may need to adapt their operational frameworks to comply with the new requirements or face legal repercussions.
As part of the rule promulgation process, Bailey has announced that public comments will be solicited, and forums will be held to gather additional evidence concerning deceptive practices by social media companies. This participatory approach aims to refine the regulation and address any unforeseen challenges in its implementation.
“This is the first prong of a comprehensive offensive to protect free speech in 2025,” Bailey remarked. “Now that we have a presidential administration coming into office that will not silence disfavored speech, we’re turning our focus to corporate censorship,” referencing to the incoming presidency of U.S. President-elect Donald Trump.
Information for this briefing was found via the sources mentioned. The author has no securities or affiliations related to this organization. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.