Arkansas tried, once again, to reshape how social media works for young people. And once again, a federal court stepped in.
In NetChoice LLC v. Griffin, a judge in the Western District of Arkansas issued a preliminary injunction against Arkansas Act 900 of 2025, concluding that major parts of the law are likely unconstitutional. The case is part of a growing national fight over whether states can force online platforms to redesign their services in the name of child protection, even when those redesigns affect speech, editorial judgment, and access to lawful information.
Join the Discussion
What Act 900 tried to do
Act 900 was Arkansas’s attempt to revive an earlier social media law that had already been blocked in court. Rather than starting from scratch, lawmakers amended the old framework and added new requirements aimed at three broad goals:
- Stop platforms from using features that could trigger “addiction or compulsive behaviors” in minors.
- Impose default settings that reduce notifications overnight and push minors into more restrictive safety and privacy settings.
- Require a parent-facing “dashboard” to monitor a child’s “use habits” and restrict access, including to “logical portions” of a platform.
That all sounds straightforward until you look at how the law defines the people it claims to protect and what it demands of services to comply.
A basic drafting problem: Who is a “user”?
One of the court’s most practical concerns was also one of the most revealing: Act 900 used multiple definitions for a person’s relationship to a platform, and those definitions did not line up cleanly with the obligations the law created.
In plain terms, some provisions applied to minors who have accounts, while other provisions could apply to minors who simply visit a site. That matters because compliance looks very different depending on whether a platform is interacting with a logged-in account holder or a passing visitor.
This definitional mess set the stage for the court’s bigger constitutional concerns: vagueness, overbreadth, and lack of tailoring.
The “addictive practices” rule: too vague and too strict
Act 900 required platforms to “ensure” they do not engage in practices that “evoke any addiction or compulsive behaviors” in Arkansas minors. The law listed examples such as notifications, recommended content, an “artificial sense of accomplishment,” and engagement with bots that appear human.
The court found this provision likely unenforceable as written because it does not give clear notice of what is prohibited and invites arbitrary enforcement. The problem was not just that the language was broad. It was also that liability could attach even if a platform could not reasonably predict that a feature would affect a particular child.
Compounding that, the law paired the restriction with recurring audit requirements. Those audits were not limited to diagnosing “addiction” itself. They extended to “addiction-driven behavior,” potentially including behavior that happens off-platform. The court treated that expansion as another sign the law was not carefully bounded.
Nighttime notification defaults: a state interest, but a poor fit
Act 900 also attempted to curb late-night notifications for minors by requiring platforms, by default, to stop non-safety notifications between 10:00 p.m. CST and 6:00 a.m. CST, while allowing a parent or guardian to modify the setting.
The judge accepted that Arkansas has a legitimate interest in minors getting sufficient sleep and treated the rule as a time, place, and manner-style restriction rather than an outright content ban. But the court still found serious tailoring problems.
Because the default could apply not only to account holders but also to visitors, the court suggested compliance might effectively pressure platforms to silence notifications for everyone in Arkansas during those hours unless the platform had verified the user as an adult account holder.
And the court questioned whether the law would actually advance the sleep interest, pointing out that parents already have a no-tech option available: taking devices away at night. In one of the opinion’s most quotable lines, the judge concluded: This “is not how one addresses a serious social problem.”
Privacy and safety defaults: speech burdens without a clear payoff
Act 900 also required that minors’ privacy and safety settings default to the “most protective level” the platform offers.
Here, the court’s First Amendment lens sharpened. The judge reasoned that many privacy and safety settings change how speech flows: who can speak to a minor, who can see a minor’s posts, and how a platform can distribute that speech. Even when the burden seems “small,” the Constitution still asks whether the state is imposing that burden in a way that is meaningfully connected to the asserted goal.
A particularly awkward feature of the law was that it did not clearly assign control of the default settings to parents. The court read the law as allowing anyone, including the child, to opt out of the most protective settings. That undercut Arkansas’s justification because it turned a purported child-protection mandate into something closer to a gentle suggestion.
The court also stressed the breadth of the law’s definition of “social media platform,” which could sweep in services not commonly associated with the most serious exploitation concerns. A wide net can be constitutionally dangerous because it burdens a great deal of lawful speech to address a narrower set of harms.
That concern culminated in the opinion’s sharp warning: Arkansas cannot sentence speech on the internet to death by a thousand cuts.
The parent “dashboard” requirement: burdensome by design
Act 900 required platforms to build an online dashboard so parents could view and understand a child’s use habits and restrict access, including to “logical portions” of the platform.
The court did not need to settle every doctrinal debate about compelled disclosures to see the practical problem: the provision would require enormous data collection and identity work, especially because it applied in a confusing way to minors who were not even account holders.
To make the dashboard functional for mere visitors, a platform would need to gather and organize detailed behavioral information about a child, determine that the visitor is a minor, identify and authenticate that child’s parent, and potentially track the child across devices. The court concluded that this kind of buildout is unduly burdensome and likely to chill lawful speech by discouraging platforms from offering features to users who are not logged in.
Why this fight keeps coming back
If you have been following these disputes across the country, the pattern is familiar. States describe real problems: compulsive use, late-night screen time, privacy risks, and online exploitation. Then they write laws that require platforms to treat minors as a separate class of users and redesign speech-related features accordingly.
That is why some observers have started describing these statutes as “segregate-and-suppress” laws. The “segregate” part is the push to identify minors and treat them differently. The “suppress” part is the reality that many of the demanded design changes reduce speech, reshape recommendations, limit sharing, and restrict access to lawful content.
This injunction does not mean states are powerless to protect children online. It does mean that when a law operates by pressuring platforms to silence, limit, or restructure speech, it must be written with clarity and narrowly tailored to a demonstrated benefit. Broad mandates with fuzzy standards, strict liability triggers, or sweeping definitions are especially vulnerable under the First Amendment.
What happens next
A preliminary injunction is not the final word, but it is an important signal: the court believes NetChoice is likely to succeed on the merits, and that enforcing Act 900 while the case proceeds would cause constitutional harm.
Arkansas could appeal, revise the statute again, or both. But the deeper lesson from the opinion is that repeated iterations do not automatically solve structural problems like vague “addiction” standards, hard-to-administer minor versus adult distinctions, and parental-control schemes that require extensive identity verification.
For readers trying to understand the constitutional stakes, this case is a reminder of something basic but easy to overlook: when the government regulates the architecture of communication, it is often regulating speech itself, even if the statute is marketed as a consumer-safety measure.