A Cure Worse Than the Scroll

Cite this Article
Satya Marar, A Cure Worse Than the Scroll, Truth on the Market (April 06, 2026), https://truthonthemarket.com/2026/04/06/a-cure-worse-than-the-scroll/

The App Store Accountability Act (ASAA) promises to protect children online—but it would do so by imposing sweeping mandates on everyone else.

Panic over doomscrolling, brainrot, gambling, pornography, online predators, and minors’ interactions with AI chatbots has fueled a familiar policy response: calls to age-gate the internet, social media, and apps.

The ASAA fits squarely in that trend. The bill has cleared the U.S. House Energy and Commerce Committee and now heads to the full House for consideration. It mirrors several state-level proposals. The ASAA would require Google Play and Apple’s App Store to verify the age of all users using “commercially reasonable methods,” and would bar minors unless they have parental consent.

Some of this will sound uncontroversial. Lawmakers have long required age verification to buy cigarettes or alcohol, or to enter casinos and strip clubs. More than 20 states now impose similar requirements on pornographic websites. Few object, in principle, to reasonable safeguards that protect minors from harmful or inappropriate content.

The ASAA goes much further.

It would expose all app-store users to new privacy and security risks, while saddling even developers of age-appropriate content with compliance burdens. The likely result: less competition and less innovation in the app economy. At the same time, the bill would do little to empower parents or meaningfully improve protections for children.

Put simply, the costs outweigh the benefits.

From Parental Choice to One-Size-Fits-All Mandates

As a baseline, society assigns parents primary responsibility for shielding children from harmful or inappropriate content. That principle hasn’t changed since earlier debates about excessive television use. Monitoring a child’s online activity can be difficult, and kids often push back—arguing that limits on messaging or social media will cut them off from their peers.

Even so, parents already have tools that offer more control than television ever did.

Apple, for example, gives parents a robust set of controls at the device level. On an iPhone, they can limit screen time, restrict website access, require approval for app downloads, block explicit content, and confine messaging to approved contacts. Parents can set these controls during device setup and adjust them at any time. Google offers similar functionality. Beyond that, a growing ecosystem of standalone parental-control tools exists, alongside controls built into apps, social media platforms, video games, internet service providers, and websites.

These tools already deliver as much—or more—control than the ASAA. The key difference lies in who decides. Existing tools rely on parental opt in. The ASAA would replace that flexible, choice-based system with a one-size-fits-all regime that requires parental approval at every stage.

It would also come with real costs. The ASAA would force app stores and developers to collect and process sensitive data on both minors and adults. That shift would raise compliance costs and expand the attack surface for hackers and identity thieves—putting all users at greater risk.

The Hidden Costs of ‘Just Verify Age’

The ASAA would require app stores to sort every user into one of four age brackets—“young child” (under 13), “child” (13-15), “teenager” (16-17), or “adult” (18+)—using a method “reasonably designed to ensure accuracy.” Get it wrong by even a year, and firms face enforcement by the Federal Trade Commission (FTC), along with steep financial penalties.

Proponents argue that tools like biometrics, blockchain, and AI make age verification both accurate and privacy-preserving. Experts dispute that claim. It also overlooks the litigation risk firms face when those systems fail. Others suggest payment systems like Apple Pay as a workaround. That, too, falls short. Not everyone uses these services. Even those who do must still submit sensitive documents to verify their age.

The result: millions of Americans would face a choice between handing over personal data or losing access to apps altogether. That includes the roughly 19% of Americans who lack a credit card.

Compliance would force app stores to collect, process, and store large volumes of sensitive data. That shift would divert resources away from product development and meaningful security, privacy, and anti-fraud measures. It would also create centralized troves of personal data—prime targets for hackers and foreign adversaries.

These risks are not hypothetical. As Sarah Forland and Prem Trivedi of the Open Technology Institute note:

In July 2025, hackers exposed 13,000 selfies and photo IDs used to verify account holders from the Tea Dating Advice app. In October, Discord found that 70,000 users may have had their government-ID photos exposed; they were submitted as part of the platform’s age-gating process.

App developers would feel the impact even more acutely. Large firms can spread fixed compliance costs across massive user bases. Smaller developers cannot. More than 90% of apps on Apple’s App Store came from small businesses as of 2022.

The ASAA does not require developers to collect age-verification data directly. But it does require app stores to pass along age-category “flags” for developers to store and process—whether they want them or not. Many developers lack the infrastructure to handle that data securely, especially if their apps serve general audiences and were never designed with age segmentation in mind.

The burden grows for apps with younger users. If a user falls under 13, developers must comply with the Children’s Online Privacy Protection Act (COPPA). That means building compliance systems, modifying algorithms, and collecting identifying information linking children to consenting parents—or excluding younger users altogether.

That outcome would sweep in even low-risk apps—calculators, fitness trackers, weather tools. It would not matter whether parents approve of their children using them.

Noncompliance carries legal, financial, and reputational risk. Compliance carries its own costs. Trusted Future (2025) estimates that a similar Texas law could impose up to $80,000 in compliance costs per small business. Evidence from the European Union’s General Data Protection Regulation (GDPR) points in the same direction: disproportionate burdens on small firms, coupled with high demand for scarce—and expensive—compliance professionals.

The ASAA would also shrink the app economy’s user base. Friction from age verification—time, paperwork, and data disclosure—will deter users. Fewer users mean less data. Less data means weaker products. Courts in antitrust cases have recognized that limited access to user data can prevent firms from reaching the “minimum efficient scale” needed to compete, even with superior algorithms.

Some compliance burdens may make sense for apps aimed at children or those hosting age-inappropriate content. The ASAA does not draw those lines. It applies across the board—even to low-risk apps.

That breadth raises constitutional concerns. A federal judge, reviewing a similar Texas law, likened it to requiring “every bookstore to verify the age of every customer at the door and, for minors, [requiring] parental consent before the child or teen could enter and again when they try to purchase a book.” The court blocked the law as an “exceedingly overbroad” restriction on protected speech that failed the “least restrictive means” test.

The ASAA would likely face similar First Amendment challenges. In the meantime, businesses would still bear the costs—possibly for years—while litigation runs its course.

All of this, despite existing tools that already give parents meaningful control. A narrower, better-targeted law could address genuine risks to children without imposing sweeping costs or raising serious constitutional problems.

A Bad Bargain for Everyone

Every law involves tradeoffs. Minimum-wage laws, for example, can reduce employment, business formation, and overall economic activity, while raising prices for goods and services. Many jurisdictions accept those costs in exchange for a guaranteed wage floor.

The ASAA presents a far less compelling bargain.

Its costs—heightened data-security risks, reduced innovation, and heavier compliance burdens—would fall on parents, children, app stores, and developers alike. The promised benefits, by contrast, remain thin. The bill would not meaningfully strengthen parental control or materially improve protections for children.

That is not a tradeoff worth making.