
A recent controversy over pornographic apps being downloaded to iPhones in the European Union illustrates a fundamental tension in the EU’s Digital Markets Act (DMA): the conflict between mandated openness and established user-safety expectations.
While the DMA aims to promote competition and user choice, the recent case of the pornographic-video app Hot Tub, distributed to iPhone users through the third-party app store AltStore PAL, demonstrates how improper application of the regulation can compromise the user-safety mechanisms upon which consumers have come to rely.
Beyond a Narrow View of Security
Under the DMA, Apple was legally required to allow the app’s distribution, a scenario that the company noted undermines “consumer trust and confidence in our ecosystem that we have worked for more than a decade to make the best in the world.”
In other words, this isn’t just about narrowly understood technical security measures, as the DMA does allow Apple to check apps distributed through third parties for malware and device compatibility. But in this case, the fact that the alternative app store failed to employ age verification may present other kinds of risks, especially for younger users.
Moreover, the porn-app developers are exploiting the fact that Apple was legally forced by the DMA to technically “approve” (notarize) the app in question. They have marketed the app as “Apple approved,” as if it met the same standards to which iOS users had become accustomed prior to the DMA.
The deeper issue goes beyond any single app or specific safety feature. The fundamental problem is that the DMA is being interpreted by its enforcers in a way that disrupts established user-safety expectations and habits.
Changing User Habits Is Hard
When users develop the habit of relying on certain safety features within an ecosystem, whether it’s Apple’s iOS or Google’s Play Store, these habits become deeply ingrained in their interactions with technology. Users learn to trust that apps from these sources meet certain safety and content standards. This trust isn’t just about malware protection; it encompasses expectations about content filtering, age restrictions, and privacy protections.
Opening these ecosystems in the ways required by the DMA requires users to actively change their safety-related habits. This is particularly challenging because:
- Habits are resistant to change, even when users are made aware of new risks;
- The cognitive load of constantly evaluating the safety of new app sources is substantial;
- Users may not fully appreciate the safety implications of using alternative app stores; and
- Children and less tech-savvy users may be particularly vulnerable to these changes.
As I’ve long argued, to genuinely avoid diminishing user safety, the DMA’s legal mandates must take into account not just narrowly construed technical security standards, but—first and foremost—the actual experiences of regular users. Without taking this seriously, regulators risk violating the rules they purport to apply.
The Trust Ecosystem
Both Apple and Google have invested heavily in building user trust into their respective app-store ecosystems. Apple’s more restrictive iOS environment and Google’s comparatively open Play Store represent different approaches to the same goal: creating spaces where users can confidently download and use apps without constantly worrying about safety and privacy risks.
This trust wasn’t built overnight. It resulted from years of consistent policies, user education, and careful curation. The DMA’s requirements threaten to erode this foundation of trust, not through any single dramatic change, but through the cumulative effect of multiple small compromises to the safety architecture that these platforms have built.
A Path Forward: Interpreting the DMA with User Safety in Mind
As I have highlighted in previous publications (Stanford, Bocconi), the DMA includes provisions that acknowledge the importance of privacy and security. These considerations, however, need to be interpreted broadly enough to encompass the full spectrum of user-safety concerns, including:
- the preservation of established safety mechanisms;
- the protection of user habits that contribute to safe technology use;
- the recognition that technical security measures alone are insufficient; and
- the acknowledgment that different user groups have varying abilities to adapt to new safety requirements.
The DMA should not be interpreted in a way that forces gatekeepers to significantly diminish user safety. This means allowing platforms to maintain reasonable safety standards and content restrictions, even as they open their ecosystems to alternative distribution channels.
The Hot Tub app controversy serves as an early warning of the challenges ahead in balancing the DMA’s competition goals with user safety. As implementation continues, regulators must recognize that true user safety encompasses more than just technical security measures. It also includes the preservation of trusted environments where users can confidently navigate digital spaces. Any interpretation of the DMA that undermines these environments risks doing more harm than good to the very users it aims to protect.