Italy’s Google and Apple Decisions: Regulatory Paternalism and Overenforcement

Cite this Article
Lazar Radic, Italy’s Google and Apple Decisions: Regulatory Paternalism and Overenforcement, Truth on the Market (December 10, 2021), https://truthonthemarket.com/2021/12/10/italys-google-and-apple-decisions-regulatory-paternalism-and-overenforcement/

The Autorità Garante della Concorenza e del Mercato (AGCM), Italy’s competition and consumer-protection watchdog, on Nov. 25 handed down fines against Google and Apple of €10 million each—the maximum penalty contemplated by the law—for alleged unfair commercial practices. Ultimately, the two decisions stand as textbook examples of why regulators should, wherever possible, strongly defer to consumer preferences, rather than substitute their own.

The Alleged Infringements

The AGCM has made two practically identical cases built around two interrelated claims. The first claim is that the companies have not properly informed users that the data they consent to share will be used for commercial purposes. The second is that, by making users opt out if they don’t want to consent to data sharing, the companies unduly restrict users’ freedom of choice and constrain them to accept terms they would not have otherwise accepted.

According to the AGCM, Apple and Google’s behavior infringes Articles 20, 21, 22, 24 and 25 of the Italian Consumer Code. The first three provisions prohibit misleading business practices, and are typically applied to conduct such as lying, fraud, the sale of unsafe products, or the omission or otherwise deliberate misrepresentation of facts in ways that would deceive the average user. The conduct caught by the first claim would allegedly fall into this category.

The last two provisions, by contrast, refer to aggressive business practices such as coercion, blackmail, verbal threats, and even physical harassment capable of “limiting the freedom of choice of users.” The conduct described in the second claim would fall here.

The First Claim

The AGCM’s first claim does not dispute that the companies informed users about the commercial use of their data. Instead, the authority argues that the companies are not sufficiently transparent in how they inform users.

Let’s start with Google. Upon creating a Google ID, users can click to view the “Privacy and Terms” disclosure, which details the types of data that Google processes and the reasons that it does so. As Figure 1 below demonstrates, the company explains that it processes data: “to publish personalized ads, based on your account settings, on Google services as well as on other partner sites and apps” (translation of the Italian text highlighted in the first red rectangle). Below, under the “data combination” heading, the user is further informed that: “in accordance with the settings of your account, we show you personalized ads based on the information gathered from your combined activity on Google and YouTube” (the section in the second red rectangle).

Figure 1: ACGM Google decision, p. 7

After creating a Google ID, a pop-up once again reminds the user that “this Google account is configured to include the personalization function, which provides tips and personalized ads based on the information saved on your account. [And that] you can select ‘other options’ to change the personalization settings as well as the information saved in your account.”

The AGCM sees two problems with this. First, the user must click on “Privacy and Terms” to be told what Google does with their data and why. Viewing this information is not simply an unavoidable step in the registration process. Second, the AGCM finds it unacceptable that the commercial use of data is listed together with other, non-commercial uses, such as improved quality, security, etc. (the other items listed in Figure 1). The allegation is that this leads to confusion and makes it less likely that users will notice the commercial aspects of data usage.

A similar argument is made in the Apple decision, where the AGCM similarly contends that users are not properly informed that their data may be used for commercial purposes. As shown in Figure 2, upon creating an Apple ID, users are asked to consent to receive “communications” (notifications, tips, and updates on Apple products, services, and software) and “Apps, music, TV, and other” (latest releases, exclusive content, special offers, tips on apps, music, films, TV programs, books, podcasts, Apple Pay and others).

Figure 2: AGCM Apple decision, p. 8

If users click on “see how your data is managed”—located just above the “Continue” button, as shown in Figure 2—they are taken to another page, where they are given more detailed information about what data Apple collects and how it is used. Apple discloses that it may employ user data to send communications and marketing e-mails about new products and services. Categories are clearly delineated and users are reminded that, if they wish to change their marketing email preferences, they can do so by going to appleid.apple.com. The word “data” is used 40 times and the taxonomy of the kind of data gathered by Apple is truly comprehensive. See for yourself.

The App Store, Apple Book Store, and iTunes Store have similar clickable options (“see how your data is managed”) that lead to pages with detailed information about how Apple uses data. This includes unambiguous references to so-called “commercial use” (e.g., “Apple uses information on your purchases, downloads, and other activities to send you tailored ads and notifications relative to Apple marketing campaigns.”)

But these disclosures failed to convince the AGCM that users are sufficiently aware that their data may be used for commercial purposes. The two reasons cited in the opinion mirror those in the Google decision. First, the authority claims that the design of the “see how your data is managed” option does not “induce the user to click on it” (see the marked area in Figure 2). Further, it notes that accessing the “Apple ID Privacy” page requires a “voluntary and eventual [i.e., hypothetical]” action by the user. According to the AGCM, this leads to a situation in which “the average user” is not “directly and intuitively” aware of the magnitude of data used for commercial purposes, and is instead led to believe that data is shared to improve the functionality of the Apple product and the Apple ecosystem.

The Second Claim

The AGCM’s second claim contends that the opt-out mechanism used by both Apple and Google “limits and conditions” users’ freedom of choice by nudging them toward the companies’ preferred option—i.e., granting the widest possible consent to process data for commercial use.

In Google’s case, the AGCM first notes that, when creating a Google ID, a user must take an additional discretionary step before they can opt out of data sharing. This refers to mechanism in which a user must click the words “OTHER OPTIONS,” in bright blue capitalized font, as shown in Figure 3 below (first blue rectangle, upper right corner).

Figure 3: AGCM Google decision, p. 22

The AGCM’s complaint here is that it is insufficient to grant users merely the possibility of opting out, as Google does. Rather, the authority contends, users must be explicitly asked whether they wish to share their data. As in the first claim, the AGCM holds that questions relating to the commercial use of data must be woven in as unavoidable steps in the registration process.

The AGCM also posits that the opt-out mechanism itself (in the lower left corner of Figure 3) “restricts and conditions” users’ freedom of choice by preventing them from “expressly and preventively” manifesting their real preferences. The contention is that, if presented with an opt-in checkbox, users would choose differently—and thus, from the authority’s point of view, choose correctly. Indeed, the AGCM concludes from the fact that the vast majority of users have not opted out from data sharing (80-100%, according to the authority), that the only reasonable conclusion is that “a significant number of subscribers have been induced to make a commercial decision without being aware of it.”

A similar argument is made in the Apple decision. Here, the issue is the supposed difficulty of the opt-out mechanism, which the AGCM describes as “intricate and non-immediate.” If a user wishes to opt out of data sharing, he or she would not only have to “uncheck” the checkboxes displayed in Figure 2, but also do the same in the Apple Store with respect to their preferences for other individual Apple products. This “intricate” process generally involves two to three steps. For instance, to opt out of “personalized tips,” a user must first go to Settings, then select their name, then multimedia files, and then “deactivate personalized tips.”

According to the AGCM, the registration process is set up in such a way that the users’ consent is not informed, free, and specific. It concludes:

The consumer, entangled in this system, of which he is not aware, is conditioned in his choices, undergoing the transfer of his data, which the professional can dispose of for his own promotional purposes.

The AGCM’s decisions fail on three fronts. They are speculative, paternalistic, and subject to the Nirvana Fallacy. They are also underpinned by an extremely uncharitable conception of what the “average user” knows and understands.

Epistemic Modesty Under Uncertainty

The AGCM makes far-reaching and speculative assumptions about user behavior based on incomplete knowledge. For instance, both Google and Apple’s registration processes make clear that they gather users’ data for advertising purposes—which, especially in the relevant context, cannot be interpreted by a user as anything but “commercial” (even under the AGCM’s pessimistic assumptions about the “average user.”) It’s true that the disclosure requires the user to click “see how your data is managed” (Apple) or “Privacy and Terms” (Google). But it’s not at all clear that this is less transparent than, say, the obligatory scroll-text that most users will ignore before blindly clicking to accept.

For example, in registering for a Blizzard account (a gaming service), users are forced to read the company’s lengthy terms and conditions, with information on the “commercial use” of data buried somewhere in a seven-page document of legalese. Does it really follow from this that Blizzard users are better informed about the commercial use of their data? I don’t think so.

Rather than the obligatory scroll-text, the AGCM may have in mind some sort of pop-up screen. But would this mean that companies should also include separate, obligatory pop-ups for every other relevant aspect of their terms and conditions? This would presumably take us back to square one, as the AGCM’s complaint was that Google amalgamated commercial and non-commercial uses of data under the same title. Perhaps the pop-up for the commercial use of data would have to be made more conspicuous. This would presumably require a normative hierarchy of the companies’ terms and conditions, listed in order of relevance for users. That would raise other thorny questions. For instance, should information about the commercial use of data be more prominently displayed than information about safety and security?

A reasonable alternative—especially under conditions of uncertainty—would be to leave Google and Apple alone to determine the best way to inform consumers, because nobody reads the terms and conditions anyway, no matter how they are presented. Moreover, the AGCM offers no evidence to support its contention that companies’ opt-out mechanisms lead more users to share their data than would freely choose to do so.

Whose Preferences?

The AGCM also replaces revealed user preferences with its own view of what those preferences should be. For instance, the AGCM doesn’t explain why opting to share data for commercial purposes would be, in principle, a bad thing. There are a number of plausible and legitimate explanations for why a user would opt for more generous data-sharing arrangements: they may believe that data sharing will improve their experience; may wish to receive tailored ads rather than generic ones; or may simply value a company’s product and see data sharing as a fair exchange. None of these explanations—or, indeed, any others—are ever contemplated in the AGCM decision.

Assuming that opt-outs, facultative terms and conditions screens, and two-to-three-step procedures to change one’s preferences truncate users’ “freedom of choice” is paternalistic and divorced from the reality of the average person, and the average Italian.

Ideal or Illegal?

At the heart of the AGCM decisions is the notion that it is proper to punish market actors wherever the real doesn’t match a regulator’s vision of the ideal—commonly known as “the Nirvana fallacy.” When the AGCM claims that Apple and Google do not properly disclose the commercial use of user data, or that the offered opt-out mechanism is opaque or manipulative, the question is: compared to what? There will always be theoretically “better” ways of granting users the choice to opt out of sharing their data. The test should not be whether a company falls short of some ideal imagined practice, but whether the existing mechanism actually deceives users.

There is nothing in the AGCM’s decisions to suggest that it does. Depending on how precipitously one lowers the bar for what the “average user” would understand, just about any intervention might be justified, in principle. But to justify the AGCM’s intervention in this case requires stretching the plausible ignorance of the average user to its absolute theoretical limits.

Conclusion

Even if a court were to buy the AGCM’s impossibly low view of the “average user” and grant the first claim—which would be unfortunate, but plausible — not even the most liberal reading of Articles 24 and 25 can support the view that “overly complex, non-immediate” opt-outs, as interpreted by the AGCM, limit users’ freedom of choice in any way comparable to the type of conduct described in those provisions (coercion, blackmail, verbal threats, etc.)

The AGCM decisions are shot through with unsubstantiated assumptions about users’ habits and preferences, and risk imposing undue burdens not only on the companies, but on users themselves. With some luck, they will be stricken down by a sensible judge. In the meantime, however, the trend of regulatory paternalism and over-enforcement continues. Much like in the United States, where the Federal Trade Commission (FTC) has occasionally engaged in product-design decisions that substitute the commission’s own preferences for those of consumers, regulators around the world continue to think they know better than consumers about what’s in their best interests.