Site icon Truth on the Market

It’s Not So Simple Who Owns “Your” Data

What kind of regulation? Treating digital platforms like public utilities won’t work, Petit argues, because the product is multidimensional and competition takes place on multiple margins (the larger theme of the book): “there is a plausible chance that increased competition in digital markets will lead to a race to the bottom, in which price competition (e.g., on ad markets) will be the winner, and non-price competition (e.g., on privacy) will be the loser.” Utilities regulation also provides incentives for rent-seeking by less efficient rivals. Retail regulation, aimed at protecting small firms, may end up helping incumbents instead by raising rivals’ costs.

Petit concludes that consumer protection regulation (such as Europe’s GDPR) is a better tool for guarding privacy and truth, though it poses challenges as well. More generally, he highlights the vast gulf between the economic analysis of privacy and speech and the increasingly loud calls for breaking up the big tech platforms, which would do little to alleviate these problems.

As in the rest of the book, Petit’s treatment of these complex issues is thoughtful, careful, and systematic. I have more fundamental problems with conventional antitrust remedies and think that consumer protection is problematic when applied to data services (even more so than in other cases). Inspired by this chapter, let me offer some additional thoughts on privacy and the nature of data which speak to regulation of digital platforms and services.

First, privacy, like information, is not an economic good. Just as we don’t buy and sell information per se but information goods (books, movies, communications infrastructure, consultants, training programs, etc.), we likewise don’t produce and consume privacy but what we might call privacy goods: sunglasses, disguises, locks, window shades, land, fences and, in the digital realm, encryption software, cookie blockers, data scramblers, and so on.

Privacy goods and services can be analyzed just like other economic goods. Entrepreneurs offer bundled services that come with varying degrees of privacy protection: encrypted or regular emails, chats, voice and video calls; browsers that block cookies or don’t; social media sites, search engines, etc. that store information or not; and so on. Most consumers seem unwilling to sacrifice other functionality for increased privacy, as suggested by the small market shares held by DuckDuckGo, Telegram, Tor, and the like suggest. Moreover, while privacy per se is appealing, there are huge efficiency gains from matching on buyer and seller characteristics on sharing platforms, digital marketplaces, and dating sites. There are also substantial cost savings from electronic storage and sharing of private information such as medical records and credit histories. And there is little evidence of sellers exploiting such information to engage in price discrimination. (Aquisti, Taylor, and Wagman, 2016 provide a detailed discussion of many of these issues.)

Regulating markets for privacy goods via bans on third-party access to customer data, mandatory data portability, and stiff penalties for data breaches is tricky. Such policies could make digital services more valuable, but it is not obvious why the market cannot figure this out. If consumers are willing to pay for additional privacy, entrepreneurs will be eager to supply it. Of course, bans on third-party access and other forms of sharing would require a fundamental change in the ad-based revenue model that makes free or low-cost access possible, so platforms would have to devise other means of monetizing their services. (Again, many platforms already offer ad-free subscriptions, so it’s unclear why those who prefer ad-based, free usage should be prevented from doing so.)

What about the idea that I own “my” data and that, therefore, I should have full control over how it is used? Some of the utilities-based regulatory models treat platforms as neutral storage places or conduits for information belonging to users. Proposals for data portability suggest that users of technology platforms should be able to move their data from platform to platform, downloading all their personal information from one platform then uploading it to another, then enjoying the same functionality on the new platform as longtime users.

Of course, there are substantial technical obstacles to such proposals. Data would have to be stored in a universal format – not just the text or media users upload to platforms, but also records of all interactions (likes, shares, comments), the search and usage patterns of users, and any other data generated as a result of the user’s actions and interactions with other users, advertisers, and the platform itself. It is unlikely that any universal format could capture this information in a form that could be transferred from one platform to another without a substantial loss of functionality, particularly for platforms that use algorithms to determine how information is presented to users based on past use. (The extreme case is a platform like TikTok which uses usage patterns as a substitute for follows, likes, and shares, portability to construct a “feed.”)

Moreover, as each platform sets its own rules for what information is allowed, the import functionality would have to screen the data for information allowed on the original platform but not the new (and the reverse would be impossible – a user switching from Twitter to Gab, for instance, would have no way to add the content that would have been permitted on Gab but was never created in the first place because it would have violated Twitter rules).

There is a deeper, philosophical issue at stake, however. Portability and neutrality proposals take for granted that users own “their” data. Users create data, either by themselves or with their friends and contacts, and the platform stores and displays the data, just as a safe deposit box holds documents or jewelry and a display case shows of an art collection. I should be able to remove my items from the safe deposit box and take them home or to another bank, and a “neutral” display case operator should not prevent me from showing off my preferred art (perhaps subject to some general rules about obscenity or harmful personal information).

These analogies do not hold for user-generated information on internet platforms, however. “My data” is a record of all my interactions with platforms, with other users on those platforms, with contractual partners of those platforms, and so on. It is co-created by these interactions. I don’t own these records any more than I “own” the fact that someone saw me in the grocery store yesterday buying apples. Of course, if I have a contract with the grocer that says he will keep my purchase records private, and he shares them with someone else, then I can sue him for breach of contract. But this isn’t theft. He hasn’t “stolen” anything; there is nothing for him to steal. If a grocer — or an owner of a tech platform — wants to attract my business by monetizing the records of our interactions and giving me a cut, he should go for it. I still might prefer another store. In any case, I don’t have the legal right to demand this revenue stream.

Likewise, “privacy” refers to what other people know about me – it is knowledge in their heads, not mine. Information isn’t property. If I know something about you, that knowledge is in my head; it’s not something I took from you. Of course, if I obtained or used that info in violation of a prior agreement, then I’m guilty of breach, and I use that information to threaten or harass you, I may be guilty of other crimes. But the popular idea that tech companies are stealing and profiting from something that’s “ours” isn’t right.

The concept of co-creation is important, because these digital records, like other co-created assets, can be more or less relationship specific. The late Oliver Williamson devoted his career to exploring the rich variety of contractual relationships devised by market participants to solve complex contracting problems, particularly in the face of asset specificity. Relationship-specific investments can be difficult for trading parties to manage, but they typically create more value. A legal regime in which only general-purpose, easily redeployable technologies were permitted would alleviate the holdup problem, but at the cost of a huge loss in efficiency. Likewise, a world in which all digital records must be fully portable reduces switching costs, but results in technologies for creating, storing, and sharing information that are less valuable. Why would platform operators invest in efficiency improvements if they cannot capture some of that value by means of proprietary formats, interfaces, sharing rules, and other arrangements?  

In short, we should not be quick to assume “market failure” in the market for privacy goods (or “true” news, whatever that is). Entrepreneurs operating in a competitive environment – not the static, partial-equilibrium notion of competition from intermediate micro texts but the rich, dynamic, complex, and multimarket kind of competition described in Petit’s book – can provide the levels of privacy and truthiness that consumers prefer.

Exit mobile version