Recent years have seen an increasing interest in incorporating privacy into antitrust analysis. The FTC and regulators in Europe have rejected these calls so far, but certain scholars and activists continue their attempts to breathe life into this novel concept. Elsewhere we have written at length on the scholarship addressing the issue and found the case for incorporation wanting. Among the errors proponents make is a persistent (and woefully unsubstantiated) assertion that online data can amount to a barrier to entry, insulating incumbent services from competition and ensuring that only the largest providers thrive. This data barrier to entry, it is alleged, can then allow firms with monopoly power to harm consumers, either directly through “bad acts” like price discrimination, or indirectly by raising the costs of advertising, which then get passed on to consumers.

A case in point was on display at last week’s George Mason Law & Economics Center Briefing on Big Data, Privacy, and Antitrust. Building on their growing body of advocacy work, Nathan Newman and Allen Grunes argued that this hypothesized data barrier to entry actually exists, and that it prevents effective competition from search engines and social networks that are interested in offering services with heightened privacy protections.

According to Newman and Grunes, network effects and economies of scale ensure that dominant companies in search and social networking (they specifically named Google and Facebook — implying that they are in separate markets) operate without effective competition. This results in antitrust harm, they assert, because it precludes competition on the non-price factor of privacy protection.

In other words, according to Newman and Grunes, even though Google and Facebook offer their services for a price of $0 and constantly innovate and upgrade their products, consumers are nevertheless harmed because the business models of less-privacy-invasive alternatives are foreclosed by insufficient access to data (an almost self-contradicting and silly narrative for many reasons, including the big question of whether consumers prefer greater privacy protection to free stuff). Without access to, and use of, copious amounts of data, Newman and Grunes argue, the algorithms underlying search and targeted advertising are necessarily less effective and thus the search product without such access is less useful to consumers. And even more importantly to Newman, the value to advertisers of the resulting consumer profiles is diminished.

Newman has put forth a number of other possible antitrust harms that purportedly result from this alleged data barrier to entry, as well. Among these is the increased cost of advertising to those who wish to reach consumers. Presumably this would harm end users who have to pay more for goods and services because the costs of advertising are passed on to them. On top of that, Newman argues that ad networks inherently facilitate price discrimination, an outcome that he asserts amounts to antitrust harm.

FTC Commissioner Maureen Ohlhausen (who also spoke at the George Mason event) recently made the case that antitrust law is not well-suited to handling privacy problems. She argues — convincingly — that competition policy and consumer protection should be kept separate to preserve doctrinal stability. Antitrust law deals with harms to competition through the lens of economic analysis. Consumer protection law is tailored to deal with broader societal harms and aims at protecting the “sanctity” of consumer transactions. Antitrust law can, in theory, deal with privacy as a non-price factor of competition, but this is an uneasy fit because of the difficulties of balancing quality over two dimensions: Privacy may be something some consumers want, but others would prefer a better algorithm for search and social networks, and targeted ads with free content, for instance.

In fact, there is general agreement with Commissioner Ohlhausen on her basic points, even among critics like Newman and Grunes. But, as mentioned above, views diverge over whether there are some privacy harms that should nevertheless factor into competition analysis, and on whether there is in fact  a data barrier to entry that makes these harms possible.

As we explain below, however, the notion of data as an antitrust-relevant barrier to entry is simply a myth. And, because all of the theories of “privacy as an antitrust harm” are essentially predicated on this, they are meritless.

First, data is useful to all industries — this is not some new phenomenon particular to online companies

It bears repeating (because critics seem to forget it in their rush to embrace “online exceptionalism”) that offline retailers also receive substantial benefit from, and greatly benefit consumers by, knowing more about what consumers want and when they want it. Through devices like coupons and loyalty cards (to say nothing of targeted mailing lists and the age-old practice of data mining check-out receipts), brick-and-mortar retailers can track purchase data and better serve consumers. Not only do consumers receive better deals for using them, but retailers know what products to stock and advertise and when and on what products to run sales. For instance:

  • Macy’s analyzes tens of millions of terabytes of data every day to gain insights from social media and store transactions. Over the past three years, the use of big data analytics alone has helped Macy’s boost its revenue growth by 4 percent annually.
  • Following its acquisition of Kosmix in 2011, Walmart established @WalmartLabs, which created its own product search engine for online shoppers. In the first year of its use alone, the number of customers buying a product on Walmart.com after researching a purchase increased by 20 percent. According to Ron Bensen, the vice president of engineering at @WalmartLabs, the combination of in-store and online data could give brick-and-mortar retailers like Walmart an advantage over strictly online stores.
  • Panera and a whole host of restaurants, grocery stores, drug stores and retailers use loyalty cards to advertise and learn about consumer preferences.

And of course there is a host of others uses for data, as well, including security, fraud prevention, product optimization, risk reduction to the insured, knowing what content is most interesting to readers, etc. The importance of data stretches far beyond the online world, and far beyond mere retail uses more generally. To describe even online giants like Amazon, Apple, Microsoft, Facebook and Google as having a monopoly on data is silly.

Second, it’s not the amount of data that leads to success but building a better mousetrap

The value of knowing someone’s birthday, for example, is not in that tidbit itself, but in the fact that you know this is a good day to give that person a present. Most of the data that supports the advertising networks underlying the Internet ecosphere is of this sort: Information is important to companies because of the value that can be drawn from it, not for the inherent value of the data itself. Companies don’t collect information about you to stalk you, but to better provide goods and services to you.

Moreover, data itself is not only less important than what can be drawn from it, but data is also less important than the underlying product it informs. For instance, Snapchat created a challenger to  Facebook so successfully (and in such short time) that Facebook attempted to buy it for $3 billion (Google offered $4 billion). But Facebook’s interest in Snapchat wasn’t about its data. Instead, Snapchat was valuable — and a competitive challenge to Facebook — because it cleverly incorporated the (apparently novel) insight that many people wanted to share information in a more private way.

Relatedly, Twitter, Instagram, LinkedIn, Yelp, Pinterest (and Facebook itself) all started with little (or no) data and they have had a lot of success. Meanwhile, despite its supposed data advantages, Google’s attempts at social networking — Google+ — have never caught up to Facebook in terms of popularity to users (and thus not to advertisers either). And scrappy social network Ello is starting to build a significant base without data collection for advertising at all.

At the same time it’s simply not the case that the alleged data giants — the ones supposedly insulating themselves behind data barriers to entry — actually have the type of data most relevant to startups anyway. As Andres Lerner has argued, if you wanted to start a travel business, the data from Kayak or Priceline would be far more relevant. Or if you wanted to start a ride-sharing business, data from cab companies would be more useful than the broad, market-cross-cutting profiles Google and Facebook have. Consider companies like Uber, Lyft and Sidecar that had no customer data when they began to challenge established cab companies that did possess such data. If data were really so significant, they could never have competed successfully. But Uber, Lyft and Sidecar have been able to effectively compete because they built products that users wanted to use — they came up with an idea for a better mousetrap.The data they have accrued came after they innovated, entered the market and mounted their successful challenges — not before.

In reality, those who complain about data facilitating unassailable competitive advantages have it exactly backwards. Companies need to innovate to attract consumer data, otherwise consumers will switch to competitors (including both new entrants and established incumbents). As a result, the desire to make use of more and better data drives competitive innovation, with manifestly impressive results: The continued explosion of new products, services and other apps is evidence that data is not a bottleneck to competition but a spur to drive it.

Third, competition online is one click or thumb swipe away; that is, barriers to entry and switching costs are low

Somehow, in the face of alleged data barriers to entry, competition online continues to soar, with newcomers constantly emerging and triumphing. This suggests that the barriers to entry are not so high as to prevent robust competition.

Again, despite the supposed data-based monopolies of Facebook, Google, Amazon, Apple and others, there exist powerful competitors in the marketplaces they compete in:

  • If consumers want to make a purchase, they are more likely to do their research on Amazon than Google.
  • Google flight search has failed to seriously challenge — let alone displace —  its competitors, as critics feared. Kayak, Expedia and the like remain the most prominent travel search sites — despite Google having literally purchased ITA’s trove of flight data and data-processing acumen.
  • People looking for local reviews go to Yelp and TripAdvisor (and, increasingly, Facebook) as often as Google.
  • Pinterest, one of the most highly valued startups today, is now a serious challenger to traditional search engines when people want to discover new products.
  • With its recent acquisition of the shopping search engine, TheFind, and test-run of a “buy” button, Facebook is also gearing up to become a major competitor in the realm of e-commerce, challenging Amazon.
  • Likewise, Amazon recently launched its own ad network, “Amazon Sponsored Links,” to challenge other advertising players.

Even assuming for the sake of argument that data creates a barrier to entry, there is little evidence that consumers cannot easily switch to a competitor. While there are sometimes network effects online, like with social networking, history still shows that people will switch. MySpace was considered a dominant network until it made a series of bad business decisions and everyone ended up on Facebook instead. Similarly, Internet users can and do use Bing, DuckDuckGo, Yahoo, and a plethora of more specialized search engines on top of and instead of Google. And don’t forget that Google itself was once an upstart new entrant that replaced once-household names like Yahoo and AltaVista.

Fourth, access to data is not exclusive

Critics like Newman have compared Google to Standard Oil and argued that government authorities need to step in to limit Google’s control over data. But to say data is like oil is a complete misnomer. If Exxon drills and extracts oil from the ground, that oil is no longer available to BP. Data is not finite in the same way. To use an earlier example, Google knowing my birthday doesn’t limit the ability of Facebook to know my birthday, as well. While databases may be proprietary, the underlying data is not. And what matters more than the data itself is how well it is analyzed.

This is especially important when discussing data online, where multi-homing is ubiquitous, meaning many competitors end up voluntarily sharing access to data. For instance, I can use the friend-finder feature on WordPress to find Facebook friends, Google connections, and people I’m following on Twitter who also use the site for blogging. Using this feature allows WordPress to access your contact list on these major online players.

Friend-Finder

Further, it is not apparent that Google’s competitors have less data available to them. Microsoft, for instance, has admitted that it may actually have more data. And, importantly for this discussion, Microsoft may have actually garnered some of its data for Bing from Google.

If Google has a high cost per click, then perhaps it’s because it is worth it to advertisers: There are more eyes on Google because of its superior search product. Contra Newman and Grunes, Google may just be more popular for consumers and advertisers alike because the algorithm makes it more useful, not because it has more data than everyone else.

Fifth, the data barrier to entry argument does not have workable antitrust remedies

The misguided logic of data barrier to entry arguments leaves a lot of questions unanswered. Perhaps most important among these is the question of remedies. What remedy would apply to a company found guilty of leveraging its market power with data?

It’s actually quite difficult to conceive of a practical means for a competition authority to craft remedies that would address the stated concerns without imposing enormous social costs. In the unilateral conduct context, the most obvious remedy would involve the forced sharing of data.

On the one hand, as we’ve noted, it’s not clear this would actually accomplish much. If competitors can’t actually make good use of data, simply having more of it isn’t going to change things. At the same time, such a result would reduce the incentive to build data networks to begin with. In their startup stage, companies like Uber and Facebook required several months and hundreds of thousands, if not millions, of dollars to design and develop just the first iteration of the products consumers love. Would any of them have done it if they had to share their insights? In fact, it may well be that access to these free insights is what competitors actually want; it’s not the data they’re lacking, but the vision or engineering acumen to use it.

Other remedies limiting collection and use of data are not only outside of the normal scope of antitrust remedies, they would also involve extremely costly court supervision and may entail problematic “collisions between new technologies and privacy rights,” as the last year’s White House Report on Big Data and Privacy put it.

It is equally unclear what an antitrust enforcer could do in the merger context. As Commissioner Ohlhausen has argued, blocking specific transactions does not necessarily stop data transfer or promote privacy interests. Parties could simply house data in a standalone entity and enter into licensing arrangements. And conditioning transactions with forced data sharing requirements would lead to the same problems described above.

If antitrust doesn’t provide a remedy, then it is not clear why it should apply at all. The absence of workable remedies is in fact a strong indication that data and privacy issues are not suitable for antitrust. Instead, such concerns would be better dealt with under consumer protection law or by targeted legislation.

As I explained in a recent Heritage Foundation Legal Memorandum, the Institute of Electrical and Electronics Engineers’ (IEEE) New Patent Policy (NPP) threatens to devalue patents that cover standards; discourage involvement by innovative companies in IEEE standard setting; and undermine support for strong patents, which are critical to economic growth and innovation.  The Legal Memorandum focused on how the NPP undermines patentees’ rights and reduces returns to patents that “read on” standards (“standard essential patents” or “SEPs”).  It did not, however, address the merits of the Justice Department Antitrust Division’s (DOJ) February 2 Business Review Letter (BRL), which found no antitrust problems with the NPP.

Unfortunately, the BRL does little more than opine on patent policy questions, such as the risk of patent “hold-up” that the NPP allegedly is designed to counteract.  The BRL is virtually bereft of antitrust analysis.  It states in conclusory fashion that the NPP is on the whole procompetitive, without coming to grips with the serious risks of monopsony and collusion, and reduced investment in standards-related innovation, inherent in the behavior that it analyzes.  (FTC Commissioner Wright and prominent economic consultant Greg Sidak expressed similar concerns about the BRL in a March 12 program on standard setting and patents hosted by the Heritage Foundation.)

Let’s examine the BRL in a bit more detail, drawing from a recent scholarly commentary by Stuart Chemtob.  The BRL eschews analyzing the risk that by sharply constraining expected returns to SEPs, the NPP’s requirements may disincentivize technology contributions to standards, harming innovation.  The BRL focuses on how the NPP may reduce patentee “hold-up” by effectively banning injunctions and highlighting three factors that limit royalties – basing royalties on the value of the smallest saleable unit, the value contributed to that unit in light of all the SEPs practiced the unit, and existing licenses covering the unit that were not obtained under threat of injunction.  The BRL essentially ignores, however, the very real problem of licensee “hold-out” by technology implementers who may gain artificial bargaining leverage over patentees.  Thus there is no weighing of the NPP’s anticompetitive risks against its purported procompetitive benefits.  This is particularly unfortunate, given the absence of hard evidence of hold-up.  (Very recently, the Federal Circuit in Ericsson v. D-Link denied jury instructions citing the possibility of hold-up, given D-Link’s failure to provide any evidence of hold-up.)   Also, by forbidding injunctive actions prior to first level appellate review, the NPP effectively precludes SEP holders from seeking exclusion orders against imports that infringe their patents, under Section 337 of the Tariff Act.  This eliminates a core statutory protection that helps shield American patentees from foreign anticompetitive harm, further debasing SEPs.  Furthermore, the BRL fails to assess the possible competitive harm firms may face if they fail to accede to the IEEE’s NPP.

Finally, and most disturbingly, the BRL totally ignores the overall thrust of the NPP – which is to encourage potential licensees to insist on anticompetitive terms that reduce returns to SEP holders below the competitive level.  Such terms, if jointly agreed to by potential licensees, could well be deemed a monopsony buyers’ cartel (with the potential licensees buying license rights), subject to summary antitrust condemnation in line with such precedents as Mandeville Island Farms and Todd v. Exxon.

In sum, the BRL is an embarrassingly one-sided document that would merit a failing grade as an antitrust exam essay.  DOJ would be wise to withdraw the letter or, at the very least, rewrite it from scratch, explaining that the NPP raises serious antitrust questions that merit close examination.  If it fails to do so, one can only conclude that DOJ has decided that it is suitable to use business review letters as vehicles for unsupported statements of patent policy preferences, rather than as serious, meticulously crafted memoranda of guidance on difficult antitrust questions.

The Wall Street Journal reported yesterday that the FTC Bureau of Competition staff report to the commissioners in the Google antitrust investigation recommended that the Commission approve an antitrust suit against the company.

While this is excellent fodder for a few hours of Twitter hysteria, it takes more than 140 characters to delve into the nuances of a 20-month federal investigation. And the bottom line is, frankly, pretty ho-hum.

As I said recently,

One of life’s unfortunate certainties, as predictable as death and taxes, is this: regulators regulate.

The Bureau of Competition staff is made up of professional lawyers — many of them litigators, whose existence is predicated on there being actual, you know, litigation. If you believe in human fallibility at all, you have to expect that, when they err, FTC staff errs on the side of too much, rather than too little, enforcement.

So is it shocking that the FTC staff might recommend that the Commission undertake what would undoubtedly have been one of the agency’s most significant antitrust cases? Hardly.

Nor is it surprising that the commissioners might not always agree with staff. In fact, staff recommendations are ignored all the time, for better or worse. Here are just a few examples: R.J Reynolds/Brown & Williamson merger, POM Wonderful , Home Shopping Network/QVC merger, cigarette advertising. No doubt there are many, many more.

Regardless, it also bears pointing out that the staff did not recommend the FTC bring suit on the central issue of search bias “because of the strong procompetitive justifications Google has set forth”:

Complainants allege that Google’s conduct is anticompetitive because if forecloses alternative search platforms that might operate to constrain Google’s dominance in search and search advertising. Although it is a close call, we do not recommend that the Commission issue a complaint against Google for this conduct.

But this caveat is enormous. To report this as the FTC staff recommending a case is seriously misleading. Here they are forbearing from bringing 99% of the case against Google, and recommending suit on the marginal 1% issues. It would be more accurate to say, “FTC staff recommends no case against Google, except on a couple of minor issues which will be immediately settled.”

And in fact it was on just these minor issues that Google agreed to voluntary commitments to curtail some conduct when the FTC announced it was not bringing suit against the company.

The Wall Street Journal quotes some other language from the staff report bolstering the conclusion that this is a complex market, the conduct at issue was ambiguous (at worst), and supporting the central recommendation not to sue:

We are faced with a set of facts that can most plausibly be accounted for by a narrative of mixed motives: one in which Google’s course of conduct was premised on its desire to innovate and to produce a high quality search product in the face of competition, blended with the desire to direct users to its own vertical offerings (instead of those of rivals) so as to increase its own revenues. Indeed, the evidence paints a complex portrait of a company working toward an overall goal of maintaining its market share by providing the best user experience, while simultaneously engaging in tactics that resulted in harm to many vertical competitors, and likely helped to entrench Google’s monopoly power over search and search advertising.

On a global level, the record will permit Google to show substantial innovation, intense competition from Microsoft and others, and speculative long-run harm.

This is exactly when you want antitrust enforcers to forbear. Predicting anticompetitive effects is difficult, and conduct that could be problematic is simultaneously potentially vigorous competition.

That the staff concluded that some of what Google was doing “harmed competitors” isn’t surprising — there were lots of competitors parading through the FTC on a daily basis claiming Google harmed them. But antitrust is about protecting consumers, not competitors. Far more important is the staff finding of “substantial innovation, intense competition from Microsoft and others, and speculative long-run harm.”

Indeed, the combination of “substantial innovation,” “intense competition from Microsoft and others,” and “Google’s strong procompetitive justifications” suggests a well-functioning market. It similarly suggests an antitrust case that the FTC would likely have lost. The FTC’s litigators should probably be grateful that the commissioners had the good sense to vote to close the investigation.

Meanwhile, the Wall Street Journal also reports that the FTC’s Bureau of Economics simultaneously recommended that the Commission not bring suit at all against Google. It is not uncommon for the lawyers and the economists at the Commission to disagree. And as a general (though not inviolable) rule, we should be happy when the Commissioners side with the economists.

While the press, professional Google critics, and the company’s competitors may want to make this sound like a big deal, the actual facts of the case and a pretty simple error-cost analysis suggests that not bringing a case was the correct course.

In short, all of this hand-wringing over privacy is largely a tempest in a teapot — especially when one considers the extent to which the White House and other government bodies have studiously ignored the real threat: government misuse of data à la the NSA. It’s almost as if the White House is deliberately shifting the public’s gaze from the reality of extensive government spying by directing it toward a fantasy world of nefarious corporations abusing private information….

The White House’s proposed bill is emblematic of many government “fixes” to largely non-existent privacy issues, and it exhibits the same core defects that undermine both its claims and its proposed solutions. As a result, the proposed bill vastly overemphasizes regulation to the dangerous detriment of the innovative benefits of Big Data for consumers and society at large.

Rate this:

Continue Reading...

In a recent post, I explained how the U.S. Supreme Court’s February 25 opinion in North Carolina Dental Board v. FTC (holding that a state regulatory board controlled by market participants must be “actively supervised” by the state to receive antitrust immunity) struck a significant blow against protectionist rent-seeking and for economic liberty.  Maureen Ohlhausen, who has spoken out against special interest government regulation as an FTC Commissioner (and formerly as Director of the FTC’s Office of Policy Planning), will discuss the ramifications of the Court’s North Carolina Dental decision in a March 31 luncheon speech at the Heritage Foundation.  Senior Attorney Clark Neily of the Institute for Justice and Misha Tseytlin, General Counsel in the West Virginia Attorney General’s Office, will provide expert commentary on the Commissioner’s speech.  You can register for this event here.

Today, the International Center for Law & Economics released a white paper, co-authored by Executive Director Geoffrey Manne and Senior Fellow Julian Morris, entitled Dangerous Exception: The detrimental effects of including “fair use” copyright exceptions in free trade agreements.

Dangerous Exception explores the relationship between copyright, creativity and economic development in a networked global marketplace. In particular, it examines the evidence for and against mandating a U.S.-style fair use exception to copyright via free trade agreements like the Trans-Pacific Partnership (TPP), and through “fast-track” trade promotion authority (TPA).

In the context of these ongoing trade negotiations, some organizations have been advocating for the inclusion of dramatically expanded copyright exceptions in place of more limited language requiring that such exceptions conform to the “three-step test” implemented by the 1994 TRIPs Agreement.

The paper argues that if broad fair use exceptions are infused into trade agreements they could increase piracy and discourage artistic creation and innovation — especially in nations without a strong legal tradition implementing such provisions.

The expansion of digital networks across borders, combined with historically weak copyright enforcement in many nations, poses a major challenge to a broadened fair use exception. The modern digital economy calls for appropriate, but limited, copyright exceptions — not their expansion.

The white paper is available here. For some of our previous work on related issues, see:

On Wednesday, March 18, our fellow law-and-economics-focused brethren at George Mason’s Law and Economics Center will host a very interesting morning briefing on the intersection of privacy, big data, consumer protection, and antitrust. FTC Commissioner Maureen Ohlhausen will keynote and she will be followed by what looks like will be a lively panel discussion. If you are in DC you can join in person, but you can also watch online. More details below.
Please join the LEC in person or online for a morning of lively discussion on this topic. FTC Commissioner Maureen K. Ohlhausen will set the stage by discussing her Antitrust Law Journal article, “Competition, Consumer Protection and The Right [Approach] To Privacy“. A panel discussion on big data and antitrust, which includes some of the leading thinkers on the subject, will follow.
Other featured speakers include:

Allen P. Grunes
Founder, The Konkurrenz Group and Data Competition Institute

Andres Lerner
Executive Vice President, Compass Lexecon

Darren S. Tucker
Partner, Morgan Lewis

Nathan Newman
Director, Economic and Technology Strategies LLC

Moderator: James C. Cooper
Director, Research and Policy, Law & Economics Center

A full agenda is available click here.

Anybody who has spent much time with children knows how squishy a concept “unfairness” can be.  One can hear the exchange, “He’s not being fair!” “No, she’s not!,” only so many times before coming to understand that unfairness is largely in the eye of the beholder.

Perhaps it’s unfortunate, then, that Congress chose a century ago to cast the Federal Trade Commission’s authority in terms of preventing “unfair methods of competition.”  But that’s what it did, and the question now is whether there is some way to mitigate this “eye of the beholder” problem.

There is.

We know that any business practice that violates the substantive antitrust laws (the Sherman and Clayton Acts) is an unfair method of competition, so we can look to Sherman and Clayton Act precedents to assess the “unfairness” of business practices that those laws reach.  But what about the Commission’s so-called “standalone” UMC authority—its power to prevent business practices that seem to impact competition unfairly but are not technically violations of the substantive antitrust laws?

Almost two years ago, Commissioner Josh Wright recognized that if the FTC’s standalone UMC authority is to play a meaningful role in assuring market competition, the Commission should issue guidelines on what constitutes an unfair method of competition. He was right.  The Commission, you see, really has only four options with respect to standalone Section 5 claims:

  1. It could bring standalone actions based on current commissioners’ considered judgments about what constitutes unfairness. Such an approach, though, is really inconsistent with the rule of law. Past commissioners, for example, have gone so far as to suggest that practices causing “resource depletion, energy waste, environmental contamination, worker alienation, [and] the psychological and social consequences of producer-stimulated demands” could be unfair methods of competition. Maybe our current commissioners wouldn’t cast so wide a net, but they’re not always going to be in power. A government of laws and not of men simply can’t mete out state power on the basis of whim.
  2. It could bring standalone actions based on unfairness principles appearing in Section 5’s “common law.” The problem here is that there is no such common law. As Commissioner Wright has observed and I have previously explained, a common law doesn’t just happen. Development of a common law requires vigorously litigated disputes and reasoned, published opinions that resolve those disputes and serve as precedent. Section 5 “litigation,” such as it is, doesn’t involve any of that.
    • First, standalone Section 5 disputes tend not to be vigorously litigated. Because the FTC acts as both prosecutor and judge in such actions, their outcome is nearly a foregone conclusion. When FTC staff win before the administrative law judge, the ALJ’s decision is always affirmed by the full commission; when staff loses with the ALJ, the full Commission always reverses. Couple this stacked deck with the fact that unfairness exists in the eye of the beholder and will therefore change with the composition of the Commission, and we end up with a situation in which accused parties routinely settle. As Commissioner Wright observes, “parties will typically prefer to settle a Section 5 claim rather than go through lengthy and costly litigation in which they are both shooting at a moving target and have the chips stacked against them.”
    • The consent decrees that memorialize settlements, then, offer little prospective guidance. They usually don’t include any detailed explanation of why the practice at issue was an unfair method of competition. Even if they did, it wouldn’t matter much; the Commission doesn’t treat its own enforcement decisions as precedent. In light of the realities of Section 5 litigation, there really is no Section 5 common law.
  3. It could refrain from bringing standalone Section 5 actions and pursue only business practices that violate the substantive antitrust laws. Substantive antitrust violations constitute unfair methods of competition, and the federal courts have established fairly workable principles for determining when business practices violate the Sherman and Clayton Acts. The FTC could therefore avoid the “eye of the beholder” problem by limiting its UMC authority to business conduct that violates the antitrust laws. Such an approach, though, would prevent the FTC from policing conduct that, while not technically an antitrust violation, is anticompetitive and injurious to consumers.
  4. It could bring standalone Section 5 actions based on articulated guidelines establishing what constitutes an unfair method of competition. This is really the only way to use Section 5 to pursue business practices that are not otherwise antitrust violations, without offending the rule of law.

Now, if the FTC is to take this fourth approach—the only one that both allows for standalone Section 5 actions and honors rule of law commitments—it obviously has to settle on a set of guidelines.  Fortunately, it has almost done so!

Since Commissioner Wright called for Section 5 guidelines almost two years ago, much ink has been spilled outlining and critiquing proposed guidelines.  Commissioner Wright got the ball rolling by issuing his own proposal along with his call for the adoption of guidelines.  Commissioner Ohlhausen soon followed suit, proposing a slightly broader set of principles.  Numerous commentators then joined the conversation (a number doing so in a TOTM symposium), and each of the other commissioners has now stated her own views.

A good deal of consensus has emerged.  Each commissioner agrees that Section 5 should be used to prosecute only conduct that is actually anticompetitive (as defined by the federal courts).  There is also apparent consensus on the view that standalone Section 5 authority should not be used to challenge conduct governed by well-forged liability principles under the Sherman and Clayton Acts.  (For example, a practice routinely evaluated under Section 2 of the Sherman Act should not be pursued using standalone Section 5 authority.)  The commissioners, and the vast majority of commentators, also agree that there should be some efficiencies screen in prosecution decisions.  The remaining disagreement centers on the scope of the efficiencies screen—i.e., how much of an efficiency benefit must a business practice confer in order to be insulated from standalone Section 5 liability?

On that narrow issue—the only legitimate point of dispute remaining among the commissioners—three views have emerged:  Commissioner Wright would refrain from prosecuting if the conduct at issue creates any cognizable efficiencies; Commissioner Ohlhausen would do so as long as the efficiencies are not disproportionately outweighed by anticompetitive harms; Chairwoman Ramirez would engage in straightforward balancing (not a “disproportionality” inquiry) and would refrain from prosecution only where efficiencies outweigh anticompetitive harms.

That leaves three potential sets of guidelines.  In each, it would be necessary that a behavior subject to any standalone Section 5 action (1) create actual or likely anticompetitive harm, and (2) not be subject to well-forged case law under the traditional antitrust laws (so that pursuing the action might cause the distinction between lawful and unlawful commercial behavior to become blurred).  Each of the three sets of guidelines would also include an efficiencies screen—either (3a) the conduct lacks cognizable efficiencies, (3b) the harms created by the conduct are disproportionate to the conduct’s cognizable efficiencies, or (3c) the harms created by the conduct are not outweighed by cognizable efficiencies.

As Commissioner Wright has observed any one of these sets of guidelines would be superior to the status quo.  Accordingly, if the commissioners could agree on the acceptability of any of them, they could improve the state of U.S. competition law.

Recognizing as much, Commissioner Wright is wisely calling on the commissioners to vote on the acceptability of each set of guidelines.  If any set is deemed acceptable by a majority of commissioners, it should be promulgated as official FTC Guidance.  (Presumably, if more than one set commands majority support, the set that most restrains FTC enforcement authority would be the one promulgated as FTC Guidance.)

Of course, individual commissioners might just choose not to vote.  That would represent a sad abdication of authority.  Given that there isn’t (and under current practice, there can’t be) a common law of Section 5, failure to vote on a set of guidelines would effectively cast a vote for either option 1 stated above (ignore rule of law values) or option 3 (limit Section 5’s potential to enhance consumer welfare).  Let’s hope our commissioners don’t relegate us to those options.

The debate has occurred.  It’s time to vote.

On February 13 an administrative law judge (ALJ) at the California Public Utility Commission (CPUC) issued a proposed decision regarding the Comcast/Time Warner Cable (TWC) merger. The proposed decision recommends that the CPUC approve the merger with conditions.

It’s laudable that the ALJ acknowledges at least some of the competitive merits of the proposed deal. But the set of conditions that the proposed decision would impose on the combined company in order to complete the merger represents a remarkable set of unauthorized regulations that are both inappropriate for the deal and at odds with California’s legislated approach to regulation of the Internet.

According to the proposed decision, every condition it imposes is aimed at mitigating a presumed harm arising from the merger:

The Applicants must meet the conditions adopted herein in order to provide reasonable assurance that the proposed transaction will be in the public interest in accordance with Pub. Util. Code § 854(a) and (c).… We only adopt conditions which mitigate an effect of the merger in order to satisfy the public interest requirements of § 854.

By any reasonable interpretation, this would mean that the CPUC can adopt only those conditions that address specific public interest concerns arising from the deal itself. But most of the conditions in the proposed decision fail this basic test and seem designed to address broader social policy issues that have nothing to do with the alleged competitive effects of the deal.

Instead, without undertaking an analysis of the merger’s competitive effects, the proposed decision effectively accepts that the merger serves the public interest, while also simply accepting the assertions of the merger’s opponents that it doesn’t. In the name of squaring that circle, the proposed decision seeks to permit the merger to proceed, but then seeks to force the post-merger company to conform to the merger’s critics’ rather arbitrary view of their preferred market structure for the provision of cable broadband services in California.

For something — say, a merger — to be in the public interest, it need not further every conceivable public interest goal. This is a perversion of the standard, and it turns “public interest” into an unconstrained license to impose a regulatory wish-list on particular actors, outside of the scope of usual regulatory processes.

While a few people may have no problem with the proposed decision’s expansive vision of Internet access regulation, California governor Jerry Brown and the overwhelming majority of the California state legislature cannot be counted among the supporters of this approach.

In 2012 the state legislature passed by an overwhelming margin — and Governor Brown signed — SB 1161 (codified as Section 710 of the California Public Utilities Code), which expressly prohibits the CPUC from regulating broadband:

The commission shall not exercise regulatory jurisdiction or control over Voice over Internet Protocol and Internet Protocol enabled services except as required or expressly delegated by federal law or expressly directed to do so by statute or as set forth in [certain enumerated exceptions].”

The message is clear: The CPUC should not try to bypass clear state law and all institutional safeguards by misusing the merger clearance process.

While bipartisan majorities in the state house, supported by a Democratic governor, have stopped the CPUC from imposing new regulations on Internet and VoIP services through SB 1161, the proposed decision seeks to impose regulations through merger conditions that go far beyond anything permitted by this state law.

For instance, the proposed decision seeks to impose arbitrary retail price controls on broadband access:

Comcast shall offer to all customers of the merged companies, for a period of five years following the effective date of the parent company merger, the opportunity to purchase stand-alone broadband Internet service at a price not to exceed the price charged by Time Warner for providing that service to its customers, and at speeds, prices, and terms, at least comparable to that offered by Time Warner prior to the merger’s closing.

And the proposed decision seeks to mandate market structure in other insidious ways, as well, mandating specific broadband speeds, requiring a break-neck geographic expansion of Comcast’s service area, and dictating installation and service times, among other things — all without regard to the actual plausibility (or cost) of implementing such requirements.

But the problem is even more acute. Not only does the proposed decision seek to regulate Internet access issues irrelevant to the merger, it also proposes to impose conditions that would actually undermine competition.

The proposed decision would impose the following conditions on Comcast’s business VoIP and business Internet services:

Comcast shall offer Time Warner’s Business Calling Plan with Stand Alone Internet Access to interested CLECs throughout the combined service territories of the merging companies for a period of five years from the effective date of the parent company merger at existing prices, terms and conditions.

Comcast shall offer Time Warner’s Carrier Ethernet Last Mile Access product to interested CLECs throughout the combined service territories of the merging companies for a period of five years from the effective date of the parent company at the same prices, terms and conditions as offered by Time Warner prior to the merger.

But the proposed decision fails to recognize that Comcast is an also-ran in the business service market. Last year it served a small fraction of the business customers served by AT&T and Verizon, who have long dominated the business services market:

According to a Sept. 2011 ComScore survey, AT&T and Verizon had the largest market shares of all business services ISPs. AT&T held 20% of market share and Verizon held 12%. Comcast ranked 6th, with 5% of market share.

The proposed conditions would hamstring the upstart challenger Comcast by removing both product and pricing flexibility for five years – an eternity in rapidly evolving technology markets. That’s a sure-fire way to minimize competition, not promote it.

The proposed decision reiterates several times its concern that the combined Comcast/Time Warner Cable will serve more than 80% of California households, and “reduce[] the possibilities for content providers to reach the California broadband market.” The alleged concern is that the combined company could exercise anticompetitive market power — imposing artificially high fees for carrying content or degrading service of unaffiliated content and services.

The problem is Comcast and TWC don’t compete anywhere in California today, and they face competition from other providers everywhere they operate. As the decision matter-of-factly states:

Comcast and Time Warner do not compete with one another… [and] Comcast and Time Warner compete with other providers of Internet access services in their respective service territories.

As a result, the merger will actually have no effect on the number of competitive choices in the state; the increase in the statewide market share as a result of the deal is irrelevant. And so these purported competition concerns can’t be the basis for any conditions, let alone the sweeping ones set out in the proposed decision.

The stated concern about content providers finding it difficult to reach Californians is a red herring: the post-merger Comcast geographic footprint will be exactly the same as the combined, pre-merger Comcast/TWC/Charter footprint. Content providers will be able to access just as many Californians (and with greater speeds) as before the merger.

True, content providers that just want to reach some number of random Californians may have to reach more of them through Comcast than they would have before the merger. But what content provider just wants to reach some number of Californians in the first place? Moreover, this fundamentally misstates the way the Internet works: it is users who reach the content they prefer; not the other way around. And, once again, for literally every consumer in the state, the number of available options for doing so won’t change one iota following the merger.

Nothing shows more clearly how the proposed decision has strayed from responding to merger concerns to addressing broader social policy issues than the conditions aimed at expanding low-price broadband offerings for underserved households. Among other things, the proposed conditions dramatically increase the size and scope of Comcast’s Internet Essentials program, converting this laudable effort from a targeted program (that uses a host of tools to connect families where a child is eligible for the National School Lunch Program to the Internet) into one that must serve all low-income adults.

Putting aside the damage this would do to the core Internet Essentials’ mission of connecting school age children by diverting resources from the program’s central purpose, it is manifestly outside the scope of the CPUC’s review. Nothing in the deal affects the number of adults (or children, for that matter) in California without broadband.

It’s possible, of course, that Comcast might implement something like an expanded Internet Essentials program without any prodding; after all, companies implement (and expand) such programs all the time. But why on earth should regulators be able to define such an obligation arbitrarily, and to impose it on whatever ISP happens to be asking for a license transfer? That arbitrariness creates precisely the sort of business uncertainty that SB 1161 was meant to prevent.

The same thing applies to the proposed decision’s requirement regarding school and library broadband connectivity:

Comcast shall connect and/or upgrade Internet infrastructure for K-12 schools and public libraries in unserved and underserved areas in Comcast’s combined California service territory so that it is providing high speed Internet to at least the same proportion of K-12 schools and public libraries in such unserved and underserved areas as it provides to the households in its service territory.

No doubt improving school and library infrastructure is a noble goal — and there’s even a large federal subsidy program (E-Rate) devoted to it. But insisting that Comcast do so — and do so to an extent unsupported by the underlying federal subsidy program already connecting such institutions, and in contravention of existing provider contracts with schools — as a condition of the merger is simple extortion.

The CPUC is treating the proposed merger like a free-for-all, imposing in the name of the “public interest” a set of conditions that it would never be permitted to impose absent the gun-to-the-head of merger approval. Moreover, it seeks to remake California’s broadband access landscape in a fashion that would likely never materialize in the natural course of competition: If the merger doesn’t go through, none of the conditions in the proposed decision and alleged to be necessary to protect the public interest will exist.

Far from trying to ensure that Comcast’s merger with TWC doesn’t erode competitive forces to the detriment of the public, the proposed decision is trying to micromanage the market, simply asserting that the public interest demands imposition of it’s subjective and arbitrary laundry list of preferred items. This isn’t sensible regulation, it isn’t compliant with state law, and it doesn’t serve the people of California.

In its February 25 North Carolina Dental decision, the U.S. Supreme Court, per Justice Anthony Kennedy, held that a state regulatory board that is controlled by market participants in the industry being regulated cannot invoke “state action” antitrust immunity unless it is “actively supervised” by the state.  In so ruling, the Court struck a significant blow against protectionist rent-seeking and for economic liberty.  (As I stated in a recent Heritage Foundation legal memorandum, “[a] Supreme Court decision accepting this [active supervision] principle might help to curb special-interest favoritism conferred through state law.  At the very least, it could complicate the efforts of special interests to protect themselves from competition through regulation.”)

A North Carolina law subjects the licensing of dentistry to a North Carolina State Board of Dental Examiners (Board), six of whose eight members must be licensed dentists.  After dentists complained to the Board that non-dentists were charging lower prices than dentists for teeth whitening, the Board sent cease-and-desist letter to non-dentist teeth whitening providers, warning that the unlicensed practice dentistry is a crime.  This led non-dentists to cease teeth whitening services in North Carolina.  The Federal Trade Commission (FTC) held that the Board’s actions violated Section 5 of the FTC Act, which prohibits unfair methods of competition, the Fourth Circuit agreed, and the Court affirmed the Fourth Circuit’s decision.

In its decision, the Court rejected the claim that state action immunity, which confers immunity on the anticompetitive conduct of states acting in their sovereign capacity, applied to the Board’s actions.  The Court stressed that where a state delegates control over a market to a non-sovereign actor, immunity applies only if the state accepts political accountability by actively supervising that actor’s decisions.  The Court applied its Midcal test, which requires (1) clear state articulation and (2) active state supervision of decisions by non-sovereign actors for immunity to attach.  The Court held that entities designated as state agencies are not exempt from active supervision when they are controlled by market participants, because allowing an exemption in such circumstances would pose the risk of self-dealing that the second prong of Midcal was created to address.

Here, the Board did not contend that the state exercised any (let alone active) supervision over its anticompetitive conduct.  The Court closed by summarizing “a few constant requirements of active supervision,” namely, (1) the supervisor must review the substance of the anticompetitive decision, (2) the supervisor must have the power to veto or modify particular decisions for consistency with state policy, (3) “the mere potential for state supervision is not an adequate substitute for a decision by the State,” and (4) “the state supervisor may not itself be an active market participant.”  The Court cautioned, however, that “the adequacy of supervision otherwise will depend on all the circumstances of a case.”

Justice Samuel Alito, joined by Justices Antonin Scalia and Clarence Thomas, dissented, arguing that the Court ignored precedent that state agencies created by the state legislature (“[t]he Board is not a private or ‘nonsovereign’ entity”) are shielded by the state action doctrine.  “By straying from this simple path” and assessing instead whether individual agencies are subject to regulatory capture, the Court spawned confusion, according to the dissenters.  Midcal was inapposite, because it involved a private trade association.  The dissenters feared that the majority’s decision may require states “to change the composition of medical, dental, and other boards, but it is not clear what sort of changes are needed to satisfy the test that the Court now adopts.”  The dissenters concluded “that determining when regulatory capture has occurred is no simple task.  That answer provides a reason for relieving courts from the obligation to make such determinations at all.  It does not explain why it is appropriate for the Court to adopt the rather crude test for capture that constitutes the holding of today’s decision.”

The Court’s holding in North Carolina Dental helpfully limits the scope of the Court’s infamous Parker v. Brown decision (which shielded from federal antitrust attack a California raisin producers’ cartel overseen by a state body), without excessively interfering in sovereign state prerogatives.  State legislatures may still choose to create self-interested professional regulatory bodies – their sovereignty is not compromised.  Now, however, they will have to (1) make it clearer up front that they intend to allow those bodies to displace competition, and (2) subject those bodies to disinterested third party review.  These changes should make it far easier for competition advocates (including competition agencies) to spot and publicize welfare-inimical regulatory schemes, and weaken the incentive and ability of rent-seekers to undermine competition through state regulatory processes.  All told, the burden these new judicially-imposed constraints will impose on the states appears relatively modest, and should be far outweighed by the substantial welfare benefits they are likely to generate.