Archives For prioritization

Just in time for tomorrow’s FCC vote on repeal of its order classifying Internet Service Providers as common carriers, the St. Louis Post-Dispatch has published my op-ed entitled The FCC Should Abandon Title II and Return to Antitrust.

Here’s the full text:

The Federal Communications Commission (FCC) will soon vote on whether to repeal an Obama-era rule classifying Internet Service Providers (ISPs) as “common carriers.” That rule was put in place to achieve net neutrality, an attractive-sounding goal that many Americans—millennials especially—reflexively support.

In Missouri, voices as diverse as the St. Louis Post-Dispatch, the Joplin Globe, and the Archdiocese of St. Louis have opposed repeal of the Obama-era rule.

Unfortunately, few people who express support for net neutrality understand all it entails. Even fewer recognize the significant dangers of pursuing net neutrality using the means the Obama-era FCC selected. All many know is that they like neutrality generally and that smart-sounding celebrities like John Oliver support the Obama-era rule. They really need to know more.

First, it’s important to understand what a policy of net neutrality entails. In essence, it prevents ISPs from providing faster or better transmission of some Internet content, even where the favored content provider is willing to pay for prioritization.

That sounds benign—laudable, even—until one considers all that such a policy prevents. Under strict net neutrality, an ISP couldn’t prioritize content transmission in which congestion delays ruin the user experience (say, an Internet videoconference between a telemedicine system operated by the University of Missouri hospital and a rural resident of Dent County) over transmissions in which delays are less detrimental (say, downloads from a photo-sharing site).
Strict net neutrality would also preclude a mobile broadband provider from exempting popular content providers from data caps. Indeed, T-Mobile was hauled before the FCC to justify its popular “Binge On” service, which offered cost-conscious subscribers unlimited access to Netflix, ESPN, and HBO.

The fact is, ISPs have an incentive to manage their traffic in whatever way most pleases subscribers. The vast majority of Americans have a choice of ISPs, so managing content in any manner that adversely affects the consumer experience would hurt business. ISPs are also motivated to design subscription packages that consumers most desire. They shouldn’t have to seek government approval of innovative offerings.

For evidence that competition protects consumers from harmful instances of non-neutral network management, consider the record. The commercial Internet was born, thrived, and became the brightest spot in the American economy without formal net neutrality rules. History provides little reason to believe that the parade of horribles net neutrality advocates imagine will ever materialize.

Indeed, in seeking to justify its net neutrality policies, the Obama era FCC could come up with only four instances of harmful non-neutral network management over the entire history of the commercial Internet. That should come as no surprise. Background antitrust rules, in place long before the Internet was born, forbid the speculative harms net neutrality advocates envision.

Even if net neutrality regulation were desirable as a policy matter, the means by which the FCC secured it was entirely inappropriate. Before it adopted the current approach, which reclassified ISPs as common carriers subject to Title II of the 1934 Communications Act, the FCC was crafting a narrower approach using authority granted by the 1996 Telecommunications Act.

It abruptly changed course after President Obama, reeling from a shellacking in the 2014 midterm elections, sought to shore up his base by posting a video calling for “the strongest possible rules” on net neutrality, including Title II reclassification. Prodded by the President, the supposedly independent commissioners abandoned their consensus that Title II was too extreme and voted along party lines to treat the Internet as a utility.

Title II reclassification has resulted in the sort of “Mother, may I?” regulatory approach that impedes innovation and investment. In the first half of 2015, as the Commission was formulating its new Title II approach, spending by ISPs on capital equipment fell by an average of 8%. That was only the third time in the history of the commercial Internet that infrastructure investment fell from the previous year. The other two times were in 2001, following the dot.com bust, and 2009, after the 2008 financial crash and ensuing recession. For those remote communities in Missouri still looking for broadband to reach their doorsteps, government policies need to incentivize more investment, not restrict it.

To enhance innovation and encourage broadband deployment, the FCC should reverse its damaging Title II order and leave concerns about non-neutral network management to antitrust law. It was doing just fine.

As the Federal Communications (FCC) prepares to revoke its economically harmful “net neutrality” order and replace it with a free market-oriented “Restoring Internet Freedom Order,” the FCC and the Federal Trade Commission (FTC) commendably have announced a joint policy for cooperation on online consumer protection.  According to a December 11 FTC press release:

The Federal Trade Commission and Federal Communications Commission (FCC) announced their intent to enter into a Memorandum of Understanding (MOU) under which the two agencies would coordinate online consumer protection efforts following the adoption of the Restoring Internet Freedom Order.

“The Memorandum of Understanding will be a critical benefit for online consumers because it outlines the robust process by which the FCC and FTC will safeguard the public interest,” said FCC Chairman Ajit Pai. “Instead of saddling the Internet with heavy-handed regulations, we will work together to take targeted action against bad actors. This approach protected a free and open Internet for many years prior to the FCC’s 2015 Title II Order and it will once again following the adoption of the Restoring Internet Freedom Order.”

“The FTC is committed to ensuring that Internet service providers live up to the promises they make to consumers,” said Acting FTC Chairman Maureen K. Ohlhausen. “The MOU we are developing with the FCC, in addition to the decades of FTC law enforcement experience in this area, will help us carry out this important work.”

The draft MOU, which is being released today, outlines a number of ways in which the FCC and FTC will work together to protect consumers, including:

The FCC will review informal complaints concerning the compliance of Internet service providers (ISPs) with the disclosure obligations set forth in the new transparency rule. Those obligations include publicly providing information concerning an ISP’s practices with respect to blocking, throttling, paid prioritization, and congestion management. Should an ISP fail to make the required disclosures—either in whole or in part—the FCC will take enforcement action.

The FTC will investigate and take enforcement action as appropriate against ISPs concerning the accuracy of those disclosures, as well as other deceptive or unfair acts or practices involving their broadband services.

The FCC and the FTC will broadly share legal and technical expertise, including the secure sharing of informal complaints regarding the subject matter of the Restoring Internet Freedom Order. The two agencies also will collaborate on consumer and industry outreach and education.

The FCC’s proposed Restoring Internet Freedom Order, which the agency is expected to vote on at its December 14 meeting, would reverse a 2015 agency decision to reclassify broadband Internet access service as a Title II common carrier service. This previous decision stripped the FTC of its authority to protect consumers and promote competition with respect to Internet service providers because the FTC does not have jurisdiction over common carrier activities.

The FCC’s Restoring Internet Freedom Order would return jurisdiction to the FTC to police the conduct of ISPs, including with respect to their privacy practices. Once adopted, the order will also require broadband Internet access service providers to disclose their network management practices, performance, and commercial terms of service. As the nation’s top consumer protection agency, the FTC will be responsible for holding these providers to the promises they make to consumers.

Particularly noteworthy is the suggestion that the FCC and FTC will work to curb regulatory duplication and competitive empire building – a boon to Internet-related businesses that would be harmed by regulatory excess and uncertainty.  Stay tuned for future developments.

As I explain in my new book, How to Regulate, sound regulation requires thinking like a doctor.  When addressing some “disease” that reduces social welfare, policymakers should catalog the available “remedies” for the problem, consider the implementation difficulties and “side effects” of each, and select the remedy that offers the greatest net benefit.

If we followed that approach in deciding what to do about the way Internet Service Providers (ISPs) manage traffic on their networks, we would conclude that FCC Chairman Ajit Pai is exactly right:  The FCC should reverse its order classifying ISPs as common carriers (Title II classification) and leave matters of non-neutral network management to antitrust, the residual regulator of practices that may injure competition.

Let’s walk through the analysis.

Diagnose the Disease.  The primary concern of net neutrality advocates is that ISPs will block some Internet content or will slow or degrade transmission from content providers who do not pay for a “fast lane.”  Of course, if an ISP’s non-neutral network management impairs the user experience, it will lose business; the vast majority of Americans have access to multiple ISPs, and competition is growing by the day, particularly as mobile broadband expands.

But an ISP might still play favorites, despite the threat of losing some subscribers, if it has a relationship with content providers.  Comcast, for example, could opt to speed up content from HULU, which streams programming of Comcast’s NBC subsidiary, or might slow down content from Netflix, whose streaming video competes with Comcast’s own cable programming.  Comcast’s losses in the distribution market (from angry consumers switching ISPs) might be less than its gains in the content market (from reducing competition there).

It seems, then, that the “disease” that might warrant a regulatory fix is an anticompetitive vertical restraint of trade: a business practice in one market (distribution) that could restrain trade in another market (content production) and thereby reduce overall output in that market.

Catalog the Available Remedies.  The statutory landscape provides at least three potential remedies for this disease.

The simplest approach would be to leave the matter to antitrust, which applies in the absence of more focused regulation.  In recent decades, courts have revised the standards governing vertical restraints of trade so that antitrust, which used to treat such restraints in a ham-fisted fashion, now does a pretty good job separating pro-consumer restraints from anti-consumer ones.

A second legally available approach would be to craft narrowly tailored rules precluding ISPs from blocking, degrading, or favoring particular Internet content.  The U.S. Court of Appeals for the D.C. Circuit held that Section 706 of the 1996 Telecommunications Act empowered the FCC to adopt targeted net neutrality rules, even if ISPs are not classified as common carriers.  The court insisted the that rules not treat ISPs as common carriers (if they are not officially classified as such), but it provided a road map for tailored net neutrality rules. The FCC pursued this targeted, rules-based approach until President Obama pushed for a third approach.

In November 2014, reeling from a shellacking in the  midterm elections and hoping to shore up his base, President Obama posted a video calling on the Commission to assure net neutrality by reclassifying ISPs as common carriers.  Such reclassification would subject ISPs to Title II of the 1934 Communications Act, giving the FCC broad power to assure that their business practices are “just and reasonable.”  Prodded by the President, the nominally independent commissioners abandoned their targeted, rules-based approach and voted to regulate ISPs like utilities.  They then used their enhanced regulatory authority to impose rules forbidding the blocking, throttling, or paid prioritization of Internet content.

Assess the Remedies’ Limitations, Implementation Difficulties, and Side Effects.   The three legally available remedies — antitrust, tailored rules under Section 706, and broad oversight under Title II — offer different pros and cons, as I explained in How to Regulate:

The choice between antitrust and direct regulation generally (under either Section 706 or Title II) involves a tradeoff between flexibility and determinacy. Antitrust is flexible but somewhat indeterminate; it would condemn non-neutral network management practices that are likely to injure consumers, but it would permit such practices if they would lower costs, improve quality, or otherwise enhance consumer welfare. The direct regulatory approaches are rigid but clearer; they declare all instances of non-neutral network management to be illegal per se.

Determinacy and flexibility influence decision and error costs.  Because they are more determinate, ex ante rules should impose lower decision costs than would antitrust. But direct regulation’s inflexibility—automatic condemnation, no questions asked—will generate higher error costs. That’s because non-neutral network management is often good for end users. For example, speeding up the transmission of content for which delivery lags are particularly detrimental to the end-user experience (e.g., an Internet telephone call, streaming video) at the expense of content that is less lag-sensitive (e.g., digital photographs downloaded from a photo-sharing website) can create a net consumer benefit and should probably be allowed. A per se rule against non-neutral network management would therefore err fairly frequently. Antitrust’s flexible approach, informed by a century of economic learning on the output effects of contractual restraints between vertically related firms (like content producers and distributors), would probably generate lower error costs.

Although both antitrust and direct regulation offer advantages vis-à-vis each other, this isn’t simply a wash. The error cost advantage antitrust holds over direct regulation likely swamps direct regulation’s decision cost advantage. Extensive experience with vertical restraints on distribution have shown that they are usually good for consumers. For that reason, antitrust courts in recent decades have discarded their old per se rules against such practices—rules that resemble the FCC’s direct regulatory approach—in favor of structured rules of reason that assess liability based on specific features of the market and restraint at issue. While these rules of reason (standards, really) may be less determinate than the old, error-prone per se rules, they are not indeterminate. By relying on past precedents and the overarching principle that legality turns on consumer welfare effects, business planners and adjudicators ought to be able to determine fairly easily whether a non-neutral network management practice passes muster. Indeed, the fact that the FCC has uncovered only four instances of anticompetitive network management over the commercial Internet’s entire history—a period in which antitrust, but not direct regulation, has governed ISPs—suggests that business planners are capable of determining what behavior is off-limits. Direct regulation’s per se rule against non-neutral network management is thus likely to add error costs that exceed any reduction in decision costs. It is probably not the remedy that would be selected under this book’s recommended approach.

In any event, direct regulation under Title II, the currently prevailing approach, is certainly not the optimal way to address potentially anticompetitive instances of non-neutral network management by ISPs. Whereas any ex ante   regulation of network management will confront the familiar knowledge problem, opting for direct regulation under Title II, rather than the more cabined approach under Section 706, adds adverse public choice concerns to the mix.

As explained earlier, reclassifying ISPs to bring them under Title II empowers the FCC to scrutinize the “justice” and “reasonableness” of nearly every aspect of every arrangement between content providers, ISPs, and consumers. Granted, the current commissioners have pledged not to exercise their Title II authority beyond mandating network neutrality, but public choice insights would suggest that this promised forbearance is unlikely to endure. FCC officials, who remain self-interest maximizers even when acting in their official capacities, benefit from expanding their regulatory turf; they gain increased power and prestige, larger budgets to manage, a greater ability to “make or break” businesses, and thus more opportunity to take actions that may enhance their future career opportunities. They will therefore face constant temptation to exercise the Title II authority that they have committed, as of now, to leave fallow. Regulated businesses, knowing that FCC decisions are key to their success, will expend significant resources lobbying for outcomes that benefit them or impair their rivals. If they don’t get what they want because of the commissioners’ voluntary forbearance, they may bring legal challenges asserting that the Commission has failed to assure just and reasonable practices as Title II demands. Many of the decisions at issue will involve the familiar “concentrated benefits/diffused costs” dynamic that tends to result in underrepresentation by those who are adversely affected by a contemplated decision. Taken together, these considerations make it unlikely that the current commissioners’ promised restraint will endure. Reclassification of ISPs so that they are subject to Title II regulation will probably lead to additional constraints on edge providers and ISPs.

It seems, then, that mandating net neutrality under Title II of the 1934 Communications Act is the least desirable of the three statutorily available approaches to addressing anticompetitive network management practices. The Title II approach combines the inflexibility and ensuing error costs of the Section 706 direct regulation approach with the indeterminacy and higher decision costs of an antitrust approach. Indeed, the indeterminacy under Title II is significantly greater than that under antitrust because the “just and reasonable” requirements of the Communications Act, unlike antitrust’s reasonableness requirements (no unreasonable restraint of trade, no unreasonably exclusionary conduct) are not constrained by the consumer welfare principle. Whereas antitrust always protects consumers, not competitors, the FCC may well decide that business practices in the Internet space are unjust or unreasonable solely because they make things harder for the perpetrator’s rivals. Business planners are thus really “at sea” when it comes to assessing the legality of novel practices.

All this implies that Internet businesses regulated by Title II need to court the FCC’s favor, that FCC officials have more ability than ever to manipulate government power to private ends, that organized interest groups are well-poised to secure their preferences when the costs are great but widely dispersed, and that the regulators’ dictated outcomes—immune from market pressures reflecting consumers’ preferences—are less likely to maximize net social welfare. In opting for a Title II solution to what is essentially a market power problem, the powers that be gave short shrift to an antitrust approach, even though there was no natural monopoly justification for direct regulation. They paid little heed to the adverse consequences likely to result from rigid per se rules adopted under a highly discretionary (and politically manipulable) standard. They should have gone back to basics, assessing the disease to be remedied (market power), the full range of available remedies (including antitrust), and the potential side effects of each. In other words, they could’ve used this book.

How to Regulate‘s full discussion of net neutrality and Title II is here:  Net Neutrality Discussion in How to Regulate.

Last week the editorial board of the Washington Post penned an excellent editorial responding to the European Commission’s announcement of its decision in its Google Shopping investigation. Here’s the key language from the editorial:

Whether the demise of any of [the complaining comparison shopping sites] is specifically traceable to Google, however, is not so clear. Also unclear is the aggregate harm from Google’s practices to consumers, as opposed to the unlucky companies. Birkenstock-seekers may well prefer to see a Google-generated list of vendors first, instead of clicking around to other sites…. Those who aren’t happy anyway have other options. Indeed, the rise of comparison shopping on giants such as Amazon and eBay makes concerns that Google might exercise untrammeled power over e-commerce seem, well, a bit dated…. Who knows? In a few years we might be talking about how Facebook leveraged its 2 billion users to disrupt the whole space.

That’s actually a pretty thorough, if succinct, summary of the basic problems with the Commission’s case (based on its PR and Factsheet, at least; it hasn’t released the full decision yet).

I’ll have more to say on the decision in due course, but for now I want to elaborate on two of the points raised by the WaPo editorial board, both in service of its crucial rejoinder to the Commission that “Also unclear is the aggregate harm from Google’s practices to consumers, as opposed to the unlucky companies.”

First, the WaPo editorial board points out that:

Birkenstock-seekers may well prefer to see a Google-generated list of vendors first, instead of clicking around to other sites.

It is undoubtedly true that users “may well prefer to see a Google-generated list of vendors first.” It’s also crucial to understanding the changes in Google’s search results page that have given rise to the current raft of complaints.

As I noted in a Wall Street Journal op-ed two years ago:

It’s a mistake to consider “general search” and “comparison shopping” or “product search” to be distinct markets.

From the moment it was technologically feasible to do so, Google has been adapting its traditional search results—that familiar but long since vanished page of 10 blue links—to offer more specialized answers to users’ queries. Product search, which is what is at issue in the EU complaint, is the next iteration in this trend.

Internet users today seek information from myriad sources: Informational sites (Wikipedia and the Internet Movie Database); review sites (Yelp and TripAdvisor); retail sites (Amazon and eBay); and social-media sites (Facebook and Twitter). What do these sites have in common? They prioritize certain types of data over others to improve the relevance of the information they provide.

“Prioritization” of Google’s own shopping results, however, is the core problem for the Commission:

Google has systematically given prominent placement to its own comparison shopping service: when a consumer enters a query into the Google search engine in relation to which Google’s comparison shopping service wants to show results, these are displayed at or near the top of the search results. (Emphasis in original).

But this sort of prioritization is the norm for all search, social media, e-commerce and similar platforms. And this shouldn’t be a surprise: The value of these platforms to the user is dependent upon their ability to sort the wheat from the chaff of the now immense amount of information coursing about the Web.

As my colleagues and I noted in a paper responding to a methodologically questionable report by Tim Wu and Yelp leveling analogous “search bias” charges in the context of local search results:

Google is a vertically integrated company that offers general search, but also a host of other products…. With its well-developed algorithm and wide range of products, it is hardly surprising that Google can provide not only direct answers to factual questions, but also a wide range of its own products and services that meet users’ needs. If consumers choose Google not randomly, but precisely because they seek to take advantage of the direct answers and other options that Google can provide, then removing the sort of “bias” alleged by [complainants] would affirmatively hurt, not help, these users. (Emphasis added).

And as Josh Wright noted in an earlier paper responding to yet another set of such “search bias” charges (in that case leveled in a similarly methodologically questionable report by Benjamin Edelman and Benjamin Lockwood):

[I]t is critical to recognize that bias alone is not evidence of competitive harm and it must be evaluated in the appropriate antitrust economic context of competition and consumers, rather individual competitors and websites. Edelman & Lockwood´s analysis provides a useful starting point for describing how search engines differ in their referrals to their own content. However, it is not useful from an antitrust policy perspective because it erroneously—and contrary to economic theory and evidence—presumes natural and procompetitive product differentiation in search rankings to be inherently harmful. (Emphasis added).

We’ll have to see what kind of analysis the Commission relies upon in its decision to reach its conclusion that prioritization is an antitrust problem, but there is reason to be skeptical that it will turn out to be compelling. The Commission states in its PR that:

The evidence shows that consumers click far more often on results that are more visible, i.e. the results appearing higher up in Google’s search results. Even on a desktop, the ten highest-ranking generic search results on page 1 together generally receive approximately 95% of all clicks on generic search results (with the top result receiving about 35% of all the clicks). The first result on page 2 of Google’s generic search results receives only about 1% of all clicks. This cannot just be explained by the fact that the first result is more relevant, because evidence also shows that moving the first result to the third rank leads to a reduction in the number of clicks by about 50%. The effects on mobile devices are even more pronounced given the much smaller screen size.

This means that by giving prominent placement only to its own comparison shopping service and by demoting competitors, Google has given its own comparison shopping service a significant advantage compared to rivals. (Emphasis added).

Whatever truth there is in the characterization that placement is more important than relevance in influencing user behavior, the evidence cited by the Commission to demonstrate that doesn’t seem applicable to what’s happening on Google’s search results page now.

Most crucially, the evidence offered by the Commission refers only to how placement affects clicks on “generic search results” and glosses over the fact that the “prominent placement” of Google’s “results” is not only a difference in position but also in the type of result offered.

Google Shopping results (like many of its other “vertical results” and direct answers) are very different than the 10 blue links of old. These “universal search” results are, for one thing, actual answers rather than merely links to other sites. They are also more visually rich and attractively and clearly displayed.

Ironically, Tim Wu and Yelp use the claim that users click less often on Google’s universal search results to support their contention that increased relevance doesn’t explain Google’s prioritization of its own content. Yet, as we note in our response to their study:

[I]f a consumer is using a search engine in order to find a direct answer to a query rather than a link to another site to answer it, click-through would actually represent a decrease in consumer welfare, not an increase.

In fact, the study fails to incorporate this dynamic even though it is precisely what the authors claim the study is measuring.

Further, as the WaPo editorial intimates, these universal search results (including Google Shopping results) are quite plausibly more valuable to users. As even Tim Wu and Yelp note:

No one truly disagrees that universal search, in concept, can be an important innovation that can serve consumers.

Google sees it exactly this way, of course. Here’s Tim Wu and Yelp again:

According to Google, a principal difference between the earlier cases and its current conduct is that universal search represents a pro-competitive, user-serving innovation. By deploying universal search, Google argues, it has made search better. As Eric Schmidt argues, “if we know the answer it is better for us to answer that question so [the user] doesn’t have to click anywhere, and in that sense we… use data sources that are our own because we can’t engineer it any other way.”

Of course, in this case, one would expect fewer clicks to correlate with higher value to users — precisely the opposite of the claim made by Tim Wu and Yelp, which is the surest sign that their study is faulty.

But the Commission, at least according to the evidence cited in its PR, doesn’t even seem to measure the relative value of the very different presentations of information at all, instead resting on assertions rooted in the irrelevant difference in user propensity to click on generic (10 blue links) search results depending on placement.

Add to this Pinar Akman’s important point that Google Shopping “results” aren’t necessarily search results at all, but paid advertising:

[O]nce one appreciates the fact that Google’s shopping results are simply ads for products and Google treats all ads with the same ad-relevant algorithm and all organic results with the same organic-relevant algorithm, the Commission’s order becomes impossible to comprehend. Is the Commission imposing on Google a duty to treat non-sponsored results in the same way that it treats sponsored results? If so, does this not provide an unfair advantage to comparison shopping sites over, for example, Google’s advertising partners as well as over Amazon, eBay, various retailers, etc…?

Randy Picker also picks up on this point:

But those Google shopping boxes are ads, Picker told me. “I can’t imagine what they’re thinking,” he said. “Google is in the advertising business. That’s how it makes its money. It has no obligation to put other people’s ads on its website.”

The bottom line here is that the WaPo editorial board does a better job characterizing the actual, relevant market dynamics in a single sentence than the Commission seems to have done in its lengthy releases summarizing its decision following seven full years of investigation.

The second point made by the WaPo editorial board to which I want to draw attention is equally important:

Those who aren’t happy anyway have other options. Indeed, the rise of comparison shopping on giants such as Amazon and eBay makes concerns that Google might exercise untrammeled power over e-commerce seem, well, a bit dated…. Who knows? In a few years we might be talking about how Facebook leveraged its 2 billion users to disrupt the whole space.

The Commission dismisses this argument in its Factsheet:

The Commission Decision concerns the effect of Google’s practices on comparison shopping markets. These offer a different service to merchant platforms, such as Amazon and eBay. Comparison shopping services offer a tool for consumers to compare products and prices online and find deals from online retailers of all types. By contrast, they do not offer the possibility for products to be bought on their site, which is precisely the aim of merchant platforms. Google’s own commercial behaviour reflects these differences – merchant platforms are eligible to appear in Google Shopping whereas rival comparison shopping services are not.

But the reality is that “comparison shopping,” just like “general search,” is just one technology among many for serving information and ads to consumers online. Defining the relevant market or limiting the definition of competition in terms of the particular mechanism that Google (or Foundem, or Amazon, or Facebook…) happens to use doesn’t reflect the extent of substitutability between these different mechanisms.

Properly defined, the market in which Google competes online is not search, but something more like online “matchmaking” between advertisers, retailers and consumers. And this market is enormously competitive. The same goes for comparison shopping.

And the fact that Amazon and eBay “offer the possibility for products to be bought on their site” doesn’t take away from the fact that they also “offer a tool for consumers to compare products and prices online and find deals from online retailers of all types.” Not only do these sites contain enormous amounts of valuable (and well-presented) information about products, including product comparisons and consumer reviews, but they also actually offer comparisons among retailers. In fact, Fifty percent of the items sold through Amazon’s platform, for example, are sold by third-party retailers — the same sort of retailers that might also show up on a comparison shopping site.

More importantly, though, as the WaPo editorial rightly notes, “[t]hose who aren’t happy anyway have other options.” Google just isn’t the indispensable gateway to the Internet (and definitely not to shopping on the Internet) that the Commission seems to think.

Today over half of product searches in the US start on Amazon. The majority of web page referrals come from Facebook. Yelp’s most engaged users now access it via its app (which has seen more than 3x growth in the past five years). And a staggering 40 percent of mobile browsing on both Android and iOS now takes place inside the Facebook app.

Then there are “closed” platforms like the iTunes store and innumerable other apps that handle copious search traffic (including shopping-related traffic) but also don’t figure in the Commission’s analysis, apparently.

In fact, billions of users reach millions of companies every day through direct browser navigation, social media, apps, email links, review sites, blogs, and countless other means — all without once touching Google.com. So-called “dark social” interactions (email, text messages, and IMs) drive huge amounts of some of the most valuable traffic on the Internet, in fact.

All of this, in turn, has led to a competitive scramble to roll out completely new technologies to meet consumers’ informational (and merchants’ advertising) needs. The already-arriving swarm of VR, chatbots, digital assistants, smart-home devices, and more will offer even more interfaces besides Google through which consumers can reach their favorite online destinations.

The point is this: Google’s competitors complaining that the world is evolving around them don’t need to rely on Google. That they may choose to do so does not saddle Google with an obligation to ensure that they can always do so.

Antitrust laws — in Europe, no less than in the US — don’t require Google or any other firm to make life easier for competitors. That’s especially true when doing so would come at the cost of consumer-welfare-enhancing innovations. The Commission doesn’t seem to have grasped this fundamental point, however.

The WaPo editorial board gets it, though:

The immense size and power of all Internet giants are a legitimate focus for the antitrust authorities on both sides of the Atlantic. Brussels vs. Google, however, seems to be a case of punishment without crime.

I recently published a piece in the Hill welcoming the Canadian Supreme Court’s decision in Google v. Equustek. In this post I expand (at length) upon my assessment of the case.

In its decision, the Court upheld injunctive relief against Google, directing the company to avoid indexing websites offering the infringing goods in question, regardless of the location of the sites (and even though Google itself was not a party in the case nor in any way held liable for the infringement). As a result, the Court’s ruling would affect Google’s conduct outside of Canada as well as within it.

The case raises some fascinating and thorny issues, but, in the end, the Court navigated them admirably.

Some others, however, were not so… welcoming of the decision (see, e.g., here and here).

The primary objection to the ruling seems to be, in essence, that it is the top of a slippery slope: “If Canada can do this, what’s to stop Iran or China from doing it? Free expression as we know it on the Internet will cease to exist.”

This is a valid concern, of course — in the abstract. But for reasons I explain below, we should see this case — and, more importantly, the approach adopted by the Canadian Supreme Court — as reassuring, not foreboding.

Some quick background on the exercise of extraterritorial jurisdiction in international law

The salient facts in, and the fundamental issue raised by, the case were neatly summarized by Hugh Stephens:

[The lower Court] issued an interim injunction requiring Google to de-index or delist (i.e. not return search results for) the website of a firm (Datalink Gateways) that was marketing goods online based on the theft of trade secrets from Equustek, a Vancouver, B.C., based hi-tech firm that makes sophisticated industrial equipment. Google wants to quash a decision by the lower courts on several grounds, primarily that the basis of the injunction is extra-territorial in nature and that if Google were to be subject to Canadian law in this case, this could open a Pandora’s box of rulings from other jurisdictions that would require global delisting of websites thus interfering with freedom of expression online, and in effect “break the Internet”.

The question of jurisdiction with regard to cross-border conduct is clearly complicated and evolving. But, in important ways, it isn’t anything new just because the Internet is involved. As Jack Goldsmith and Tim Wu (yes, Tim Wu) wrote (way back in 2006) in Who Controls the Internet?: Illusions of a Borderless World:

A government’s responsibility for redressing local harms caused by a foreign source does not change because the harms are caused by an Internet communication. Cross-border harms that occur via the Internet are not any different than those outside the Net. Both demand a response from governmental authorities charged with protecting public values.

As I have written elsewhere, “[g]lobal businesses have always had to comply with the rules of the territories in which they do business.”

Traditionally, courts have dealt with the extraterritoriality problem by applying a rule of comity. As my colleague, Geoffrey Manne (Founder and Executive Director of ICLE), reminds me, the principle of comity largely originated in the work of the 17th Century Dutch legal scholar, Ulrich Huber. Huber wrote that comitas gentium (“courtesy of nations”) required the application of foreign law in certain cases:

[Sovereigns will] so act by way of comity that rights acquired within the limits of a government retain their force everywhere so far as they do not cause prejudice to the powers or rights of such government or of their subjects.

And, notably, Huber wrote that:

Although the laws of one nation can have no force directly with another, yet nothing could be more inconvenient to commerce and to international usage than that transactions valid by the law of one place should be rendered of no effect elsewhere on account of a difference in the law.

The basic principle has been recognized and applied in international law for centuries. Of course, the flip side of the principle is that sovereign nations also get to decide for themselves whether to enforce foreign law within their jurisdictions. To summarize Huber (as well as Lord Mansfield, who brought the concept to England, and Justice Story, who brought it to the US):

All three jurists were concerned with deeply polarizing public issues — nationalism, religious factionalism, and slavery. For each, comity empowered courts to decide whether to defer to foreign law out of respect for a foreign sovereign or whether domestic public policy should triumph over mere courtesy. For each, the court was the agent of the sovereign’s own public law.

The Canadian Supreme Court’s well-reasoned and admirably restrained approach in Equustek

Reconciling the potential conflict between the laws of Canada and those of other jurisdictions was, of course, a central subject of consideration for the Canadian Court in Equustek. The Supreme Court, as described below, weighed a variety of factors in determining the appropriateness of the remedy. In analyzing the competing equities, the Supreme Court set out the following framework:

[I]s there a serious issue to be tried; would the person applying for the injunction suffer irreparable harm if the injunction were not granted; and is the balance of convenience in favour of granting the interlocutory injunction or denying it. The fundamental question is whether the granting of an injunction is just and equitable in all of the circumstances of the case. This will necessarily be context-specific. [Here, as throughout this post, bolded text represents my own, added emphasis.]

Applying that standard, the Court held that because ordering an interlocutory injunction against Google was the only practical way to prevent Datalink from flouting the court’s several orders, and because there were no sufficient, countervailing comity or freedom of expression concerns in this case that would counsel against such an order being granted, the interlocutory injunction was appropriate.

I draw particular attention to the following from the Court’s opinion:

Google’s argument that a global injunction violates international comity because it is possible that the order could not have been obtained in a foreign jurisdiction, or that to comply with it would result in Google violating the laws of that jurisdiction is, with respect, theoretical. As Fenlon J. noted, “Google acknowledges that most countries will likely recognize intellectual property rights and view the selling of pirated products as a legal wrong”.

And while it is always important to pay respectful attention to freedom of expression concerns, particularly when dealing with the core values of another country, I do not see freedom of expression issues being engaged in any way that tips the balance of convenience towards Google in this case. As Groberman J.A. concluded:

In the case before us, there is no realistic assertion that the judge’s order will offend the sensibilities of any other nation. It has not been suggested that the order prohibiting the defendants from advertising wares that violate the intellectual property rights of the plaintiffs offends the core values of any nation. The order made against Google is a very limited ancillary order designed to ensure that the plaintiffs’ core rights are respected.

In fact, as Andrew Keane Woods writes at Lawfare:

Under longstanding conflicts of laws principles, a court would need to weigh the conflicting and legitimate governments’ interests at stake. The Canadian court was eager to undertake that comity analysis, but it couldn’t do so because the necessary ingredient was missing: there was no conflict of laws.

In short, the Canadian Supreme Court, while acknowledging the importance of comity and appropriate restraint in matters with extraterritorial effect, carefully weighed the equities in this case and found that they favored the grant of extraterritorial injunctive relief. As the Court explained:

Datalink [the direct infringer] and its representatives have ignored all previous court orders made against them, have left British Columbia, and continue to operate their business from unknown locations outside Canada. Equustek has made efforts to locate Datalink with limited success. Datalink is only able to survive — at the expense of Equustek’s survival — on Google’s search engine which directs potential customers to Datalink’s websites. This makes Google the determinative player in allowing the harm to occur. On balance, since the world‑wide injunction is the only effective way to mitigate the harm to Equustek pending the trial, the only way, in fact, to preserve Equustek itself pending the resolution of the underlying litigation, and since any countervailing harm to Google is minimal to non‑existent, the interlocutory injunction should be upheld.

As I have stressed, key to the Court’s reasoning was its close consideration of possible countervailing concerns and its entirely fact-specific analysis. By the very terms of the decision, the Court made clear that its balancing would not necessarily lead to the same result where sensibilities or core values of other nations would be offended. In this particular case, they were not.

How critics of the decision (and there are many) completely miss the true import of the Court’s reasoning

In other words, the holding in this case was a function of how, given the facts of the case, the ruling would affect the particular core concerns at issue: protection and harmonization of global intellectual property rights on the one hand, and concern for the “sensibilities of other nations,” including their concern for free expression, on the other.

This should be deeply reassuring to those now criticizing the decision. And yet… it’s not.

Whether because they haven’t actually read or properly understood the decision, or because they are merely grandstanding, some commenters are proclaiming that the decision marks the End Of The Internet As We Know It — you know, it’s going to break the Internet. Or something.

Human Rights Watch, an organization I generally admire, issued a statement including the following:

The court presumed no one could object to delisting someone it considered an intellectual property violator. But other countries may soon follow this example, in ways that more obviously force Google to become the world’s censor. If every country tries to enforce its own idea of what is proper to put on the Internet globally, we will soon have a race to the bottom where human rights will be the loser.

The British Columbia Civil Liberties Association added:

Here it was technical details of a product, but you could easily imagine future cases where we might be talking about copyright infringement, or other things where people in private lawsuits are wanting things to be taken down off  the internet that are more closely connected to freedom of expression.

From the other side of the traditional (if insufficiently nuanced) “political spectrum,” AEI’s Ariel Rabkin asserted that

[O]nce we concede that Canadian courts can regulate search engine results in Turkey, it is hard to explain why a Turkish court shouldn’t have the reciprocal right. And this is no hypothetical — a Turkish court has indeed ordered Twitter to remove a user (AEI scholar Michael Rubin) within the United States for his criticism of Erdogan. Once the jurisdictional question is decided, it is no use raising free speech as an issue. Other countries do not have our free speech norms, nor Canada’s. Once Canada concedes that foreign courts have the right to regulate Canadian search results, they are on the internet censorship train, and there is no egress before the end of the line.

In this instance, in particular, it is worth noting not only the complete lack of acknowledgment of the Court’s articulated constraints on taking action with extraterritorial effect, but also the fact that Turkey (among others) has hardly been waiting for approval from Canada before taking action.   

And then there’s EFF (of course). EFF, fairly predictably, suggests first — with unrestrained hyperbole — that the Supreme Court held that:

A country has the right to prevent the world’s Internet users from accessing information.

Dramatic hyperbole aside, that’s also a stilted way to characterize the content at issue in the case. But it is important to EFF’s misleading narrative to begin with the assertion that offering infringing products for sale is “information” to which access by the public is crucial. But, of course, the distribution of infringing products is hardly “expression,” as most of us would understand that term. To claim otherwise is to denigrate the truly important forms of expression that EFF claims to want to protect.

And, it must be noted, even if there were expressive elements at issue, infringing “expression” is always subject to restriction under the copyright laws of virtually every country in the world (and free speech laws, where they exist).

Nevertheless, EFF writes that the decision:

[W]ould cut off access to information for U.S. users would set a dangerous precedent for online speech. In essence, it would expand the power of any court in the world to edit the entire Internet, whether or not the targeted material or site is lawful in another country. That, we warned, is likely to result in a race to the bottom, as well-resourced individuals engage in international forum-shopping to impose the one country’s restrictive laws regarding free expression on the rest of the world.

Beyond the flaws of the ruling itself, the court’s decision will likely embolden other countries to try to enforce their own speech-restricting laws on the Internet, to the detriment of all users. As others have pointed out, it’s not difficult to see repressive regimes such as China or Iran use the ruling to order Google to de-index sites they object to, creating a worldwide heckler’s veto.

As always with EFF missives, caveat lector applies: None of this is fair or accurate. EFF (like the other critics quoted above) is looking only at the result — the specific contours of the global order related to the Internet — and not to the reasoning of the decision itself.

Quite tellingly, EFF urges its readers to ignore the case in front of them in favor of a theoretical one. That is unfortunate. Were EFF, et al. to pay closer attention, they would be celebrating this decision as a thoughtful, restrained, respectful, and useful standard to be employed as a foundational decision in the development of global Internet governance.

The Canadian decision is (as I have noted, but perhaps still not with enough repetition…) predicated on achieving equity upon close examination of the facts, and giving due deference to the sensibilities and core values of other nations in making decisions with extraterritorial effect.

Properly understood, the ruling is a shield against intrusions that undermine freedom of expression, and not an attack on expression.

EFF subverts the reasoning of the decision and thus camouflages its true import, all for the sake of furthering its apparently limitless crusade against all forms of intellectual property. The ruling can be read as an attack on expression only if one ascribes to the distribution of infringing products the status of protected expression — so that’s what EFF does. But distribution of infringing products is not protected expression.

Extraterritoriality on the Internet is complicated — but that undermines, rather than justifies, critics’ opposition to the Court’s analysis

There will undoubtedly be other cases that present more difficult challenges than this one in defining the jurisdictional boundaries of courts’ abilities to address Internet-based conduct with multi-territorial effects. But the guideposts employed by the Supreme Court of Canada will be useful in informing such decisions.

Of course, some states don’t (or won’t, when it suits them), adhere to principles of comity. But that was true long before the Equustek decision. And, frankly, the notion that this decision gives nations like China or Iran political cover for global censorship is ridiculous. Nations that wish to censor the Internet will do so regardless. If anything, reference to this decision (which, let me spell it out again, highlights the importance of avoiding relief that would interfere with core values or sensibilities of other nations) would undermine their efforts.

Rather, the decision will be far more helpful in combating censorship and advancing global freedom of expression. Indeed, as noted by Hugh Stephens in a recent blog post:

While the EFF, echoed by its Canadian proxy OpenMedia, went into hyperventilation mode with the headline, “Top Canadian Court permits Worldwide Internet Censorship”, respected organizations like the Canadian Civil Liberties Association (CCLA) welcomed the decision as having achieved the dual objectives of recognizing the importance of freedom of expression and limiting any order that might violate that fundamental right. As the CCLA put it,

While today’s decision upholds the worldwide order against Google, it nevertheless reflects many of the freedom of expression concerns CCLA had voiced in our interventions in this case.

As I noted in my piece in the Hill, this decision doesn’t answer all of the difficult questions related to identifying proper jurisdiction and remedies with respect to conduct that has global reach; indeed, that process will surely be perpetually unfolding. But, as reflected in the comments of the Canadian Civil Liberties Association, it is a deliberate and well-considered step toward a fair and balanced way of addressing Internet harms.

With apologies for quoting myself, I noted the following in an earlier piece:

I’m not unsympathetic to Google’s concerns. As a player with a global footprint, Google is legitimately concerned that it could be forced to comply with the sometimes-oppressive and often contradictory laws of countries around the world. But that doesn’t make it — or any other Internet company — unique. Global businesses have always had to comply with the rules of the territories in which they do business… There will be (and have been) cases in which taking action to comply with the laws of one country would place a company in violation of the laws of another. But principles of comity exist to address the problem of competing demands from sovereign governments.

And as Andrew Keane Woods noted:

Global takedown orders with no limiting principle are indeed scary. But Canada’s order has a limiting principle. As long as there is room for Google to say to Canada (or France), “Your order will put us in direct and significant violation of U.S. law,” the order is not a limitless assertion of extraterritorial jurisdiction. In the instance that a service provider identifies a conflict of laws, the state should listen.

That is precisely what the Canadian Supreme Court’s decision contemplates.

No one wants an Internet based on the lowest common denominator of acceptable speech. Yet some appear to want an Internet based on the lowest common denominator for the protection of original expression. These advocates thus endorse theories of jurisdiction that would deny societies the ability to enforce their own laws, just because sometimes those laws protect intellectual property.

And yet that reflects little more than an arbitrary prioritization of those critics’ personal preferences. In the real world (including the real online world), protection of property is an important value, deserving reciprocity and courtesy (comity) as much as does speech. Indeed, the G20 Digital Economy Ministerial Declaration adopted in April of this year recognizes the importance to the digital economy of promoting security and trust, including through the provision of adequate and effective intellectual property protection. Thus the Declaration expresses the recognition of the G20 that:

[A]pplicable frameworks for privacy and personal data protection, as well as intellectual property rights, have to be respected as they are essential to strengthening confidence and trust in the digital economy.

Moving forward in an interconnected digital universe will require societies to make a series of difficult choices balancing both competing values and competing claims from different jurisdictions. Just as it does in the offline world, navigating this path will require flexibility and skepticism (if not rejection) of absolutism — including with respect to the application of fundamental values. Even things like freedom of expression, which naturally require a balancing of competing interests, will need to be reexamined. We should endeavor to find that fine line between allowing individual countries to enforce their own national judgments and a tolerance for those countries that have made different choices. This will not be easy, as well manifested in something that Alice Marwick wrote earlier this year:

But a commitment to freedom of speech above all else presumes an idealistic version of the internet that no longer exists. And as long as we consider any content moderation to be censorship, minority voices will continue to be drowned out by their aggressive majority counterparts.

* * *

We need to move beyond this simplistic binary of free speech/censorship online. That is just as true for libertarian-leaning technologists as it is neo-Nazi provocateurs…. Aggressive online speech, whether practiced in the profanity and pornography-laced environment of 4Chan or the loftier venues of newspaper comments sections, positions sexism, racism, and anti-Semitism (and so forth) as issues of freedom of expression rather than structural oppression.

Perhaps we might want to look at countries like Canada and the United Kingdom, which take a different approach to free speech than does the United States. These countries recognize that unlimited free speech can lead to aggression and other tactics which end up silencing the speech of minorities — in other words, the tyranny of the majority. Creating online communities where all groups can speak may mean scaling back on some of the idealism of the early internet in favor of pragmatism. But recognizing this complexity is an absolutely necessary first step.

While I (and the Canadian Supreme Court, for that matter) share EFF’s unease over the scope of extraterritorial judgments, I fundamentally disagree with EFF that the Equustek decision “largely sidesteps the question of whether such a global order would violate foreign law or intrude on Internet users’ free speech rights.”

In fact, it is EFF’s position that comes much closer to a position indifferent to the laws and values of other countries; in essence, EFF’s position would essentially always prioritize the particular speech values adopted in the US, regardless of whether they had been adopted by the countries affected in a dispute. It is therefore inconsistent with the true nature of comity.

Absolutism and exceptionalism will not be a sound foundation for achieving global consensus and the effective operation of law. As stated by the Canadian Supreme Court in Equustek, courts should enforce the law — whatever the law is — to the extent that such enforcement does not substantially undermine the core sensitivities or values of nations where the order will have effect.

EFF ignores the process in which the Court engaged precisely because EFF — not another country, but EFF — doesn’t find the enforcement of intellectual property rights to be compelling. But that unprincipled approach would naturally lead in a different direction where the court sought to protect a value that EFF does care about. Such a position arbitrarily elevates EFF’s idiosyncratic preferences. That is simply not a viable basis for constructing good global Internet governance.

If the Internet is both everywhere and nowhere, our responses must reflect that reality, and be based on the technology-neutral application of laws, not the abdication of responsibility premised upon an outdated theory of tech exceptionalism under which cyberspace is free from the application of the laws of sovereign nations. That is not the path to either freedom or prosperity.

To realize the economic and social potential of the Internet, we must be guided by both a determination to meaningfully address harms, and a sober reservation about interfering in the affairs of other states. The Supreme Court of Canada’s decision in Google v. Equustek has planted a flag in this space. It serves no one to pretend that the Court decided that a country has the unfettered right to censor the Internet. That’s not what it held — and we should be grateful for that. To suggest otherwise may indeed be self-fulfilling.

On March 14, the U.S. Chamber of Commerce released a report “by an independent group of experts it commissioned to consider U.S. responses to the inappropriate use of antitrust enforcement actions worldwide to achieve industrial policy outcomes.”  (See here and here.)  I served as rapporteur for the report, which represents the views of the experts (leading academics, practitioners, and former senior officials who specialize in antitrust and international trade), not the position of the Chamber.  In particular, the report calls for the formation of a new White House-led working group.  The working group would oversee development of a strategy for dealing with the misuse of competition policy by other nations that impede international trade and competition and harm U.S. companies.  The denial of fundamental due process rights and the inappropriate extraterritorial application of competition remedies by foreign governments also would be within the purview of the working group.

The Chamber will hold a program on April 10 with members of the experts group to discuss the report and its conclusions.  The letter transmitting the report to the President and congressional leadership states as follows:

Today, nearly every nation in the world has some form of antitrust or competition law regulating business activities occurring within or substantially affecting its territory. The United States has long championed the promotion of global competition as the best way to ensure that businesses have a strong incentive to operate efficiently and innovate, and this approach has helped to fuel a strong and vibrant U.S. economy. But competition laws are not always applied in a transparent, accurate and impartial manner, and they can have significant adverse impacts far outside a country’s own borders. Certain of our major trading partners appear to have used their laws to actually harm competition by U.S. companies, protecting their own markets from foreign competition, promoting national champions, forcing technology transfers and, in some cases, denying U.S. companies fundamental due process.

Up to now, the United States has had some, but limited, success in addressing this problem. For that reason, in August of 2016, the U.S. Chamber of Commerce convened an independent, bi-partisan group of experts in trade and competition law and economics to take a fresh look and develop recommendations for a potentially more effective and better-integrated international trade and competition law strategy.

As explained by the U.S. Chamber in announcing the formation of this group,

The United States has been, and should continue to be, a global leader in the development and implementation of sound competition law and policy. . . . When competition law is applied in a discriminatory manner or relies upon non-competition factors to engineer outcomes in support of national champions or industrial policy objectives, the impact of such instances arguably goes beyond the role of U.S. antitrust agencies. The Chamber believes it is critical for the United States to develop a coordinated trade and competition law approach to international economic policy.

The International Competition Policy Expert Group (“ICPEG”) was encouraged to develop “practical and actionable steps forward that will serve to advance sound trade and competition policy.”

The Report accompanying this letter is the result of ICPEG’s work. Although the U.S. Chamber suggested the project and recruited participants, it made no effort to steer the content of ICPEG’s recommendations.

The Report is addressed specifically to the interaction of competition law and international trade law and proposes greater coordination and cooperation between them in the formulation and implementation of U.S. international trade policy. It focuses on the use of international trade and other appropriate tools to address problems in the application of foreign competition policies through 12 concrete recommendations.

Recommendations 1 through 6 urge the Trump Administration to prioritize the coordination of international competition policy through a new, cabinet-level White House working group (the “Working Group”) to be chaired by an Assistant to the President. Among other things, the Working Group would:

  • set a government-wide, high-level strategy for articulating and promoting policies to address the misuse of competition law by other nations that impede international trade and competition and harm U.S. companies;
  • undertake a 90-day review of existing and potential new trade policy tools available to address the challenge, culminating in a recommended “action list” for the President and Congress; and
  • address not only broader substantive concerns regarding the abuse of competition policy for protectionist and discriminatory purposes, but also the denial of fundamental process rights and the extraterritorial imposition of remedies that are not necessary to protect a country’s legitimate competition law objectives.

Recommendations 7 through 12 focus on steps that should be taken with international organizations and bilateral initiatives. For example, the United States should consider:

  • the feasibility and value of expanding the World Trade Organization’s regular assessment of each member government by the Trade Policy Review Body to include national competition policies and encourage the Organisation for Economic Cooperation and Development (OECD) to undertake specific peer reviews of national procedural or substantive policies, including of non-OECD countries;
  • encouraging the OECD and/or other multilateral bodies to adopt a code enumerating transparent, accurate, and impartial procedures; and
  • promoting the application of agreements under which nations would cooperate with and take into account legitimate interests of other nations affected by a competition investigation.

The competition and trade law issues addressed in the Report are complex and the consequences of taking any particular action vis-a-vis another country must be carefully considered in light of a number of factors beyond the scope of this Report. ICPEG does not take a view on the actions of any particular country nor propose specific steps with respect to any actual dispute or matter. In addition, reasonable minds can differ on ICPEG’s assessment and recommendations. But we hope that this Report will prompt appropriate prioritization of the issues it addresses and serve as the basis for the further development of a successful policy and action plan and improved coordination and cooperation between U.S. competition and trade agencies.

Last week the International Center for Law & Economics and I filed an amicus brief in the DC Circuit in support of en banc review of the court’s decision to uphold the FCC’s 2015 Open Internet Order.

In our previous amicus brief before the panel that initially reviewed the OIO, we argued, among other things, that

In order to justify its Order, the Commission makes questionable use of important facts. For instance, the Order’s ban on paid prioritization ignores and mischaracterizes relevant record evidence and relies on irrelevant evidence. The Order also omits any substantial consideration of costs. The apparent necessity of the Commission’s aggressive treatment of the Order’s factual basis demonstrates the lengths to which the Commission must go in its attempt to fit the Order within its statutory authority.

Our brief supporting en banc review builds on these points to argue that

By reflexively affording substantial deference to the FCC in affirming the Open Internet Order (“OIO”), the panel majority’s opinion is in tension with recent Supreme Court precedent….

The panel majority need not have, and arguably should not have, afforded the FCC the level of deference that it did. The Supreme Court’s decisions in State Farm, Fox, and Encino all require a more thorough vetting of the reasons underlying an agency change in policy than is otherwise required under the familiar Chevron framework. Similarly, Brown and Williamson, Utility Air Regulatory Group, and King all indicate circumstances in which an agency construction of an otherwise ambiguous statute is not due deference, including when the agency interpretation is a departure from longstanding agency understandings of a statute or when the agency is not acting in an expert capacity (e.g., its decision is based on changing policy preferences, not changing factual or technical considerations).

In effect, the panel majority based its decision whether to afford the FCC deference upon deference to the agency’s poorly supported assertions that it was due deference. We argue that this is wholly inappropriate in light of recent Supreme Court cases.

Moreover,

The panel majority failed to appreciate the importance of granting Chevron deference to the FCC. That importance is most clearly seen at an aggregate level. In a large-scale study of every Court of Appeals decision between 2003 and 2013, Professors Kent Barnett and Christopher Walker found that a court’s decision to defer to agency action is uniquely determinative in cases where, as here, an agency is changing established policy.

Kent Barnett & Christopher J. Walker, Chevron In the Circuit Courts 61, Figure 14 (2016), available at ssrn.com/abstract=2808848.

Figure 14 from Barnett & Walker, as reproduced in our brief.

As  that study demonstrates,

agency decisions to change established policy tend to present serious, systematic defects — and [thus that] it is incumbent upon this court to review the panel majority’s decision to reflexively grant Chevron deference. Further, the data underscore the importance of the Supreme Court’s command in Fox and Encino that agencies show good reason for a change in policy; its recognition in Brown & Williamson and UARG that departures from existing policy may fall outside of the Chevron regime; and its command in King that policies not made by agencies acting in their capacity as technical experts may fall outside of the Chevron regime. In such cases, the Court essentially holds that reflexive application of Chevron deference may not be appropriate because these circumstances may tend toward agency action that is arbitrary, capricious, in excess of statutory authority, or otherwise not in accordance with law.

As we conclude:

The present case is a clear example where greater scrutiny of an agency’s decision-making process is both warranted and necessary. The panel majority all too readily afforded the FCC great deference, despite the clear and unaddressed evidence of serious flaws in the agency’s decision-making process. As we argued in our brief before the panel, and as Judge Williams recognized in his partial dissent, the OIO was based on factually inaccurate, contradicted, and irrelevant record evidence.

Read our full — and very short — amicus brief here.

Introduction

In my role as a “non-governmental advisor” (NGA), I was privileged to attend and participate actively in the 15th Annual ICN Conference, held in Singapore from April 26-29.  (I have blogged previously on ICN annual conferences and policy initiatives, see here, here, and here.)  As a virtual network of national competition law agencies (“national competition authorities,” or NCAs, such as the U.S. Federal Trade Commission (FTC) and the U.S. Justice Department’s Antitrust Division (DOJ)) and expert NGAs from around the world, the ICN supports convergence toward consensus-based “best practices” in competition law enforcement and policy:

The ICN is unique as it is the only international body devoted exclusively to competition law enforcement and its members represent national and multinational competition authorities. Members produce work products through their involvement in flexible project-oriented and results-based working groups. Working group members work together largely by Internet, telephone, teleseminars and webinars.

Annual conferences and workshops provide opportunities to discuss working group projects and their implications for enforcement. The ICN does not exercise any rule-making function. Where the ICN reaches consensus on recommendations, or “best practices”, arising from the projects, individual competition authorities decide whether and how to implement the recommendations, through unilateral, bilateral or multilateral arrangements, as appropriate.

Since its founding in 2001, the ICN has evolved from a small club of fifteen agencies and a few NGAs (mainly from North America and Europe) to a robust organization comprising over 130 NCAs and numerous NGAs from around the world (although admittedly the majority of NGAs continue to come from developed countries).  Due to its lack of a centralized bureaucracy and the absence of top-down control by national governments, the ICN has been able to tackle concrete issues in a pragmatic and incremental fashion, drawing upon the insights of leading scholars as well as former and current NCA heads.

Summary Assessment of ICN Achievements

As the ICN turns fifteen, a bit of stock-taking is in order.  Here are some of my observations, based upon my decade-long involvement with the ICN:

  1. The ICN has significantly promoted the reduction of transactions costs involved in merger filings, through its recommended practices for merger notification and review procedures. This tangible benefit has gained importance with the proliferation of merger filing requirements around the world.  Although many regimes have yet to fully adopt the recommended practices, their influence has grown steadily.  Furthermore, the ICN’s Merger Working Group has leveraged this success to promote greater cooperation among NCAs in merger evaluation and a greater acceptance of economics-based merger analysis principles – factors which may also be expected to reduce transactions and error costs.
  1. The ICN’s Cartel Working Group has done an impressive job in promoting cooperation among NCAs in the detection, investigation, and prosecution of international cartels. Hard core cartel agreements involve the one type of business arrangement that unequivocally diminishes economic welfare, and therefore merits condemnation.  ICN work products related to cartel enforcement, including detection, punishment, investigative techniques, and information sharing (supplemented by hands-on workshops), continue to raise the quality of anti-cartel cooperation and new agency involvement in cartel enforcement.  Future challenges for this Working Group include the strengthening of corporate compliance programs worldwide to deter cartel activity, and dealing with private enforcement as a supplement to public anti-cartel efforts (European Union nations and other jurisdictions are beginning to introduce private competition law enforcement).
  1. The ICN has developed taped training modules and related documentary resources, centered on case-specific hypotheticals and economic analysis, that may over time raise the quality of substantive antitrust analyses carried out by NCAs, especially new and inexperienced ones. Although the influence of these materials may only be gradually felt over time, and cannot be quantified, discussion at the Singapore Conference suggests that they are being consulted more frequently.  These materials ideally will help reduce error costs in enforcement by dissuading enforcers from adopting economically irrational approaches to case analysis (although the materials cannot, of course, guarantee against the possible interjection of non-economic policy factors and extraneous political considerations in the assessment of particular matters).
  1. The ICN’s Working Group on Agency Effectiveness holds real promise for enhancement of the quality and efficiency of NCA decision-making. In particular, the Working Group’s recently developed ICN Guidance on Investigative Process provides useful guidance on investigative transparency and due process that, if followed, would help reduce widespread concerns about lack of procedural fairness in competition investigations, particularly those carried out by inexperienced NCAs.  Nevertheless, it must be recognized that calls for increased attention to due process in such investigations, by DOJ, the FTC, and corporate representatives, have met with limited success at best.  This may partly reflect the different view of due process found in civil law systems, which rely on inquisitorial proceedings guided by government officials rather than the common law adversary process.  It may also reflect concerns about having the nature of agency decision-making exposed to too much “sunshine,” particularly in young NCAs that have limited resources and inexperienced staff.  Improvements in procedural fairness thus may be expected to proceed slowly.  Even recognizing this, however, the Agency Effectiveness Group is engaging proactively in such issues as strategic planning and prioritization that could lead to improved substantive NCA outcomes from an economic welfare point of view, quite apart from due process issues.
  1. The ICN’s Advocacy Working Group (AWG) is expanding its efforts to enable NCAs to better assess the anticompetitive potential of government laws and regulations. This AWG has over the years produced excellent and succinct sets of Recommended Practices on Competition Assessment, centered on identifying features of proposed regulations and laws that prevent competitive forces from operating effectively, such as barriers to entry by new businesses.  The Working Group has also done valuable work on inculcating public support for procompetitive government policies (featuring release of a “competition culture” report in 2015).  Recently, the AWG has also worked closely with other multinational organizations involved in promoting international economic cooperation and development, including in particular the World Bank and the Organization for Economic Cooperation and Development (OECD).  The AWG has adapted analysis from the OECD’s Competition Assessment Toolkit (methodologies for identifying anticompetitive government actions) in its recommended practices and has involved OECD Competition Committee experts in its work.  Moreover, for several years now the World Bank has held one-day programs immediately preceding the ICN Annual Conference, which have highlighted how regulations and legislation that undermine competition greatly reduce economic growth potential in developing countries.  Starting in 2015, the World Bank and ICN AWG cooperated in launching annual “competition advocacy contests” in which NCAs compete in presenting case studies on how their successful public advocacy initiatives have enhanced competition and raised economic welfare within their jurisdictions.   The Working Group has also launched a web-based “Benefits Platform,” which “seeks to provide ICN members with knowledge, strategies and arguments for explaining the benefits of competition to support their competition advocacy efforts with government and non-government stakeholders, as well as in the evaluation of competition interventions.”  All told, among all of the ICN’s initiatives, the AWG’s projects hold out the greatest potential for enhancing economic welfare, since government interference in competitive processes tend to be far more serious, distortive, and long-lasting than mere private restraints.  (In a related vein, Shanker Singham and I have authored a Heritage Foundation essay on how procompetitive regulatory reform can advance both economic freedom and prosperity.)  The greatest limitation on the utility of AWG guidance is, of course, political opposition to regulatory reform from private rent seekers and their government allies.
  1. The ICN’s Unilateral Conduct Working Group (UCWG) has done solid and generally sound work on state-created monopolies and predatory pricing, and on the assessment of dominance/substantial market power, and is turning now to a broader policy initiative. In particular, the UCWG merits praise for recommending that competition analysis not treat state-owned enterprises more favorably than their private competitors (although the feasibility of true “neutrality” analysis may be questioned given the array of implicit benefits that state-supported enterprises may enjoy).  The UCWG is now developing a potentially ambitious “workbook” on the general analysis of unilateral conduct, which holds out both promise and risk.  Unilateral conduct analysis is particularly prone to high error costs, because procompetitive and anticompetitive conduct often look the same.  Given that fact, and the importance of aggressive unilateral initiatives to a vibrant competitive process, there is good reason to err on the side of caution in evaluating the competitive ramifications of unilateral conduct.  DOJ’s 2008 Report on Single-Firm Conduct Under Section 2 of the Sherman Act presents an excellent template for unilateral conduct analysis, which could profitably be adopted by the UCWG.  Regrettably, however, there are those who believe that unilateral conduct analysis should rely heavily on sophisticated theoretical models of potential competitive harm.  Such models typically ignore the problem of decision theory and error costs (see here), and threaten to condemn particular instances (if not broad categories) of single firm business initiatives that are welfare-enhancing.  What’s worse, such condemnations have the tendency to chill efficient unilateral actions by other firms, which fear that the efficiencies underlying their actions would be misunderstood or ignored by competition law enforcers.  The members of the UCWG should keep these considerations in mind in drafting the workbook, and rely on decision theory and error cost analysis in deriving their recommendations.

Conclusion

In sum, the ICN has done a commendable job in promoting sensible procedural and substantive principles in competition law analysis around the world.  Although its achievements inevitably have been and will continue to be constrained by the differences among national legal regimes (particularly the civil law and common law divide) and political limitations that individual NCAs face, there is every reason to believe that it has enhanced overall economic welfare through its efforts.  Accordingly, continued support for the ICN by the United States antitrust agencies and American antitrust scholars is clearly warranted.  It is to be hoped that active participants in ICN initiatives will continue to rely on sound economic analysis as their lodestar – and, in particular, that UCWG members will employ appropriate caution and a decision-theoretic template in developing future recommendations.

It appears that White House’s zeal for progressive-era legal theory has … progressed (or regressed?) further. Late last week President Obama signed an Executive Order that nominally claims to direct executive agencies (and “strongly encourages” independent agencies) to adopt “pro-competitive” policies. It’s called Steps to Increase Competition and Better Inform Consumers and Workers to Support Continued Growth of the American Economy, and was produced alongside an issue brief from the Council of Economic Advisors titled Benefits of Competition and Indicators of Market Power.

TL;DR version: the Order and its brief do not appear so much aimed at protecting consumers or competition, as they are at providing justification for favored regulatory adventures.

In truth, it’s not exactly clear what problem the President is trying to solve. And there is language in both the Order and the brief that could be interpreted in a positive light, and, likewise, language that could be more of a shot across the bow of “unruly” corporate citizens who have not gotten in line with the President’s agenda. Most of the Order and the corresponding CEA brief read as a rote recital of basic antitrust principles: price fixing bad, collusion bad, competition good. That said, there were two items in the Order that particularly stood out.

The (Maybe) Good

Section 2 of the Order states that

Executive departments … with authorities that could be used to enhance competition (agencies) shall … use those authorities to promote competition, arm consumers and workers with the information they need to make informed choices, and eliminate regulations that restrict competition without corresponding benefits to the American public. (emphasis added)

Obviously this is music to the ears of anyone who has thought that agencies should be required to do a basic economic analysis before undertaking brave voyages of regulatory adventure. And this is what the Supreme Court was getting at in Michigan v. EPA when it examined the meaning of the phrase “appropriate” in connection with environmental regulations:

One would not say that it is even rational, never mind “appropriate,” to impose billions of dollars in economic costs in return for a few dollars in health or environmental benefits.

Thus, if this Order follows the direction of Michigan v. EPA, and it becomes the standard for agencies to conduct cost-benefit analyses before issuing regulation (and to review old regulations through such an analysis), then wonderful! Moreover, this mandate to agencies to reduce regulations that restrict competition could lead to an unexpected reformation of a variety of regulations – even outside of the agencies themselves. For instance, the FTC is laudable in its ongoing efforts both to correct anticompetitive state licensing laws as well as to resist state-protected incumbents, such as taxi-cab companies.

Still, I have trouble believing that the President — and this goes for any president, really, regardless of party — would truly intend for agencies under his control to actually cede regulatory ground when a little thing like economic reality points in a different direction than official policy. After all, there was ample information available that the Title II requirements on broadband providers would be both costly and result in reduced capital expenditures, and the White House nonetheless encouraged the FCC to go ahead with reclassification.

And this isn’t the first time that the President has directed agencies to perform retrospective review of regulation (see the Identifying and Reducing Regulatory Burdens Order of 2012). To date, however, there appears to be little evidence that the burdens of the regulatory state have lessened. Last year set a record for the page count of the Federal Register (80k+ pages), and the data suggest that the cost of the regulatory state is only increasing. Thus, despite the pleasant noises the Order makes with regard to imposing economic discipline on agencies – and despite the good example Canada has set for us in this regard – I am not optimistic of the actual result.

And the (maybe) good builds an important bridge to the (probably) bad of the Order. It is well and good to direct agencies to engage in economic calculation when they write and administer regulations, but such calculation must be in earnest, and must be directed by the learning that was hard earned over the course of the development of antitrust jurisprudence in the US. As Geoffrey Manne and Josh Wright have noted:

Without a serious methodological commitment to economic science, the incorporation of economics into antitrust is merely a façade, allowing regulators and judges to select whichever economic model fits their earlier beliefs or policy preferences rather than the model that best fits the real‐world data. Still, economic theory remains essential to antitrust law. Economic analysis constrains and harnesses antitrust law so that it protects consumers rather than competitors.

Unfortunately, the brief does not indicate that it is interested in more than a façade of economic rigor. For instance, it relies on the outmoded 50 firm revenue concentration numbers gathered by the Census Bureau to support the proposition that the industries themselves are highly concentrated and, therefore, are anticompetitive. But, it’s been fairly well understood since the 1970s that concentration says nothing directly about monopoly power and its exercise. In fact, concentration can often be seen as an indicator of superior efficiency that results in better outcomes for consumers (depending on the industry).

The (Probably) Bad

Apart from general concerns (such as having a host of federal agencies with no antitrust expertise now engaging in competition turf wars) there is one specific area that could have a dramatically bad result for long term policy, and that moreover reflects either ignorance or willful blindness of antitrust jurisprudence. Specifically, the Order directs agencies to

identify specific actions that they can take in their areas of responsibility to build upon efforts to detect abuses such as price fixing, anticompetitive behavior in labor and other input markets, exclusionary conduct, and blocking access to critical resources that are needed for competitive entry. (emphasis added).

It then goes on to say that

agencies shall submit … an initial list of … any specific practices, such as blocking access to critical resources, that potentially restrict meaningful consumer or worker choice or unduly stifle new market entrants (emphasis added)

The generally uncontroversial language regarding price fixing and exclusionary conduct are bromides – after all, as the Order notes, we already have the FTC and DOJ very actively policing this sort of conduct. What’s novel here, however, is that the highlighted language above seems to amount to a mandate to executive agencies (and a strong suggestion to independent agencies) that they begin to seek out “essential facilities” within their regulated industries.

But “critical resources … needed for competitive entry” could mean nearly anything, depending on how you define competition and relevant markets. And asking non-antitrust agencies to integrate one of the more esoteric (and controversial) parts of antitrust law into their mission is going to be a recipe for disaster.

In fact, this may be one of the reasons why the Supreme Court declined to recognize the essential facilities doctrine as a distinct rule in Trinko, where it instead characterized the exclusionary conduct in Aspen Skiing as ‘at or near the outer boundary’ of Sherman Act § 2 liability.

In short, the essential facilities doctrine is widely criticized, by pretty much everyone. In their respected treatise, Antitrust Law, Herbert Hovenkamp and Philip Areeda have said that “the essential facility doctrine is both harmful and unnecessary and should be abandoned”; Michael Boudin has noted that the doctrine is full of “embarrassing weaknesses”; and Gregory Werden has opined that “Courts should reject the doctrine.” One important reason for the broad criticism is because

At bottom, a plaintiff … is saying that the defendant has a valuable facility that it would be difficult to reproduce … But … the fact that the defendant has a highly valued facility is a reason to reject sharing, not to require it, since forced sharing “may lessen the incentive for the monopolist, the rival, or both to invest in those economically beneficial facilities.” (quoting Trinko)

Further, it’s really hard to say when one business is so critical to a particular market that its own internal functions need to be exposed for competitors’ advantage. For instance, is Big Data – which the CEA brief specifically notes as a potential “critical resource” — an essential facility when one company serves so many consumers that it has effectively developed an entire market that it dominates? ( In case you are wondering, it’s actually not). When exactly does a firm so outcompete its rivals that access to its business infrastructure can be seen by regulators as “essential” to competition? And is this just a set-up for punishing success — which hardly promotes competition, innovation or consumer welfare?

And, let’s be honest here, when the CEA is considering Big Data as an essential facility they are at least partially focused on Google and its various search properties. Google is frequently the target for “essentialist” critics who argue, among other things, that Google’s prioritization of its own properties in its own search results violates antitrust rules. The story goes that Google search is so valuable that when Google publishes its own shopping results ahead of its various competitors, it is engaging in anticompetitive conduct. But this is a terribly myopic view of what the choices are for search services because, as Geoffrey Manne has so ably noted before, “competitors denied access to the top few search results at Google’s site are still able to advertise their existence and attract users through a wide range of other advertising outlets[.]”

Moreover, as more and more users migrate to specialized apps on their mobile devices for a variety of content, Google’s desktop search becomes just one choice among many for finding information. All of this leaves to one side, of course, the fact that for some categories, Google has incredibly stiff competition.

Thus it is that

to the extent that inclusion in Google search results is about “Stiglerian” search-cost reduction for websites (and it can hardly be anything else), the range of alternate facilities for this function is nearly limitless.

The troubling thing here is that, given the breezy analysis of the Order and the CEA brief, I don’t think the White House is really considering the long-term legal and economic implications of its command; the Order appears to be much more about political support for favored agency actions already under way.

Indeed, despite the length of the CEA brief and the variety of antitrust principles recited in the Order itself, an accompanying release points to what is really going on (at least in part). The White House, along with the FCC, seems to think that the embedded streams in a cable or satellite broadcast should be considered a form of essential facility that is an indispensable component of video consumers’ choice (which is laughable given the magnitude of choice in video consumption options that consumers enjoy today).

And, to the extent that courts might apply the (controversial) essential facilities doctrine, an “indispensable requirement … is the unavailability of access to the ‘essential facilities’[.]” This is clearly not the case with much of what the CEA brief points to as examples of ostensibly laudable pro-competitive regulation.

The doctrine wouldn’t apply, for instance, to the FCC’s Open Internet Order since edge providers have access to customers over networks, even where network providers want to zero-rate, employ usage-based billing or otherwise negotiate connection fees and prioritization. And it also doesn’t apply to the set-top box kerfuffle; while third-parties aren’t able to access the video streams that make-up a cable broadcast, the market for consuming those streams is a single part of the entire video ecosystem. What really matters there is access to viewers, and the ability to provide services to consumers and compete for their business.

Yet, according to the White House, “the set-top box is the mascot” for the administration’s competition Order, because, apparently, cable boxes represent “what happens when you don’t have the choice to go elsewhere.” ( “Elsewhere” to the White House, I assume, cannot include Roku, Apple TV, Hulu, Netflix, and a myriad of other video options  that consumers can currently choose among.)

The set-top box is, according to the White House, a prime example of the problem that

[a]cross our economy, too many consumers are dealing with inferior or overpriced products, too many workers aren’t getting the wage increases they deserve, too many entrepreneurs and small businesses are getting squeezed out unfairly by their bigger competitors, and overall we are not seeing the level of innovative growth we would like to see.

This is, of course, nonsense. Consumers enjoy an incredible amount of low-cost, high quality goods (including video options) – far more than at any point in history.  After all:

From cable to Netflix to Roku boxes to Apple TV to Amazon FireStick, we have more ways to find and watch TV than ever — and we can do so in our living rooms, on our phones and tablets, and on seat-back screens at 30,000 feet. Oddly enough, FCC Chairman Tom Wheeler … agrees: “American consumers enjoy unprecedented choice in how they view entertainment, news and sports programming. You can pretty much watch what you want, where you want, when you want.”

Thus, I suspect that the White House has its eye on a broader regulatory agenda.

For instance, the Department of Labor recently announced that it would be extending its reach in the financial services industry by changing the standard for when financial advice might give rise to a fiduciary relationship under ERISA. It seems obvious that the SEC or FINRA could have taken up the slack for any financial services regulatory issues – it’s certainly within their respective wheelhouses. But that’s not the direction the administration took, possibly because SEC and FINRA are independent agencies. Thus, the DOL – an agency with substantially less financial and consumer protection experience than either the SEC or FINRA — has expansive new authority.

And that’s where more of the language in the Order comes into focus. It directs agencies to “ensur[e] that consumers and workers have access to the information needed to make informed choices[.]” The text of the DOL rule develops for itself a basis in competition law as well:

The current proposal’s defined boundaries between fiduciary advice, education, and sales activity directed at large plans, may bring greater clarity to the IRA and plan services markets. Innovation in new advice business models, including technology-driven models, may be accelerated, and nudged away from conflicts and toward transparency, thereby promoting healthy competition in the fiduciary advice market.

Thus, it’s hard to see what the White House is doing in the Order, other than laying the groundwork for expansive authority of non-independent executive agencies under the thin guise of promoting competition. Perhaps the President believes that couching this expansion in free market terms ( i.e. that its “pro-competition”) will somehow help the initiatives go through with minimal friction. But there is nothing in the Order or the CEA brief to provide any confidence that competition will, in fact, be promoted. And in the end I have trouble seeing how this sort of regulatory adventurism does not run afoul of separation of powers issues, as well as assorted other legal challenges.

Finally, conjuring up a regulatory version of the essential facilities doctrine as a support for this expansion is simply a terrible idea — one that smacks much more of industrial policy than of sound regulatory reform or consumer protection.

The International Center for Law & Economics (ICLE) and TechFreedom filed two joint comments with the FCC today, explaining why the FCC has no sound legal basis for micromanaging the Internet and why “net neutrality” regulation would actually prove counter-productive for consumers.

The Policy Comments are available here, and the Legal Comments are here. See our previous post, Net Neutrality Regulation Is Bad for Consumers and Probably Illegal, for a distillation of many of the key points made in the comments.

New regulation is unnecessary. “An open Internet and the idea that companies can make special deals for faster access are not mutually exclusive,” said Geoffrey Manne, Executive Director of ICLE. “If the Internet really is ‘open,’ shouldn’t all companies be free to experiment with new technologies, business models and partnerships?”

“The media frenzy around this issue assumes that no one, apart from broadband companies, could possibly question the need for more regulation,” said Berin Szoka, President of TechFreedom. “In fact, increased regulation of the Internet will incite endless litigation, which will slow both investment and innovation, thus harming consumers and edge providers.”

Title II would be a disaster. The FCC has proposed re-interpreting the Communications Act to classify broadband ISPs under Title II as common carriers. But reinterpretation might unintentionally ensnare edge providers, weighing them down with onerous regulations. “So-called reclassification risks catching other Internet services in the crossfire,” explained Szoka. “The FCC can’t easily forbear from Title II’s most onerous rules because the agency has set a high bar for justifying forbearance. Rationalizing a changed approach would be legally and politically difficult. The FCC would have to simultaneously find the broadband market competitive enough to forbear, yet fragile enough to require net neutrality rules. It would take years to sort out this mess — essentially hitting the pause button on better broadband.”

Section 706 is not a viable option. In 2010, the FCC claimed Section 706 as an independent grant of authority to regulate any form of “communications” not directly barred by the Act, provided only that the Commission assert that regulation would somehow promote broadband. “This is an absurd interpretation,” said Szoka. “This could allow the FCC to essentially invent a new Communications Act as it goes, regulating not just broadband, but edge companies like Google and Facebook, too, and not just neutrality but copyright, cybersecurity and more. The courts will eventually strike down this theory.”

A better approach. “The best policy would be to maintain the ‘Hands off the Net’ approach that has otherwise prevailed for 20 years,” said Manne. “That means a general presumption that innovative business models and other forms of ‘prioritization’ are legal. Innovation could thrive, and regulators could still keep a watchful eye, intervening only where there is clear evidence of actual harm, not just abstract fears.” “If the FCC thinks it can justify regulating the Internet, it should ask Congress to grant such authority through legislation,” added Szoka. “A new communications act is long overdue anyway. The FCC could also convene a multistakeholder process to produce a code enforceable by the Federal Trade Commission,” he continued, noting that the White House has endorsed such processes for setting Internet policy in general.

Manne concluded: “The FCC should focus on doing what Section 706 actually commands: clearing barriers to broadband deployment. Unleashing more investment and competition, not writing more regulation, is the best way to keep the Internet open, innovative and free.”

For some of our other work on net neutrality, see:

“Understanding Net(flix) Neutrality,” an op-ed by Geoffrey Manne in the Detroit News on Netflix’s strategy to confuse interconnection costs with neutrality issues.

“The Feds Lost on Net Neutrality, But Won Control of the Internet,” an op-ed by Berin Szoka and Geoffrey Manne in Wired.com.

“That startup investors’ letter on net neutrality is a revealing look at what the debate is really about,” a post by Geoffrey Manne in Truth on the Market.

Bipartisan Consensus: Rewrite of ‘96 Telecom Act is Long Overdue,” a post on TF’s blog highlighting the key points from TechFreedom and ICLE’s joint comments on updating the Communications Act.

The Net Neutrality Comments are available here:

ICLE/TF Net Neutrality Policy Comments

TF/ICLE Net Neutrality Legal Comments

With Berin Szoka.

TechFreedom and the International Center for Law & Economics will shortly file two joint comments with the FCC, explaining why the FCC has no sound legal basis for micromanaging the Internet—now called “net neutrality regulation”—and why such regulation would be counter-productive as a policy matter. The following summarizes some of the key points from both sets of comments.

No one’s against an open Internet. The notion that anyone can put up a virtual shingle—and that the good ideas will rise to the top—is a bedrock principle with broad support; it has made the Internet essential to modern life. Key to Internet openness is the freedom to innovate. An open Internet and the idea that companies can make special deals for faster access are not mutually exclusive. If the Internet really is “open,” shouldn’t all companies be free to experiment with new technologies, business models and partnerships? Shouldn’t the FCC allow companies to experiment in building the unknown—and unknowable—Internet of the future?

The best approach would be to maintain the “Hands off the Net” approach that has otherwise prevailed for 20 years. That means a general presumption that innovative business models and other forms of “prioritization” are legal. Innovation could thrive, and regulators could still keep a watchful eye, intervening only where there is clear evidence of actual harm, not just abstract fears. And they should start with existing legal tools—like antitrust and consumer protection laws—before imposing prior restraints on innovation.

But net neutrality regulation hurts more than it helps. Counterintuitively, a blanket rule that ISPs treat data equally could actually harm consumers. Consider the innovative business models ISPs are introducing. T-Mobile’s unRadio lets users listen to all the on-demand music and radio they want without taking a hit against their monthly data plan. Yet so-called consumer advocates insist that’s a bad thing because it favors some content providers over others. In fact, “prioritizing” one service when there is congestion frees up data for subscribers to consume even more content—from whatever source. You know regulation may be out of control when a company is demonized for offering its users a freebie.

Treating each bit of data neutrally ignores the reality of how the Internet is designed, and how consumers use it.  Net neutrality proponents insist that all Internet content must be available to consumers neutrally, whether those consumers (or content providers) want it or not. They also argue against usage-based pricing. Together, these restrictions force all users to bear the costs of access for other users’ requests, regardless of who actually consumes the content, as the FCC itself has recognized:

[P]rohibiting tiered or usage-based pricing and requiring all subscribers to pay the same amount for broadband service, regardless of the performance or usage of the service, would force lighter end users of the network to subsidize heavier end users. It would also foreclose practices that may appropriately align incentives to encourage efficient use of networks.

The rules that net neutrality advocates want would hurt startups as well as consumers. Imagine a new entrant, clamoring for market share. Without the budget for a major advertising blitz, the archetypical “next Netflix” might never get the exposure it needs to thrive. But for a relatively small fee, the startup could sign up to participate in a sponsored data program, with its content featured and its customers’ data usage exempted from their data plans. This common business strategy could mean the difference between success and failure for a startup. Yet it would be prohibited by net neutrality rules banning paid prioritization.

The FCC lacks sound legal authority. The FCC is essentially proposing to do what can only properly be done by Congress: invent a new legal regime for broadband. Each of the options the FCC proposes to justify this—Section 706 of the Telecommunications Act and common carrier classification—is deeply problematic.

First, Section 706 isn’t sustainable. Until 2010, the FCC understood Section 706 as a directive to use its other grants of authority to promote broadband deployment. But in its zeal to regulate net neutrality, the FCC reversed itself in 2010, claiming Section 706 as an independent grant of authority. This would allow the FCC to regulate any form of “communications” in any way not directly barred by the Act — not just broadband but “edge” companies like Google and Facebook. This might mean going beyond neutrality to regulate copyright, cybersecurity and more. The FCC need only assert that regulation would somehow promote broadband.

If Section 706 is a grant of authority, it’s almost certainly a power to deregulate. But even if its power is as broad as the FCC claims, the FCC still hasn’t made the case that, on balance, its proposed regulations would actually do what it asserts: promote broadband. The FCC has stubbornly refused to conduct serious economic analysis on the net effects of its neutrality rules.

And Title II would be a disaster. The FCC has asked whether Title II of the Act, which governs “common carriers” like the old monopoly telephone system, is a workable option. It isn’t.

In the first place, regulations that impose design limitations meant for single-function networks simply aren’t appropriate for the constantly evolving Internet. Moreover, if the FCC re-interprets the Communications Act to classify broadband ISPs as common carriers, it risks catching other Internet services in the cross-fire, inadvertently making them common carriers, too. Surely net neutrality proponents can appreciate the harmful effects of treating Skype as a common carrier.

Forbearance can’t clean up the Title II mess. In theory the FCC could “forbear” from Title II’s most onerous rules, promising not to apply them when it determines there’s enough competition in a market to make the rules unnecessary. But the agency has set a high bar for justifying forbearance.

Most recently, in 2012, the Commission refused to grant Qwest forbearance even in the highly competitive telephony market, disregarding competition from wireless providers, and concluding that a cable-telco “duopoly” is inadequate to protect consumers. It’s unclear how the FCC could justify reaching the opposite conclusion about the broadband market—simultaneously finding it competitive enough to forbear, yet fragile enough to require net neutrality rules. Such contradictions would be difficult to explain, even if the FCC generally gets discretion on changing its approach.

But there is another path forward. If the FCC can really make the case for regulation, it should go to Congress, armed with the kind of independent economic and technical expert studies Commissioner Pai has urged, and ask for new authority. A new Communications Act is long overdue anyway. In the meantime, the FCC could convene the kind of multistakeholder process generally endorsed by the White House to produce a code enforceable by the Federal Trade Commission. A consensus is possible — just not inside the FCC, where the policy questions can’t be separated from the intractable legal questions.

Meanwhile, the FCC should focus on doing what Section 706 actually demands: clearing barriers to broadband deployment and competition. The 2010 National Broadband Plan laid out an ambitious pro-deployment agenda. It’s just too bad the FCC was so obsessed with net neutrality that it didn’t focus on the plan. Unleashing more investment and competition, not writing more regulation, is the best way to keep the Internet open, innovative and free.

[Cross-posted at TechFreedom.]

Last week a group of startup investors wrote a letter to protest what they assume FCC Chairman Tom Wheeler’s proposed, revised Open Internet NPRM will say.

Bear in mind that an NPRM is a proposal, not a final rule, and its issuance starts a public comment period. Bear in mind, as well, that the proposal isn’t public yet, presumably none of the signatories to this letter has seen it, and the devil is usually in the details. That said, the letter has been getting a lot of press.

I found the letter seriously wanting, and seriously disappointing. But it’s a perfect example of what’s so wrong with this interminable debate on net neutrality.

Below I reproduce the letter in full, in quotes, with my comments interspersed. The key take-away: Neutrality (or non-discrimination) isn’t what’s at stake here. What’s at stake is zero-cost access by content providers to broadband networks. One can surely understand why content providers and those who fund them want their costs of doing business to be lower. But the rhetoric of net neutrality is mismatched with this goal. It’s no wonder they don’t just come out and say it – it’s quite a remarkable claim.

Open Internet Investors Letter

The Honorable Tom Wheeler, Chairman
Federal Communications Commission
445 12th Street, SW
Washington D.C. 20554

May 8, 2014

Dear Chairman Wheeler:

We write to express our support for a free and open Internet.

We invest in entrepreneurs, investing our own funds and those of our investors (who are individuals, pension funds, endowments, and financial institutions).  We often invest at the earliest stages, when companies include just a handful of founders with largely unproven ideas. But, without lawyers, large teams or major revenues, these small startups have had the opportunity to experiment, adapt, and grow, thanks to equal access to the global market.

“Equal” access has nothing to do with it. No startup is inherently benefitted by being “equal” to others. Maybe this is just careless drafting. But frankly, as I’ll discuss, there are good reasons to think (contra the pro-net neutrality narrative) that startups will be helped by inequality (just like contra the (totally wrong) accepted narrative, payola helps new artists). It says more than they would like about what these investors really want that they advocate “equality” despite the harm it may impose on startups (more on this later).

Presumably what “equal” really means here is “zero cost”: “As long as my startup pays nothing for access to ISPs’ subscribers, it’s fine that we’re all on equal footing.” Wheeler has stated his intent that his proposal would require any prioritization to be available to any who want it, on equivalent, commercially reasonable terms. That’s “equal,” too, so what’s to complain about? But it isn’t really inequality that’s gotten everyone so upset.

Of course, access is never really “zero cost;” start-ups wouldn’t need investors if their costs were zero. In that sense, why is equality of ISP access any more important than other forms of potential equality? Why not mandate price controls on rent? Why not mandate equal rent? A cost is a cost. What’s really going on here is that, like Netflix, these investors want to lower their costs and raise their returns as much as possible, and they want the government to do it for them.

As a result, some of the startups we have invested in have managed to become among the most admired, successful, and influential companies in the world.

No startup became successful as a result of “equality” or even zero-cost access to broadband. No doubt some of their business models were predicated on that assumption. But it didn’t cause their success.

We have made our investment decisions based on the certainty of a level playing field and of assurances against discrimination and access fees from Internet access providers.

And they would make investment decisions based on the possibility of an un-level playing field if that were the status quo. More importantly, the businesses vying for investment dollars might be different ones if they built their business models in a different legal/economic environment. So what? This says nothing about the amount of investment, the types of businesses, the quality of businesses that would arise under a different set of rules. It says only that past specific investments might not have been made.

Unless the contention is that businesses would be systematically worse under a different rule, this is irrelevant. I have seen that claim made, and it’s implicit here, of course, but I’ve seen no evidence to actually support it. Businesses thrive in unequal, cost-ladened environments all the time. It costs about $4 million/30 seconds to advertise during the Super Bowl. Budweiser and PepsiCo paid multiple millions this year to do so; many of their competitors didn’t. With inequality like that, it’s a wonder Sierra Nevada and Dr. Pepper haven’t gone bankrupt.

Indeed, our investment decisions in Internet companies are dependent upon the certainty of an equal-opportunity marketplace.

Again, no they’re not. Equal opportunity is a euphemism for zero cost, or else this is simply absurd on its face. Are these investors so lacking in creativity and ability that they can invest only when there is certainty of equal opportunity? Don’t investors thrive – aren’t they most needed – in environments where arbitrage is possible, where a creative entrepreneur can come up with a risky, novel way to take advantage of differential conditions better than his competitors? Moreover, the implicit equating of “equal-opportunity marketplace” with net neutrality rules is far-fetched. Is that really all that matters?

This is a good time to make a point that is so often missed: The loudest voices for net neutrality are the biggest companies – Google, Netflix, Amazon, etc. That fact should give these investors and everyone else serious pause. Their claim rests on the idea that “equality” is needed, so big companies can’t use an Internet “fast lane” to squash them. Google is decidedly a big company. So why do the big boys want this so much?

The battle is often pitched as one of ISPs vs. (small) content providers. But content providers have far less to worry about and face far less competition from broadband providers than from big, incumbent competitors. It is often claimed that “Netflix was able to pay Comcast’s toll, but a small startup won’t have that luxury.” But Comcast won’t even notice or care about a small startup; its traffic demands will be inconsequential. Netflix can afford to pay for Internet access for precisely the same reason it came to Comcast’s attention: It’s hugely successful, and thus creates a huge amount of traffic.

Based on news reports and your own statements, we are worried that your proposed rules will not provide the necessary certainty that we need to make investment decisions and that these rules will stifle innovation in the Internet sector.

Now, there’s little doubt that legal certainty aids investment decisions. But “certainty” is not in danger here. The rules have to change because the court said so – with pretty clear certainty. And a new rule is not inherently any more or less likely to offer certainty than the previous Open Internet Order, which itself was subject to intense litigation (obviously) and would have been subject to interpretation and inconsistent enforcement (and would have allowed all kinds of paid prioritization, too!). Certainty would be good, but Wheeler’s proposed rule won’t likely do anything about the amount of certainty one way or the other.

If established companies are able to pay for better access speeds or lower latency, the Internet will no longer be a level playing field. Start-ups with applications that are advantaged by speed (such as games, video, or payment systems) will be unlikely to overcome that deficit no matter how innovative their service.

Again, it’s notable that some of the strongest advocates for net neutrality are established companies. Another letter sent out last week included signatures from a bunch of startups, but also Google, Microsoft, Facebook and Yahoo!, among others.

In truth it’s hard to see why startup investors would think this helps them. Non-neutrality offers the prospect that a startup might be able to buy priority access to overcome the inherent disadvantage of newness, and to better compete with an established company. Neutrality means that that competitive advantage is impossible, and the baseline relative advantages and disadvantages remain – which helps incumbents, not startups. With a neutral Internet – well, the advantages of the incumbent competitor can’t be dissipated by a startup buying a favorable leg-up in speed and the Netflix’s of the world will be more likely to continue to dominate.

Of course the claim is that incumbents will use their huge resources to gain even more advantage with prioritized access. Implicit in this must be the assumption that the advantage that could be gained by a startup buying priority offers less return for the startup than the cost imposed on it by the inherent disadvantages of reputation, brand awareness, customer base, etc. But that’s not plausible for all or even most startups. And investors exist precisely because they are able to provide funds for which there is a likelihood of a good return – so if paying for priority would help overcome inherent disadvantages, there would be money for it.

Also implicit is the claim that the benefits to incumbents (over and above their natural advantages) from paying for priority, in terms of hamstringing new entrants, will outweigh the cost. This is unlikely generally to be true, as well. They already have advantages. Sure, sometimes they might want to pay for more, but in precisely the cases where it would be worth it to do so, the new entrant would also be most benefited by doing so itself – ensuring, again, that investment funds will be available.

Of course if both incumbents and startups decide paying for priority is better, we’re back to a world of “equality,” so what’s to complain about, based on this letter? This puts into stark relief that what these investors really want is government-mandated, subsidized broadband access, not “equality.”

Now, it’s conceivable that that is the optimal state of affairs, but if it is, it isn’t for the reasons given here, nor has anyone actually demonstrated that it is the case.

Entrepreneurs will need to raise money to buy fast lane services before they have proven that consumers want their product. Investors will extract more equity from entrepreneurs to compensate for the risk.

Internet applications will not be able to afford to create a relationship with millions of consumers by making their service freely available and then build a business over time as they better understand the value consumers find in their service (which is what Facebook, Twitter, Tumblr, Pinterest, Reddit, Dropbox and virtually other consumer Internet service did to achieve scale).

In other words: “Subsidize us. We’re worth it.” Maybe. But this is probably more revealing than intended. The Internet cost something to someone to build. (Actually, it cost more than a trillion dollars to broadband providers). This just says “we shouldn’t have to pay them for it now.” Fine, but who, then, and how do you know that forcing someone else to subsidize these startup companies will actually lead to better results? Mightn’t we get less broadband investment such that there is little Internet available for these companies to take advantage of in the first place? If broadband consumers instead of content consumers foot the bill, is that clearly preferable, either from a social welfare perspective, or even the self interest of these investors who, after all, do ultimately rely on consumer spending to earn their return?

Moreover, why is this “build for free, then learn how to monetize over time” business model necessarily better than any other? These startup investors know better than anyone that enshrining existing business models just because they exist is the antithesis of innovation and progress. But that’s exactly what they’re saying – “the successful companies of the past did it this way, so we should get a government guarantee to preserve our ability to do it, too!”

This is the most depressing implication of this letter. These investors and others like them have been responsible for financing enormously valuable innovations. If even they can’t see the hypocrisy of these claims for net neutrality – and worse, choose to propagate it further – then we really have come to a sad place. When innovators argue passionately for stagnation, we’re in trouble.

Instead, creators will have to ask permission of an investor or corporate hierarchy before they can launch. Ideas will be vetted by committees and quirky passion projects will not get a chance. An individual in dorm room or a design studio will not be able to experiment out loud on the Internet. The result will be greater conformity, fewer surprises, and less innovation.

This is just a little too much protest. Creators already have to ask “permission” – or are these investors just opening up their bank accounts to whomever wants their money? The ones that are able to do it on a shoestring, with money saved up from babysitting gigs, may find higher costs, and the need to do more babysitting. But again, there is nothing special about the Internet in this. Let’s mandate zero cost office space and office supplies and developer services and design services and . . . etc. for all – then we’ll have way more “permission-less” startups. If it’s just a handout they want, they should say so, instead of pretending there is a moral or economic welfare basis for their claims.

Further, investors like us will be wary of investing in anything that access providers might consider part of their future product plans for fear they will use the same technical infrastructure to advantage their own services or use network management as an excuse to disadvantage competitive offerings.

This is crazy. For the same reasons I mentioned above, the big access provider (and big incumbent competitor, for that matter) already has huge advantages. If these investors aren’t already wary of investing in anything that Google or Comcast or Apple or… might plan to compete with, they must be terrible at their jobs.

What’s more, Wheeler’s much-reviled proposal (what we know about it, that is), to say nothing of antitrust law, clearly contemplates exactly this sort of foreclosure and addresses it. “Pure” net neutrality doesn’t add much, if anything, to the limits those laws already do or would provide.

Policing this will be almost impossible (even using a standard of “commercial reasonableness”) and access providers do not need to successfully disadvantage their competition; they just need to create a credible threat so that investors like us will be less inclined to back those companies.

You think policing the world of non-neutrality is hard – try policing neutrality. It’s not as easy as proponents make it out to be. It’s simply never been the case that all bits at all times have been treated “neutrally” on the Internet. Any version of an Open Internet Order (just like the last one, for example) will have to recognize this.

Larry Downes compiled a list of the exceptions included in the last Open Internet Order when he testified before the House Judiciary Committee on the rules in 2011. There are 16 categories of exemption, covering a wide range of fundamental components of broadband connectivity, from CDNs to free Wi-Fi at Starbucks. His testimony is a tour de force, and should be required reading for everyone involved in this debate.

But think about how the manifest advantages of these non-neutral aspects of broadband networks would be squared with “real” neutrality. On their face, if these investors are to be taken at their word, these arguments would preclude all of the Open Internet Order’s exemptions, too. And if any sort of inequality is going to be deemed ok, how accurately would regulators distinguish between “illegitimate” inequality and the acceptable kind that lets coffee shops subsidize broadband? How does the simplistic logic of net equality distinguish between, say, Netflix’s colocated servers and a startup like Uber being integrated into Google Maps? The simple answer is that it doesn’t, and the claims and arguments of this letter are woefully inadequate to the task.

We need simple, strong, enforceable rules against discrimination and access fees, not merely against blocking.

No, we don’t. Or, at least, no one has made that case. These investors want a handout; that is the only case this letter makes.

We encourage the Commission to consider all available jurisdictional tools at its disposal in ensuring a free and open Internet that rewards, not disadvantages, investment and entrepreneurship.

… But not investment in broadband, and not entrepreneurship that breaks with the business models of the past. In reality, this letter is simple rent-seeking: “We want to invest in what we know, in what’s been done before, and we don’t want you to do anything to make that any more costly for us. If that entails impairing broadband investment or imposing costs on others, so be it – we’ll still make our outsized returns, and they can write their own letter complaining about ‘inequality.’”

A final point I have to make. Although the investors don’t come right out and say it, many others have, and it’s implicit in the investors’ letter: “Content providers shouldn’t have to pay for broadband. Users already pay for the service, so making content providers pay would just let ISPs double dip.” The claim is deeply problematic.

For starters, it’s another form of the status quo mentality: “Users have always paid and content hasn’t, so we object to any deviation from that.” But it needn’t be that way. And of course models frequently coexist where different parties pay for the same or similar services. Some periodicals are paid for by readers and offer little or no advertising; others charge a subscription and offer paid ads; and still others are offered for free, funded entirely by ads. All of these models work. None is “better” than the other. There is no reason the same isn’t true for broadband and content.

Net neutrality claims that the only proper price to charge on the content side of the market is zero. (Congratulations: You’re in the same club as that cutting-edge, innovative technology, the check, which is cleared at par by government fiat. A subsidy that no doubt explains why checks have managed to last this long). As an economic matter, that’s possible; it could be that zero is the right price. But it most certainly needn’t be, and issues revolving around Netflix’s traffic and the ability of ISPs and Netflix cost-effectively to handle it are evidence that zero may well not be the right price.

The reality is that these sorts of claims are devoid of economic logic — which is presumably why they, like the whole net neutrality “movement” generally, appeal so gratuitously to emotion rather than reason. But it doesn’t seem unreasonable to hope for more from a bunch of savvy financiers.