Archives For forbearance

Every 5 years, Congress has to reauthorize the sunsetting provisions of the Satellite Television Extension and Localism Act (STELA). And the deadline for renewing the law is quickly approaching (Dec. 31). While sunsetting is, in the abstract, seemingly a good thing to ensure rules don’t become outdated, there are an interlocking set of interest groups who, generally speaking, only support reauthorizing the law because they are locked in a regulatory stalemate. STELA no longer represents an optimal outcome for many if not most of the affected parties. The time is now for finally allowing STELA to sunset, and using this occasion to further reform the underlying regulatory morass it is built upon.

Since the creation of STELA in 1988, much has changed in the marketplace. At the time of the 1992 Cable Act (the first year data from the FCC’s Video Competition Reports is available), cable providers served 95% of multichannel video subscribers. Now, the power of cable has waned to the extent that 2 of the top 4 multichannel video programming distributors (MVPDs) are satellite providers, without even considering the explosion in competition from online video distributors like Netflix and Amazon Prime.

Given these developments, Congress should reconsider whether STELA is necessary at all, along with the whole complex regulatory structure undergirding it, and consider the relative simplicity with which copyright and antitrust law are capable of adequately facilitating the market for broadcast content negotiations. An approach building upon that contemplated in the bipartisan Modern Television Act of 2019 by Congressman Steve Scalise (R-LA) and Congresswoman Anna Eshoo (D-CA)—which would repeal the compulsory license/retransmission consent regime for both cable and satellite—would be a step in the right direction.

A brief history of STELA

STELA, originally known as the 1988 Satellite Home Viewer Act, was originally justified as necessary to promote satellite competition against incumbent cable networks and to give satellite companies stronger negotiating positions against network broadcasters. In particular, the goal was to give satellite providers the ability to transmit terrestrial network broadcasts to subscribers. To do this, this regulatory structure modified the Communications Act and the Copyright Act. 

With the 1988 Satellite Home Viewer Act, Congress created a compulsory license for satellite retransmissions under Section 119 of the Copyright Act. This compulsory license provision mandated, just as the Cable Act did for cable providers, that satellite would have the right to certain network broadcast content in exchange for a government-set price (despite the fact that local network affiliates don’t necessarily own the copyrights themselves). The retransmission consent provision requires satellite providers (and cable providers under the Cable Act) to negotiate with network broadcasters for the fee to be paid for the right to network broadcast content. 

Alternatively, broadcasters can opt to impose must-carry provisions on cable and satellite  in lieu of retransmission consent negotiations. These provisions require satellite and cable operators to carry many channels from network broadcasters in order to have access to their content. As ICLE President Geoffrey Manne explained to Congress previously:

The must-carry rules require that, for cable providers offering 12 or more channels in their basic tier, at least one-third of these be local broadcast retransmissions. The forced carriage of additional, less-favored local channels results in a “tax on capacity,” and at the margins causes a reduction in quality… In the end, must-carry rules effectively transfer significant programming decisions from cable providers to broadcast stations, to the detriment of consumers… Although the ability of local broadcasters to opt in to retransmission consent in lieu of must-carry permits negotiation between local broadcasters and cable providers over the price of retransmission, must-carry sets a floor on this price, ensuring that payment never flows from broadcasters to cable providers for carriage, even though for some content this is surely the efficient transaction.

The essential question about the reauthorization of STELA regards the following provisions: 

  1. an exemption from retransmission consent requirements for satellite operators for the carriage of distant network signals to “unserved households” while maintaining the compulsory license right for those signals (modification of the compulsory license/retransmission consent regime);
  2. the prohibition on exclusive retransmission consent contracts between MVPDs and network broadcasters (per se ban on a business model); and
  3. the requirement that television broadcast stations and MVPDs negotiate in good faith (nebulous negotiating standard reviewed by FCC).

This regulatory scheme was supposed to sunset after 5 years. Instead of actually sunsetting, Congress has consistently reauthorized STELA ( in 1994, 1999, 2004, 2010, and 2014).

Each time, satellite companies like DirecTV & Dish Network, as well as interest groups representing rural customers who depend heavily on satellite for cable television, strongly supported the renewal of the legislation. Over time, though, the reauthorization has led to amendments supported by major players from each side of the negotiating table and broad support for what is widely considered “must-pass” legislation. In other words, every affected industry found something they liked about the compromise legislation.

As it stands, the sunset provision of STELA has meant that it gives each side negotiating leverage during the next round of reauthorization talks, and often concessions are drawn. But rather than simplifying this regulatory morass, STELA reauthorization simply extends rules that have outlived their purpose.

Current marketplace competition undermines the necessity of STELA reauthorization

The marketplace is very different in 2019 than it was when STELA’s predecessors were adopted and reauthorized. No longer is it the case that cable dominates and that satellite and other providers need a leg up just to compete. Moreover, there are now services that didn’t even exist when the STELA framework was first developed. Competition is thriving.

Wikipedia:

RankServiceSubscribersProviderType
1.Xfinity21,986,000ComcastCable
2.DirecTV19,222,000AT&TSatellite
3.Spectrum16,606,000CharterCable
4.Dish9,905,000Dish NetworkSatellite
5.Verizon Fios TV4,451,000VerizonFiber-Optic
6.Cox Cable TV4,015,000Cox EnterprisesCable
7.U-Verse TV3,704,000AT&TFiber-Optic
8.Optimum/Suddenlink3,307,500Altice USACable
9.Sling TV*2,417,000Dish NetworkLive Streaming
10.Hulu with Live TV2,000,000Hulu(Disney, Comcast, AT&T)Live Streaming
11.DirecTV Now1,591,000AT&TLive Streaming
12.YouTube TV1,000,000Google(Alphabet)Live Streaming
13.Frontier FiOS838,000FrontierFiber-Optic
14.Mediacom776,000MediacomCable
15.PlayStation Vue500,000SonyLive Streaming
16.CableOne Cable TV326,423Cable OneCable
17.FuboTV250,000FuboTVLive Streaming

A 2018 accounting of the largest MVPDs by subscribers shows that satellite is 2 of the top 4, and that over-the-top services like Sling TV, Hulu with LiveTV, and YouTube TV are gaining significantly. And this does not even consider (non-live) streaming services such as Netflix (approximately 60 million US subscribers), Hulu (about 28 million US subscribers) and Amazon Prime Video (which has about 40 million users in the US). It is not clear from these numbers that satellite needs special rules in order to compete with cable, or that the complex regulatory regime underlying STELA is necessary anymore.

On the contrary, there seems to be a lot of reason to believe that content is king, and the market for the distribution of that content is thriving. Competition among platforms is intense, not only among MVPDs like Comcast, DirecTV, Charter, and Dish Network, but from streaming services like Netflix, Amazon Prime Video, Hulu, and HBONow. Distribution networks heavily invest in exclusive content to attract consumers. There is no reason to think that we need selective forbearance from the byzantine regulations in this space in order to promote satellite adoption when satellite companies are just as good as any at contracting for high-demand content (for instance DirecTV with NFL Sunday Ticket). 

A better way forward: Streamlined regulation in the form of copyright and antitrust

As Geoffrey Manne said in his Congressional testimony on STELA reauthorization back in 2013: 

behind all these special outdated regulations are laws of general application that govern the rest of the economy: antitrust and copyright. These are better, more resilient rules. They are simple rules for a complex world. They will stand up far better as video technology evolves–and they don’t need to be sunsetted.

Copyright law establishes clearly defined rights, thereby permitting efficient bargaining between content owners and distributors. But under the compulsory license system, the copyright holders’ right to a performance license is fundamentally abridged. Retransmission consent normally requires fees to be paid for the content that MVPDs have available to them. But STELA exempts certain network broadcasts (“distant signals” for “unserved households”) from retransmission consent requirements. This reduces incentives to develop content subject to STELA, which at the margin harms both content creators and viewers. It also gives satellite an unfair advantage vis-a-vis cable in those cases it does not need to pay ever-rising retransmission consent fees. Ironically, it also reduces the incentive for satellite providers (DirecTV, at least) to work to provide local content to some rural consumers. Congress should reform the law to allow copyright holders to have their full rights under the Copyright Act again. Congress should also repeal the compulsory license and must-carry provisions that work at cross-purposes and allow true marketplace negotiations.

The initial allocation of property rights guaranteed under copyright law would allow for MVPDs, including satellite providers, to negotiate with copyright holders for content, and thereby realize a more efficient set of content distribution outcomes than is otherwise possible. Under the compulsory license/retransmission consent regime underlying both STELA and the Cable Act, the outcomes at best approximate those that would occur through pure private ordering but in most cases lead to economically inefficient results because of the thumb on the scale in favor of the broadcasters. 

In a similar way, just as copyright law provides a superior set of bargaining conditions for content negotiation, antitrust law provides a superior mechanism for policing potentially problematic conduct between the firms involved. Under STELA, the FCC polices transactions with a “good faith” standard. In an important sense, this ambiguous regulatory discretion provides little information to prospective buyers and sellers of licenses as to what counts as “good faith” negotiations (aside from the specific practices listed).

By contrast, antitrust law, guided by the consumer welfare standard and decades of case law, is designed both to deter potential anticompetitive foreclosure and also to provide a clear standard for firms engaged in the marketplace. The effect of relying on antitrust law to police competitive harms is — as the name of the standard suggest — a net increase in the welfare of consumers, the ultimate beneficiaries of a well functioning market. 

For instance, consider a hypothetical dispute between a network broadcaster and a satellite provider. Under the FCC’s “good faith” oversight, bargaining disputes, which are increasingly resulting in blackouts, are reviewed for certain negotiating practices deemed to be unfair, 47 CFR § 76.65(b)(1), and by a more general “totality of the circumstances” standard, 47 CFR § 76.65(b)(2). This is both over- and under-inclusive as the negotiating practices listed in (b)(1) may have procompetitive benefits in certain circumstances, and the (b)(2) totality of the circumstances standard is vague and ill-defined. By comparison, antitrust claims would be adjudicated through a foreseeable process with reference to a consumer welfare standard illuminated by economic evidence and case law.

If a satellite provider alleges anticompetitive foreclosure by a refusal to license, its claims would be subject to analysis under the Sherman Act. In order to prove its case, it would need to show that the network broadcaster has power in a properly defined market and is using that market power to foreclose competition by leveraging its ownership over network content to the detriment of consumer welfare. A court would then analyze whether this refusal of a duty to deal is a violation of antitrust law under the Trinko and Aspen Skiing standards. Economic evidence would need to be introduced that supports the allegation. 

And, critically, in this process, the defendants would be entitled to raise evidence in their case — both evidence suggesting that there was no foreclosure, as well as evidence of procompetitive justifications for decisions that otherwise may be considered foreclosure. Ultimately, a court, bound by established, nondiscretionary standards would weigh the evidence and make a determination. It is, of course, possible, that a review for “good faith” conduct could reach the correct result, but there is simply not a similarly rigorous process available to consistently push it in that direction.

The above-mentioned Modern Television Act of 2019 does represent a step in the right direction, as it would repeal the compulsory license/retransmission consent regime applied to both cable and satellite operators. However, it is imperfect as it does leave must carry requirements in place for local content and retains the “good faith” negotiating standard to be enforced by the FCC. 

Expiration is better than the status quo even if fundamental reform is not possible

Some scholars who have written on this issue, and very much agree that fundamental reform is needed, nonetheless argue that STELA should be renewed if more fundamental reforms like those described above can’t be achieved. For instance, George Ford recently wrote that 

With limited days left in the legislative calendar before STELAR expires, there is insufficient time for a sensible solution to this complex issue. Senate Commerce Committee Chairman Roger Wicker (R-Miss.) has offered a “clean” STELAR reauthorization bill to maintain the status quo, which would provide Congress with some much-needed breathing room to begin tackling the gnarly issue of how broadcast signals can be both widely retransmitted and compensated. Congress and the Trump administration should welcome this opportunity.

However, even in a world without more fundamental reform, it is not clear that satellite needs distant signals in order to compete with cable. The number of “short markets”—i.e. those without access to all four local network broadcasts—implicated by the loss of distant signals is relatively few. Regardless of how bad the overall regulatory scheme needs to be updated, it makes no sense to continue to preserve STELA’s provisions that benefit satellite when it is no longer necessary on competition grounds.

Conclusion

Congress should not only let STELA sunset, but it should consider reforming the entire compulsory license/retransmission consent regime as the Modern Television Act of 2019 aims to do. In fact, reformers should look to go even further in repealing must-carry provisions and the good faith negotiating standard enforced by the FCC. Copyright and antitrust law are much better rules for this constantly evolving space than the current sector-specific rules. 

For previous work from ICLE on STELA see The Future of Video Marketplace Regulation (written testimony of ICLE President Geoffrey Manne from June 12, 2013) and Joint Comments of ICLE and TechFreedom, In the Matter of STELA Reauthorization and Video Programming Reform (March 19, 2014). 

[TOTM: The following is the sixth in a series of posts by TOTM guests and authors on the FTC v. Qualcomm case recently decided by Judge Lucy Koh in the Northern District of California. Other posts in this series are here.

This post is authored by Jonathan M. Barnett, Torrey H. Webb Professor of Law at the University of Southern California Gould School of Law.]

There is little doubt that the decision in May 2019 by the Northern District of California in FTC v. Qualcomm is of historical importance. Unless reversed or modified on appeal, the decision would require that the lead innovator behind 3G and 4G smartphone technology renegotiate hundreds of existing licenses with device producers and offer new licenses to any interested chipmakers.

The court’s sweeping order caps off a global campaign by implementers to re-engineer the property-rights infrastructure of the wireless markets. Those efforts have deployed the instruments of antitrust and patent law to override existing licensing arrangements and thereby reduce the input costs borne by device producers in the downstream market. This has occurred both directly, in arguments made by those firms in antitrust and patent litigation or through the filing of amicus briefs, or indirectly by advocating that regulators bring antitrust actions against IP licensors.

Whether or not FTC v. Qualcomm is correctly decided largely depends on whether or not downstream firms’ interest in minimizing the costs of obtaining technology inputs from upstream R&D specialists aligns with the public interest in preserving dynamically efficient innovation markets. As I discuss below, there are three reasons to believe those interests are not aligned in this case. If so, the court’s order would simply engineer a wealth transfer from firms that have led innovation in wireless markets to producers that have borne few of the costs and risks involved in doing so. Members of the former group each exhibits R&D intensities (R&D expenditures as a percentage of sales) in the high teens to low twenties; the latter, approximately five percent. Of greater concern, the court’s upending of long-established licensing arrangements endangers business models that monetize R&D by licensing technology to a large pool of device producers (see Qualcomm), rather than earning returns through self-contained hardware and software ecosystems (see Apple). There is no apparent antitrust rationale for picking and choosing among these business models in innovation markets.

Reason #1: FRAND is a Two-Sided Deal

To fully appreciate the recent litigations involving the FTC and Apple on the one hand, and Qualcomm on the other hand, it is necessary to return to the origins of modern wireless markets.

Starting in the late 1980s, various firms were engaged in the launch of the GSM wireless network in Western Europe. At that time, each European telecom market typically consisted of a national monopoly carrier and a favored group of local equipment suppliers. The GSM project, which envisioned a trans-national wireless communications market, challenged this model. In particular, the national carrier and equipment monopolies were threatened by the fact that the GSM standard relied in part on patented technology held by an outside innovator—namely, Motorola. As I describe in a forthcoming publication, the “FRAND” (fair, reasonable and nondiscriminatory) principles that today govern the licensing of standard-essential patents in wireless markets emerged from a negotiation between, on the one hand, carriers and producers who sought a royalty cap and, on the other hand, a technology innovator that sought to preserve its licensing freedom going forward.

This negotiation history is important. Any informed discussion of the meaning of FRAND must recognize that this principle was adopted as something akin to a “good faith” contractual term designed to promote two objectives:

  1. Protect downstream adopters from holdup tactics by upstream innovators; and
  2. enable upstream innovators to enjoy an appreciable portion of the value generated by sales in the consumer market.

Any interpretation of FRAND that does not meet these conditions will induce upstream firms to reduce R&D investment, limit participation in standard-setting activities, or vertically integrate forward to capture directly a return on R&D dollars.

Reason #2: No Evidence of Actual Harm

In the December 2018 appellate court proceedings in which the Department of Justice unsuccessfully challenged the AT&T/Time-Warner merger, Judge David Sentelle of the D.C. Circuit said to the government’s legal counsel:

If you’re going to rely on an economic model, you have to rely on it with quantification. The bare theorem . . . doesn’t prove anything in a particular case.

The government could not credibly reply to that query in the AT&T case and, if appropriately challenged, could not do so in this case.

Far from being a market that calls out for federal antitrust intervention, the smartphone market offers what appears to be an almost textbook case of dynamic efficiency. For over a decade, implementers, along with sympathetic regulators and commentators, have argued that the market suffers (or, in a variation, will imminently suffer) from inflated prices, reduced output and delayed innovation as a result of “patent hold-up” and “royalty stacking” by opportunistic patent owners. In the course of several decades that have passed since the launch of the GSM network, none of these predictions have yet to materialize. To the contrary. The market has exhibited expanding output, declining prices (adjusted for increased functionality), constant innovation, and regular entry into the production market. Multiple empirical studies (e.g. this, this and this) have found that device producers bear on average an aggregate royalty burden in the single to mid-digits.

This hardly seems like a market in which producers and consumers are being “victimized” by what the Northern District of California calls “unreasonably high” licensing fees (compared to an unspecified, and inherently unspecifiable, dynamically efficient benchmark). Rather, it seems more likely that device producers—many of whom provided the testimony which the court referenced in concluding that royalty rates were “unreasonably high”—would simply prefer to pay an even lower fee to R&D input suppliers (with no assurance that any of the cost-savings would flow to consumers).

Reason #3: The “License as Tax” Fallacy

The rhetorical centerpiece of the FTC’s brief relied on an analogy between the patent license fees earned by Qualcomm in the downstream device market and the tax that everyone pays to the IRS. The court’s opinion wholeheartedly adopted this narrative, determining that Qualcomm imposes a tax (or, as Judge Koh terms it, a “surcharge”) on the smartphone market by demanding a fee from OEMs for use of its patent portfolio whether or not the OEM purchases chipsets from Qualcomm or another firm. The tax analogy is fundamentally incomplete, both in general and in this case in particular.

It is true that much of the economic literature applies monopoly taxation models to assess the deadweight losses attributed to patents. While this analogy facilitates analytical tractability, a “zero-sum” approach to patent licensing overlooks the value-creating “multiplier” effect that licensing generates in real-world markets. Specifically, broad-based downstream licensing by upstream patent owners—something to which SEP owners commit under FRAND principles—ensures that device makers can obtain the necessary technology inputs and, in doing so, facilitates entry by producers that do not have robust R&D capacities. All of that ultimately generates gains for consumers.

This “positive-sum” multiplier effect appears to be at work in the smartphone market. Far from acting as a tax, Qualcomm’s licensing policies appear to have promoted entry into the smartphone market, which has experienced fairly robust turnover in market leadership. While Apple and Samsung may currently dominate the U.S. market, they face intense competition globally from Chinese firms such as Huawei, Xiaomi and Oppo. That competitive threat is real. As of 2007, Nokia and Blackberry were the overwhelming market leaders and appeared to be indomitable. Yet neither can be found in the market today. That intense “gale of competition”, sustained by the fact that any downstream producer can access the required technology inputs upon payment of licensing fees to upstream innovators, challenges the view that Qualcomm’s licensing practices have somehow restrained market growth.

Concluding Thoughts: Antitrust Flashback

When competitive harms are so unclear (and competitive gains so evident), modern antitrust law sensibly prescribes forbearance. A famous “bad case” from antitrust history shows why.

In 1953, the Department of Justice won an antitrust suit against United Shoe Machinery Corporation, which had led innovation in shoe manufacturing equipment and subsequently dominated that market. United Shoe’s purportedly anti-competitive practices included a lease-only policy that incorporated training and repair services at no incremental charge. The court found this to be a coercive tie that preserved United Shoe’s dominant position, despite the absence of any evidence of competitive harm. Scholars have subsequently shown (e.g. this and  this; see also this) that the court did not adequately consider (at least) two efficiency explanations: (1) lease-only policies were widespread in the market because this facilitated access by smaller capital-constrained manufacturers, and (2) tying support services to equipment enabled United Shoe to avoid free-riding on its training services by other equipment suppliers. In retrospect, courts relied on a mere possibility theorem ultimately to order the break-up of a technological pioneer, with potentially adverse consequences for manufacturers that relied on its R&D efforts.

The court’s decision in FTC v. Qualcomm is a flashback to cases like United Shoe in which courts found liability and imposed dramatic remedies with little economic inquiry into competitive harm. It has become fashionable to assert that current antitrust law is too cautious in finding liability. Yet there is a sound reason why, outside price-fixing, courts generally insist that theories of antitrust liability include compelling evidence of competitive harm. Antitrust remedies are strong medicine and should be administered with caution. If courts and regulators do not zealously scrutinize the factual support for antitrust claims, then they are vulnerable to capture by private entities whose business objectives may depart from the public interest in competitive markets. While no antitrust fact-pattern is free from doubt, over two decades of market performance strongly favor the view that long-standing licensing arrangements in the smartphone market have resulted in substantial net welfare gains for consumers. If so, the prudent course of action is simply to leave the market alone.

As I explain in my new book, How to Regulate, sound regulation requires thinking like a doctor.  When addressing some “disease” that reduces social welfare, policymakers should catalog the available “remedies” for the problem, consider the implementation difficulties and “side effects” of each, and select the remedy that offers the greatest net benefit.

If we followed that approach in deciding what to do about the way Internet Service Providers (ISPs) manage traffic on their networks, we would conclude that FCC Chairman Ajit Pai is exactly right:  The FCC should reverse its order classifying ISPs as common carriers (Title II classification) and leave matters of non-neutral network management to antitrust, the residual regulator of practices that may injure competition.

Let’s walk through the analysis.

Diagnose the Disease.  The primary concern of net neutrality advocates is that ISPs will block some Internet content or will slow or degrade transmission from content providers who do not pay for a “fast lane.”  Of course, if an ISP’s non-neutral network management impairs the user experience, it will lose business; the vast majority of Americans have access to multiple ISPs, and competition is growing by the day, particularly as mobile broadband expands.

But an ISP might still play favorites, despite the threat of losing some subscribers, if it has a relationship with content providers.  Comcast, for example, could opt to speed up content from HULU, which streams programming of Comcast’s NBC subsidiary, or might slow down content from Netflix, whose streaming video competes with Comcast’s own cable programming.  Comcast’s losses in the distribution market (from angry consumers switching ISPs) might be less than its gains in the content market (from reducing competition there).

It seems, then, that the “disease” that might warrant a regulatory fix is an anticompetitive vertical restraint of trade: a business practice in one market (distribution) that could restrain trade in another market (content production) and thereby reduce overall output in that market.

Catalog the Available Remedies.  The statutory landscape provides at least three potential remedies for this disease.

The simplest approach would be to leave the matter to antitrust, which applies in the absence of more focused regulation.  In recent decades, courts have revised the standards governing vertical restraints of trade so that antitrust, which used to treat such restraints in a ham-fisted fashion, now does a pretty good job separating pro-consumer restraints from anti-consumer ones.

A second legally available approach would be to craft narrowly tailored rules precluding ISPs from blocking, degrading, or favoring particular Internet content.  The U.S. Court of Appeals for the D.C. Circuit held that Section 706 of the 1996 Telecommunications Act empowered the FCC to adopt targeted net neutrality rules, even if ISPs are not classified as common carriers.  The court insisted the that rules not treat ISPs as common carriers (if they are not officially classified as such), but it provided a road map for tailored net neutrality rules. The FCC pursued this targeted, rules-based approach until President Obama pushed for a third approach.

In November 2014, reeling from a shellacking in the  midterm elections and hoping to shore up his base, President Obama posted a video calling on the Commission to assure net neutrality by reclassifying ISPs as common carriers.  Such reclassification would subject ISPs to Title II of the 1934 Communications Act, giving the FCC broad power to assure that their business practices are “just and reasonable.”  Prodded by the President, the nominally independent commissioners abandoned their targeted, rules-based approach and voted to regulate ISPs like utilities.  They then used their enhanced regulatory authority to impose rules forbidding the blocking, throttling, or paid prioritization of Internet content.

Assess the Remedies’ Limitations, Implementation Difficulties, and Side Effects.   The three legally available remedies — antitrust, tailored rules under Section 706, and broad oversight under Title II — offer different pros and cons, as I explained in How to Regulate:

The choice between antitrust and direct regulation generally (under either Section 706 or Title II) involves a tradeoff between flexibility and determinacy. Antitrust is flexible but somewhat indeterminate; it would condemn non-neutral network management practices that are likely to injure consumers, but it would permit such practices if they would lower costs, improve quality, or otherwise enhance consumer welfare. The direct regulatory approaches are rigid but clearer; they declare all instances of non-neutral network management to be illegal per se.

Determinacy and flexibility influence decision and error costs.  Because they are more determinate, ex ante rules should impose lower decision costs than would antitrust. But direct regulation’s inflexibility—automatic condemnation, no questions asked—will generate higher error costs. That’s because non-neutral network management is often good for end users. For example, speeding up the transmission of content for which delivery lags are particularly detrimental to the end-user experience (e.g., an Internet telephone call, streaming video) at the expense of content that is less lag-sensitive (e.g., digital photographs downloaded from a photo-sharing website) can create a net consumer benefit and should probably be allowed. A per se rule against non-neutral network management would therefore err fairly frequently. Antitrust’s flexible approach, informed by a century of economic learning on the output effects of contractual restraints between vertically related firms (like content producers and distributors), would probably generate lower error costs.

Although both antitrust and direct regulation offer advantages vis-à-vis each other, this isn’t simply a wash. The error cost advantage antitrust holds over direct regulation likely swamps direct regulation’s decision cost advantage. Extensive experience with vertical restraints on distribution have shown that they are usually good for consumers. For that reason, antitrust courts in recent decades have discarded their old per se rules against such practices—rules that resemble the FCC’s direct regulatory approach—in favor of structured rules of reason that assess liability based on specific features of the market and restraint at issue. While these rules of reason (standards, really) may be less determinate than the old, error-prone per se rules, they are not indeterminate. By relying on past precedents and the overarching principle that legality turns on consumer welfare effects, business planners and adjudicators ought to be able to determine fairly easily whether a non-neutral network management practice passes muster. Indeed, the fact that the FCC has uncovered only four instances of anticompetitive network management over the commercial Internet’s entire history—a period in which antitrust, but not direct regulation, has governed ISPs—suggests that business planners are capable of determining what behavior is off-limits. Direct regulation’s per se rule against non-neutral network management is thus likely to add error costs that exceed any reduction in decision costs. It is probably not the remedy that would be selected under this book’s recommended approach.

In any event, direct regulation under Title II, the currently prevailing approach, is certainly not the optimal way to address potentially anticompetitive instances of non-neutral network management by ISPs. Whereas any ex ante   regulation of network management will confront the familiar knowledge problem, opting for direct regulation under Title II, rather than the more cabined approach under Section 706, adds adverse public choice concerns to the mix.

As explained earlier, reclassifying ISPs to bring them under Title II empowers the FCC to scrutinize the “justice” and “reasonableness” of nearly every aspect of every arrangement between content providers, ISPs, and consumers. Granted, the current commissioners have pledged not to exercise their Title II authority beyond mandating network neutrality, but public choice insights would suggest that this promised forbearance is unlikely to endure. FCC officials, who remain self-interest maximizers even when acting in their official capacities, benefit from expanding their regulatory turf; they gain increased power and prestige, larger budgets to manage, a greater ability to “make or break” businesses, and thus more opportunity to take actions that may enhance their future career opportunities. They will therefore face constant temptation to exercise the Title II authority that they have committed, as of now, to leave fallow. Regulated businesses, knowing that FCC decisions are key to their success, will expend significant resources lobbying for outcomes that benefit them or impair their rivals. If they don’t get what they want because of the commissioners’ voluntary forbearance, they may bring legal challenges asserting that the Commission has failed to assure just and reasonable practices as Title II demands. Many of the decisions at issue will involve the familiar “concentrated benefits/diffused costs” dynamic that tends to result in underrepresentation by those who are adversely affected by a contemplated decision. Taken together, these considerations make it unlikely that the current commissioners’ promised restraint will endure. Reclassification of ISPs so that they are subject to Title II regulation will probably lead to additional constraints on edge providers and ISPs.

It seems, then, that mandating net neutrality under Title II of the 1934 Communications Act is the least desirable of the three statutorily available approaches to addressing anticompetitive network management practices. The Title II approach combines the inflexibility and ensuing error costs of the Section 706 direct regulation approach with the indeterminacy and higher decision costs of an antitrust approach. Indeed, the indeterminacy under Title II is significantly greater than that under antitrust because the “just and reasonable” requirements of the Communications Act, unlike antitrust’s reasonableness requirements (no unreasonable restraint of trade, no unreasonably exclusionary conduct) are not constrained by the consumer welfare principle. Whereas antitrust always protects consumers, not competitors, the FCC may well decide that business practices in the Internet space are unjust or unreasonable solely because they make things harder for the perpetrator’s rivals. Business planners are thus really “at sea” when it comes to assessing the legality of novel practices.

All this implies that Internet businesses regulated by Title II need to court the FCC’s favor, that FCC officials have more ability than ever to manipulate government power to private ends, that organized interest groups are well-poised to secure their preferences when the costs are great but widely dispersed, and that the regulators’ dictated outcomes—immune from market pressures reflecting consumers’ preferences—are less likely to maximize net social welfare. In opting for a Title II solution to what is essentially a market power problem, the powers that be gave short shrift to an antitrust approach, even though there was no natural monopoly justification for direct regulation. They paid little heed to the adverse consequences likely to result from rigid per se rules adopted under a highly discretionary (and politically manipulable) standard. They should have gone back to basics, assessing the disease to be remedied (market power), the full range of available remedies (including antitrust), and the potential side effects of each. In other words, they could’ve used this book.

How to Regulate‘s full discussion of net neutrality and Title II is here:  Net Neutrality Discussion in How to Regulate.

Discussion

In recent years, U.S. government policymakers have recounted various alleged market deficiencies associated with patent licensing practices, as part of a call for patent policy “reforms” – with the “reforms” likely to have the effect of weakening patent rights.  In particular, antitrust enforcers have expressed concerns that:  (1) the holder of a patent covering the technology needed to implement some aspect of a technical standard (a “standard-essential patent,” or SEP) could “hold up” producers that utilize the standard by demanding  anticompetitively high royalty payments; (2) the accumulation of royalties for multiple complementary patent licenses needed to make a product exceeds the bundled monopoly rate that would be charged if all patents were under common control (“royalty stacking”); (3) an overlapping set of patent rights requiring that producers seeking to commercialize a new technology obtain licenses from multiple patentees deters innovation (“patent thickets”); and (4) the dispersed ownership of complementary patented inventions results in “excess” property rights, the underuse of resources, and economic inefficiency (“the tragedy of the anticommons”).  (See, for example, Federal Trade Commission and U.S. Justice Department reports on antitrust and intellectual property policy, here, here, and here).

Although some commentators have expressed skepticism about the actual real world incidence of these scenarios, relatively little attention has been paid to the underlying economic assumptions that give rise to the “excessive royalty” problem that is portrayed.  Very recently, however, Professor Daniel F. Spulber of Northwestern University circulated a paper that questions those assumptions.  The paper points out that claims of economic harm due to excessive royalty charges critically rest on the assumption that individual patent owners choose royalties using posted prices, thereby generating total royalties that are above the monopoly level that would be charged for all complementary patents if they were owned in common.  In other words, it is assumed that interdependencies among complements are ignored, with individual patent monopoly prices being separately charged – the “Cournot complements” problem.

In reality, however, Professor Spulber explains that patent licensing usually involves bargaining rather than posted prices, because such licensing involves long-term contractual relationships between patentees and producers, rather than immediate exchange.  Significantly, the paper shows that bargaining procedures reflecting long-term relationships maximize the joint profits of inventors (patentees) and producers, with licensing royalties being less than (as opposed to more than under posted prices) bundled monopoly royalties.  In short, bargaining over long-term patent licensing contracts yields an efficient market outcome, in marked contrast to the inefficient outcome posited by those who (wrongly) assume patent licensing under posted prices.  In other words, real world patent holders (as opposed to the inward-looking, non-cooperative, posted-price patentees of government legend) tend to engage in highly fruitful licensing negotiations that yield socially efficient outcomes.  This finding neatly explains why examples of economically-debilitating patent thickets, royalty stacks, hold-ups, and patent anti-commons, like unicorns (or perhaps, to be fair, black swans), are amazingly hard to spot in the real world.  It also explains why the business sector that should in theory be most prone to such “excessive patent” problems, the telecommunications industry (which involves many different patentees and producers, and tens of thousands of patents), has been (and remains) a leader in economic growth and innovation.  (See also here, for an article explaining that smartphone innovation has soared because of the large number of patents.)

Professor Spulber’s concluding section highlights the policy implications of his research:

The efficiency of the bargaining outcome differs from the outcome of the Cournot posted prices model. Understanding the role of bargaining helps address a host of public policy concerns, including SEP holdup, royalty stacking, patent thickets, the tragedy of the anticommons, and justification for patent pools. The efficiency of the bargaining outcome suggests the need for antitrust forbearance toward industries that combine multiple inventions, including SEPs.

Professor Spulber’s reference to “antitrust forbearance” is noteworthy.  As I have previously pointed out (see, for example, here, here, and here), in recent years U.S. antitrust enforcers have taken positions that tend to favor the weakening of patent rights.  Those positions are justified by the “patent policy problems” that Professor Spulber’s paper debunks, as well as an emphasis on low quality “probabilistic patents” (see, for example, here) that ignores a growing body of literature (both theoretical and empirical) on the economic benefits of a strong patent system (see, for example, here and here).

In sum, Professor Spulber’s impressive study is one more piece of compelling evidence that the federal government’s implicitly “anti-patent” positions are misguided.  The government should reject those positions and restore its previous policy of respect for robust patent rights – a policy that promotes American innovation and economic growth.

Appendix

While Professor Spulber’s long paper is well worth a careful read, key italicized excerpts from his debunking of prominent “excessive patent” stories are set forth below.

SEP Holdups

Standard Setting Organizations (SSOs) are voluntary organizations that establish and disseminate technology standards for industries. Patent owners may declare that their patents are essential to manufacturing products that conform to the standard. Many critics of SSOs suggest that inclusion of SEPs in technology standards allows patent owners to charge much higher royalties than if the SEPs were not included in the standard. SEPs are said to cause a form of “holdup” if producers using the patented technology would incur high costs of switching to alternative technologies. . . . [Academic] discussions of the effects of SEPs [summarized by the author] depend on patent owners choosing royalties using posted prices, generating total royalties above the bundled monopoly level. When IP owners and producers engage in bargaining, the present analysis suggests that total royalties will be less than the bundled monopoly level. Efficiencies in choosing licensing royalties should mitigate concerns about the effects of SEPs on total royalties when patent licensing involves bargaining. The present analysis further suggests bargaining should reduce or eliminate concerns about SEP “holdup”. Efficiencies in choosing patent licensing royalties also should help mitigate concerns about whether or not SSOs choose efficient technology standards.

Royalty Stacking

“Royalty stacking” refers to the situation in which total royalties are excessive in comparison to some benchmark, typically the bundled monopoly rate. . . . The present analysis shows that the perceived royalty stacking problem is due to the posted prices assumption in Cournot’s model. . . . The present analysis shows that royalty stacking need not occur with different market institutions, notably bargaining between IP owners and producers. In particular, with non-cooperative licensing offers and negotiation of royalty rates between IP owners and producers, total royalties will be less than the royalties chosen by a bundled monopoly IP owner. The result that total royalties are less than the bundled monopoly benchmark holds even if there are many patented inventions. Total royalties are less than the benchmark with innovative complements and substitutes.

Patent Thickets

The patent thickets view considers patents as deterrents to innovation. This view differs substantially from the view that patents function as property rights that stimulate innovation. . . . The bargaining analysis presented here suggests that multiple patents should not be viewed as deterring innovation. Multiple inventors can coordinate with producers through market transactions. This means that by making licensing offers to producers and negotiating patent royalties, inventors and producers can achieve efficient outcomes. There is no need for government regulation to restrict the total number of patents. Arbitrarily limiting the total number of patents by various regulatory mechanisms would likely discourage invention and innovation.

Tragedy of the Anticommons

The “Tragedy of the Anticommons” describes the situation in which dispersed ownership of complementary inventions results in underuse of resources[.] . . . . The present analysis shows that patents need not create excess property rights when there is bargaining between IP owners and producers. Bargaining results in a total output that maximizes the joint returns of inventors and producers. Social welfare and final output are greater with bargaining than in Cournot’s posted prices model. This contradicts the “Tragedy of the Anticommons” result and shows that there need not be underutilization of resources due to high royalties.

The FCC doesn’t have authority over the edge and doesn’t want authority over the edge. Well, that is until it finds itself with no choice but to regulate the edge as a result of its own policies. As the FCC begins to explore its new authority to regulate privacy under the Open Internet Order (“OIO”), for instance, it will run up against policy conflicts and inconsistencies that will make it increasingly hard to justify forbearance from regulating edge providers.

Take for example the recently announced NPRM titled “Expanding Consumers’ Video Navigation Choices” — a proposal that seeks to force cable companies to provide video programming to third party set-top box manufacturers. Under the proposed rules, MVPD distributors would be required to expose three data streams to competitors: (1) listing information about what is available to particular customers; (2) the rights associated with accessing such content; and (3) the actual video content. As Geoff Manne has aptly noted, this seems to be much more of an effort to eliminate the “nightmare” of “too many remote controls” than it is to actually expand consumer choice in a market that is essentially drowning in consumer choice. But of course even so innocuous a goal—which is probably more about picking on cable companies because… “eww cable companies”—suggests some very important questions.

First, the market for video on cable systems is governed by a highly interdependent web of contracts that assures to a wide variety of parties that their bargained-for rights are respected. Among other things, channels negotiate for particular placements and channel numbers in a cable system’s lineup, IP rights holders bargain for content to be made available only at certain times and at certain locations, and advertisers pay for their ads to be inserted into channel streams and broadcasts.

Moreover, to a large extent, the content industry develops its content based on a stable regime of bargained-for contractual terms with cable distribution networks (among others). Disrupting the ability of cable companies to control access to their video streams will undoubtedly alter the underlying assumptions upon which IP companies rely when planning and investing in content development. And, of course, the physical networks and their related equipment have been engineered around the current cable-access regimes. Some non-trivial amount of re-engineering will have to take place to make the cable-networks compatible with a more “open” set-top box market.

The FCC nods to these concerns in its NPRM, when it notes that its “goal is to preserve the contractual arrangements between programmers and MVPDs, while creating additional opportunities for programmers[.]” But this aspiration is not clearly given effect in the NPRM, and, as noted, some contractual arrangements are simply inconsistent with the NPRM’s approach.

Second, the FCC proposes to bind third-party manufacturers to the public interest privacy commitments in §§ 629, 551 and 338(i) of the Communications Act (“Act”) through a self-certification process. MVPDs would be required to pass the three data streams to third-party providers only once such a certification is received. To the extent that these sections, enforced via self-certification, do not sufficiently curtail third-parties’ undesirable behavior, the FCC appears to believe that “the strictest state regulatory regime[s]” and the “European Union privacy regulations” will serve as the necessary regulatory gap fillers.

This seems hard to believe, however, particularly given the recently announced privacy and cybersecurity NPRM, through which the FCC will adopt rules detailing the agency’s new authority (under the OIO) to regulate privacy at the ISP level. Largely, these rules will grow out of §§ 222 and 201 of the Act, which the FCC in Terracom interpreted together to be a general grant of privacy and cybersecurity authority.

I’m apprehensive of the asserted scope of the FCC’s power over privacy — let alone cybersecurity — under §§ 222 and 201. In truth, the FCC makes an admirable showing in Terracom of demonstrating its reasoning; it does a far better job than the FTC in similar enforcement actions. But there remains a problem. The FTC’s authority is fundamentally cabined by the limitations contained within the FTC Act (even if it frequently chooses to ignore them, they are there and are theoretically a protection against overreach).

But the FCC’s enforcement decisions are restrained (if at all) by a vague “public interest” mandate, and a claim that it will enforce these privacy principles on a case-by-case basis. Thus, the FCC’s proposed regime is inherently one based on vast agency discretion. As in many other contexts, enforcers with wide discretion and a tremendous power to penalize exert a chilling effect on innovation and openness, as well as a frightening power over a tremendous swath of the economy. For the FCC to claim anything like an unbounded UDAP authority for itself has got to be outside of the archaic grant of authority from § 201, and is certainly a long stretch for the language of § 706 (a provision of the Act which it used as one of the fundamental justifications for the OIO)— leading very possibly to a bout of Chevron problems under precedent such as King v. Burwell and UARG v. EPA.

And there is a real risk here of, if not hypocrisy, then… deep conflict in the way the FCC will strike out on the set-top box and privacy NPRMs. The Commission has already noted in its NPRM that it will not be able to bind third-party providers of set-top boxes under the same privacy requirements that apply to current MVPD providers. Self-certification will go a certain length, but even there agitation from privacy absolutists will possibly sway the FCC to consider more stringent requirements. For instance, §§ 551 and 338 of the Act — which the FCC focuses on in the set-top box NPRM — are really only about disclosing intended uses of consumer data. And disclosures can come in many forms, including burying them in long terms of service that customers frequently do not read. Such “weak” guarantees of consumer privacy will likely become a frequent source of complaint (and FCC filings) for privacy absolutists.  

Further, many of the new set-top box entrants are going to be current providers of OTT video or devices that redistribute OTT video. And many of these providers make a huge share of their revenue from data mining and selling access to customer data. Which means one of two things: Either the FCC is going to just allow us to live in a world of double standards where these self-certifying entities are permitted significantly more leeway in their uses of consumer data than MVPD providers or, alternatively, the FCC is going to discover that it does in fact need to “do something.” If only there were a creative way to extend the new privacy authority under Title II to these providers of set-top boxes… . Oh! there is: bring edge providers into the regulation fold under the OIO.

It’s interesting that Wheeler’s announcement of the FCC’s privacy NPRM explicitly noted that the rules would not be extended to edge providers. That Wheeler felt the need to be explicit in this suggests that he believes that the FCC has the authority to extend the privacy regulations to edge providers, but that it will merely forbear (for now) from doing so.

If edge providers are swept into the scope of Title II they would be subject to the brand new privacy rules the FCC is proposing. Thus, despite itself (or perhaps not), the FCC may find itself in possession of a much larger authority over some edge providers than any of the pro-Title II folks would have dared admit was possible. And the hook (this time) could be the privacy concerns embedded in the FCC’s ill-advised attempt to “open” the set-top box market.

This is a complicated set of issues, and it’s contingent on a number of moving parts. This week, Chairman Wheeler will be facing an appropriations hearing where I hope he will be asked to unpack his thinking regarding the true extent to which the OIO may in fact be extended to the edge.

Yesterday, the International Center for Law & Economics, together with Professor Gus Hurwitz, Nebraska College of Law, and nine other scholars of law and economics, filed an amicus brief in the DC Circuit explaining why the court should vacate the FCC’s 2015 Open Internet Order.

A few key points from ICLE’s brief follow, but you can read a longer summary of the brief here.

If the 2010 Order was a limited incursion into neighboring territory, the 2015 Order represents the outright colonization of a foreign land, extending FCC control over the Internet far beyond what the Telecommunications Act authorizes.

The Commission asserts vast powers — powers that Congress never gave it — not just over broadband but also over the very ‘edge’ providers it claims to be protecting. The court should be very skeptical of the FCC’s claims to pervasive powers over the Internet.

In the 2015 Order, the FCC Invoked Title II, admitted that it was unworkable for the Internet, and then tried to ‘tailor’ the statute to avoid its worst excesses.

That the FCC felt the need for such sweeping forbearance should have indicated to it that it had ‘taken an interpretive wrong turn’ in understanding the statute Congress gave it. Last year, the Supreme Court blocked a similar attempt by the EPA to ‘modernize’ old legislation in a way that gave it expansive new powers. In its landmark UARG decision, the Court made clear that it won’t allow regulatory agencies to rewrite legislation in an effort to retrofit their statutes to their preferred regulatory regimes.

Internet regulation is a question of ‘vast economic and political significance,’ yet the FCC  didn’t even bother to weigh the costs and benefits of its rule. 

FCC Chairman Tom Wheeler never misses an opportunity to talk about the the Internet as ‘the most important network known to Man.’ So why did he and the previous FCC Chairman ignore requests from other commissioners for serious, independent economic analysis of the supposed problem and the best way to address it? Why did the FCC rush to adopt a plan that had the effect of blocking the Federal Trade Commission from applying its consumer protection laws to the Internet? For all the FCC’s talk about protecting consumers, it appears that its real agenda may be simply expanding its own power.

Joining ICLE on the brief are:

  • Richard Epstein (NYU Law)
  • James Huffman (Lewis & Clark Law)
  • Gus Hurwitz (Nebraska Law)
  • Thom Lambert (Missouri Law)
  • Daniel Lyons (Boston College Law)
  • Geoffrey Manne (ICLE)
  • Randy May (Free State Foundation)
  • Jeremy Rabkin (GMU Law)
  • Ronald Rotunda (Chapman Law)
  • Ilya Somin (GMU Law)

Read the brief here, and the summary here.

Read more of ICLE’s work on net neutrality and Title II, including:

  • Highlights from policy and legal comments filed by ICLE and TechFreedom on net neutrality
  • “Regulating the Most Powerful Network Ever,” a scholarly essay by Gus Hurwitz for the Free State Foundation
  • “How to Break the Internet,” an essay by Geoffrey Manne and Ben Sperry, in Reason Magazine
  • “The FCC’s Net Neutrality Victory is Anything But,” an op-ed by Geoffrey Manne, in Wired
  • “The Feds Lost on Net Neutrality, But Won Control of the Internet,” an op-ed by Geoffrey Manne and Berin Szoka in Wired
  • “Net Neutrality’s Hollow Promise to Startups,” an op-ed by Geoffrey Manne and Berin Szoka in Computerworld
  • Letter signed by 32 scholars urging the FTC to caution the FCC against adopting per se net neutrality rules by reclassifying ISPs under Title II
  • The FCC’s Open Internet Roundtables, Policy Approaches, Panel 3, Enhancing Transparency, with Geoffrey Manne​

Remember when net neutrality wasn’t going to involve rate regulation and it was crazy to say that it would? Or that it wouldn’t lead to regulation of edge providers? Or that it was only about the last mile and not interconnection? Well, if the early petitions and complaints are a preview of more to come, the Open Internet Order may end up having the FCC regulating rates for interconnection and extending the reach of its privacy rules to edge providers.

On Monday, Consumer Watchdog petitioned the FCC to not only apply Customer Proprietary Network Information (CPNI) rules originally meant for telephone companies to ISPs, but to also start a rulemaking to require edge providers to honor Do Not Track requests in order to “promote broadband deployment” under Section 706. Of course, we warned of this possibility in our joint ICLE-TechFreedom legal comments:

For instance, it is not clear why the FCC could not, through Section 706, mandate “network level” copyright enforcement schemes or the DNS blocking that was at the heart of the Stop Online Piracy Act (SOPA). . . Thus, it would appear that Section 706, as re-interpreted by the FCC, would, under the D.C. Circuit’s Verizon decision, allow the FCC sweeping power to regulate the Internet up to and including (but not beyond) the process of “communications” on end-user devices. This could include not only copyright regulation but everything from cybersecurity to privacy to technical standards. (emphasis added).

While the merits of Do Not Track are debatable, it is worth noting that privacy regulation can go too far and actually drastically change the Internet ecosystem. In fact, it is actually a plausible scenario that overregulating data collection online could lead to the greater use of paywalls to access content.  This may actually be a greater threat to Internet Openness than anything ISPs have done.

And then yesterday, the first complaint under the new Open Internet rule was brought against Time Warner Cable by a small streaming video company called Commercial Network Services. According to several news stories, CNS “plans to file a peering complaint against Time Warner Cable under the Federal Communications Commission’s new network-neutrality rules unless the company strikes a free peering deal ASAP.” In other words, CNS is asking for rate regulation for interconnectionshakespeare. Under the Open Internet Order, the FCC can rule on such complaints, but it can only rule on a case-by-case basis. Either TWC assents to free peering, or the FCC intervenes and sets the rate for them, or the FCC dismisses the complaint altogether and pushes such decisions down the road.

This was another predictable development that many critics of the Open Internet Order warned about: there was no way to really avoid rate regulation once the FCC reclassified ISPs. While the FCC could reject this complaint, it is clear that they have the ability to impose de facto rate regulation through case-by-case adjudication. Whether it is rate regulation according to Title II (which the FCC ostensibly didn’t do through forbearance) is beside the point. This will have the same practical economic effects and will be functionally indistinguishable if/when it occurs.

In sum, while neither of these actions were contemplated by the FCC (they claim), such abstract rules are going to lead to random complaints like these, and companies are going to have to use the “ask FCC permission” process to try to figure out beforehand whether they should be investing or whether they’re going to be slammed. As Geoff Manne said in Wired:

That’s right—this new regime, which credits itself with preserving “permissionless innovation,” just put a bullet in its head. It puts innovators on notice, and ensures that the FCC has the authority (if it holds up in court) to enforce its vague rule against whatever it finds objectionable.

I mean, I don’t wanna brag or nothin, but it seems to me that we critics have been right so far. The reclassification of broadband Internet service as Title II has had the (supposedly) unintended consequence of sweeping in far more (both in scope of application and rules) than was supposedly bargained for. Hopefully the FCC rejects the petition and the complaint and reverses this course before it breaks the Internet.

The International Center for Law & Economics (ICLE) and TechFreedom filed two joint comments with the FCC today, explaining why the FCC has no sound legal basis for micromanaging the Internet and why “net neutrality” regulation would actually prove counter-productive for consumers.

The Policy Comments are available here, and the Legal Comments are here. See our previous post, Net Neutrality Regulation Is Bad for Consumers and Probably Illegal, for a distillation of many of the key points made in the comments.

New regulation is unnecessary. “An open Internet and the idea that companies can make special deals for faster access are not mutually exclusive,” said Geoffrey Manne, Executive Director of ICLE. “If the Internet really is ‘open,’ shouldn’t all companies be free to experiment with new technologies, business models and partnerships?”

“The media frenzy around this issue assumes that no one, apart from broadband companies, could possibly question the need for more regulation,” said Berin Szoka, President of TechFreedom. “In fact, increased regulation of the Internet will incite endless litigation, which will slow both investment and innovation, thus harming consumers and edge providers.”

Title II would be a disaster. The FCC has proposed re-interpreting the Communications Act to classify broadband ISPs under Title II as common carriers. But reinterpretation might unintentionally ensnare edge providers, weighing them down with onerous regulations. “So-called reclassification risks catching other Internet services in the crossfire,” explained Szoka. “The FCC can’t easily forbear from Title II’s most onerous rules because the agency has set a high bar for justifying forbearance. Rationalizing a changed approach would be legally and politically difficult. The FCC would have to simultaneously find the broadband market competitive enough to forbear, yet fragile enough to require net neutrality rules. It would take years to sort out this mess — essentially hitting the pause button on better broadband.”

Section 706 is not a viable option. In 2010, the FCC claimed Section 706 as an independent grant of authority to regulate any form of “communications” not directly barred by the Act, provided only that the Commission assert that regulation would somehow promote broadband. “This is an absurd interpretation,” said Szoka. “This could allow the FCC to essentially invent a new Communications Act as it goes, regulating not just broadband, but edge companies like Google and Facebook, too, and not just neutrality but copyright, cybersecurity and more. The courts will eventually strike down this theory.”

A better approach. “The best policy would be to maintain the ‘Hands off the Net’ approach that has otherwise prevailed for 20 years,” said Manne. “That means a general presumption that innovative business models and other forms of ‘prioritization’ are legal. Innovation could thrive, and regulators could still keep a watchful eye, intervening only where there is clear evidence of actual harm, not just abstract fears.” “If the FCC thinks it can justify regulating the Internet, it should ask Congress to grant such authority through legislation,” added Szoka. “A new communications act is long overdue anyway. The FCC could also convene a multistakeholder process to produce a code enforceable by the Federal Trade Commission,” he continued, noting that the White House has endorsed such processes for setting Internet policy in general.

Manne concluded: “The FCC should focus on doing what Section 706 actually commands: clearing barriers to broadband deployment. Unleashing more investment and competition, not writing more regulation, is the best way to keep the Internet open, innovative and free.”

For some of our other work on net neutrality, see:

“Understanding Net(flix) Neutrality,” an op-ed by Geoffrey Manne in the Detroit News on Netflix’s strategy to confuse interconnection costs with neutrality issues.

“The Feds Lost on Net Neutrality, But Won Control of the Internet,” an op-ed by Berin Szoka and Geoffrey Manne in Wired.com.

“That startup investors’ letter on net neutrality is a revealing look at what the debate is really about,” a post by Geoffrey Manne in Truth on the Market.

Bipartisan Consensus: Rewrite of ‘96 Telecom Act is Long Overdue,” a post on TF’s blog highlighting the key points from TechFreedom and ICLE’s joint comments on updating the Communications Act.

The Net Neutrality Comments are available here:

ICLE/TF Net Neutrality Policy Comments

TF/ICLE Net Neutrality Legal Comments

With Berin Szoka.

TechFreedom and the International Center for Law & Economics will shortly file two joint comments with the FCC, explaining why the FCC has no sound legal basis for micromanaging the Internet—now called “net neutrality regulation”—and why such regulation would be counter-productive as a policy matter. The following summarizes some of the key points from both sets of comments.

No one’s against an open Internet. The notion that anyone can put up a virtual shingle—and that the good ideas will rise to the top—is a bedrock principle with broad support; it has made the Internet essential to modern life. Key to Internet openness is the freedom to innovate. An open Internet and the idea that companies can make special deals for faster access are not mutually exclusive. If the Internet really is “open,” shouldn’t all companies be free to experiment with new technologies, business models and partnerships? Shouldn’t the FCC allow companies to experiment in building the unknown—and unknowable—Internet of the future?

The best approach would be to maintain the “Hands off the Net” approach that has otherwise prevailed for 20 years. That means a general presumption that innovative business models and other forms of “prioritization” are legal. Innovation could thrive, and regulators could still keep a watchful eye, intervening only where there is clear evidence of actual harm, not just abstract fears. And they should start with existing legal tools—like antitrust and consumer protection laws—before imposing prior restraints on innovation.

But net neutrality regulation hurts more than it helps. Counterintuitively, a blanket rule that ISPs treat data equally could actually harm consumers. Consider the innovative business models ISPs are introducing. T-Mobile’s unRadio lets users listen to all the on-demand music and radio they want without taking a hit against their monthly data plan. Yet so-called consumer advocates insist that’s a bad thing because it favors some content providers over others. In fact, “prioritizing” one service when there is congestion frees up data for subscribers to consume even more content—from whatever source. You know regulation may be out of control when a company is demonized for offering its users a freebie.

Treating each bit of data neutrally ignores the reality of how the Internet is designed, and how consumers use it.  Net neutrality proponents insist that all Internet content must be available to consumers neutrally, whether those consumers (or content providers) want it or not. They also argue against usage-based pricing. Together, these restrictions force all users to bear the costs of access for other users’ requests, regardless of who actually consumes the content, as the FCC itself has recognized:

[P]rohibiting tiered or usage-based pricing and requiring all subscribers to pay the same amount for broadband service, regardless of the performance or usage of the service, would force lighter end users of the network to subsidize heavier end users. It would also foreclose practices that may appropriately align incentives to encourage efficient use of networks.

The rules that net neutrality advocates want would hurt startups as well as consumers. Imagine a new entrant, clamoring for market share. Without the budget for a major advertising blitz, the archetypical “next Netflix” might never get the exposure it needs to thrive. But for a relatively small fee, the startup could sign up to participate in a sponsored data program, with its content featured and its customers’ data usage exempted from their data plans. This common business strategy could mean the difference between success and failure for a startup. Yet it would be prohibited by net neutrality rules banning paid prioritization.

The FCC lacks sound legal authority. The FCC is essentially proposing to do what can only properly be done by Congress: invent a new legal regime for broadband. Each of the options the FCC proposes to justify this—Section 706 of the Telecommunications Act and common carrier classification—is deeply problematic.

First, Section 706 isn’t sustainable. Until 2010, the FCC understood Section 706 as a directive to use its other grants of authority to promote broadband deployment. But in its zeal to regulate net neutrality, the FCC reversed itself in 2010, claiming Section 706 as an independent grant of authority. This would allow the FCC to regulate any form of “communications” in any way not directly barred by the Act — not just broadband but “edge” companies like Google and Facebook. This might mean going beyond neutrality to regulate copyright, cybersecurity and more. The FCC need only assert that regulation would somehow promote broadband.

If Section 706 is a grant of authority, it’s almost certainly a power to deregulate. But even if its power is as broad as the FCC claims, the FCC still hasn’t made the case that, on balance, its proposed regulations would actually do what it asserts: promote broadband. The FCC has stubbornly refused to conduct serious economic analysis on the net effects of its neutrality rules.

And Title II would be a disaster. The FCC has asked whether Title II of the Act, which governs “common carriers” like the old monopoly telephone system, is a workable option. It isn’t.

In the first place, regulations that impose design limitations meant for single-function networks simply aren’t appropriate for the constantly evolving Internet. Moreover, if the FCC re-interprets the Communications Act to classify broadband ISPs as common carriers, it risks catching other Internet services in the cross-fire, inadvertently making them common carriers, too. Surely net neutrality proponents can appreciate the harmful effects of treating Skype as a common carrier.

Forbearance can’t clean up the Title II mess. In theory the FCC could “forbear” from Title II’s most onerous rules, promising not to apply them when it determines there’s enough competition in a market to make the rules unnecessary. But the agency has set a high bar for justifying forbearance.

Most recently, in 2012, the Commission refused to grant Qwest forbearance even in the highly competitive telephony market, disregarding competition from wireless providers, and concluding that a cable-telco “duopoly” is inadequate to protect consumers. It’s unclear how the FCC could justify reaching the opposite conclusion about the broadband market—simultaneously finding it competitive enough to forbear, yet fragile enough to require net neutrality rules. Such contradictions would be difficult to explain, even if the FCC generally gets discretion on changing its approach.

But there is another path forward. If the FCC can really make the case for regulation, it should go to Congress, armed with the kind of independent economic and technical expert studies Commissioner Pai has urged, and ask for new authority. A new Communications Act is long overdue anyway. In the meantime, the FCC could convene the kind of multistakeholder process generally endorsed by the White House to produce a code enforceable by the Federal Trade Commission. A consensus is possible — just not inside the FCC, where the policy questions can’t be separated from the intractable legal questions.

Meanwhile, the FCC should focus on doing what Section 706 actually demands: clearing barriers to broadband deployment and competition. The 2010 National Broadband Plan laid out an ambitious pro-deployment agenda. It’s just too bad the FCC was so obsessed with net neutrality that it didn’t focus on the plan. Unleashing more investment and competition, not writing more regulation, is the best way to keep the Internet open, innovative and free.

[Cross-posted at TechFreedom.]

Angelo’s escape

Larry Ribstein —  23 February 2011

So Mozilo won’t be criminally prosecuted for Countrywide.  Holman Jenkins writes in today’s WSJ:

The incentive to bring a case against a vilified public figure, of course, is huge. Weighed against this, however, must be the chance of being humiliated by a judge, possibly censured, now that the legal system has started blowing the whistle on cases that strain to arrange the ill- fitting garments of criminal law around business defendants. * * *

Mr. Mozilo’s first thank-you note should probably go to Mark Belnick, the former Tyco executive whom a jury found did not commit “grand larceny” by accepting a bonus authorized by the company’s CEO. Mr. Belnick’s acquittal in 2004, in retrospect, was a harbinger.

There followed a pair of stock-option backdating cases thrown out by judges on grounds of “prosecutorial misconduct” that amounted to trying to make innocuous behavior appear nefarious. Two Bear Stearns executives were acquitted of the non-crime of losing money in the housing bubble. Capping off the trend was a Supreme Court decision last year disarming the “honest services” blunderbuss, which threatened to turn any self-interested behavior by a CEO into a crime.

Jenkins explains that Mozilo’s supposed crime was trafficking in “whatever types of loans seemed to be acceptable in the marketplace to consumers and sellable to Wall Street as securities.” Mozilo basically got caught in a bubble.  He expected the company to be around to “pick up the pieces when the ‘innovators’ had exited,” as Jenkins says. He didn’t foresee “unprecedented collapse in home prices” or “Countrywide’s own funding drying up overnight.”

So Mozilo avoided jail for following the crowd and not foreseeing the unexpected.  That left him better off than Jeff Skilling or Ken Lay.  Hopefully this prosecutorial forbearance will become a trend.  But based on the incentives and institutions I discuss in my Agents Prosecuting Agents, I’m not so sure about that.

Maybe it was too easy anyway.  I mean, who hasn’t heard this one before?

Jinsoo Kim begins his opening brief by stating, “Blood may be thicker than water, but here it?s far weightier than a peppercorn.”1 Kim appeals from the trial court?s refusal to enforce a gratuitous promise, handwritten in his friend’s own blood, to repay money Kim loaned and lost in two failed business ventures. He faults the trial court for not discussing or deciding in its statement of decision the issue of whether Kim’s forbearance (waiting over a year to file a meritless lawsuit against his friend, Stephen Son), supplied adequate consideration for Son?s blood-written document.

Yikes.  HT: Volokh (via How Appealing).

I’ve had the pleasure of spending the last few weeks curled up with Herbert Hovenkamp’s wonderful new book, The Antitrust Enterprise: Principle and Execution, which I’m reviewing for the Texas Law Review. Hovenkamp is a sharp thinker and a wonderfully clear writer, and the book is a fantastic read for scholars and students alike. As a reviewer, though, I’m charged with pointing out the weak spots. Hovenkamp’s discussion of Illinois Brick‘s indirect purchaser rule is, I believe, one of those spots.

For the unitiated, the Supreme Court’s famous Illinois Brick decision held that those who purchase only indirectly from a monopolist or cartel may not recover overcharge damages; instead, the direct purchaser may collect the entire amount of any overcharge, even if that purchaser has passed some of the overcharge on to downstream (i.e., indirect) purchasers. As Hovenkamp notes, the primary rationale for the Court’s holding in Illinois Brick was the tremendous difficulty of accurately determining, in a judicial proceeding, the proportion of an overcharge passed on to downstream purchasers.

While he agrees that computing passed-on damages is extraordinarily difficult, Hovenkamp maintains that that difficulty does not justify the rule precluding indirect purchaser actions. He asserts that the Illinois Brick Court’s reasoning relied on two false assumptions: first, that “overcharge� was the proper measure of damages for every firm in the defendant’s distribution chain; and second, that calculating downstream damages would require tracing and apportionment of the initial overcharge among the direct purchaser and the various downstream purchasers.

With respect to the first assumption, Hovenkamp argues that overcharge is not the proper measure of damages for intermediary purchasers (e.g., assemblers, distributors, or retailers), who will generally respond to supracompetitive pricing by passing along at least some of the price increase and suffering reduced sales as a result of their higher prices. For example, if a cartel of liquor manufacturers raises the price of a bottle of liquor from $10 to $14, retailers may respond by raising the retail price of a bottle from $13 to $16. Their losses will consist of an absorbed overcharge of $1 per bottle sold plus the profit losses resulting from reduced sales at a $16 retail price (less any incremental profits from increased sales of alternative liquors). An overcharge measure, Hovenkamp observes, “never captures the losses resulting from lost volume.� By contrast, a “lost profits� measure would do so and would account for the degree to which middlemen are able to pass overcharge on to downstream purchasers.

With respect to the second assumption, Hovenkamp asserts that indirect purchaser actions do not require tracing and apportionment of the overcharge. Rather than calculating what percentage of an overcharge was absorbed by middlemen and what percentage was ultimately paid by indirect purchaser plaintiffs, courts could use the familiar “yardstick� or “before-and-after� methods to determine the amount of overcharge paid by indirect purchaser plaintiffs. The yardstick method calculates damages by comparing the price the plaintiff paid to the prevailing price in some different but similar market in which the anticompetitive practice at issue is not occurring. The before-and-after method compares the plaintiff’s price to those prevailing in the same market prior to and subsequent to the violation period. Neither method would require determination of pass-on percentages. Hovenkamp therefore contemplates a system in which injured middlemen would recover lost profits and consumers would recover overcharges, with both lost profits and overcharges measured using either the yardstick or before-and-after method.

I believe Hovenkamp’s proposal to abandon the indirect purchaser rule is a bad idea. The indirect purchaser rule likely provides a closer-to-optimal level of antitrust deterrence, and at a lower administrative cost, than Hovenkamp’s proposed approach. To see why, read below the fold. Continue Reading…