Archives For net neutrality

This week, the International Center for Law & Economics filed an ex parte notice in the FCC’s Restoring Internet Freedom docket. In it, we reviewed two of the major items that were contained in our formal comments. First, we noted that

the process by which [the Commission] enacted the 2015 [Open Internet Order]… demonstrated scant attention to empirical evidence, and even less attention to a large body of empirical and theoretical work by academics. The 2015 OIO, in short, was not supported by reasoned analysis.

Further, on the issue of preemption, we stressed that

[F]ollowing the adoption of an Order in this proceeding, a number of states may enact their own laws or regulations aimed at regulating broadband service… The resulting threat of a patchwork of conflicting state regulations, many of which would be unlikely to further the public interest, is a serious one…

[T]he Commission should explicitly state that… broadband services may not be subject to certain forms of state regulations, including conduct regulations that prescribe how ISPs can use their networks. This position would also be consistent with the FCC’s treatment of interstate information services in the past.

Our full ex parte comments can be viewed here.

Today the International Center for Law & Economics (ICLE) submitted an amicus brief urging the Supreme Court to review the DC Circuit’s 2016 decision upholding the FCC’s 2015 Open Internet Order. The brief was authored by Geoffrey A. Manne, Executive Director of ICLE, and Justin (Gus) Hurwitz, Assistant Professor of Law at the University of Nebraska College of Law and ICLE affiliate, with able assistance from Kristian Stout and Allen Gibby of ICLE. Jeffrey A. Mandell of the Wisconsin law firm of Stafford Rosenbaum collaborated in drafting the brief and provided invaluable pro bono legal assistance, for which we are enormously grateful. Laura Lamansky of Stafford Rosenbaum also assisted. 

The following post discussing the brief was written by Jeff Mandell (originally posted here).

Courts generally defer to agency expertise when reviewing administrative rules that regulate conduct in areas where Congress has delegated authority to specialized executive-branch actors. An entire body of law—administrative law—governs agency actions and judicial review of those actions. And at the federal level, courts grant agencies varying degrees of deference, depending on what kind of function the agency is performing, how much authority Congress delegated, and the process by which the agency adopts or enforces policies.

Should courts be more skeptical when an agency changes a policy position, especially if the agency is reversing prior policy without a corresponding change to the governing statute? Daniel Berninger v. Federal Communications Commission, No. 17-498 (U.S.), raises these questions. And this week Stafford Rosenbaum was honored to serve as counsel of record for the International Center for Law & Economics (“ICLE”) in filing an amicus curiae brief urging the U.S. Supreme Court to hear the case and to answer these questions.

ICLE’s amicus brief highlights new academic research suggesting that systematic problems undermine judicial review of agency changes in policy. The brief also points out that judicial review is complicated by conflicting signals from the Supreme Court about the degree of deference that courts should accord agencies in reviewing reversals of prior policy. And the brief argues that the specific policy change at issue in this case lacks a sufficient basis but was affirmed by the court below as the result of a review that was, but should not have been, “particularly deferential.”

In 2015, the Federal Communications Commission (“FCC”) issued the Open Internet Order (“OIO”), which required Internet Service Providers to abide by a series of regulations popularly referred to as net neutrality. To support these regulations, the FCC interpreted the Communications Act of 1934 to grant it authority to heavily regulate broadband internet service. This interpretation reversed a long-standing agency understanding of the statute as permitting only limited regulation of broadband service.

The FCC ostensibly based the OIO on factual and legal analysis. However, ICLE argues, the OIO is actually based on questionable factual reinterpretations and misunderstanding of statutory interpretation adopted more in order to support radical changes in FCC policy than for their descriptive accuracy. When a variety of interested parties challenged the OIO, the U.S. Court of Appeals for the D.C. Circuit affirmed the regulations. In doing so, the court afforded substantial deference to the FCC—so much that the D.C. Circuit never addressed the reasonableness of the FCC’s decisionmaking process in reversing prior policy.

ICLE’s amicus brief argues that the D.C. Circuit’s decision “is both in tension with [the Supreme] Court’s precedents and, more, raises exceptionally important and previously unaddressed questions about th[e] Court’s precedents on judicial review of agency changes of policy.” Without further guidance from the Supreme Court, the brief argues, “there is every reason to believe” the FCC will again reverse its position on broadband regulation, such that “the process will become an endless feedback loop—in the case of this regulation and others—at great cost not only to regulated entities and their consumers, but also to the integrity of the regulatory process.”

The ramifications of the Supreme Court accepting this case would be twofold. First, administrative agencies would gain guidance for their decisionmaking processes in considering changes to existing policies. Second, lower courts would gain clarity on agency deference issues, making judicial review more uniform and appropriate where agencies reverse prior policy positions.

Read the full brief here.

It’s fitting that FCC Chairman Ajit Pai recently compared his predecessor’s jettisoning of the FCC’s light touch framework for Internet access regulation without hard evidence to the Oklahoma City Thunder’s James Harden trade. That infamous deal broke up a young nucleus of three of the best players in the NBA in 2012 because keeping all three might someday create salary cap concerns. What few saw coming was a new TV deal in 2015 that sent the salary cap soaring.

If it’s hard to predict how the market will evolve in the closed world of professional basketball, predictions about the path of Internet innovation are an order of magnitude harder — especially for those making crucial decisions with a lot of money at stake.

The FCC’s answer for what it considered to be the dangerous unpredictability of Internet innovation was to write itself a blank check of authority to regulate ISPs in the 2015 Open Internet Order (OIO), embodied in what is referred to as the “Internet conduct standard.” This standard expanded the scope of Internet access regulation well beyond the core principle of preserving openness (i.e., ensuring that any legal content can be accessed by all users) by granting the FCC the unbounded, discretionary authority to define and address “new and novel threats to the Internet.”

When asked about what the standard meant (not long after writing it), former Chairman Tom Wheeler replied,

We don’t really know. We don’t know where things will go next. We have created a playing field where there are known rules, and the FCC will sit there as a referee and will throw the flag.

Somehow, former Chairman Wheeler would have us believe that an amorphous standard that means whatever the agency (or its Enforcement Bureau) says it means created a playing field with “known rules.” But claiming such broad authority is hardly the light-touch approach marketed to the public. Instead, this ill-conceived standard allows the FCC to wade as deeply as it chooses into how an ISP organizes its business and how it manages its network traffic.

Such an approach is destined to undermine, rather than further, the objectives of Internet openness, as embodied in Chairman Powell’s 2005 Internet Policy Statement:

To foster creation, adoption and use of Internet broadband content, applications, services and attachments, and to ensure consumers benefit from the innovation that comes from competition.

Instead, the Internet conduct standard is emblematic of how an off-the-rails quest to heavily regulate one specific component of the complex Internet ecosystem results in arbitrary regulatory imbalances — e.g., between ISPs and over-the-top (OTT) or edge providers that offer similar services such as video streaming or voice calling.

As Boston College Law Professor, Dan Lyons, puts it:

While many might assume that, in theory, what’s good for Netflix is good for consumers, the reality is more complex. To protect innovation at the edge of the Internet ecosystem, the Commission’s sweeping rules reduce the opportunity for consumer-friendly innovation elsewhere, namely by facilities-based broadband providers.

This is no recipe for innovation, nor does it coherently distinguish between practices that might impede competition and innovation on the Internet and those that are merely politically disfavored, for any reason or no reason at all.

Free data madness

The Internet conduct standard’s unholy combination of unfettered discretion and the impulse to micromanage can (and will) be deployed without credible justification to the detriment of consumers and innovation. Nowhere has this been more evident than in the confusion surrounding the regulation of “free data.”

Free data, like T-Mobile’s Binge On program, is data consumed by a user that has been subsidized by a mobile operator or a content provider. The vertical arrangements between operators and content providers creating the free data offerings provide many benefits to consumers, including enabling subscribers to consume more data (or, for low-income users, to consume data in the first place), facilitating product differentiation by mobile operators that offer a variety of free data plans (including allowing smaller operators the chance to get a leg up on competitors by assembling a market-share-winning plan), increasing the overall consumption of content, and reducing users’ cost of obtaining information. It’s also fundamentally about experimentation. As the International Center for Law & Economics (ICLE) recently explained:

Offering some services at subsidized or zero prices frees up resources (and, where applicable, data under a user’s data cap) enabling users to experiment with new, less-familiar alternatives. Where a user might not find it worthwhile to spend his marginal dollar on an unfamiliar or less-preferred service, differentiated pricing loosens the user’s budget constraint, and may make him more, not less, likely to use alternative services.

In December 2015 then-Chairman Tom Wheeler used his newfound discretion to launch a 13-month “inquiry” into free data practices before preliminarily finding some to be in violation of the standard. Without identifying any actual harm, Wheeler concluded that free data plans “may raise” economic and public policy issues that “may harm consumers and competition.”

After assuming the reins at the FCC, Chairman Pai swiftly put an end to that nonsense, saying that the Commission had better things to do (like removing barriers to broadband deployment) than denying free data plans that expand Internet access and are immensely popular, especially among low-income Americans.

The global morass of free data regulation

But as long as the Internet conduct standard remains on the books, it implicitly grants the US’s imprimatur to harmful policies and regulatory capriciousness in other countries that look to the US for persuasive authority. While Chairman Pai’s decisive intervention resolved the free data debate in the US (at least for now), other countries are still grappling with whether to prohibit the practice, allow it, or allow it with various restrictions.

In Europe, the 2016 EC guidelines left the decision of whether to allow the practice in the hands of national regulators. Consequently, some regulators — in Hungary, Sweden, and the Netherlands (although there the ban was recently overturned in court) — have banned free data practices  while others — in Denmark, Germany, Spain, Poland, the United Kingdom, and Ukraine — have not. And whether or not they allow the practice, regulators (e.g., Norway’s Nkom and the UK’s Ofcom) have lamented the lack of regulatory certainty surrounding free data programs, a state of affairs that is compounded by a lack of data on the consequences of various approaches to their regulation.

In Canada this year, the CRTC issued a decision adopting restrictive criteria under which to evaluate free data plans. The criteria include assessing the degree to which the treatment of data is agnostic, whether the free data offer is exclusive to certain customers or certain content providers, the impact on Internet openness and innovation, and whether there is financial compensation involved. The standard is open-ended, and free data plans as they are offered in the US would “likely raise concerns.”

Other regulators are contributing to the confusion through ambiguously framed rules, such as that of the Chilean regulator, Subtel. In a 2014 decision, it found that a free data offer of specific social network apps was in breach of Chile’s Internet rules. In contrast to what is commonly reported, however, Subtel did not ban free data. Instead, it required mobile operators to change how they promote such services, requiring them to state that access to Facebook, Twitter and WhatsApp were offered “without discounting the user’s balance” instead of “at no cost.” It also required them to disclose the amount of time the offer would be available, but imposed no mandatory limit.

In addition to this confusing regulatory make-work governing how operators market free data plans, the Chilean measures also require that mobile operators offer free data to subscribers who pay for a data plan, in order to ensure free data isn’t the only option users have to access the Internet.

The result is that in Chile today free data plans are widely offered by Movistar, Claro, and Entel and include access to apps such as Facebook, WhatsApp, Twitter, Instagram, Pokemon Go, Waze, Snapchat, Apple Music, Spotify, Netflix or YouTube — even though Subtel has nominally declared such plans to be in violation of Chile’s net neutrality rules.

Other regulators are searching for palatable alternatives to both flex their regulatory muscle to govern Internet access, while simultaneously making free data work. The Indian regulator, TRAI, famously banned free data in February 2016. But the story doesn’t end there. After seeing the potential value of free data in unserved and underserved, low-income areas, TRAI proposed implementing government-sanctioned free data. The proposed scheme would provide rural subscribers with 100 MB of free data per month, funded through the country’s universal service fund. To ensure that there would be no vertical agreements between content providers and mobile operators, TRAI recommended introducing third parties, referred to as “aggregators,” that would facilitate mobile-operator-agnostic arrangements.

The result is a nonsensical, if vaguely well-intentioned, threading of the needle between the perceived need to (over-)regulate access providers and the determination to expand access. Notwithstanding the Indian government’s awareness that free data will help to close the digital divide and enhance Internet access, in other words, it nonetheless banned private markets from employing private capital to achieve that very result, preferring instead non-market processes which are unlikely to be nearly as nimble or as effective — and yet still ultimately offer “non-neutral” options for consumers.

Thinking globally, acting locally (by ditching the Internet conduct standard)

Where it is permitted, free data is undergoing explosive adoption among mobile operators. Currently in the US, for example, all major mobile operators offer some form of free data or unlimited plan to subscribers. And, as a result, free data is proving itself as a business model for users’ early stage experimentation and adoption of augmented reality, virtual reality and other cutting-edge technologies that represent the Internet’s next wave — but that also use vast amounts of data. Were the US to cut off free data at the legs under the OIO absent hard evidence of harm, it would substantially undermine this innovation.

The application of the nebulous Internet conduct standard to free data is a microcosm of the current incoherence: It is a rule rife with a parade of uncertainties and only theoretical problems, needlessly saddling companies with enforcement risk, all in the name of preserving and promoting innovation and openness. As even some of the staunchest proponents of net neutrality have recognized, only companies that can afford years of litigation can be expected to thrive in such an environment.

In the face of confusion and uncertainty globally, the US is now poised to provide leadership grounded in sound policy that promotes innovation. As ICLE noted last month, Chairman Pai took a crucial step toward re-imposing economic rigor and the rule of law at the FCC by questioning the unprecedented and ill-supported expansion of FCC authority that undergirds the OIO in general and the Internet conduct standard in particular. Today the agency will take the next step by voting on Chairman Pai’s proposed rulemaking. Wherever the new proceeding leads, it’s a welcome opportunity to analyze the issues with a degree of rigor that has thus far been appallingly absent.

And we should not forget that there’s a direct solution to these ambiguities that would avoid the undulations of subsequent FCC policy fights: Congress could (and should) pass legislation implementing a regulatory framework grounded in sound economics and empirical evidence that allows for consumers to benefit from the vast number of procompetitive vertical agreements (such as free data plans), while still facilitating a means for policing conduct that may actually harm consumers.

The Golden State Warriors are the heavy odds-on favorite to win another NBA Championship this summer, led by former OKC player Kevin Durant. And James Harden is a contender for league MVP. We can’t always turn back the clock on a terrible decision, hastily made before enough evidence has been gathered, but Chairman Pai’s efforts present a rare opportunity to do so.

Last week the International Center for Law & Economics and I filed an amicus brief in the DC Circuit in support of en banc review of the court’s decision to uphold the FCC’s 2015 Open Internet Order.

In our previous amicus brief before the panel that initially reviewed the OIO, we argued, among other things, that

In order to justify its Order, the Commission makes questionable use of important facts. For instance, the Order’s ban on paid prioritization ignores and mischaracterizes relevant record evidence and relies on irrelevant evidence. The Order also omits any substantial consideration of costs. The apparent necessity of the Commission’s aggressive treatment of the Order’s factual basis demonstrates the lengths to which the Commission must go in its attempt to fit the Order within its statutory authority.

Our brief supporting en banc review builds on these points to argue that

By reflexively affording substantial deference to the FCC in affirming the Open Internet Order (“OIO”), the panel majority’s opinion is in tension with recent Supreme Court precedent….

The panel majority need not have, and arguably should not have, afforded the FCC the level of deference that it did. The Supreme Court’s decisions in State Farm, Fox, and Encino all require a more thorough vetting of the reasons underlying an agency change in policy than is otherwise required under the familiar Chevron framework. Similarly, Brown and Williamson, Utility Air Regulatory Group, and King all indicate circumstances in which an agency construction of an otherwise ambiguous statute is not due deference, including when the agency interpretation is a departure from longstanding agency understandings of a statute or when the agency is not acting in an expert capacity (e.g., its decision is based on changing policy preferences, not changing factual or technical considerations).

In effect, the panel majority based its decision whether to afford the FCC deference upon deference to the agency’s poorly supported assertions that it was due deference. We argue that this is wholly inappropriate in light of recent Supreme Court cases.

Moreover,

The panel majority failed to appreciate the importance of granting Chevron deference to the FCC. That importance is most clearly seen at an aggregate level. In a large-scale study of every Court of Appeals decision between 2003 and 2013, Professors Kent Barnett and Christopher Walker found that a court’s decision to defer to agency action is uniquely determinative in cases where, as here, an agency is changing established policy.

Kent Barnett & Christopher J. Walker, Chevron In the Circuit Courts 61, Figure 14 (2016), available at ssrn.com/abstract=2808848.

Figure 14 from Barnett & Walker, as reproduced in our brief.

As  that study demonstrates,

agency decisions to change established policy tend to present serious, systematic defects — and [thus that] it is incumbent upon this court to review the panel majority’s decision to reflexively grant Chevron deference. Further, the data underscore the importance of the Supreme Court’s command in Fox and Encino that agencies show good reason for a change in policy; its recognition in Brown & Williamson and UARG that departures from existing policy may fall outside of the Chevron regime; and its command in King that policies not made by agencies acting in their capacity as technical experts may fall outside of the Chevron regime. In such cases, the Court essentially holds that reflexive application of Chevron deference may not be appropriate because these circumstances may tend toward agency action that is arbitrary, capricious, in excess of statutory authority, or otherwise not in accordance with law.

As we conclude:

The present case is a clear example where greater scrutiny of an agency’s decision-making process is both warranted and necessary. The panel majority all too readily afforded the FCC great deference, despite the clear and unaddressed evidence of serious flaws in the agency’s decision-making process. As we argued in our brief before the panel, and as Judge Williams recognized in his partial dissent, the OIO was based on factually inaccurate, contradicted, and irrelevant record evidence.

Read our full — and very short — amicus brief here.

Last week the International Center for Law & Economics filed comments on the FCC’s Broadband Privacy NPRM. ICLE was joined in its comments by the following scholars of law & economics:

  • Babette E. Boliek, Associate Professor of Law, Pepperdine School of Law
  • Adam Candeub, Professor of Law, Michigan State University College of Law
  • Justin (Gus) Hurwitz, Assistant Professor of Law, Nebraska College of Law
  • Daniel Lyons, Associate Professor, Boston College Law School
  • Geoffrey A. Manne, Executive Director, International Center for Law & Economics
  • Paul H. Rubin, Samuel Candler Dobbs Professor of Economics, Emory University Department of Economics

As we note in our comments:

The Commission’s NPRM would shoehorn the business models of a subset of new economy firms into a regime modeled on thirty-year-old CPNI rules designed to address fundamentally different concerns about a fundamentally different market. The Commission’s hurried and poorly supported NPRM demonstrates little understanding of the data markets it proposes to regulate and the position of ISPs within that market. And, what’s more, the resulting proposed rules diverge from analogous rules the Commission purports to emulate. Without mounting a convincing case for treating ISPs differently than the other data firms with which they do or could compete, the rules contemplate disparate regulatory treatment that would likely harm competition and innovation without evident corresponding benefit to consumers.

In particular, we focus on the FCC’s failure to justify treating ISPs differently than other competitors, and its failure to justify more stringent treatment for ISPs in general:

In short, the Commission has not made a convincing case that discrimination between ISPs and edge providers makes sense for the industry or for consumer welfare. The overwhelming body of evidence upon which other regulators have relied in addressing privacy concerns urges against a hard opt-in approach. That same evidence and analysis supports a consistent regulatory approach for all competitors, and nowhere advocates for a differential approach for ISPs when they are participating in the broader informatics and advertising markets.

With respect to the proposed opt-in regime, the NPRM ignores the weight of economic evidence on opt-in rules and fails to justify the specific rules it prescribes. Of most significance is the imposition of this opt-in requirement for the sharing of non-sensitive data.

On net opt-in regimes may tend to favor the status quo, and to maintain or grow the position of a few dominant firms. Opt-in imposes additional costs on consumers and hurts competition — and it may not offer any additional protections over opt-out. In the absence of any meaningful evidence or rigorous economic analysis to the contrary, the Commission should eschew imposing such a potentially harmful regime on broadband and data markets.

Finally, we explain that, although the NPRM purports to embrace a regulatory regime consistent with the current “federal privacy regime,” and particularly the FTC’s approach to privacy regulation, it actually does no such thing — a sentiment echoed by a host of current and former FTC staff and commissioners, including the Bureau of Consumer Protection staff, Commissioner Maureen Ohlhausen, former Chairman Jon Leibowitz, former Commissioner Josh Wright, and former BCP Director Howard Beales.

Our full comments are available here.

While we all wait on pins and needles for the DC Circuit to issue its long-expected ruling on the FCC’s Open Internet Order, another federal appeals court has pushed back on Tom Wheeler’s FCC for its unremitting “just trust us” approach to federal rulemaking.

The case, round three of Prometheus, et al. v. FCC, involves the FCC’s long-standing rules restricting common ownership of local broadcast stations and their extension by Tom Wheeler’s FCC to the use of joint sales agreements (JSAs). (For more background see our previous post here). Once again the FCC lost (it’s now only 1 for 3 in this case…), as the Third Circuit Court of Appeals took the Commission to task for failing to establish that its broadcast ownership rules were still in the public interest, as required by law, before it decided to extend those rules.

While much of the opinion deals with the FCC’s unreasonable delay (of more than 7 years) in completing two Quadrennial Reviews in relation to its diversity rules, the court also vacated the FCC’s rule expanding its duopoly rule (or local television ownership rule) to ban joint sales agreements without first undertaking the reviews.

We (the International Center for Law and Economics, along with affiliated scholars of law, economics, and communications) filed an amicus brief arguing for precisely this result, noting that

the 2014 Order [] dramatically expands its scope by amending the FCC’s local ownership attribution rules to make the rule applicable to JSAs, which had never before been subject to it. The Commission thereby suddenly declares unlawful JSAs in scores of local markets, many of which have been operating for a decade or longer without any harm to competition. Even more remarkably, it does so despite the fact that both the DOJ and the FCC itself had previously reviewed many of these JSAs and concluded that they were not likely to lessen competition. In doing so, the FCC also fails to examine the empirical evidence accumulated over the nearly two decades some of these JSAs have been operating. That evidence shows that many of these JSAs have substantially reduced the costs of operating TV stations and improved the quality of their programming without causing any harm to competition, thereby serving the public interest.

The Third Circuit agreed that the FCC utterly failed to justify its continued foray into banning potentially pro-competitive arrangements, finding that

the Commission violated § 202(h) by expanding the reach of the ownership rules without first justifying their preexisting scope through a Quadrennial Review. In Prometheus I we made clear that § 202(h) requires that “no matter what the Commission decides to do to any particular rule—retain, repeal, or modify (whether to make more or less stringent)—it must do so in the public interest and support its decision with a reasoned analysis.” Prometheus I, 373 F.3d at 395. Attribution of television JSAs modifies the Commission’s ownership rules by making them more stringent. And, unless the Commission determines that the preexisting ownership rules are sound, it cannot logically demonstrate that an expansion is in the public interest. Put differently, we cannot decide whether the Commission’s rationale—the need to avoid circumvention of ownership rules—makes sense without knowing whether those rules are in the public interest. If they are not, then the public interest might not be served by closing loopholes to rules that should no longer exist.

Perhaps this decision will be a harbinger of good things to come. The FCC — and especially Tom Wheeler’s FCC — has a history of failing to justify its rules with anything approaching rigorous analysis. The Open Internet Order is a case in point. We will all be better off if courts begin to hold the Commission’s feet to the fire and throw out their rules when the FCC fails to do the work needed to justify them.

Netflix’s latest net neutrality hypocrisy (yes, there have been others. See here and here, for example) involves its long-term, undisclosed throttling of its video traffic on AT&T’s and Verizon’s wireless networks, while it lobbied heavily for net neutrality rules from the FCC that would prevent just such throttling by ISPs.

It was Netflix that coined the term “strong net neutrality,” in an effort to import interconnection (the connections between ISPs and edge provider networks) into the net neutrality fold. That alone was a bastardization of what net neutrality purportedly stood for, as I previously noted:

There is a reason every iteration of the FCC’s net neutrality rules, including the latest, have explicitly not applied to backbone interconnection agreements: Interconnection over the backbone has always been open and competitive, and it simply doesn’t give rise to the kind of discrimination concerns net neutrality is meant to address.

That Netflix would prefer not to pay for delivery of its content isn’t surprising. But net neutrality regulations don’t — and shouldn’t — have anything to do with it.

But Netflix did something else with “strong net neutrality.” It tied it to consumer choice:

This weak net neutrality isn’t enough to protect an open, competitive Internet; a stronger form of net neutrality is required. Strong net neutrality additionally prevents ISPs from charging a toll for interconnection to services like Netflix, YouTube, or Skype, or intermediaries such as Cogent, Akamai or Level 3, to deliver the services and data requested by ISP residential subscribers. Instead, they must provide sufficient access to their network without charge. (Emphasis added).

A focus on consumers is laudable, of course, but when the focus is on consumers there’s no reason to differentiate between ISPs (to whom net neutrality rules apply) and content providers entering into contracts with ISPs to deliver their content (to whom net neutrality rules don’t apply).

And Netflix has just showed us exactly why that’s the case.

Netflix can and does engage in management of its streams in order (presumably) to optimize consumer experience as users move between networks, devices and viewers (e.g., native apps vs Internet browser windows) with very different characteristics and limitations. That’s all well and good. But as we noted in our Policy Comments in the FCC’s Open Internet Order proceeding,

In this circumstance, particularly when the content in question is Netflix, with 30% of network traffic, both the network’s and the content provider’s transmission decisions may be determinative of network quality, as may the users’ device and application choices.

As a 2011 paper by a group of network engineers studying the network characteristics of video streaming data from Netflix and YouTube noted:

This is a concern as it means that a sudden change of application or container in a large population might have a significant impact on the network traffic. Considering the very fast changes in trends this is a real possibility, the most likely being a change from Flash to HTML5 along with an increase in the use of mobile devices…. [S]treaming videos at high resolutions can result in smoother aggregate traffic while at the same time linearly increase the aggregate data rate due to video streaming.

Again, a concern with consumers is admirable, but Netflix isn’t concerned with consumers. It’s concerned at most with consumers of Netflix, while they are consuming Netflix. But the reality is that Netflix’s content management decisions can adversely affect consumers overall, including its own subscribers when they aren’t watching Netflix.

And here’s the huge irony. The FCC’s net neutrality rules are tailor-made to guarantee that Netflix will never have any incentive to take these externalities into account in its own decisions. What’s more, they ensure that ISPs are severely hamstrung in managing their networks for the benefit of all consumers, not least because their interconnection deals with large content providers like Netflix are now being closely scrutinized.

It’s great that Netflix thinks it should manage its video delivery to optimize viewing under different network conditions. But net neutrality rules ensure that Netflix bears no cost for overwhelming the network in the process. Essentially, short of building new capacity — at great expense to all ISP subscribers, of course — ISPs can’t do much about it, either, under the rules. And, of course, the rules also make it impossible for ISPs to negotiate for financial help from Netflix (or its heaviest users) in paying for those upgrades.

On top of this, net neutrality advocates have taken aim at usage-based billing and other pricing practices that would help with the problem by enabling ISPs to charge their heaviest users more in order to alleviate the inherent subsidy by normal users that flat-rate billing entails. (Netflix itself, as one of the articles linked above discusses at length, is hypocritically inconsistent on this score).

As we also noted in our OIO Policy Comments:

The idea that consumers and competition generally are better off when content providers face no incentive to take account of congestion externalities in their pricing (or when users have no incentive to take account of their own usage) runs counter to basic economic logic and is unsupported by the evidence. In fact, contrary to such claims, usage-based pricing, congestion pricing and sponsored content, among other nonlinear pricing models, would, in many circumstances, further incentivize networks to expand capacity (not create artificial scarcity).

Some concern for consumers. Under Netflix’s approach consumers get it coming and going: Either their non-Netflix traffic is compromised for the sake of Netflix’s traffic, or they have to pay higher subscription fees to ISPs for the privilege of accommodating Netflix’s ever-expanding traffic loads (4K videos, anyone?) — whether they ever use Netflix or not.

Sometimes, apparently, Netflix throttles its own traffic in order to “help” a few consumers. (That it does so without disclosing the practice is pretty galling, especially given the enhanced transparency rules in the Open Internet Order — something Netflix also advocated for, and which also apply only to ISPs and not to content providers). But its self-aggrandizing advocacy for the FCC’s latest net neutrality rules reveals that its first priority is to screw over consumers, so long as it can shift the blame and the cost to others.

The FCC doesn’t have authority over the edge and doesn’t want authority over the edge. Well, that is until it finds itself with no choice but to regulate the edge as a result of its own policies. As the FCC begins to explore its new authority to regulate privacy under the Open Internet Order (“OIO”), for instance, it will run up against policy conflicts and inconsistencies that will make it increasingly hard to justify forbearance from regulating edge providers.

Take for example the recently announced NPRM titled “Expanding Consumers’ Video Navigation Choices” — a proposal that seeks to force cable companies to provide video programming to third party set-top box manufacturers. Under the proposed rules, MVPD distributors would be required to expose three data streams to competitors: (1) listing information about what is available to particular customers; (2) the rights associated with accessing such content; and (3) the actual video content. As Geoff Manne has aptly noted, this seems to be much more of an effort to eliminate the “nightmare” of “too many remote controls” than it is to actually expand consumer choice in a market that is essentially drowning in consumer choice. But of course even so innocuous a goal—which is probably more about picking on cable companies because… “eww cable companies”—suggests some very important questions.

First, the market for video on cable systems is governed by a highly interdependent web of contracts that assures to a wide variety of parties that their bargained-for rights are respected. Among other things, channels negotiate for particular placements and channel numbers in a cable system’s lineup, IP rights holders bargain for content to be made available only at certain times and at certain locations, and advertisers pay for their ads to be inserted into channel streams and broadcasts.

Moreover, to a large extent, the content industry develops its content based on a stable regime of bargained-for contractual terms with cable distribution networks (among others). Disrupting the ability of cable companies to control access to their video streams will undoubtedly alter the underlying assumptions upon which IP companies rely when planning and investing in content development. And, of course, the physical networks and their related equipment have been engineered around the current cable-access regimes. Some non-trivial amount of re-engineering will have to take place to make the cable-networks compatible with a more “open” set-top box market.

The FCC nods to these concerns in its NPRM, when it notes that its “goal is to preserve the contractual arrangements between programmers and MVPDs, while creating additional opportunities for programmers[.]” But this aspiration is not clearly given effect in the NPRM, and, as noted, some contractual arrangements are simply inconsistent with the NPRM’s approach.

Second, the FCC proposes to bind third-party manufacturers to the public interest privacy commitments in §§ 629, 551 and 338(i) of the Communications Act (“Act”) through a self-certification process. MVPDs would be required to pass the three data streams to third-party providers only once such a certification is received. To the extent that these sections, enforced via self-certification, do not sufficiently curtail third-parties’ undesirable behavior, the FCC appears to believe that “the strictest state regulatory regime[s]” and the “European Union privacy regulations” will serve as the necessary regulatory gap fillers.

This seems hard to believe, however, particularly given the recently announced privacy and cybersecurity NPRM, through which the FCC will adopt rules detailing the agency’s new authority (under the OIO) to regulate privacy at the ISP level. Largely, these rules will grow out of §§ 222 and 201 of the Act, which the FCC in Terracom interpreted together to be a general grant of privacy and cybersecurity authority.

I’m apprehensive of the asserted scope of the FCC’s power over privacy — let alone cybersecurity — under §§ 222 and 201. In truth, the FCC makes an admirable showing in Terracom of demonstrating its reasoning; it does a far better job than the FTC in similar enforcement actions. But there remains a problem. The FTC’s authority is fundamentally cabined by the limitations contained within the FTC Act (even if it frequently chooses to ignore them, they are there and are theoretically a protection against overreach).

But the FCC’s enforcement decisions are restrained (if at all) by a vague “public interest” mandate, and a claim that it will enforce these privacy principles on a case-by-case basis. Thus, the FCC’s proposed regime is inherently one based on vast agency discretion. As in many other contexts, enforcers with wide discretion and a tremendous power to penalize exert a chilling effect on innovation and openness, as well as a frightening power over a tremendous swath of the economy. For the FCC to claim anything like an unbounded UDAP authority for itself has got to be outside of the archaic grant of authority from § 201, and is certainly a long stretch for the language of § 706 (a provision of the Act which it used as one of the fundamental justifications for the OIO)— leading very possibly to a bout of Chevron problems under precedent such as King v. Burwell and UARG v. EPA.

And there is a real risk here of, if not hypocrisy, then… deep conflict in the way the FCC will strike out on the set-top box and privacy NPRMs. The Commission has already noted in its NPRM that it will not be able to bind third-party providers of set-top boxes under the same privacy requirements that apply to current MVPD providers. Self-certification will go a certain length, but even there agitation from privacy absolutists will possibly sway the FCC to consider more stringent requirements. For instance, §§ 551 and 338 of the Act — which the FCC focuses on in the set-top box NPRM — are really only about disclosing intended uses of consumer data. And disclosures can come in many forms, including burying them in long terms of service that customers frequently do not read. Such “weak” guarantees of consumer privacy will likely become a frequent source of complaint (and FCC filings) for privacy absolutists.  

Further, many of the new set-top box entrants are going to be current providers of OTT video or devices that redistribute OTT video. And many of these providers make a huge share of their revenue from data mining and selling access to customer data. Which means one of two things: Either the FCC is going to just allow us to live in a world of double standards where these self-certifying entities are permitted significantly more leeway in their uses of consumer data than MVPD providers or, alternatively, the FCC is going to discover that it does in fact need to “do something.” If only there were a creative way to extend the new privacy authority under Title II to these providers of set-top boxes… . Oh! there is: bring edge providers into the regulation fold under the OIO.

It’s interesting that Wheeler’s announcement of the FCC’s privacy NPRM explicitly noted that the rules would not be extended to edge providers. That Wheeler felt the need to be explicit in this suggests that he believes that the FCC has the authority to extend the privacy regulations to edge providers, but that it will merely forbear (for now) from doing so.

If edge providers are swept into the scope of Title II they would be subject to the brand new privacy rules the FCC is proposing. Thus, despite itself (or perhaps not), the FCC may find itself in possession of a much larger authority over some edge providers than any of the pro-Title II folks would have dared admit was possible. And the hook (this time) could be the privacy concerns embedded in the FCC’s ill-advised attempt to “open” the set-top box market.

This is a complicated set of issues, and it’s contingent on a number of moving parts. This week, Chairman Wheeler will be facing an appropriations hearing where I hope he will be asked to unpack his thinking regarding the true extent to which the OIO may in fact be extended to the edge.

Last week, FCC General Counsel Jonathan Sallet pulled back the curtain on the FCC staff’s analysis behind its decision to block Comcast’s acquisition of Time Warner Cable. As the FCC staff sets out on its reported Rainbow Tour to reassure regulated companies that it’s not “hostile to the industries it regulates,” Sallet’s remarks suggest it will have an uphill climb. Unfortunately, the staff’s analysis appears to have been unduly speculative, disconnected from critical market realities, and decidedly biased — not characteristics in a regulator that tend to offer much reassurance.

Merger analysis is inherently speculative, but, as courts have repeatedly had occasion to find, the FCC has a penchant for stretching speculation beyond the breaking point, adopting theories of harm that are vaguely possible, even if unlikely and inconsistent with past practice, and poorly supported by empirical evidence. The FCC’s approach here seems to fit this description.

The FCC’s fundamental theory of anticompetitive harm

To begin with, as he must, Sallet acknowledged that there was no direct competitive overlap in the areas served by Comcast and Time Warner Cable, and no consumer would have seen the number of providers available to her changed by the deal.

But the FCC staff viewed this critical fact as “not outcome determinative.” Instead, Sallet explained that the staff’s opposition was based primarily on a concern that the deal might enable Comcast to harm “nascent” OVD competitors in order to protect its video (MVPD) business:

Simply put, the core concern came down to whether the merged firm would have an increased incentive and ability to safeguard its integrated Pay TV business model and video revenues by limiting the ability of OVDs to compete effectively, especially through the use of new business models.

The justification for the concern boiled down to an assumption that the addition of TWC’s subscriber base would be sufficient to render an otherwise too-costly anticompetitive campaign against OVDs worthwhile:

Without the merger, a company taking action against OVDs for the benefit of the Pay TV system as a whole would incur costs but gain additional sales – or protect existing sales — only within its footprint. But the combined entity, having a larger footprint, would internalize more of the external “benefits” provided to other industry members.

The FCC theorized that, by acquiring a larger footprint, Comcast would gain enough bargaining power and leverage, as well as the means to profit from an exclusionary strategy, leading it to employ a range of harmful tactics — such as impairing the quality/speed of OVD streams, imposing data caps, limiting OVD access to TV-connected devices, imposing higher interconnection fees, and saddling OVDs with higher programming costs. It’s difficult to see how such conduct would be permitted under the FCC’s Open Internet Order/Title II regime, but, nevertheless, the staff apparently believed that Comcast would possess a powerful “toolkit” with which to harm OVDs post-transaction.

Comcast’s share of the MVPD market wouldn’t have changed enough to justify the FCC’s purported fears

First, the analysis turned on what Comcast could and would do if it were larger. But Comcast was already the largest ISP and MVPD (now second largest MVPD, post AT&T/DIRECTV) in the nation, and presumably it has approximately the same incentives and ability to disadvantage OVDs today.

In fact, there’s no reason to believe that the growth of Comcast’s MVPD business would cause any material change in its incentives with respect to OVDs. Whatever nefarious incentives the merger allegedly would have created by increasing Comcast’s share of the MVPD market (which is where the purported benefits in the FCC staff’s anticompetitive story would be realized), those incentives would be proportional to the size of increase in Comcast’s national MVPD market share — which, here, would be about eight percentage points: from 22% to under 30% of the national market.

It’s difficult to believe that Comcast would gain the wherewithal to engage in this costly strategy by adding such a relatively small fraction of the MVPD market (which would still leave other MVPDs serving fully 70% of the market to reap the purported benefits instead of Comcast), but wouldn’t have it at its current size – and there’s no evidence that it has ever employed such strategies with its current market share.

It bears highlighting that the D.C. Circuit has already twice rejected FCC efforts to impose a 30% market cap on MVPDs, based on the Commission’s inability to demonstrate that a greater-than-30% share would create competitive problems, especially given the highly dynamic nature of the MVPD market. In vacating the FCC’s most recent effort to do so in 2009, the D.C. Circuit was resolute in its condemnation of the agency, noting:

In sum, the Commission has failed to demonstrate that allowing a cable operator to serve more than 30% of all [MVPD] subscribers would threaten to reduce either competition or diversity in programming.

The extent of competition and the amount of available programming (including original programming distributed by OVDs themselves) has increased substantially since 2009; this makes the FCC’s competitive claims even less sustainable today.

It’s damning enough to the FCC’s case that there is no marketplace evidence of such conduct or its anticompetitive effects in today’s market. But it’s truly impossible to square the FCC’s assertions about Comcast’s anticompetitive incentives with the fact that, over the past decade, Comcast has made massive investments in broadband, steadily increased broadband speeds, and freely licensed its programming, among other things that have served to enhance OVDs’ long-term viability and growth. Chalk it up to the threat of regulatory intervention or corporate incompetence if you can’t believe that competition alone could be responsible for this largesse, but, whatever the reason, the FCC staff’s fears appear completely unfounded in a marketplace not significantly different than the landscape that would have existed post-merger.

OVDs aren’t vulnerable, and don’t need the FCC’s “help”

After describing the “new entrants” in the market — such unfamiliar and powerless players as Dish, Sony, HBO, and CBS — Sallet claimed that the staff was principally animated by the understanding that

Entrants are particularly vulnerable when competition is nascent. Thus, staff was particularly concerned that this transaction could damage competition in the video distribution industry.

Sallet’s description of OVDs makes them sound like struggling entrepreneurs working in garages. But, in fact, OVDs have radically reshaped the media business and wield enormous clout in the marketplace.

Netflix, for example, describes itself as “the world’s leading Internet television network with over 65 million members in over 50 countries.” New services like Sony Vue and Sling TV are affiliated with giant, well-established media conglomerates. And whatever new offerings emerge from the FCC-approved AT&T/DIRECTV merger will be as well-positioned as any in the market.

In fact, we already know that the concerns of the FCC are off-base because they are of a piece with the misguided assumptions that underlie the Chairman’s recent NPRM to rewrite the MVPD rules to “protect” just these sorts of companies. But the OVDs themselves — the ones with real money and their competitive futures on the line — don’t see the world the way the FCC does, and they’ve resolutely rejected the Chairman’s proposal. Notably, the proposed rules would “protect” these services from exactly the sort of conduct that Sallet claims would have been a consequence of the Comcast-TWC merger.

If they don’t want or need broad protection from such “harms” in the form of revised industry-wide rules, there is surely no justification for the FCC to throttle a merger based on speculation that the same conduct could conceivably arise in the future.

The realities of the broadband market post-merger wouldn’t have supported the FCC’s argument, either

While a larger Comcast might be in a position to realize more of the benefits from the exclusionary strategy Sallet described, it would also incur more of the costs — likely in direct proportion to the increased size of its subscriber base.

Think of it this way: To the extent that an MVPD can possibly constrain an OVD’s scope of distribution for programming, doing so also necessarily makes the MVPD’s own broadband offering less attractive, forcing it to incur a cost that would increase in proportion to the size of the distributor’s broadband market. In this case, as noted, Comcast would have gained MVPD subscribers — but it would have also gained broadband subscribers. In a world where cable is consistently losing video subscribers (as Sallet acknowledged), and where broadband offers higher margins and faster growth, it makes no economic sense that Comcast would have valued the trade-off the way the FCC claims it would have.

Moreover, in light of the existing conditions imposed on Comcast under the Comcast/NBCU merger order from 2011 (which last for a few more years) and the restrictions adopted in the Open Internet Order, Comcast’s ability to engage in the sort of exclusionary conduct described by Sallet would be severely limited, if not non-existent. Nor, of course, is there any guarantee that former or would-be OVD subscribers would choose to subscribe to, or pay more for, any MVPD in lieu of OVDs. Meanwhile, many of the relevant substitutes in the MVPD market (like AT&T and Verizon FiOS) also offer broadband services – thereby increasing the costs that would be incurred in the broadband market even more, as many subscribers would shift not only their MVPD, but also their broadband service, in response to Comcast degrading OVDs.

And speaking of the Open Internet Order — wasn’t that supposed to prevent ISPs like Comcast from acting on their alleged incentives to impede the quality of, or access to, edge providers like OVDs? Why is merger enforcement necessary to accomplish the same thing once Title II and the rest of the Open Internet Order are in place? And if the argument is that the Open Internet Order might be defeated, aside from the completely speculative nature of such a claim, why wouldn’t a merger condition that imposed the same constraints on Comcast – as was done in the Comcast/NBCU merger order by imposing the former net neutrality rules on Comcast – be perfectly sufficient?

While the FCC staff analysis accepted as true (again, contrary to current marketplace evidence) that a bigger Comcast would have more incentive to harm OVDs post-merger, it rejected arguments that there could be countervailing benefits to OVDs and others from this same increase in scale. Thus, things like incremental broadband investments and speed increases, a larger Wi-Fi network, and greater business services market competition – things that Comcast is already doing and would have done on a greater and more-accelerated scale in the acquired territories post-transaction – were deemed insufficient to outweigh the expected costs of the staff’s entirely speculative anticompetitive theory.

In reality, however, not only OVDs, but consumers – and especially TWC subscribers – would have benefitted from the merger by access to Comcast’s faster broadband speeds, its new investments, and its superior video offerings on the X1 platform, among other things. Many low-income families would have benefitted from expansion of Comcast’s Internet Essentials program, and many businesses would have benefited from the addition of a more effective competitor to the incumbent providers that currently dominate the business services market. Yet these and other verifiable benefits were given short shrift in the agency’s analysis because they “were viewed by staff as incapable of outweighing the potential harms.”

The assumptions underlying the FCC staff’s analysis of the broadband market are arbitrary and unsupportable

Sallet’s claim that the combined firm would have 60% of all high-speed broadband subscribers in the U.S. necessarily assumes a national broadband market measured at 25 Mbps or higher, which is a red herring.

The FCC has not explained why 25 Mbps is a meaningful benchmark for antitrust analysis. The FCC itself endorsed a 10 Mbps baseline for its Connect America fund last December, noting that over 70% of current broadband users subscribe to speeds less than 25 Mbps, even in areas where faster speeds are available. And streaming online video, the most oft-cited reason for needing high bandwidth, doesn’t require 25 Mbps: Netflix says that 5 Mbps is the most that’s required for an HD stream, and the same goes for Amazon (3.5 Mbps) and Hulu (1.5 Mbps).

What’s more, by choosing an arbitrary, faster speed to define the scope of the broadband market (in an effort to assert the non-competitiveness of the market, and thereby justify its broadband regulations), the agency has – without proper analysis or grounding, in my view – unjustifiably shrunk the size of the relevant market. But, as it happens, doing so also shrinks the size of the increase in “national market share” that the merger would have brought about.

Recall that the staff’s theory was premised on the idea that the merger would give Comcast control over enough of the broadband market that it could unilaterally impose costs on OVDs sufficient to impair their ability to reach or sustain minimum viable scale. But Comcast would have added only one percent of this invented “market” as a result of the merger. It strains credulity to assert that there could be any transaction-specific harm from an increase in market share equivalent to a rounding error.

In any case, basing its rejection of the merger on a manufactured 25 Mbps relevant market creates perverse incentives and will likely do far more to harm OVDs than realization of even the staff’s worst fears about the merger ever could have.

The FCC says it wants higher speeds, and it wants firms to invest in faster broadband. But here Comcast did just that, and then was punished for it. Rather than acknowledging Comcast’s ongoing broadband investments as strong indication that the FCC staff’s analysis might be on the wrong track, the FCC leadership simply sidestepped that inconvenient truth by redefining the market.

The lesson is that if you make your product too good, you’ll end up with an impermissibly high share of the market you create and be punished for it. This can’t possibly promote the public interest.

Furthermore, the staff’s analysis of competitive effects even in this ersatz market aren’t likely supportable. As noted, most subscribers access OVDs on connections that deliver content at speeds well below the invented 25 Mbps benchmark, and they pay the same prices for OVD subscriptions as subscribers who receive their content at 25 Mbps. Confronted with the choice to consume content at 25 Mbps or 10 Mbps (or less), the majority of consumers voluntarily opt for slower speeds — and they purchase service from Netflix and other OVDs in droves, nonetheless.

The upshot? Contrary to the implications on which the staff’s analysis rests, if Comcast were to somehow “degrade” OVD content on the 25 Mbps networks so that it was delivered with characteristics of video content delivered over a 10-Mbps network, real-world, observed consumer preferences suggest it wouldn’t harm OVDs’ access to consumers at all. This is especially true given that OVDs often have a global focus and reach (again, Netflix has 65 million subscribers in over 50 countries), making any claims that Comcast could successfully foreclose them from the relevant market even more suspect.

At the same time, while the staff apparently viewed the broadband alternatives as “limited,” the reality is that Comcast, as well as other broadband providers, are surrounded by capable competitors, including, among others, AT&T, Verizon, CenturyLink, Google Fiber, many advanced VDSL and fiber-based Internet service providers, and high-speed mobile wireless providers. The FCC understated the complex impact of this robust, dynamic, and ever-increasing competition, and its analysis entirely ignored rapidly growing mobile wireless broadband competition.

Finally, as noted, Sallet claimed that the staff determined that merger conditions would be insufficient to remedy its concerns, without any further explanation. Yet the Commission identified similar concerns about OVDs in both the Comcast/NBCUniversal and AT&T/DIRECTV transactions, and adopted remedies to address those concerns. We know the agency is capable of drafting behavioral conditions, and we know they have teeth, as demonstrated by prior FCC enforcement actions. It’s hard to understand why similar, adequate conditions could not have been fashioned for this transaction.

In the end, while I appreciate Sallet’s attempt to explain the FCC’s decision to reject the Comcast/TWC merger, based on the foregoing I’m not sure that Comcast could have made any argument or showing that would have dissuaded the FCC from challenging the merger. Comcast presented a strong economic analysis answering the staff’s concerns discussed above, all to no avail. It’s difficult to escape the conclusion that this was a politically-driven result, and not one rigorously based on the facts or marketplace reality.

Yesterday, the International Center for Law & Economics, together with Professor Gus Hurwitz, Nebraska College of Law, and nine other scholars of law and economics, filed an amicus brief in the DC Circuit explaining why the court should vacate the FCC’s 2015 Open Internet Order.

A few key points from ICLE’s brief follow, but you can read a longer summary of the brief here.

If the 2010 Order was a limited incursion into neighboring territory, the 2015 Order represents the outright colonization of a foreign land, extending FCC control over the Internet far beyond what the Telecommunications Act authorizes.

The Commission asserts vast powers — powers that Congress never gave it — not just over broadband but also over the very ‘edge’ providers it claims to be protecting. The court should be very skeptical of the FCC’s claims to pervasive powers over the Internet.

In the 2015 Order, the FCC Invoked Title II, admitted that it was unworkable for the Internet, and then tried to ‘tailor’ the statute to avoid its worst excesses.

That the FCC felt the need for such sweeping forbearance should have indicated to it that it had ‘taken an interpretive wrong turn’ in understanding the statute Congress gave it. Last year, the Supreme Court blocked a similar attempt by the EPA to ‘modernize’ old legislation in a way that gave it expansive new powers. In its landmark UARG decision, the Court made clear that it won’t allow regulatory agencies to rewrite legislation in an effort to retrofit their statutes to their preferred regulatory regimes.

Internet regulation is a question of ‘vast economic and political significance,’ yet the FCC  didn’t even bother to weigh the costs and benefits of its rule. 

FCC Chairman Tom Wheeler never misses an opportunity to talk about the the Internet as ‘the most important network known to Man.’ So why did he and the previous FCC Chairman ignore requests from other commissioners for serious, independent economic analysis of the supposed problem and the best way to address it? Why did the FCC rush to adopt a plan that had the effect of blocking the Federal Trade Commission from applying its consumer protection laws to the Internet? For all the FCC’s talk about protecting consumers, it appears that its real agenda may be simply expanding its own power.

Joining ICLE on the brief are:

  • Richard Epstein (NYU Law)
  • James Huffman (Lewis & Clark Law)
  • Gus Hurwitz (Nebraska Law)
  • Thom Lambert (Missouri Law)
  • Daniel Lyons (Boston College Law)
  • Geoffrey Manne (ICLE)
  • Randy May (Free State Foundation)
  • Jeremy Rabkin (GMU Law)
  • Ronald Rotunda (Chapman Law)
  • Ilya Somin (GMU Law)

Read the brief here, and the summary here.

Read more of ICLE’s work on net neutrality and Title II, including:

  • Highlights from policy and legal comments filed by ICLE and TechFreedom on net neutrality
  • “Regulating the Most Powerful Network Ever,” a scholarly essay by Gus Hurwitz for the Free State Foundation
  • “How to Break the Internet,” an essay by Geoffrey Manne and Ben Sperry, in Reason Magazine
  • “The FCC’s Net Neutrality Victory is Anything But,” an op-ed by Geoffrey Manne, in Wired
  • “The Feds Lost on Net Neutrality, But Won Control of the Internet,” an op-ed by Geoffrey Manne and Berin Szoka in Wired
  • “Net Neutrality’s Hollow Promise to Startups,” an op-ed by Geoffrey Manne and Berin Szoka in Computerworld
  • Letter signed by 32 scholars urging the FTC to caution the FCC against adopting per se net neutrality rules by reclassifying ISPs under Title II
  • The FCC’s Open Internet Roundtables, Policy Approaches, Panel 3, Enhancing Transparency, with Geoffrey Manne​