Archives For FCC

The FCC’s proposed “Open Internet Order,” which would impose heavy-handed “common carrier” regulation of Internet service providers (the Order is being appealed in federal court and there are good arguments for striking it down) in order to promote “net neutrality,” is fundamentally misconceived.  If upheld, it will slow innovation, impose substantial costs, and harm consumers (see Heritage Foundation commentaries on FCC Internet regulation here, here, here, and here).  What’s more, it is not needed to protect consumers and competition from potential future abuse by Internet firms.  As I explain in a Heritage Foundation Legal Memorandum published yesterday, should the Open Internet Order be struck down, the U.S. Federal Trade Commission (FTC) has ample authority under Section 5 of the Federal Trade Commission Act (FTC Act) to challenge any harmful conduct by entities involved in Internet broadband services markets when such conduct undermines competition or harms consumers.

Section 5 of the FTC Act authorizes the FTC to prevent persons, partnerships, or corporations from engaging in “unfair methods of competition” or “unfair or deceptive acts or practices” in or affecting commerce.  This gives it ample authority to challenge Internet abuses raising antitrust (unfair methods) and consumer protection (unfair acts or practices) issues.

On the antitrust side, in evaluating individual business restraints under a “rule of reason,” the FTC relies on objective fact-specific analyses of the actual economic and consumer protection implications of a particular restraint.  Thus, FTC evaluations of broadband industry restrictions are likely to be more objective and predictable than highly subjective “public interest” assessments by the FCC, leading to reduced error and lower planning costs for purveyors of broadband and related services.  Appropriate antitrust evaluation should accord broad leeway to most broadband contracts.  As FTC Commissioner Josh Wright put it in testifying before Congress, “fundamental observation and market experience [demonstrate] that the business practices at the heart of the net neutrality debate are generally procompetitive.”  This suggests application of a rule of reason that will fully weigh efficiencies but not shy away from challenging broadband-related contractual arrangements that undermine the competitive process.

On the consumer protection side, the FTC can attack statements made by businesses that mislead and thereby impose harm on consumers (including business purchasers) who are acting reasonably.  It can also challenge practices that, though not literally false or deceptive, impose substantial harm on consumers (including business purchasers) that they cannot reasonably avoid, assuming the harm is greater than any countervailing benefits.  These are carefully designed and cabined sources of authority that require the FTC to determine the presence of actual consumer harm before acting.  Application of the FTC’s unfairness and deception powers therefore lacks the uncertainty associated with the FCC’s uncabined and vague “public interest” standard of evaluation.  As in the case of antitrust, the existence of greater clarity and a well-defined analytic methodology suggests that reliance on FTC rather than FCC enforcement in this area is preferable from a policy standpoint.

Finally, arguments for relying on FTC Internet policing are based on experience as well – the FTC is no Internet policy novice.  It closely monitors Internet activity and, over the years, it has developed substantial expertise in Internet topics through research, hearings, and enforcement actions.

Most recently, for example, the FTC sued AT&T in federal court for allegedly slowing wireless customers’ Internet speeds, although the customers had subscribed to “unlimited” data usage plans.  The FTC asserted that in offering renewals to unlimited-plan customers, AT&T did not adequately inform them of a new policy to “throttle” (drastically reduce the speed of) customer data service once a certain monthly data usage cap was met. The direct harm of throttling was in addition to the high early termination fees that dissatisfied customers would face for early termination of their services.  The FTC characterized this behavior as both “unfair” and “deceptive.”  Moreover, the commission claimed that throttling-related speed reductions and data restrictions were not determined by real-time network congestion and thus did not even qualify as reasonable network management activity.  This case illustrates that the FTC is perfectly capable of challenging potential “network neutrality” violations that harm consumer welfare (since “throttled” customers are provided service that is inferior to the service afforded customers on “tiered” service plans) and thus FCC involvement is unwarranted.

In sum, if a court strikes down the latest FCC effort to regulate the Internet, the FTC has ample authority to address competition and consumer protection problems in the area of broadband, including questions related to net neutrality.  The FTC’s highly structured, analytic, fact-based approach to these issues is superior to FCC net neutrality regulation based on vague and unfocused notions of the public interest.  If a court does not act, Congress might wish to consider legislation to prohibit FCC Internet regulation and leave oversight of potential competitive and consumer abuses to the FTC.

Recently, Commissioner Pai praised the introduction of bipartisan legislation to protect joint sales agreements (“JSAs”) between local television stations. He explained that

JSAs are contractual agreements that allow broadcasters to cut down on costs by using the same advertising sales force. The efficiencies created by JSAs have helped broadcasters to offer services that benefit consumers, especially in smaller markets…. JSAs have served communities well and have promoted localism and diversity in broadcasting. Unfortunately, the FCC’s new restrictions on JSAs have already caused some stations to go off the air and other stations to carry less local news.

fccThe “new restrictions” to which Commissioner Pai refers were recently challenged in court by the National Association of Broadcasters (NAB), et. al., and on April 20, the International Center for Law & Economics and a group of law and economics scholars filed an amicus brief with the D.C. Circuit Court of Appeals in support of the petition, asking the court to review the FCC’s local media ownership duopoly rule restricting JSAs.

Much as it did with with net neutrality, the FCC is looking to extend another set of rules with no basis in sound economic theory or established facts.

At issue is the FCC’s decision both to retain the duopoly rule and to extend that rule to certain JSAs, all without completing a legally mandated review of the local media ownership rules, due since 2010 (but last completed in 2007).

The duopoly rule is at odds with sound competition policy because it fails to account for drastic changes in the media market that necessitate redefinition of the market for television advertising. Moreover, its extension will bring a halt to JSAs currently operating (and operating well) in nearly 100 markets.  As the evidence on the FCC rulemaking record shows, many of these JSAs offer public interest benefits and actually foster, rather than stifle, competition in broadcast television markets.

In the world of media mergers generally, competition law hasn’t yet caught up to the obvious truth that new media is competing with old media for eyeballs and advertising dollars in basically every marketplace.

For instance, the FTC has relied on very narrow market definitions to challenge newspaper mergers without recognizing competition from television and the Internet. Similarly, the generally accepted market in which Google’s search conduct has been investigated is something like “online search advertising” — a market definition that excludes traditional marketing channels, despite the fact that advertisers shift their spending between these channels on a regular basis.

But the FCC fares even worse here. The FCC’s duopoly rule is premised on an “eight voices” test for local broadcast stations regardless of the market shares of the merging stations. In other words, one entity cannot own FCC licenses to two or more TV stations in the same local market unless there are at least eight independently owned stations in that market, even if their combined share of the audience or of advertising are below the level that could conceivably give rise to any inference of market power.

Such a rule is completely unjustifiable under any sensible understanding of competition law.

Can you even imagine the FTC or DOJ bringing an 8 to 7 merger challenge in any marketplace? The rule is also inconsistent with the contemporary economic learning incorporated into the 2010 Merger Guidelines, which looks at competitive effects rather than just counting competitors.

Not only did the FCC fail to analyze the marketplace to understand how much competition there is between local broadcasters, cable, and online video, but, on top of that, the FCC applied this outdated duopoly rule to JSAs without considering their benefits.

The Commission offers no explanation as to why it now believes that extending the duopoly rule to JSAs, many of which it had previously approved, is suddenly necessary to protect competition or otherwise serve the public interest. Nor does the FCC cite any evidence to support its position. In fact, the record evidence actually points overwhelmingly in the opposite direction.

As a matter of sound regulatory practice, this is bad enough. But Congress directed the FCC in Section 202(h) of the Telecommunications Act of 1996 to review all of its local ownership rules every four years to determine whether they were still “necessary in the public interest as the result of competition,” and to repeal or modify those that weren’t. During this review, the FCC must examine the relevant data and articulate a satisfactory explanation for its decision.

So what did the Commission do? It announced that, instead of completing its statutorily mandated 2010 quadrennial review of its local ownership rules, it would roll that review into a new 2014 quadrennial review (which it has yet to perform). Meanwhile, the Commission decided to retain its duopoly rule pending completion of that review because it had “tentatively” concluded that it was still necessary.

In other words, the FCC hasn’t conducted its mandatory quadrennial review in more than seven years, and won’t, under the new rules, conduct one for another year and a half (at least). Oh, and, as if nothing of relevance has changed in the market since then, it “tentatively” maintains its already suspect duopoly rule in the meantime.

In short, because the FCC didn’t conduct the review mandated by statute, there is no factual support for the 2014 Order. By relying on the outdated findings from its earlier review, the 2014 Order fails to examine the significant changes both in competition policy and in the market for video programming that have occurred since the current form of the rule was first adopted, rendering the rulemaking arbitrary and capricious under well-established case law.

Had the FCC examined the record of the current rulemaking, it would have found substantial evidence that undermines, rather than supports, the FCC’s rule.

Economic studies have shown that JSAs can help small broadcasters compete more effectively with cable and online video in a world where their advertising revenues are drying up and where temporary economies of scale (through limited contractual arrangements like JSAs) can help smaller, local advertising outlets better implement giant, national advertising campaigns. A ban on JSAs will actually make it less likely that competition among local broadcasters can survive, not more.

OfficialPaiCommissioner Pai, in his dissenting statement to the 2014 Order, offered a number of examples of the benefits of JSAs (all of them studiously ignored by the Commission in its Order). In one of these, a JSA enabled two stations in Joplin, Missouri to use their $3.5 million of cost savings from a JSA to upgrade their Doppler radar system, which helped save lives when a devastating tornado hit the town in 2011. But such benefits figure nowhere in the FCC’s “analysis.”

Several econometric studies also provide empirical support for the (also neglected) contention that duopolies and JSAs enable stations to improve the quality and prices of their programming.

One study, by Jeff Eisenach and Kevin Caves, shows that stations operating under these agreements are likely to carry significantly more news, public affairs, and current affairs programming than other stations in their markets. The same study found an 11 percent increase in audience shares for stations acquired through a duopoly. Meanwhile, a study by Hal Singer and Kevin Caves shows that markets with JSAs have advertising prices that are, on average, roughly 16 percent lower than in non-duopoly markets — not higher, as would be expected if JSAs harmed competition.

And again, Commissioner Pai provides several examples of these benefits in his dissenting statement. In one of these, a JSA in Wichita, Kansas enabled one of the two stations to provide Spanish-language HD programming, including news, weather, emergency and community information, in a market where that Spanish-language programming had not previously been available. Again — benefit ignored.

Moreover, in retaining its duopoly rule on the basis of woefully outdated evidence, the FCC completely ignores the continuing evolution in the market for video programming.

In reality, competition from non-broadcast sources of programming has increased dramatically since 1999. Among other things:

  • VideoScreensToday, over 85 percent of American households watch TV over cable or satellite. Most households now have access to nearly 200 cable channels that compete with broadcast TV for programming content and viewers.
  • In 2014, these cable channels attracted twice as many viewers as broadcast channels.
  • Online video services such as Netflix, Amazon Prime, and Hulu have begun to emerge as major new competitors for video programming, leading 179,000 households to “cut the cord” and cancel their cable subscriptions in the third quarter of 2014 alone.
  • Today, 40 percent of U.S. households subscribe to an online streaming service; as a result, cable ratings among adults fell by nine percent in 2014.
  • At the end of 2007, when the FCC completed its last quadrennial review, the iPhone had just been introduced, and the launch of the iPad was still more than two years away. Today, two-thirds of Americans have a smartphone or tablet over which they can receive video content, using technology that didn’t even exist when the FCC last amended its duopoly rule.

In the face of this evidence, and without any contrary evidence of its own, the Commission’s action in reversing 25 years of agency practice and extending its duopoly rule to most JSAs is arbitrary and capricious.

The law is pretty clear that the extent of support adduced by the FCC in its 2014 Rule is insufficient. Among other relevant precedent (and there is a lot of it):

The Supreme Court has held that an agency

must examine the relevant data and articulate a satisfactory explanation for its action, including a rational connection between the facts found and the choice made.

In the DC Circuit:

the agency must explain why it decided to act as it did. The agency’s statement must be one of ‘reasoning’; it must not be just a ‘conclusion’; it must ‘articulate a satisfactory explanation’ for its action.

And:

[A]n agency acts arbitrarily and capriciously when it abruptly departs from a position it previously held without satisfactorily explaining its reason for doing so.

Also:

The FCC ‘cannot silently depart from previous policies or ignore precedent’ . . . .”

And most recently in Judge Silberman’s concurrence/dissent in the 2010 Verizon v. FCC Open Internet Order case:

factual determinations that underly [sic] regulations must still be premised on demonstrated — and reasonable — evidential support

None of these standards is met in this case.

It will be noteworthy to see what the DC Circuit does with these arguments given the pending Petitions for Review of the latest Open Internet Order. There, too, the FCC acted without sufficient evidentiary support for its actions. The NAB/Stirk Holdings case may well turn out to be a bellwether for how the court views the FCC’s evidentiary failings in that case, as well.

The scholars joining ICLE on the brief are:

  • Babette E. Boliek, Associate Professor of Law, Pepperdine School of Law
  • Henry N. Butler, George Mason University Foundation Professor of Law and Executive Director of the Law & Economics Center, George Mason University School of Law (and newly appointed dean).
  • Richard Epstein, Laurence A. Tisch Professor of Law, Classical Liberal Institute, New York University School of Law
  • Stan Liebowitz, Ashbel Smith Professor of Economics, University of Texas at Dallas
  • Fred McChesney, de la Cruz-Mentschikoff Endowed Chair in Law and Economics, University of Miami School of Law
  • Paul H. Rubin, Samuel Candler Dobbs Professor of Economics, Emory University
  • Michael E. Sykuta, Associate Professor in the Division of Applied Social Sciences and Director of the Contracting and Organizations Research Institute, University of Missouri

The full amicus brief is available here.

The Wall Street Journal dropped an FCC bombshell last week, although I’m not sure anyone noticed. In an article ostensibly about the possible role that MFNs might play in the Comcast/Time-Warner Cable merger, the Journal noted that

The FCC is encouraging big media companies to offer feedback confidentially on Comcast’s $45-billion offer for Time Warner Cable.

Not only is the FCC holding secret meetings, but it is encouraging Comcast’s and TWC’s commercial rivals to hold confidential meetings and to submit information under seal. This is not a normal part of ex parte proceedings at the FCC.

In the typical proceeding of this sort – known as a “permit-but-disclose proceeding” – ex parte communications are subject to a host of disclosure requirements delineated in 47 CFR 1.1206. But section 1.1200(a) of the Commission’s rules permits the FCC, in its discretion, to modify the applicable procedures if the public interest so requires.

If you dig deeply into the Public Notice seeking comments on the merger, you find a single sentence stating that

Requests for exemptions from the disclosure requirements pursuant to section 1.1204(a)(9) may be made to Jonathan Sallet [the FCC’s General Counsel] or Hillary Burchuk [who heads the transaction review team].

Similar language appears in the AT&T/DirecTV transaction Public Notice.

This leads to the cited rule exempting certain ex parte presentations from the usual disclosure requirements in such proceedings, including the referenced one that exempts ex partes from disclosure when

The presentation is made pursuant to an express or implied promise of confidentiality to protect an individual from the possibility of reprisal, or there is a reasonable expectation that disclosure would endanger the life or physical safety of an individual

So the FCC is inviting “media companies” to offer confidential feedback and to hold secret meetings that the FCC will hold confidential because of “the possibility of reprisal” based on language intended to protect individuals.

Such deviations from the standard permit-but-disclose procedures are extremely rare. As in non-existent. I guess there might be other examples, but I was unable to find a single one in a quick search. And I’m willing to bet that the language inviting confidential communications in the PN hasn’t appeared before – and certainly not in a transaction review.

It is worth pointing out that the language in 1.1204(a)(9) is remarkably similar to language that appears in the Freedom of Information Act. As the DOJ notes regarding that exemption:

Exemption 7(D) provides protection for “records or information compiled for law enforcement purposes [which] could reasonably be expected to disclose the identity of a confidential source… to ensure that “confidential sources are not lost through retaliation against the sources for past disclosure or because of the sources’ fear of future disclosure.”

Surely the fear-of-reprisal rationale for confidentiality makes sense in that context – but here? And invoked to elicit secret meetings and to keep confidential information from corporations instead of individuals, it makes even less sense (and doesn’t even obviously comply with the rule itself). It is not as though – as far as I know – someone approached the Commission with stated fears and requested it implement a procedure for confidentiality in these particular reviews.

Rather, this is the Commission inviting non-transparent process in the midst of a heated, politicized and heavily-scrutinized transaction review.

The optics are astoundingly bad.

Unfortunately, this kind of behavior seems to be par for the course for the current FCC. As Commissioner Pai has noted on more than one occasion, the minority commissioners have been routinely kept in the dark with respect to important matters at the Commission – not coincidentally, in other highly-politicized proceedings.

What’s particularly troubling is that, for all its faults, the FCC’s process is typically extremely open and transparent. Public comments, endless ex parte meetings, regular Open Commission Meetings are all the norm. And this is as it should be. Particularly when it comes to transactions and other regulated conduct for which the regulated entity bears the burden of proving that its behavior does not offend the public interest, it is obviously necessary to have all of the information – to know what might concern the Commission and to make a case respecting those matters.

The kind of arrogance on display of late, and the seeming abuse of process that goes along with it, hearkens back to the heady days of Kevin Martin’s tenure as FCC Chairman – a tenure described as “dysfunctional” and noted for its abuse of process.

All of which should stand as a warning to the vocal, pro-regulatory minority pushing for the FCC to proclaim enormous power to regulate net neutrality – and broadband generally – under Title II. Just as Chairman Martin tried to manipulate diversity rules to accomplish his pet project of cable channel unbundling, some future Chairman will undoubtedly claim authority under Title II to accomplish some other unintended, but politically expedient, objective — and it may not be one the self-proclaimed consumer advocates like, when it happens.

Bad as that risk may be, it is only made more likely by regulatory reviews undertaken in secret. Whatever impelled the Chairman to invite unprecedented secrecy into these transaction reviews, it seems to be of a piece with a deepening politicization and abuse of process at the Commission. It’s both shameful – and deeply worrying.

The International Center for Law & Economics (ICLE) and TechFreedom filed two joint comments with the FCC today, explaining why the FCC has no sound legal basis for micromanaging the Internet and why “net neutrality” regulation would actually prove counter-productive for consumers.

The Policy Comments are available here, and the Legal Comments are here. See our previous post, Net Neutrality Regulation Is Bad for Consumers and Probably Illegal, for a distillation of many of the key points made in the comments.

New regulation is unnecessary. “An open Internet and the idea that companies can make special deals for faster access are not mutually exclusive,” said Geoffrey Manne, Executive Director of ICLE. “If the Internet really is ‘open,’ shouldn’t all companies be free to experiment with new technologies, business models and partnerships?”

“The media frenzy around this issue assumes that no one, apart from broadband companies, could possibly question the need for more regulation,” said Berin Szoka, President of TechFreedom. “In fact, increased regulation of the Internet will incite endless litigation, which will slow both investment and innovation, thus harming consumers and edge providers.”

Title II would be a disaster. The FCC has proposed re-interpreting the Communications Act to classify broadband ISPs under Title II as common carriers. But reinterpretation might unintentionally ensnare edge providers, weighing them down with onerous regulations. “So-called reclassification risks catching other Internet services in the crossfire,” explained Szoka. “The FCC can’t easily forbear from Title II’s most onerous rules because the agency has set a high bar for justifying forbearance. Rationalizing a changed approach would be legally and politically difficult. The FCC would have to simultaneously find the broadband market competitive enough to forbear, yet fragile enough to require net neutrality rules. It would take years to sort out this mess — essentially hitting the pause button on better broadband.”

Section 706 is not a viable option. In 2010, the FCC claimed Section 706 as an independent grant of authority to regulate any form of “communications” not directly barred by the Act, provided only that the Commission assert that regulation would somehow promote broadband. “This is an absurd interpretation,” said Szoka. “This could allow the FCC to essentially invent a new Communications Act as it goes, regulating not just broadband, but edge companies like Google and Facebook, too, and not just neutrality but copyright, cybersecurity and more. The courts will eventually strike down this theory.”

A better approach. “The best policy would be to maintain the ‘Hands off the Net’ approach that has otherwise prevailed for 20 years,” said Manne. “That means a general presumption that innovative business models and other forms of ‘prioritization’ are legal. Innovation could thrive, and regulators could still keep a watchful eye, intervening only where there is clear evidence of actual harm, not just abstract fears.” “If the FCC thinks it can justify regulating the Internet, it should ask Congress to grant such authority through legislation,” added Szoka. “A new communications act is long overdue anyway. The FCC could also convene a multistakeholder process to produce a code enforceable by the Federal Trade Commission,” he continued, noting that the White House has endorsed such processes for setting Internet policy in general.

Manne concluded: “The FCC should focus on doing what Section 706 actually commands: clearing barriers to broadband deployment. Unleashing more investment and competition, not writing more regulation, is the best way to keep the Internet open, innovative and free.”

For some of our other work on net neutrality, see:

“Understanding Net(flix) Neutrality,” an op-ed by Geoffrey Manne in the Detroit News on Netflix’s strategy to confuse interconnection costs with neutrality issues.

“The Feds Lost on Net Neutrality, But Won Control of the Internet,” an op-ed by Berin Szoka and Geoffrey Manne in Wired.com.

“That startup investors’ letter on net neutrality is a revealing look at what the debate is really about,” a post by Geoffrey Manne in Truth on the Market.

Bipartisan Consensus: Rewrite of ‘96 Telecom Act is Long Overdue,” a post on TF’s blog highlighting the key points from TechFreedom and ICLE’s joint comments on updating the Communications Act.

The Net Neutrality Comments are available here:

ICLE/TF Net Neutrality Policy Comments

TF/ICLE Net Neutrality Legal Comments

With Berin Szoka.

TechFreedom and the International Center for Law & Economics will shortly file two joint comments with the FCC, explaining why the FCC has no sound legal basis for micromanaging the Internet—now called “net neutrality regulation”—and why such regulation would be counter-productive as a policy matter. The following summarizes some of the key points from both sets of comments.

No one’s against an open Internet. The notion that anyone can put up a virtual shingle—and that the good ideas will rise to the top—is a bedrock principle with broad support; it has made the Internet essential to modern life. Key to Internet openness is the freedom to innovate. An open Internet and the idea that companies can make special deals for faster access are not mutually exclusive. If the Internet really is “open,” shouldn’t all companies be free to experiment with new technologies, business models and partnerships? Shouldn’t the FCC allow companies to experiment in building the unknown—and unknowable—Internet of the future?

The best approach would be to maintain the “Hands off the Net” approach that has otherwise prevailed for 20 years. That means a general presumption that innovative business models and other forms of “prioritization” are legal. Innovation could thrive, and regulators could still keep a watchful eye, intervening only where there is clear evidence of actual harm, not just abstract fears. And they should start with existing legal tools—like antitrust and consumer protection laws—before imposing prior restraints on innovation.

But net neutrality regulation hurts more than it helps. Counterintuitively, a blanket rule that ISPs treat data equally could actually harm consumers. Consider the innovative business models ISPs are introducing. T-Mobile’s unRadio lets users listen to all the on-demand music and radio they want without taking a hit against their monthly data plan. Yet so-called consumer advocates insist that’s a bad thing because it favors some content providers over others. In fact, “prioritizing” one service when there is congestion frees up data for subscribers to consume even more content—from whatever source. You know regulation may be out of control when a company is demonized for offering its users a freebie.

Treating each bit of data neutrally ignores the reality of how the Internet is designed, and how consumers use it.  Net neutrality proponents insist that all Internet content must be available to consumers neutrally, whether those consumers (or content providers) want it or not. They also argue against usage-based pricing. Together, these restrictions force all users to bear the costs of access for other users’ requests, regardless of who actually consumes the content, as the FCC itself has recognized:

[P]rohibiting tiered or usage-based pricing and requiring all subscribers to pay the same amount for broadband service, regardless of the performance or usage of the service, would force lighter end users of the network to subsidize heavier end users. It would also foreclose practices that may appropriately align incentives to encourage efficient use of networks.

The rules that net neutrality advocates want would hurt startups as well as consumers. Imagine a new entrant, clamoring for market share. Without the budget for a major advertising blitz, the archetypical “next Netflix” might never get the exposure it needs to thrive. But for a relatively small fee, the startup could sign up to participate in a sponsored data program, with its content featured and its customers’ data usage exempted from their data plans. This common business strategy could mean the difference between success and failure for a startup. Yet it would be prohibited by net neutrality rules banning paid prioritization.

The FCC lacks sound legal authority. The FCC is essentially proposing to do what can only properly be done by Congress: invent a new legal regime for broadband. Each of the options the FCC proposes to justify this—Section 706 of the Telecommunications Act and common carrier classification—is deeply problematic.

First, Section 706 isn’t sustainable. Until 2010, the FCC understood Section 706 as a directive to use its other grants of authority to promote broadband deployment. But in its zeal to regulate net neutrality, the FCC reversed itself in 2010, claiming Section 706 as an independent grant of authority. This would allow the FCC to regulate any form of “communications” in any way not directly barred by the Act — not just broadband but “edge” companies like Google and Facebook. This might mean going beyond neutrality to regulate copyright, cybersecurity and more. The FCC need only assert that regulation would somehow promote broadband.

If Section 706 is a grant of authority, it’s almost certainly a power to deregulate. But even if its power is as broad as the FCC claims, the FCC still hasn’t made the case that, on balance, its proposed regulations would actually do what it asserts: promote broadband. The FCC has stubbornly refused to conduct serious economic analysis on the net effects of its neutrality rules.

And Title II would be a disaster. The FCC has asked whether Title II of the Act, which governs “common carriers” like the old monopoly telephone system, is a workable option. It isn’t.

In the first place, regulations that impose design limitations meant for single-function networks simply aren’t appropriate for the constantly evolving Internet. Moreover, if the FCC re-interprets the Communications Act to classify broadband ISPs as common carriers, it risks catching other Internet services in the cross-fire, inadvertently making them common carriers, too. Surely net neutrality proponents can appreciate the harmful effects of treating Skype as a common carrier.

Forbearance can’t clean up the Title II mess. In theory the FCC could “forbear” from Title II’s most onerous rules, promising not to apply them when it determines there’s enough competition in a market to make the rules unnecessary. But the agency has set a high bar for justifying forbearance.

Most recently, in 2012, the Commission refused to grant Qwest forbearance even in the highly competitive telephony market, disregarding competition from wireless providers, and concluding that a cable-telco “duopoly” is inadequate to protect consumers. It’s unclear how the FCC could justify reaching the opposite conclusion about the broadband market—simultaneously finding it competitive enough to forbear, yet fragile enough to require net neutrality rules. Such contradictions would be difficult to explain, even if the FCC generally gets discretion on changing its approach.

But there is another path forward. If the FCC can really make the case for regulation, it should go to Congress, armed with the kind of independent economic and technical expert studies Commissioner Pai has urged, and ask for new authority. A new Communications Act is long overdue anyway. In the meantime, the FCC could convene the kind of multistakeholder process generally endorsed by the White House to produce a code enforceable by the Federal Trade Commission. A consensus is possible — just not inside the FCC, where the policy questions can’t be separated from the intractable legal questions.

Meanwhile, the FCC should focus on doing what Section 706 actually demands: clearing barriers to broadband deployment and competition. The 2010 National Broadband Plan laid out an ambitious pro-deployment agenda. It’s just too bad the FCC was so obsessed with net neutrality that it didn’t focus on the plan. Unleashing more investment and competition, not writing more regulation, is the best way to keep the Internet open, innovative and free.

[Cross-posted at TechFreedom.]

Last week a group of startup investors wrote a letter to protest what they assume FCC Chairman Tom Wheeler’s proposed, revised Open Internet NPRM will say.

Bear in mind that an NPRM is a proposal, not a final rule, and its issuance starts a public comment period. Bear in mind, as well, that the proposal isn’t public yet, presumably none of the signatories to this letter has seen it, and the devil is usually in the details. That said, the letter has been getting a lot of press.

I found the letter seriously wanting, and seriously disappointing. But it’s a perfect example of what’s so wrong with this interminable debate on net neutrality.

Below I reproduce the letter in full, in quotes, with my comments interspersed. The key take-away: Neutrality (or non-discrimination) isn’t what’s at stake here. What’s at stake is zero-cost access by content providers to broadband networks. One can surely understand why content providers and those who fund them want their costs of doing business to be lower. But the rhetoric of net neutrality is mismatched with this goal. It’s no wonder they don’t just come out and say it – it’s quite a remarkable claim.

Open Internet Investors Letter

The Honorable Tom Wheeler, Chairman
Federal Communications Commission
445 12th Street, SW
Washington D.C. 20554

May 8, 2014

Dear Chairman Wheeler:

We write to express our support for a free and open Internet.

We invest in entrepreneurs, investing our own funds and those of our investors (who are individuals, pension funds, endowments, and financial institutions).  We often invest at the earliest stages, when companies include just a handful of founders with largely unproven ideas. But, without lawyers, large teams or major revenues, these small startups have had the opportunity to experiment, adapt, and grow, thanks to equal access to the global market.

“Equal” access has nothing to do with it. No startup is inherently benefitted by being “equal” to others. Maybe this is just careless drafting. But frankly, as I’ll discuss, there are good reasons to think (contra the pro-net neutrality narrative) that startups will be helped by inequality (just like contra the (totally wrong) accepted narrative, payola helps new artists). It says more than they would like about what these investors really want that they advocate “equality” despite the harm it may impose on startups (more on this later).

Presumably what “equal” really means here is “zero cost”: “As long as my startup pays nothing for access to ISPs’ subscribers, it’s fine that we’re all on equal footing.” Wheeler has stated his intent that his proposal would require any prioritization to be available to any who want it, on equivalent, commercially reasonable terms. That’s “equal,” too, so what’s to complain about? But it isn’t really inequality that’s gotten everyone so upset.

Of course, access is never really “zero cost;” start-ups wouldn’t need investors if their costs were zero. In that sense, why is equality of ISP access any more important than other forms of potential equality? Why not mandate price controls on rent? Why not mandate equal rent? A cost is a cost. What’s really going on here is that, like Netflix, these investors want to lower their costs and raise their returns as much as possible, and they want the government to do it for them.

As a result, some of the startups we have invested in have managed to become among the most admired, successful, and influential companies in the world.

No startup became successful as a result of “equality” or even zero-cost access to broadband. No doubt some of their business models were predicated on that assumption. But it didn’t cause their success.

We have made our investment decisions based on the certainty of a level playing field and of assurances against discrimination and access fees from Internet access providers.

And they would make investment decisions based on the possibility of an un-level playing field if that were the status quo. More importantly, the businesses vying for investment dollars might be different ones if they built their business models in a different legal/economic environment. So what? This says nothing about the amount of investment, the types of businesses, the quality of businesses that would arise under a different set of rules. It says only that past specific investments might not have been made.

Unless the contention is that businesses would be systematically worse under a different rule, this is irrelevant. I have seen that claim made, and it’s implicit here, of course, but I’ve seen no evidence to actually support it. Businesses thrive in unequal, cost-ladened environments all the time. It costs about $4 million/30 seconds to advertise during the Super Bowl. Budweiser and PepsiCo paid multiple millions this year to do so; many of their competitors didn’t. With inequality like that, it’s a wonder Sierra Nevada and Dr. Pepper haven’t gone bankrupt.

Indeed, our investment decisions in Internet companies are dependent upon the certainty of an equal-opportunity marketplace.

Again, no they’re not. Equal opportunity is a euphemism for zero cost, or else this is simply absurd on its face. Are these investors so lacking in creativity and ability that they can invest only when there is certainty of equal opportunity? Don’t investors thrive – aren’t they most needed – in environments where arbitrage is possible, where a creative entrepreneur can come up with a risky, novel way to take advantage of differential conditions better than his competitors? Moreover, the implicit equating of “equal-opportunity marketplace” with net neutrality rules is far-fetched. Is that really all that matters?

This is a good time to make a point that is so often missed: The loudest voices for net neutrality are the biggest companies – Google, Netflix, Amazon, etc. That fact should give these investors and everyone else serious pause. Their claim rests on the idea that “equality” is needed, so big companies can’t use an Internet “fast lane” to squash them. Google is decidedly a big company. So why do the big boys want this so much?

The battle is often pitched as one of ISPs vs. (small) content providers. But content providers have far less to worry about and face far less competition from broadband providers than from big, incumbent competitors. It is often claimed that “Netflix was able to pay Comcast’s toll, but a small startup won’t have that luxury.” But Comcast won’t even notice or care about a small startup; its traffic demands will be inconsequential. Netflix can afford to pay for Internet access for precisely the same reason it came to Comcast’s attention: It’s hugely successful, and thus creates a huge amount of traffic.

Based on news reports and your own statements, we are worried that your proposed rules will not provide the necessary certainty that we need to make investment decisions and that these rules will stifle innovation in the Internet sector.

Now, there’s little doubt that legal certainty aids investment decisions. But “certainty” is not in danger here. The rules have to change because the court said so – with pretty clear certainty. And a new rule is not inherently any more or less likely to offer certainty than the previous Open Internet Order, which itself was subject to intense litigation (obviously) and would have been subject to interpretation and inconsistent enforcement (and would have allowed all kinds of paid prioritization, too!). Certainty would be good, but Wheeler’s proposed rule won’t likely do anything about the amount of certainty one way or the other.

If established companies are able to pay for better access speeds or lower latency, the Internet will no longer be a level playing field. Start-ups with applications that are advantaged by speed (such as games, video, or payment systems) will be unlikely to overcome that deficit no matter how innovative their service.

Again, it’s notable that some of the strongest advocates for net neutrality are established companies. Another letter sent out last week included signatures from a bunch of startups, but also Google, Microsoft, Facebook and Yahoo!, among others.

In truth it’s hard to see why startup investors would think this helps them. Non-neutrality offers the prospect that a startup might be able to buy priority access to overcome the inherent disadvantage of newness, and to better compete with an established company. Neutrality means that that competitive advantage is impossible, and the baseline relative advantages and disadvantages remain – which helps incumbents, not startups. With a neutral Internet – well, the advantages of the incumbent competitor can’t be dissipated by a startup buying a favorable leg-up in speed and the Netflix’s of the world will be more likely to continue to dominate.

Of course the claim is that incumbents will use their huge resources to gain even more advantage with prioritized access. Implicit in this must be the assumption that the advantage that could be gained by a startup buying priority offers less return for the startup than the cost imposed on it by the inherent disadvantages of reputation, brand awareness, customer base, etc. But that’s not plausible for all or even most startups. And investors exist precisely because they are able to provide funds for which there is a likelihood of a good return – so if paying for priority would help overcome inherent disadvantages, there would be money for it.

Also implicit is the claim that the benefits to incumbents (over and above their natural advantages) from paying for priority, in terms of hamstringing new entrants, will outweigh the cost. This is unlikely generally to be true, as well. They already have advantages. Sure, sometimes they might want to pay for more, but in precisely the cases where it would be worth it to do so, the new entrant would also be most benefited by doing so itself – ensuring, again, that investment funds will be available.

Of course if both incumbents and startups decide paying for priority is better, we’re back to a world of “equality,” so what’s to complain about, based on this letter? This puts into stark relief that what these investors really want is government-mandated, subsidized broadband access, not “equality.”

Now, it’s conceivable that that is the optimal state of affairs, but if it is, it isn’t for the reasons given here, nor has anyone actually demonstrated that it is the case.

Entrepreneurs will need to raise money to buy fast lane services before they have proven that consumers want their product. Investors will extract more equity from entrepreneurs to compensate for the risk.

Internet applications will not be able to afford to create a relationship with millions of consumers by making their service freely available and then build a business over time as they better understand the value consumers find in their service (which is what Facebook, Twitter, Tumblr, Pinterest, Reddit, Dropbox and virtually other consumer Internet service did to achieve scale).

In other words: “Subsidize us. We’re worth it.” Maybe. But this is probably more revealing than intended. The Internet cost something to someone to build. (Actually, it cost more than a trillion dollars to broadband providers). This just says “we shouldn’t have to pay them for it now.” Fine, but who, then, and how do you know that forcing someone else to subsidize these startup companies will actually lead to better results? Mightn’t we get less broadband investment such that there is little Internet available for these companies to take advantage of in the first place? If broadband consumers instead of content consumers foot the bill, is that clearly preferable, either from a social welfare perspective, or even the self interest of these investors who, after all, do ultimately rely on consumer spending to earn their return?

Moreover, why is this “build for free, then learn how to monetize over time” business model necessarily better than any other? These startup investors know better than anyone that enshrining existing business models just because they exist is the antithesis of innovation and progress. But that’s exactly what they’re saying – “the successful companies of the past did it this way, so we should get a government guarantee to preserve our ability to do it, too!”

This is the most depressing implication of this letter. These investors and others like them have been responsible for financing enormously valuable innovations. If even they can’t see the hypocrisy of these claims for net neutrality – and worse, choose to propagate it further – then we really have come to a sad place. When innovators argue passionately for stagnation, we’re in trouble.

Instead, creators will have to ask permission of an investor or corporate hierarchy before they can launch. Ideas will be vetted by committees and quirky passion projects will not get a chance. An individual in dorm room or a design studio will not be able to experiment out loud on the Internet. The result will be greater conformity, fewer surprises, and less innovation.

This is just a little too much protest. Creators already have to ask “permission” – or are these investors just opening up their bank accounts to whomever wants their money? The ones that are able to do it on a shoestring, with money saved up from babysitting gigs, may find higher costs, and the need to do more babysitting. But again, there is nothing special about the Internet in this. Let’s mandate zero cost office space and office supplies and developer services and design services and . . . etc. for all – then we’ll have way more “permission-less” startups. If it’s just a handout they want, they should say so, instead of pretending there is a moral or economic welfare basis for their claims.

Further, investors like us will be wary of investing in anything that access providers might consider part of their future product plans for fear they will use the same technical infrastructure to advantage their own services or use network management as an excuse to disadvantage competitive offerings.

This is crazy. For the same reasons I mentioned above, the big access provider (and big incumbent competitor, for that matter) already has huge advantages. If these investors aren’t already wary of investing in anything that Google or Comcast or Apple or… might plan to compete with, they must be terrible at their jobs.

What’s more, Wheeler’s much-reviled proposal (what we know about it, that is), to say nothing of antitrust law, clearly contemplates exactly this sort of foreclosure and addresses it. “Pure” net neutrality doesn’t add much, if anything, to the limits those laws already do or would provide.

Policing this will be almost impossible (even using a standard of “commercial reasonableness”) and access providers do not need to successfully disadvantage their competition; they just need to create a credible threat so that investors like us will be less inclined to back those companies.

You think policing the world of non-neutrality is hard – try policing neutrality. It’s not as easy as proponents make it out to be. It’s simply never been the case that all bits at all times have been treated “neutrally” on the Internet. Any version of an Open Internet Order (just like the last one, for example) will have to recognize this.

Larry Downes compiled a list of the exceptions included in the last Open Internet Order when he testified before the House Judiciary Committee on the rules in 2011. There are 16 categories of exemption, covering a wide range of fundamental components of broadband connectivity, from CDNs to free Wi-Fi at Starbucks. His testimony is a tour de force, and should be required reading for everyone involved in this debate.

But think about how the manifest advantages of these non-neutral aspects of broadband networks would be squared with “real” neutrality. On their face, if these investors are to be taken at their word, these arguments would preclude all of the Open Internet Order’s exemptions, too. And if any sort of inequality is going to be deemed ok, how accurately would regulators distinguish between “illegitimate” inequality and the acceptable kind that lets coffee shops subsidize broadband? How does the simplistic logic of net equality distinguish between, say, Netflix’s colocated servers and a startup like Uber being integrated into Google Maps? The simple answer is that it doesn’t, and the claims and arguments of this letter are woefully inadequate to the task.

We need simple, strong, enforceable rules against discrimination and access fees, not merely against blocking.

No, we don’t. Or, at least, no one has made that case. These investors want a handout; that is the only case this letter makes.

We encourage the Commission to consider all available jurisdictional tools at its disposal in ensuring a free and open Internet that rewards, not disadvantages, investment and entrepreneurship.

… But not investment in broadband, and not entrepreneurship that breaks with the business models of the past. In reality, this letter is simple rent-seeking: “We want to invest in what we know, in what’s been done before, and we don’t want you to do anything to make that any more costly for us. If that entails impairing broadband investment or imposing costs on others, so be it – we’ll still make our outsized returns, and they can write their own letter complaining about ‘inequality.’”

A final point I have to make. Although the investors don’t come right out and say it, many others have, and it’s implicit in the investors’ letter: “Content providers shouldn’t have to pay for broadband. Users already pay for the service, so making content providers pay would just let ISPs double dip.” The claim is deeply problematic.

For starters, it’s another form of the status quo mentality: “Users have always paid and content hasn’t, so we object to any deviation from that.” But it needn’t be that way. And of course models frequently coexist where different parties pay for the same or similar services. Some periodicals are paid for by readers and offer little or no advertising; others charge a subscription and offer paid ads; and still others are offered for free, funded entirely by ads. All of these models work. None is “better” than the other. There is no reason the same isn’t true for broadband and content.

Net neutrality claims that the only proper price to charge on the content side of the market is zero. (Congratulations: You’re in the same club as that cutting-edge, innovative technology, the check, which is cleared at par by government fiat. A subsidy that no doubt explains why checks have managed to last this long). As an economic matter, that’s possible; it could be that zero is the right price. But it most certainly needn’t be, and issues revolving around Netflix’s traffic and the ability of ISPs and Netflix cost-effectively to handle it are evidence that zero may well not be the right price.

The reality is that these sorts of claims are devoid of economic logic — which is presumably why they, like the whole net neutrality “movement” generally, appeal so gratuitously to emotion rather than reason. But it doesn’t seem unreasonable to hope for more from a bunch of savvy financiers.

 

I have a new article on the Comcast/Time Warner Cable merger in the latest edition of the CPI Antitrust Chronicle, which includes several other articles on the merger, as well.

In a recent essay, Allen Grunes & Maurice Stucke (who also have an essay in the CPI issue) pose a thought experiment: If Comcast can acquire TWC, what’s to stop it acquiring all cable companies? The authors’ assertion is that the arguments being put forward to support the merger contain no “limiting principle,” and that the same arguments, if accepted here, would unjustifiably permit further consolidation. But there is a limiting principle: competitive harm. Size doesn’t matter, as courts and economists have repeatedly pointed out.

The article explains why the merger doesn’t give rise to any plausible theory of anticompetitive harm under modern antitrust analysis. Instead, arguments against the merger amount to little more than the usual “big-is-bad” naysaying.

In summary, I make the following points:

Horizontal Concerns

The absence of any reduction in competition should end the inquiry into any potentially anticompetitive effects in consumer markets resulting from the horizontal aspects of the transaction.

  • It’s well understood at this point that Comcast and TWC don’t compete directly for subscribers in any relevant market; in terms of concentration and horizontal effects, the transaction will neither reduce competition nor restrict consumer choice.
  • Even if Comcast were a true monopolist provider of broadband service in certain geographic markets, the DOJ would have to show that the merger would be substantially likely to lessen competition—a difficult showing to make where Comcast and TWC are neither actual nor potential competitors in any of these markets.
  • Whatever market power Comcast may currently possess, the proposed merger simply does nothing to increase it, nor to facilitate its exercise.

Comcast doesn’t currently have substantial bargaining power in its dealings with content providers, and the merger won’t change that. The claim that the combined entity will gain bargaining leverage against content providers from the merger, resulting in lower content prices to programmers, fails for similar reasons.

  • After the transaction, Comcast will serve fewer than 30 percent of total MVPD subscribers in the United States. This share is insufficient to give Comcast market power over sellers of video programming.
  • The FCC has tried to impose a 30 percent cable ownership cap, and twice it has been rejected by the courts. The D.C. Circuit concluded more than a decade ago—in far less competitive conditions than exist today—that the evidence didn’t justify a horizontal ownership limit lower than 60% on the basis of buyer power.
  • The recent exponential growth in OVDs like Google, Netflix, Amazon and Apple gives content providers even more ways to distribute their programming.
  • In fact, greater concentration among cable operators has coincided with an enormous increase in output and quality of video programming
  • Moreover, because the merger doesn’t alter the competitive make-up of any relevant consumer market, Comcast will have no greater ability to threaten to withhold carriage of content in order to extract better terms.
  • Finally, programmers with valuable content have significant bargaining power and have been able to extract the prices to prove it. None of that will change post-merger.

Vertical Concerns

The merger won’t give Comcast the ability (or the incentive) to foreclose competition from other content providers for its NBCUniversal content.

  • Because the merger would represent only 30 percent of the national market (for MVPD services), 70 percent of the market is still available for content distribution.
  • But even this significantly overstates the extent of possible foreclosure. OVD providers increasingly vie for the same content as cable (and satellite).
  • In the past when regulators have considered foreclosure effects for localized content (regional sports networks, primarily)—for example, in the 2005 Adelphia/Comcast/TWC deal, under far less competitive conditions—the FTC found no substantial threat of anticompetitive harm. And while the FCC did identify a potential risk of harm in its review of the Adelphia deal, its solution was to impose arbitration requirements for access to this programming—which are already part of the NBCUniversal deal conditions and which will be extended to the new territory and new programming from TWC.

The argument that the merger will increase Comcast’s incentive and ability to impair access to its users by online video competitors or other edge providers is similarly without merit.

  • Fundamentally, Comcast benefits from providing its users access to edge providers, and it would harm itself if it were to constrain access to these providers.
  • Foreclosure effects would be limited, even if they did arise. On a national level, the combined firm would have only about 40 percent of broadband customers, at most (and considerably less if wireless broadband is included in the market).
  • This leaves at least 60 percent—and quite possibly far more—of customers available to purchase content and support edge providers reaching minimum viable scale, even if Comcast were to attempt to foreclose access.

Some have also argued that because Comcast has a monopoly on access to its customers, transit providers are beholden to it, giving it the ability to degrade or simply block content from companies like Netflix. But these arguments misunderstand the market.

  • The transit market through which edge providers bring their content into the Comcast network is highly competitive. Edge providers can access Comcast’s network through multiple channels, undermining Comcast’s ability to deny access or degrade service to such providers.
  • The transit market is also almost entirely populated by big players engaged in repeat interactions and, despite a large number of transactions over the years, marked by a trivial number of disputes.
  • The recent Comcast/Netflix agreement demonstrates that the sophisticated commercial entities in this market are capable of resolving conflicts—conflicts that appear to affect only the distribution of profits among contracting parties but not raise anticompetitive concerns.
  • If Netflix does end up paying more to access Comcast’s network over time, it won’t be because of market power or this merger. Rather, it’s an indication of the evolving market and the increasing popularity of OTT providers.
  • The Comcast/Netflix deal has procompetitive justifications, as well. Charging Netflix allows Comcast to better distinguish between the high-usage Netflix customers (two percent of Netflix users account for 20 percent of all broadband traffic) and everyone else. This should lower cable bills on average, improve incentives for users, and lead to more efficient infrastructure investments by both Comcast and Netflix.

Critics have also alleged that the vertically integrated Comcast may withhold its own content from competing MVPDs or OVDs, or deny carriage to unaffiliated programming. In theory, by denying competitors or potential competitors access to popular programming, a vertically integrated MVPD might gain a competitive advantage over its rivals. Similarly, an MVPD that owns cable channels may refuse to carry at least some unaffiliated content to benefit its own channels. But these claims also fall flat.

  • Once again, these issue are not transaction specific.
  • But, regardless, Comcast will not be able to engage in successful foreclosure strategies following the transaction.
  • The merger has no effect on Comcast’s share of national programming. And while it will have a larger share of national distribution post-merger, a 30 percent market share is nonetheless insufficient to confer buyer power in today’s highly competitive MVPD market.
  • Moreover, the programming market is highly dynamic and competitive, and Comcast’s affiliated programming networks face significant competition.
  • Comcast already has no ownership interest in the overwhelming majority of content it distributes. This won’t measurably change post-transaction.

Procompetitive Justifications

While the proposed transaction doesn’t give rise to plausible anticompetitive harms, it should bring well-understood pro-competitive benefits. Most notably:

  • The deal will bring significant scale efficiencies in a marketplace that requires large, fixed-cost investments in network infrastructure and technology.
  • And bringing a more vertical structure to TWC will likely be beneficial, as well. Vertical integration can increase efficiency, and the elimination of double marginalization often leads to lower prices for consumers.

Let’s be clear about the baseline here. Remember all those years ago when Netflix was a mail-order DVD company? Before either Netflix or Comcast even considered using the internet to distribute Netflix’s video content, Comcast invested in the technology and infrastructure that ultimately enabled the Netflix of today. It did so at enormous cost (tens of billions of dollars over the last 20 years) and risk. Absent broadband we’d still be waiting for our Netflix DVDs to be delivered by snail mail, and Netflix would still be spending three-quarters of a billion dollars a year on shipping.

The ability to realize returns—including returns from scale—is essential to incentivizing continued network and other quality investments. The cable industry today operates with a small positive annual return on invested capital (“ROIC”) but it has had cumulative negative ROIC over the entirety of the last decade. In fact, on invested capital of $127 billion between 2000 and 2009, cable has seen economic profits of negative $62 billion and a weighted average ROIC of negative 5 percent. Meanwhile Comcast’s stock has significantly underperformed the S&P 500 over the same period and only outperformed the S&P over the last two years.

Comcast is far from being a rapacious and endlessly profitable monopolist. This merger should help it (and TWC) improve its cable and broadband services, not harm consumers.

No matter how many times Al Franken and Susan Crawford say it, neither the broadband market nor the MVPD market is imperiled by vertical or horizontal integration. The proposed merger won’t create cognizable antitrust harms. Comcast may get bigger, but that simply isn’t enough to thwart the merger.

Today the D.C. Circuit struck down most of the FCC’s 2010 Open Internet Order, rejecting rules that required broadband providers to carry all traffic for edge providers (“anti-blocking”) and prevented providers from negotiating deals for prioritized carriage. However, the appeals court did conclude that the FCC has statutory authority to issue “Net Neutrality” rules under Section 706(a) and let stand the FCC’s requirement that broadband providers clearly disclose their network management practices.

The following statement may be attributed to Geoffrey Manne and Berin Szoka:

The FCC may have lost today’s battle, but it just won the war over regulating the Internet. By recognizing Section 706 as an independent grant of statutory authority, the court has given the FCC near limitless power to regulate not just broadband, but the Internet itself, as Judge Silberman recognized in his dissent.

The court left the door open for the FCC to write new Net Neutrality rules, provided the Commission doesn’t treat broadband providers as common carriers. This means that, even without reclassifying broadband as a Title II service, the FCC could require that any deals between broadband and content providers be reasonable and non-discriminatory, just as it has required wireless carriers to provide data roaming services to their competitors’ customers on that basis. In principle, this might be a sound approach, if the rule resembles antitrust standards. But even that limitation could easily be evaded if the FCC regulates through case-by-case enforcement actions, as it tried to do before issuing the Open Internet Order. Either way, the FCC need only make a colorable argument under Section 706 that its actions are designed to “encourage the deployment… of advanced telecommunications services.” If the FCC’s tenuous “triple cushion shot” argument could satisfy that test, there is little limit to the deference the FCC will receive.

But that’s just for Net Neutrality. Section 706 covers “advanced telecommunications,” which seems to include any information service, from broadband to the interconnectivity of smart appliances like washing machines and home thermostats. If the court’s ruling on Section 706 is really as broad as it sounds, and as the dissent fears, the FCC just acquired wide authority over these, as well — in short, the entire Internet, including the “Internet of Things.” While the court’s “no common carrier rules” limitation is a real one, the FCC clearly just gained enormous power that it didn’t have before today’s ruling.

Today’s decision essentially rewrites the Communications Act in a way that will, ironically, do the opposite of what the FCC claims: hurt, not help, deployment of new Internet services. Whatever the FCC’s role ought to be, such decisions should be up to our elected representatives, not three unelected FCC Commissioners. So if there’s a silver lining in any of this, it may be that the true implications of today’s decision are so radical that Congress finally writes a new Communications Act — a long-overdue process Congressmen Fred Upton and Greg Walden have recently begun.

Szoka and Manne are available for comment at media@techfreedom.org. Find/share this release on Facebook or Twitter.

The debates over mobile spectrum aggregation and the auction rules for the FCC’s upcoming incentive auction — like all regulatory rent-seeking — can be farcical. One aspect of the debate in particular is worth highlighting, as it puts into stark relief the tendentiousness of self-interested companies making claims about the public interestedness of their preferred policies: The debate over how and whether to limit the buying and aggregating of lower frequency (in this case 600 MHz) spectrum.

A little technical background is in order. At its most basic, a signal carried in higher frequency spectrum doesn’t travel as well as a signal carried in lower frequency spectrum. The higher the frequency, the closer together cell towers need to be to maintain a good signal.

600MHz is relatively low frequency for wireless communications. In rural areas it is helpful in reducing infrastructure costs for wide area coverage because cell towers can be placed further apart and thus fewer towers must be built. But in cities, population density trumps frequency, and propagation range is essentially irrelevant for infrastructure costs. In other words, it doesn’t matter how far your signal will travel if congestion alleviation demands you build cell towers closer together than even the highest frequency spectrum requires anyway. The optimal — nay, the largest usable — cell radius in urban and suburban areas is considerably smaller than the sort of cell radius that low frequency spectrum allows for.

It is important to note, of course, that signal distance isn’t the only propagation characteristic imparting value to lower frequency spectrum; in particular, it is also valuable even in densely populated settings for its ability to travel through building walls. That said, however, the primary arguments made in favor of spreading the 600 MHz wealth — of effectively subsidizing its purchase by smaller carriers — are rooted in its value in offering more efficient coverage in less-populated areas. Thus the FCC has noted that while there may be significant infrastructure cost savings associated with deploying lower frequency networks in rural areas, this lower frequency spectrum provides little cost advantage in urban or suburban areas (even though, as noted, it has building-penetrating value there).

It is primarily because of these possible rural network cost advantages that certain entities (the Department of Justice, Free Press, the Competitive Carriers Association, e.g.) have proposed that AT&T and Verizon (both of whom have significant lower frequency spectrum holdings) should be restricted from winning “too much” spectrum in the FCC’s upcoming 600 MHz incentive auctions. The argument goes that, in order to ensure national competition — that is, to give other companies financial incentive to build out their networks into rural areas — the auction should be structured to favor Sprint and T-Mobile (both of whose spectrum holdings are mostly in the upper frequency bands) as awardees of this low-frequency spectrum, at commensurately lower cost.

Shockingly, T-Mobile and Sprint are on board with this plan.

So, to recap: 600MHz spectrum confers cost savings when used in rural areas. It has much less effect on infrastructure costs in urban and suburban areas. T-Mobile and Sprint don’t have much of it; AT&T and Verizon have lots. If we want T-Mobile and Sprint to create the competing national networks that the government seems dead set on engineering, we need to put a thumb on the scale in the 600MHz auctions. So they can compete in rural areas. Because that’s where 600MHz spectrum offers cost advantages. In rural areas.

So what does T-Mobile plan to do if it wins the spectrum lottery? Certainly not build in rural areas. As Craig Moffett notes, currently “T-Mobile’s U.S. network is fast…but coverage is not its strong suit, particularly outside of metro areas.” And for the future? T-mobile’s breakneck LTE coverage ramp up since the failed merger with AT&T is expected to top out at 225 million people, or the 71% of consumers living in the most-populated areas (it’s currently somewhere over 200 million). “Although sticking to a smaller network, T-Mobile plans to keep increasing the depth of its LTE coverage” (emphasis added). Depth. That means more bandwidth in high-density areas. It does not mean broader coverage. Obviously.

Sprint, meanwhile, is devoting all of its resources to playing LTE catch-up in the most-populated areas; it isn’t going to waste valuable spectrum resources on expanded rural build out anytime soon.

The kicker is that T-Mobile relies on AT&T’s network to provide its urban and suburban customers with coverage (3G) when they do roam into rural areas, taking advantage of a merger break-up provision that gives it roaming access to AT&T’s 3G network. In other words, T-Mobile’s national network is truly “national” only insofar as it piggybacks on AT&T’s broader coverage. And because AT&T will get the blame for congestion when T-Mobile’s customers roam onto its network, the cost to T-Mobile of hamstringing AT&T’s network is low.

The upshot is that T-Mobile seems not to need, nor does it intend to deploy, lower frequency spectrum to build out its network in less-populated areas. Defenders say that rigging the auction rules to benefit T-Mobile and Sprint will allow them to build out in rural areas to compete with AT&T’s and Verizon’s broader networks. But this is a red herring. They may get the spectrum, but they won’t use it to extend their coverage in rural areas; they’ll use it to add “depth” to their overloaded urban and suburban networks.

But for AT&T the need for additional spectrum is made more acute by the roaming deal, which requires it to serve its own customers and those of T-Mobile.

This makes clear the reason underlying T‑Mobile’s advocacy for rigging the 600 MHz auction – it is simply so that T‑Mobile can acquire this spectrum on the cheap to use in urban and suburban areas, not so that it can deploy a wide rural network. And the beauty of it is that by hamstringing AT&T’s ability to acquire this spectrum, it becomes more expensive for AT&T to serve T‑Mobile’s own customers!

Two birds, one stone: lower your costs, raise your competitor’s costs.

The lesson is this: If we want 600 MHz spectrum to be used efficiently to provide rural LTE service, we should assume that the highest bidder will make the most valuable use of the spectrum. The experience of the relatively unrestricted 700 MHz auction in 2008 confirms this. The purchase of 700 MHz spectrum by AT&T and Verizon led to the US becoming the world leader in LTE. Why mess with success?

[Cross-posted at RedState]

I have a new post up at TechPolicyDaily.com, excerpted below, in which I discuss the growing body of (surprising uncontroversial) work showing that broadband in the US compares favorably to that in the rest of the world. My conclusion, which is frankly more cynical than I like, is that concern about the US “falling behind” is manufactured debate. It’s a compelling story that the media likes and that plays well for (some) academics.

Before the excerpt, I’d also like to quote one of today’s headlines from Slashdot:

“Google launched the citywide Wi-Fi network with much fanfare in 2006 as a way for Mountain View residents and businesses to connect to the Internet at no cost. It covers most of the Silicon Valley city and worked well until last year, as Slashdot readers may recall, when connectivity got rapidly worse. As a result, Mountain View is installing new Wi-Fi hotspots in parts of the city to supplement the poorly performing network operated by Google. Both the city and Google have blamed the problems on the design of the network. Google, which is involved in several projects to provide Internet access in various parts of the world, said in a statement that it is ‘actively in discussions with the Mountain View city staff to review several options for the future of the network.'”

The added emphasis is mine. It is added to draw attention to the simple point that designing and building networks is hard. Like, really really hard. Folks think that it’s easy, because they have small networks in their homes or offices — so surely they can scale to a nationwide network without much trouble. But all sorts of crazy stuff starts to happen when we substantially increase the scale of IP networks. This is just one of the very many things that should give us pause about calls for the buildout of a government run or sponsored Internet infrastructure.

Another of those things is whether there’s any need for that. Which brings us to my TechPolicyDaily.com post:

In the week or so since TPRC, I’ve found myself dwelling on an observation I made during the conference: how much agreement there was, especially on issues usually thought of as controversial. I want to take a few paragraphs to consider what was probably the most surprisingly non-controversial panel of the conference, the final Internet Policy panel, in which two papers – one by ITIF’s Rob Atkinson and the other by James McConnaughey from NTIA – were presented that showed that broadband Internet service in US (and Canada, though I will focus on the US) compares quite well to that offered in the rest of the world. […]

But the real question that this panel raised for me was: given how well the US actually compares to other countries, why does concern about the US falling behind dominate so much discourse in this area? When you get technical, economic, legal, and policy experts together in a room – which is what TPRC does – the near consensus seems to be that the “kids are all right”; but when you read the press, or much of the high-profile academic literature, “the sky is falling.”

The gap between these assessments could not be larger. I think that we need to think about why this is. I hate to be cynical or disparaging – especially since I know strong advocates on both sides and believe that their concerns are sincere and efforts earnest. But after this year’s conference, I’m having trouble shaking the feeling that ongoing concern about how US broadband stacks up to the rest of the world is a manufactured debate. It’s a compelling, media- and public-friendly, narrative that supports a powerful political agenda. And the clear incentives, for academics and media alike, are to find problems and raise concerns. […]

Compare this to the Chicken Little narrative. As I was writing this, I received a message from a friend asking my views on an Economist blog post that shares data from the ITU’s just-released Measuring the Information Society 2013 report. This data shows that the US has some of the highest prices for pre-paid handset-based mobile data around the world. That is, it reports the standard narrative – and it does so without looking at the report’s methodology. […]

Even more problematic than what the Economist blog reports, however, is what it doesn’t report. [The report contains data showing the US has some of the lowest cost fixed broadband and mobile broadband prices in the world. See the full post at TechPolicyDaily.com for the numbers.]

Now, there are possible methodological problems with these rankings, too. My point here isn’t to debate over the relative position of the United States. It’s to ask why the “story” about this report cherry-picks the alarming data, doesn’t consider its methodology, and ignores the data that contradicts its story.

Of course, I answered that question above: It’s a compelling, media- and public-friendly, narrative that supports a powerful political agenda. And the clear incentives, for academics and media alike, are to find problems and raise concerns. Manufacturing debate sells copy and ads, and advances careers.