Archives For Jonathan Sallet

Municipal broadband has been heavily promoted by its advocates as a potential source of competition against Internet service providers (“ISPs”) with market power. Jonathan Sallet argued in Broadband for America’s Future: A Vision for the 2020s, for instance, that municipal broadband has a huge role to play in boosting broadband competition, with attendant lower prices, faster speeds, and economic development. 

Municipal broadband, of course, can mean more than one thing: From “direct consumer” government-run systems, to “open access” where government builds the back-end, but leaves it up to private firms to bring the connections to consumers, to “middle mile” where the government network reaches only some parts of the community but allows private firms to connect to serve other consumers. The focus of this blog post is on the “direct consumer” model.

There have been many economic studies on municipal broadband, both theoretical and empirical. The literature largely finds that municipal broadband poses serious risks to taxpayers, often relies heavily on cross-subsidies from government-owned electric utilities, crowds out private ISP investment in areas it operates, and largely fails the cost-benefit analysis. While advocates have defended municipal broadband on the grounds of its speed, price, and resulting attractiveness to consumers and businesses, others have noted that many of those benefits come at the expense of other parts of the country from which businesses move. 

What this literature has not touched upon is a more fundamental problem: municipal broadband lacks the price signals necessary for economic calculation.. The insights of the Austrian school of economics helps explain why this model is incapable of providing efficient outcomes for society. Rather than creating a valuable source of competition, municipal broadband creates “islands of chaos” undisciplined by the market test of profit-and-loss. As a result, municipal broadband is a poor model for promoting competition and innovation in broadband markets. 

The importance of profit-and-loss to economic calculation

One of the things often assumed away in economic analysis is the very thing the market process depends upon: the discovery of knowledge. Knowledge, in this context, is not the technical knowledge of how to build or maintain a broadband network, but the more fundamental knowledge which is discovered by those exercising entrepreneurial judgment in the marketplace. 

This type of knowledge is dependent on prices throughout the market. In the market process, prices coordinate exchange between market participants without each knowing the full plan of anyone else. For consumers, prices allow for the incremental choices between different options. For producers, prices in capital markets similarly allow for choices between different ways of producing their goods for the next stage of production. Prices in interest rates help coordinate present consumption, investment, and saving. And, the price signal of profit-and-loss allows producers to know whether they have cost-effectively served consumer needs. 

The broadband marketplace can’t be considered in isolation from the greater marketplace in which it is situated. But it can be analyzed under the framework of prices and the knowledge they convey.

For broadband consumers, prices are important for determining the relative importance of Internet access compared to other felt needs. The quality of broadband connection demanded by consumers is dependent on the price. All other things being equal, consumers demand faster connections with less latency issues. But many consumers may prefer slower speeds and connections with more latency if it is cheaper. Even choices between the importance of upload speeds versus download speeds may be highly asymmetrical if determined by consumers.  

While “High Performance Broadband for All” may be a great goal from a social planner’s perspective, individuals acting in the marketplace may prioritize other needs with his or her scarce resources. Even if consumers do need Internet access of some kind, the benefits of 100 Mbps download speeds over 25 Mbps, or upload speeds of 100 Mbps versus 3 Mbps may not be worth the costs. 

For broadband ISPs, prices for capital goods are important for building out the network. The relative prices of fiber, copper, wireless, and all the other factors of production in building out a network help them choose in light of anticipated profit. 

All the decisions of broadband ISPs are made through the lens of pursuing profit. If they are successful, it is because the revenues generated are greater than the costs of production, including the cost of money represented in interest rates. Just as importantly, loss shows the ISPs were unsuccessful in cost-effectively serving consumers. While broadband companies may be able to have losses over some period of time, they ultimately must turn a profit at some point, or there will be exit from the marketplace. Profit-and-loss both serve important functions.

Sallet misses the point when he states the“full value of broadband lies not just in the number of jobs it directly creates or the profits it delivers to broadband providers but also in its importance as a mechanism that others use across the economy and society.” From an economic point of view, profits aren’t important because economists love it when broadband ISPs get rich. Profits are important as an incentive to build the networks we all benefit from, and a signal for greater competition and innovation.

Municipal broadband as islands of chaos

Sallet believes the lack of high-speed broadband (as he defines it) is due to the monopoly power of broadband ISPs. He sees the entry of municipal broadband as pro-competitive. But the entry of a government-run broadband company actually creates “islands of chaos” within the market economy, reducing the ability of prices to coordinate disparate plans of action among participants. This, ultimately, makes society poorer.

The case against municipal broadband doesn’t rely on greater knowledge of how to build or maintain a network being in the hands of private engineers. It relies instead on the different institutional frameworks within which the manager of the government-run broadband network works as compared to the private broadband ISP. The type of knowledge gained in the market process comes from prices, including profit-and-loss. The manager of the municipal broadband network simply doesn’t have access to this knowledge and can’t calculate the best course of action as a result.

This is because the government-run municipal broadband network is not reliant upon revenues generated by free choices of consumers alone. Rather than needing to ultimately demonstrate positive revenue in order to remain a going concern, government-run providers can instead base their ongoing operation on access to below-market loans backed by government power, cross-subsidies when it is run by a government electric utility, and/or public money in the form of public borrowing (i.e. bonds) or taxes. 

Municipal broadband, in fact, does rely heavily on subsidies from the government. As a result, municipal broadband is not subject to the discipline of the market’s profit-and-loss test. This frees the enterprise to focus on other goals, including higher speeds—especially upload speeds—and lower prices than private ISPs often offer in the same market. This is why municipal broadband networks build symmetrical high-speed fiber networks at higher rates than the private sector.

But far from representing a superior source of “competition,” municipal broadband is actually an example of “predatory entry.” In areas where there is already private provision of broadband, municipal broadband can “out-compete” those providers due to subsidies from the rest of society. Eventually, this could lead to exit by the private ISPs, starting with the least cost-efficient to the most. In areas where there is limited provision of Internet access, the entry of municipal broadband could reduce incentives for private entry altogether. In either case, there is little reason to believe municipal broadband actually increases consumer welfarein the long run.

Moreover, there are serious concerns in relying upon municipal broadband for the buildout of ISP networks. While Sallet describes fiber as “future-proof,” there is little reason to think that it is. The profit motive induces broadband ISPs to constantly innovate and improve their networks. Contrary to what you would expect from an alleged monopoly industry, broadband companies are consistently among the highest investors in the American economy. Similar incentives would not apply to municipal broadband, which lacks the profit motive to innovate. 

Conclusion

There is a definite need to improve public policy to promote more competition in broadband markets. But municipal broadband is not the answer. The lack of profit-and-loss prevents the public manager of municipal broadband from having the price signal necessary to know it is serving the public cost-effectively. No amount of bureaucratic management can replace the institutional incentives of the marketplace.

As Thomas Sowell has noted many times, political debates often involve the use of words which if taken literally mean something very different than the connotations which are conveyed. Examples abound in the debate about broadband buildout. 

There is a general consensus on the need to subsidize aspects of broadband buildout to rural areas in order to close the digital divide. But this real need allows for strategic obfuscation of key terms in this debate by parties hoping to achieve political or competitive gain. 

“Access” and “high-speed broadband”

For instance, nearly everyone would agree that Internet policy should “promote access to high-speed broadband.” But how some academics and activists define “access” and “high-speed broadband” are much different than the average American would expect.

A commonsense definition of access is that consumers have the ability to buy broadband sufficient to meet their needs, considering the costs and benefits they face. In the context of the digital divide between rural and urban areas, the different options available to consumers in each area is a reflection of the very real costs and other challenges of providing service. In rural areas with low population density, it costs broadband providers considerably more per potential subscriber to build the infrastructure needed to provide service. At some point, depending on the technology, it is no longer profitable to build out to the next customer several miles down the road. The options and prices available to rural consumers reflects this unavoidable fact. Holding price constant, there is no doubt that many rural consumers would prefer higher speeds than are currently available to them. But this is not the real-world choice which presents itself. 

But access in this debate instead means the availability of the same broadband options regardless of where people live. Rather than being seen as a reflection of underlying economic realities, the fact that rural Americans do not have the same options available to them that urban Americans do is seen as a problem which calls out for a political solution. Thus, billions of dollars are spent in an attempt to “close the digital divide” by subsidizing broadband providers to build infrastructure to  rural areas. 

“High-speed broadband” similarly has a meaning in this debate significantly different from what many consumers, especially those lacking “high speed” service, expect. For consumers, fast enough is what allows them to use the Internet in the ways they desire. What is fast enough does change over time as more and more uses for the Internet become common. This is why the FCC has changed the technical definition of broadband multiple times over the years as usage patterns and bandwidth requirements change. Currently, the FCC uses 25Mbps down/3 Mbps up as the baseline for broadband.

However, for some, like Jonathan Sallet, this is thoroughly insufficient. In his Broadband for America’s Future: A Vision for the 2020s, he instead proposes “100 Mbps symmetrical service without usage limits.” The disconnect between consumer demand as measured in the marketplace in light of real trade-offs between cost and performance and this arbitrary number is not well-explained in this study. The assumption is simply that faster is better, and that the building of faster networks is a mere engineering issue once sufficiently funded and executed with enough political will.

But there is little evidence that consumers “need” faster Internet than the market is currently providing. In fact, one Wall Street Journal study suggests “typical U.S. households don’t use most of their bandwidth while streaming and get marginal gains from upgrading speeds.” Moreover, there is even less evidence that most consumers or businesses need anything close to upload speeds of 100 Mbps. For even intensive uses like high-resolution live streaming, recommended upload speeds still fall far short of 100 Mbps. 

“Competition” and “Overbuilding”

Similarly, no one objects to the importance of “competition in the broadband marketplace.” But what is meant by this term is subject to vastly different interpretations.

The number of competitors is not the same as the amount of competition. Competition is a process by which market participants discover the best way to serve consumers at lowest cost. Specific markets are often subject to competition not only from the firms which exist within those markets, but also from potential competitors who may enter the market any time potential profits reach a point high enough to justify the costs of entry. An important inference from this is that temporary monopolies, in the sense that one firm has a significant share of the market, is not in itself illegal under antitrust law, even if they are charging monopoly prices. Potential entry is as real in its effects as actual competitors in forcing incumbents to continue to innovate and provide value to consumers. 

However, many assume the best way to encourage competition in broadband buildout is to simply promote more competitors. A significant portion of Broadband for America’s Future emphasizes the importance of subsidizing new competition in order to increase buildout, increase quality, and bring down prices. In particular, Sallet emphasizes the benefits of municipal broadband, i.e. when local governments build and run their own networks. 

In fact, Sallet argues that fears of “overbuilding” are really just fears of competition by incumbent broadband ISPs:

Language here is important. There is a tendency to call the construction of new, competitive networks in a locality with an existing network “overbuilding”—as if it were an unnecessary thing, a useless piece of engineering. But what some call “overbuilding” should be called by a more familiar term: “Competition.” “Overbuilding” is an engineering concept; “competition” is an economic concept that helps consumers because it shifts the focus from counting broadband networks to counting the dollars that consumers save when they have competitive choices. The difference is fundamental—overbuilding asks whether the dollars spent to build another network are necessary for the delivery of a communications service; economics asks whether spending those dollars will lead to competition that allows consumers to spend less and get more. 

Sallet makes two rhetorical moves here to make his argument. 

The first is redefining “overbuilding,” which refers to literally building a new network on top of (that is, “over”) previously built architecture, as a ploy by ISPs to avoid competition. But this is truly Orwellian. When a new entrant can build over an incumbent and take advantage of the first-mover’s investments to enter at a lower cost, a failure to compensate the first-mover is free riding. If the government compels such free riding, it reduces incentives for firms to make the initial investment to build the infrastructure.

The second is defining competition as the number of competitors, even if those competitors need to be subsidized by the government in order to enter the marketplace.  

But there is no way to determine the “right” number of competitors in a given market in advance. In the real world, markets don’t match blackboard descriptions of perfect competition. In fact, there are sometimes high fixed costs which limit the number of firms which will likely exist in a competitive market. In some markets, known as natural monopolies, high infrastructural costs and other barriers to entry relative to the size of the market lead to a situation where it is cheaper for a monopoly to provide a good or service than multiple firms in a market. But it is important to note that only firms operating under market pressures can assess the viability of competition. This is why there is a significant risk in government subsidizing entry. 

Competition drives sustained investment in the capital-intensive architecture of broadband networks, which suggests that ISPs are not natural monopolies. If they were, then having a monopoly provider regulated by the government to ensure the public interest, or government-run broadband companies, may make sense. In fact, Sallet denies ISPs are natural monopolies, stating that “the history of telecommunications regulation in the United States suggests that monopolies were a result of policy choices, not mandated by any iron law of economics” and “it would be odd for public policy to treat the creation of a monopoly as a success.” 

As noted by economist George Ford in his study, The Impact of Government-Owned Broadband Networks on Private Investment and Consumer Welfare, unlike the threat of entry which often causes incumbents to act competitively even in the absence of competitors, the threat of subsidized entry reduces incentives for private entities to invest in those markets altogether. This includes both the incentive to build the network and update it. Subsidized entry may, in fact, tip the scales from competition that promotes consumer welfare to that which could harm it. If the market only profitably sustains one or two competitors, adding another through municipal broadband or subsidizing a new entrant may reduce the profitability of the incumbent(s) and eventually lead to exit. When this happens, only the government-run or subsidized network may survive because the subsidized entrant is shielded from the market test of profit-and-loss.

The “Donut Hole” Problem

The term “donut hole” is a final example to consider of how words can be used to confuse rather than enlighten in this debate.

There is broad agreement that to generate the positive externalities from universal service, there needs to be subsidies for buildout to high-cost rural areas. However, this seeming agreement masks vastly different approaches. 

For instance, some critics of the current subsidy approach have identified a phenomenon where the city center has multiple competitive ISPs and government policy extends subsidies to ISPs to build out broadband coverage into rural areas, but there is relatively paltry Internet services in between due to a lack of private or public investment. They describe this as a “donut hole” because the “unserved” rural areas receive subsidies while “underserved” outlying parts immediately surrounding town centers receive nothing under current policy.

Conceptually, this is not a donut hole. It is actually more like a target or bullseye, where the city center is served by private investment and the rural areas receive subsidies to be served. 

Indeed, there is a different use of the term donut hole, which describes how public investment in city centers can create a donut hole of funding needed to support rural build-out. Most Internet providers rely on profits from providing lower-cost service to higher-population areas (like city centers) to cross-subsidize the higher cost of providing service in outlying and rural areas. But municipal providers generally only provide municipal service — they only provide lower-cost service. This hits the carriers that serve higher-cost areas with a double whammy. First, every customer that municipal providers take from private carriers cuts the revenue that those carriers rely on to provide service elsewhere. Second, and even more problematic, because the municipal providers have lower costs (because they tend not to serve the higher-cost outlying areas), they can offer lower prices for service. This “competition” exerts downward pressure on the private firms’ prices, further reducing revenue across their entire in-town customer base. 

This version of the “donut hole,” in which the revenues that private firms rely on from the city center to support the costs of providing service to outlying areas has two simultaneous effects. First, it directly reduces the funding available to serve more rural areas. And, second, it increases the average cost of providing service across its network (because it is no longer recovering as much of its costs from the lower-cost city core), which increases the prices that need to be charged to rural users in order to justify offering service at all.

Conclusion

Overcoming the problem of the rural digital divide starts with understanding why it exists. It is simply more expensive to build networks in areas with low population density. If universal service is the goal, subsidies, whether explicit subsidies from government or implicit cross-subsidies by broadband companies, are necessary to build out to these areas. But obfuscations about increasing “access to high-speed broadband” by promoting “competition” shouldn’t control the debate.

Instead, there needs to be a nuanced understanding of how government-subsidized entry into the broadband marketplace can discourage private investment and grow the size of the “donut hole,” thereby leading to demand for even greater subsidies. Policymakers should avoid exacerbating the digital divide by prioritizing subsidized competition over market processes.

On Friday the the International Center for Law & Economics filed comments with the FCC in response to Chairman Wheeler’s NPRM (proposed rules) to “unlock” the MVPD (i.e., cable and satellite subscription video, essentially) set-top box market. Plenty has been written on the proposed rulemaking—for a few quick hits (among many others) see, e.g., Richard Bennett, Glenn Manishin, Larry Downes, Stuart Brotman, Scott Wallsten, and me—so I’ll dispense with the background and focus on the key points we make in our comments.

Our comments explain that the proposal’s assertion that the MVPD set-top box market isn’t competitive is a product of its failure to appreciate the dynamics of the market (and its disregard for economics). Similarly, the proposal fails to acknowledge the complexity of the markets it intends to regulate, and, in particular, it ignores the harmful effects on content production and distribution the rules would likely bring about.

“Competition, competition, competition!” — Tom Wheeler

“Well, uh… just because I don’t know what it is, it doesn’t mean I’m lying.” — Claude Elsinore

At root, the proposal is aimed at improving competition in a market that is already hyper-competitive. As even Chairman Wheeler has admitted,

American consumers enjoy unprecedented choice in how they view entertainment, news and sports programming. You can pretty much watch what you want, where you want, when you want.

Of course, much of this competition comes from outside the MVPD market, strictly speaking—most notably from OVDs like Netflix. It’s indisputable that the statute directs the FCC to address the MVPD market and the MVPD set-top box market. But addressing competition in those markets doesn’t mean you simply disregard the world outside those markets.

The competitiveness of a market isn’t solely a function of the number of competitors in the market. Even relatively constrained markets like these can be “fully competitive” with only a few competing firms—as is the case in every market in which MVPDs operate (all of which are presumed by the Commission to be subject to “effective competition”).

The truly troubling thing, however, is that the FCC knows that MVPDs compete with OVDs, and thus that the competitiveness of the “MVPD market” (and the “MVPD set-top box market”) isn’t solely a matter of direct, head-to-head MVPD competition.

How do we know that? As I’ve recounted before, in a recent speech FCC General Counsel Jonathan Sallet approvingly explained that Commission staff recommended rejecting the Comcast/Time Warner Cable merger precisely because of the alleged threat it posed to OVD competitors. In essence, Sallet argued that Comcast sought to undertake a $45 billion merger primarily—if not solely—in order to ameliorate the competitive threat to its subscription video services from OVDs:

Simply put, the core concern came down to whether the merged firm would have an increased incentive and ability to safeguard its integrated Pay TV business model and video revenues by limiting the ability of OVDs to compete effectively.…

Thus, at least when it suits it, the Chairman’s office appears not only to believe that this competitive threat is real, but also that Comcast, once the largest MVPD in the country, believes so strongly that the OVD competitive threat is real that it was willing to pay $45 billion for a mere “increased ability” to limit it.

UPDATE 4/26/2016

And now the FCC has approved the Charter/Time Warner Cable, imposing conditions that, according to Wheeler,

focus on removing unfair barriers to video competition. First, New Charter will not be permitted to charge usage-based prices or impose data caps. Second, New Charter will be prohibited from charging interconnection fees, including to online video providers, which deliver large volumes of internet traffic to broadband customers. Additionally, the Department of Justice’s settlement with Charter both outlaws video programming terms that could harm OVDs and protects OVDs from retaliation—an outcome fully supported by the order I have circulated today.

If MVPDs and OVDs don’t compete, why would such terms be necessary? And even if the threat is merely potential competition, as we note in our comments (citing to this, among other things),

particularly in markets characterized by the sorts of technological change present in video markets, potential competition can operate as effectively as—or even more effectively than—actual competition to generate competitive market conditions.

/UPDATE

Moreover, the proposal asserts that the “market” for MVPD set-top boxes isn’t competitive because “consumers have few alternatives to leasing set-top boxes from their MVPDs, and the vast majority of MVPD subscribers lease boxes from their MVPD.”

But the MVPD set-top box market is an aftermarket—a secondary market; no one buys set-top boxes without first buying MVPD service—and always or almost always the two are purchased at the same time. As Ben Klein and many others have shown, direct competition in the aftermarket need not be plentiful for the market to nevertheless be competitive.

Whether consumers are fully informed or uninformed, consumers will pay a competitive package price as long as sufficient competition exists among sellers in the [primary] market.

The competitiveness of the MVPD market in which the antecedent choice of provider is made incorporates consumers’ preferences regarding set-top boxes, and makes the secondary market competitive.

The proposal’s superficial and erroneous claim that the set-top box market isn’t competitive thus reflects bad economics, not competitive reality.

But it gets worse. The NPRM doesn’t actually deny the importance of OVDs and app-based competitors wholesale — it only does so when convenient. As we note in our Comments:

The irony is that the NPRM seeks to give a leg up to non-MVPD distribution services in order to promote competition with MVPDs, while simultaneously denying that such competition exists… In order to avoid triggering [Section 629’s sunset provision,] the Commission is forced to pretend that we still live in the world of Blockbuster rentals and analog cable. It must ignore the Netflix behind the curtain—ignore the utter wealth of video choices available to consumers—and focus on the fact that a consumer might have a remote for an Apple TV sitting next to her Xfinity remote.

“Yes, but you’re aware that there’s an invention called television, and on that invention they show shows?” — Jules Winnfield

The NPRM proposes to create a world in which all of the content that MVPDs license from programmers, and all of their own additional services, must be provided to third-party device manufacturers under a zero-rate compulsory license. Apart from the complete absence of statutory authority to mandate such a thing (or, I should say, apart from statutory language specifically prohibiting such a thing), the proposed rules run roughshod over the copyrights and negotiated contract rights of content providers:

The current rulemaking represents an overt assault on the web of contracts that makes content generation and distribution possible… The rules would create a new class of intermediaries lacking contractual privity with content providers (or MVPDs), and would therefore force MVPDs to bear the unpredictable consequences of providing licensed content to third-parties without actual contracts to govern those licenses…

Because such nullification of license terms interferes with content owners’ right “to do and to authorize” their distribution and performance rights, the rules may facially violate copyright law… [Moreover,] the web of contracts that support the creation and distribution of content are complicated, extensively negotiated, and subject to destabilization. Abrogating the parties’ use of the various control points that support the financing, creation, and distribution of content would very likely reduce the incentive to invest in new and better content, thereby rolling back the golden age of television that consumers currently enjoy.

You’ll be hard-pressed to find any serious acknowledgement in the NPRM that its rules could have any effect on content providers, apart from this gem:

We do not currently have evidence that regulations are needed to address concerns raised by MVPDs and content providers that competitive navigation solutions will disrupt elements of service presentation (such as agreed-upon channel lineups and neighborhoods), replace or alter advertising, or improperly manipulate content…. We also seek comment on the extent to which copyright law may protect against these concerns, and note that nothing in our proposal will change or affect content creators’ rights or remedies under copyright law.

The Commission can’t rely on copyright to protect against these concerns, at least not without admitting that the rules require MVPDs to violate copyright law and to breach their contracts. And in fact, although it doesn’t acknowledge it, the NPRM does require the abrogation of content owners’ rights embedded in licenses negotiated with MVPD distributors to the extent that they conflict with the terms of the rule (which many of them must).   

“You keep using that word. I do not think it means what you think it means.” — Inigo Montoya

Finally, the NPRM derives its claimed authority for these rules from an interpretation of the relevant statute (Section 629 of the Communications Act) that is absurdly unreasonable. That provision requires the FCC to enact rules to assure the “commercial availability” of set-top boxes from MVPD-unaffiliated vendors. According to the NPRM,

we cannot assure a commercial market for devices… unless companies unaffiliated with an MVPD are able to offer innovative user interfaces and functionality to consumers wishing to access that multichannel video programming.

This baldly misconstrues a term plainly meant to refer to the manner in which consumers obtain their navigation devices, not how those devices should function. It also contradicts the Commission’s own, prior readings of the statute:

As structured, the rules will place a regulatory thumb on the scale in favor of third-parties and to the detriment of MVPDs and programmers…. [But] Congress explicitly rejected language that would have required unbundling of MVPDs’ content and services in order to promote other distribution services…. Where Congress rejected language that would have favored non-MVPD services, the Commission selectively interprets the language Congress did employ in order to accomplish exactly what Congress rejected.

And despite the above noted problems (and more), the Commission has failed to do even a cursory economic evaluation of the relative costs of the NPRM, instead focusing narrowly on one single benefit it believes might occur (wider distribution of set-top boxes from third-parties) despite the consistent failure of similar FCC efforts in the past.

All of the foregoing leads to a final question: At what point do the costs of these rules finally outweigh the perceived benefits? On the one hand are legal questions of infringement, inducements to violate agreements, and disruptions of complex contractual ecosystems supporting content creation. On the other hand are the presence of more boxes and apps that allow users to choose who gets to draw the UI for their video content…. At some point the Commission needs to take seriously the costs of its actions, and determine whether the public interest is really served by the proposed rules.

Our full comments are available here.

Last week, FCC General Counsel Jonathan Sallet pulled back the curtain on the FCC staff’s analysis behind its decision to block Comcast’s acquisition of Time Warner Cable. As the FCC staff sets out on its reported Rainbow Tour to reassure regulated companies that it’s not “hostile to the industries it regulates,” Sallet’s remarks suggest it will have an uphill climb. Unfortunately, the staff’s analysis appears to have been unduly speculative, disconnected from critical market realities, and decidedly biased — not characteristics in a regulator that tend to offer much reassurance.

Merger analysis is inherently speculative, but, as courts have repeatedly had occasion to find, the FCC has a penchant for stretching speculation beyond the breaking point, adopting theories of harm that are vaguely possible, even if unlikely and inconsistent with past practice, and poorly supported by empirical evidence. The FCC’s approach here seems to fit this description.

The FCC’s fundamental theory of anticompetitive harm

To begin with, as he must, Sallet acknowledged that there was no direct competitive overlap in the areas served by Comcast and Time Warner Cable, and no consumer would have seen the number of providers available to her changed by the deal.

But the FCC staff viewed this critical fact as “not outcome determinative.” Instead, Sallet explained that the staff’s opposition was based primarily on a concern that the deal might enable Comcast to harm “nascent” OVD competitors in order to protect its video (MVPD) business:

Simply put, the core concern came down to whether the merged firm would have an increased incentive and ability to safeguard its integrated Pay TV business model and video revenues by limiting the ability of OVDs to compete effectively, especially through the use of new business models.

The justification for the concern boiled down to an assumption that the addition of TWC’s subscriber base would be sufficient to render an otherwise too-costly anticompetitive campaign against OVDs worthwhile:

Without the merger, a company taking action against OVDs for the benefit of the Pay TV system as a whole would incur costs but gain additional sales – or protect existing sales — only within its footprint. But the combined entity, having a larger footprint, would internalize more of the external “benefits” provided to other industry members.

The FCC theorized that, by acquiring a larger footprint, Comcast would gain enough bargaining power and leverage, as well as the means to profit from an exclusionary strategy, leading it to employ a range of harmful tactics — such as impairing the quality/speed of OVD streams, imposing data caps, limiting OVD access to TV-connected devices, imposing higher interconnection fees, and saddling OVDs with higher programming costs. It’s difficult to see how such conduct would be permitted under the FCC’s Open Internet Order/Title II regime, but, nevertheless, the staff apparently believed that Comcast would possess a powerful “toolkit” with which to harm OVDs post-transaction.

Comcast’s share of the MVPD market wouldn’t have changed enough to justify the FCC’s purported fears

First, the analysis turned on what Comcast could and would do if it were larger. But Comcast was already the largest ISP and MVPD (now second largest MVPD, post AT&T/DIRECTV) in the nation, and presumably it has approximately the same incentives and ability to disadvantage OVDs today.

In fact, there’s no reason to believe that the growth of Comcast’s MVPD business would cause any material change in its incentives with respect to OVDs. Whatever nefarious incentives the merger allegedly would have created by increasing Comcast’s share of the MVPD market (which is where the purported benefits in the FCC staff’s anticompetitive story would be realized), those incentives would be proportional to the size of increase in Comcast’s national MVPD market share — which, here, would be about eight percentage points: from 22% to under 30% of the national market.

It’s difficult to believe that Comcast would gain the wherewithal to engage in this costly strategy by adding such a relatively small fraction of the MVPD market (which would still leave other MVPDs serving fully 70% of the market to reap the purported benefits instead of Comcast), but wouldn’t have it at its current size – and there’s no evidence that it has ever employed such strategies with its current market share.

It bears highlighting that the D.C. Circuit has already twice rejected FCC efforts to impose a 30% market cap on MVPDs, based on the Commission’s inability to demonstrate that a greater-than-30% share would create competitive problems, especially given the highly dynamic nature of the MVPD market. In vacating the FCC’s most recent effort to do so in 2009, the D.C. Circuit was resolute in its condemnation of the agency, noting:

In sum, the Commission has failed to demonstrate that allowing a cable operator to serve more than 30% of all [MVPD] subscribers would threaten to reduce either competition or diversity in programming.

The extent of competition and the amount of available programming (including original programming distributed by OVDs themselves) has increased substantially since 2009; this makes the FCC’s competitive claims even less sustainable today.

It’s damning enough to the FCC’s case that there is no marketplace evidence of such conduct or its anticompetitive effects in today’s market. But it’s truly impossible to square the FCC’s assertions about Comcast’s anticompetitive incentives with the fact that, over the past decade, Comcast has made massive investments in broadband, steadily increased broadband speeds, and freely licensed its programming, among other things that have served to enhance OVDs’ long-term viability and growth. Chalk it up to the threat of regulatory intervention or corporate incompetence if you can’t believe that competition alone could be responsible for this largesse, but, whatever the reason, the FCC staff’s fears appear completely unfounded in a marketplace not significantly different than the landscape that would have existed post-merger.

OVDs aren’t vulnerable, and don’t need the FCC’s “help”

After describing the “new entrants” in the market — such unfamiliar and powerless players as Dish, Sony, HBO, and CBS — Sallet claimed that the staff was principally animated by the understanding that

Entrants are particularly vulnerable when competition is nascent. Thus, staff was particularly concerned that this transaction could damage competition in the video distribution industry.

Sallet’s description of OVDs makes them sound like struggling entrepreneurs working in garages. But, in fact, OVDs have radically reshaped the media business and wield enormous clout in the marketplace.

Netflix, for example, describes itself as “the world’s leading Internet television network with over 65 million members in over 50 countries.” New services like Sony Vue and Sling TV are affiliated with giant, well-established media conglomerates. And whatever new offerings emerge from the FCC-approved AT&T/DIRECTV merger will be as well-positioned as any in the market.

In fact, we already know that the concerns of the FCC are off-base because they are of a piece with the misguided assumptions that underlie the Chairman’s recent NPRM to rewrite the MVPD rules to “protect” just these sorts of companies. But the OVDs themselves — the ones with real money and their competitive futures on the line — don’t see the world the way the FCC does, and they’ve resolutely rejected the Chairman’s proposal. Notably, the proposed rules would “protect” these services from exactly the sort of conduct that Sallet claims would have been a consequence of the Comcast-TWC merger.

If they don’t want or need broad protection from such “harms” in the form of revised industry-wide rules, there is surely no justification for the FCC to throttle a merger based on speculation that the same conduct could conceivably arise in the future.

The realities of the broadband market post-merger wouldn’t have supported the FCC’s argument, either

While a larger Comcast might be in a position to realize more of the benefits from the exclusionary strategy Sallet described, it would also incur more of the costs — likely in direct proportion to the increased size of its subscriber base.

Think of it this way: To the extent that an MVPD can possibly constrain an OVD’s scope of distribution for programming, doing so also necessarily makes the MVPD’s own broadband offering less attractive, forcing it to incur a cost that would increase in proportion to the size of the distributor’s broadband market. In this case, as noted, Comcast would have gained MVPD subscribers — but it would have also gained broadband subscribers. In a world where cable is consistently losing video subscribers (as Sallet acknowledged), and where broadband offers higher margins and faster growth, it makes no economic sense that Comcast would have valued the trade-off the way the FCC claims it would have.

Moreover, in light of the existing conditions imposed on Comcast under the Comcast/NBCU merger order from 2011 (which last for a few more years) and the restrictions adopted in the Open Internet Order, Comcast’s ability to engage in the sort of exclusionary conduct described by Sallet would be severely limited, if not non-existent. Nor, of course, is there any guarantee that former or would-be OVD subscribers would choose to subscribe to, or pay more for, any MVPD in lieu of OVDs. Meanwhile, many of the relevant substitutes in the MVPD market (like AT&T and Verizon FiOS) also offer broadband services – thereby increasing the costs that would be incurred in the broadband market even more, as many subscribers would shift not only their MVPD, but also their broadband service, in response to Comcast degrading OVDs.

And speaking of the Open Internet Order — wasn’t that supposed to prevent ISPs like Comcast from acting on their alleged incentives to impede the quality of, or access to, edge providers like OVDs? Why is merger enforcement necessary to accomplish the same thing once Title II and the rest of the Open Internet Order are in place? And if the argument is that the Open Internet Order might be defeated, aside from the completely speculative nature of such a claim, why wouldn’t a merger condition that imposed the same constraints on Comcast – as was done in the Comcast/NBCU merger order by imposing the former net neutrality rules on Comcast – be perfectly sufficient?

While the FCC staff analysis accepted as true (again, contrary to current marketplace evidence) that a bigger Comcast would have more incentive to harm OVDs post-merger, it rejected arguments that there could be countervailing benefits to OVDs and others from this same increase in scale. Thus, things like incremental broadband investments and speed increases, a larger Wi-Fi network, and greater business services market competition – things that Comcast is already doing and would have done on a greater and more-accelerated scale in the acquired territories post-transaction – were deemed insufficient to outweigh the expected costs of the staff’s entirely speculative anticompetitive theory.

In reality, however, not only OVDs, but consumers – and especially TWC subscribers – would have benefitted from the merger by access to Comcast’s faster broadband speeds, its new investments, and its superior video offerings on the X1 platform, among other things. Many low-income families would have benefitted from expansion of Comcast’s Internet Essentials program, and many businesses would have benefited from the addition of a more effective competitor to the incumbent providers that currently dominate the business services market. Yet these and other verifiable benefits were given short shrift in the agency’s analysis because they “were viewed by staff as incapable of outweighing the potential harms.”

The assumptions underlying the FCC staff’s analysis of the broadband market are arbitrary and unsupportable

Sallet’s claim that the combined firm would have 60% of all high-speed broadband subscribers in the U.S. necessarily assumes a national broadband market measured at 25 Mbps or higher, which is a red herring.

The FCC has not explained why 25 Mbps is a meaningful benchmark for antitrust analysis. The FCC itself endorsed a 10 Mbps baseline for its Connect America fund last December, noting that over 70% of current broadband users subscribe to speeds less than 25 Mbps, even in areas where faster speeds are available. And streaming online video, the most oft-cited reason for needing high bandwidth, doesn’t require 25 Mbps: Netflix says that 5 Mbps is the most that’s required for an HD stream, and the same goes for Amazon (3.5 Mbps) and Hulu (1.5 Mbps).

What’s more, by choosing an arbitrary, faster speed to define the scope of the broadband market (in an effort to assert the non-competitiveness of the market, and thereby justify its broadband regulations), the agency has – without proper analysis or grounding, in my view – unjustifiably shrunk the size of the relevant market. But, as it happens, doing so also shrinks the size of the increase in “national market share” that the merger would have brought about.

Recall that the staff’s theory was premised on the idea that the merger would give Comcast control over enough of the broadband market that it could unilaterally impose costs on OVDs sufficient to impair their ability to reach or sustain minimum viable scale. But Comcast would have added only one percent of this invented “market” as a result of the merger. It strains credulity to assert that there could be any transaction-specific harm from an increase in market share equivalent to a rounding error.

In any case, basing its rejection of the merger on a manufactured 25 Mbps relevant market creates perverse incentives and will likely do far more to harm OVDs than realization of even the staff’s worst fears about the merger ever could have.

The FCC says it wants higher speeds, and it wants firms to invest in faster broadband. But here Comcast did just that, and then was punished for it. Rather than acknowledging Comcast’s ongoing broadband investments as strong indication that the FCC staff’s analysis might be on the wrong track, the FCC leadership simply sidestepped that inconvenient truth by redefining the market.

The lesson is that if you make your product too good, you’ll end up with an impermissibly high share of the market you create and be punished for it. This can’t possibly promote the public interest.

Furthermore, the staff’s analysis of competitive effects even in this ersatz market aren’t likely supportable. As noted, most subscribers access OVDs on connections that deliver content at speeds well below the invented 25 Mbps benchmark, and they pay the same prices for OVD subscriptions as subscribers who receive their content at 25 Mbps. Confronted with the choice to consume content at 25 Mbps or 10 Mbps (or less), the majority of consumers voluntarily opt for slower speeds — and they purchase service from Netflix and other OVDs in droves, nonetheless.

The upshot? Contrary to the implications on which the staff’s analysis rests, if Comcast were to somehow “degrade” OVD content on the 25 Mbps networks so that it was delivered with characteristics of video content delivered over a 10-Mbps network, real-world, observed consumer preferences suggest it wouldn’t harm OVDs’ access to consumers at all. This is especially true given that OVDs often have a global focus and reach (again, Netflix has 65 million subscribers in over 50 countries), making any claims that Comcast could successfully foreclose them from the relevant market even more suspect.

At the same time, while the staff apparently viewed the broadband alternatives as “limited,” the reality is that Comcast, as well as other broadband providers, are surrounded by capable competitors, including, among others, AT&T, Verizon, CenturyLink, Google Fiber, many advanced VDSL and fiber-based Internet service providers, and high-speed mobile wireless providers. The FCC understated the complex impact of this robust, dynamic, and ever-increasing competition, and its analysis entirely ignored rapidly growing mobile wireless broadband competition.

Finally, as noted, Sallet claimed that the staff determined that merger conditions would be insufficient to remedy its concerns, without any further explanation. Yet the Commission identified similar concerns about OVDs in both the Comcast/NBCUniversal and AT&T/DIRECTV transactions, and adopted remedies to address those concerns. We know the agency is capable of drafting behavioral conditions, and we know they have teeth, as demonstrated by prior FCC enforcement actions. It’s hard to understand why similar, adequate conditions could not have been fashioned for this transaction.

In the end, while I appreciate Sallet’s attempt to explain the FCC’s decision to reject the Comcast/TWC merger, based on the foregoing I’m not sure that Comcast could have made any argument or showing that would have dissuaded the FCC from challenging the merger. Comcast presented a strong economic analysis answering the staff’s concerns discussed above, all to no avail. It’s difficult to escape the conclusion that this was a politically-driven result, and not one rigorously based on the facts or marketplace reality.

The Wall Street Journal dropped an FCC bombshell last week, although I’m not sure anyone noticed. In an article ostensibly about the possible role that MFNs might play in the Comcast/Time-Warner Cable merger, the Journal noted that

The FCC is encouraging big media companies to offer feedback confidentially on Comcast’s $45-billion offer for Time Warner Cable.

Not only is the FCC holding secret meetings, but it is encouraging Comcast’s and TWC’s commercial rivals to hold confidential meetings and to submit information under seal. This is not a normal part of ex parte proceedings at the FCC.

In the typical proceeding of this sort – known as a “permit-but-disclose proceeding” – ex parte communications are subject to a host of disclosure requirements delineated in 47 CFR 1.1206. But section 1.1200(a) of the Commission’s rules permits the FCC, in its discretion, to modify the applicable procedures if the public interest so requires.

If you dig deeply into the Public Notice seeking comments on the merger, you find a single sentence stating that

Requests for exemptions from the disclosure requirements pursuant to section 1.1204(a)(9) may be made to Jonathan Sallet [the FCC’s General Counsel] or Hillary Burchuk [who heads the transaction review team].

Similar language appears in the AT&T/DirecTV transaction Public Notice.

This leads to the cited rule exempting certain ex parte presentations from the usual disclosure requirements in such proceedings, including the referenced one that exempts ex partes from disclosure when

The presentation is made pursuant to an express or implied promise of confidentiality to protect an individual from the possibility of reprisal, or there is a reasonable expectation that disclosure would endanger the life or physical safety of an individual

So the FCC is inviting “media companies” to offer confidential feedback and to hold secret meetings that the FCC will hold confidential because of “the possibility of reprisal” based on language intended to protect individuals.

Such deviations from the standard permit-but-disclose procedures are extremely rare. As in non-existent. I guess there might be other examples, but I was unable to find a single one in a quick search. And I’m willing to bet that the language inviting confidential communications in the PN hasn’t appeared before – and certainly not in a transaction review.

It is worth pointing out that the language in 1.1204(a)(9) is remarkably similar to language that appears in the Freedom of Information Act. As the DOJ notes regarding that exemption:

Exemption 7(D) provides protection for “records or information compiled for law enforcement purposes [which] could reasonably be expected to disclose the identity of a confidential source… to ensure that “confidential sources are not lost through retaliation against the sources for past disclosure or because of the sources’ fear of future disclosure.”

Surely the fear-of-reprisal rationale for confidentiality makes sense in that context – but here? And invoked to elicit secret meetings and to keep confidential information from corporations instead of individuals, it makes even less sense (and doesn’t even obviously comply with the rule itself). It is not as though – as far as I know – someone approached the Commission with stated fears and requested it implement a procedure for confidentiality in these particular reviews.

Rather, this is the Commission inviting non-transparent process in the midst of a heated, politicized and heavily-scrutinized transaction review.

The optics are astoundingly bad.

Unfortunately, this kind of behavior seems to be par for the course for the current FCC. As Commissioner Pai has noted on more than one occasion, the minority commissioners have been routinely kept in the dark with respect to important matters at the Commission – not coincidentally, in other highly-politicized proceedings.

What’s particularly troubling is that, for all its faults, the FCC’s process is typically extremely open and transparent. Public comments, endless ex parte meetings, regular Open Commission Meetings are all the norm. And this is as it should be. Particularly when it comes to transactions and other regulated conduct for which the regulated entity bears the burden of proving that its behavior does not offend the public interest, it is obviously necessary to have all of the information – to know what might concern the Commission and to make a case respecting those matters.

The kind of arrogance on display of late, and the seeming abuse of process that goes along with it, hearkens back to the heady days of Kevin Martin’s tenure as FCC Chairman – a tenure described as “dysfunctional” and noted for its abuse of process.

All of which should stand as a warning to the vocal, pro-regulatory minority pushing for the FCC to proclaim enormous power to regulate net neutrality – and broadband generally – under Title II. Just as Chairman Martin tried to manipulate diversity rules to accomplish his pet project of cable channel unbundling, some future Chairman will undoubtedly claim authority under Title II to accomplish some other unintended, but politically expedient, objective — and it may not be one the self-proclaimed consumer advocates like, when it happens.

Bad as that risk may be, it is only made more likely by regulatory reviews undertaken in secret. Whatever impelled the Chairman to invite unprecedented secrecy into these transaction reviews, it seems to be of a piece with a deepening politicization and abuse of process at the Commission. It’s both shameful – and deeply worrying.