Archives For Broadband

On February 13 an administrative law judge (ALJ) at the California Public Utility Commission (CPUC) issued a proposed decision regarding the Comcast/Time Warner Cable (TWC) merger. The proposed decision recommends that the CPUC approve the merger with conditions.

It’s laudable that the ALJ acknowledges at least some of the competitive merits of the proposed deal. But the set of conditions that the proposed decision would impose on the combined company in order to complete the merger represents a remarkable set of unauthorized regulations that are both inappropriate for the deal and at odds with California’s legislated approach to regulation of the Internet.

According to the proposed decision, every condition it imposes is aimed at mitigating a presumed harm arising from the merger:

The Applicants must meet the conditions adopted herein in order to provide reasonable assurance that the proposed transaction will be in the public interest in accordance with Pub. Util. Code § 854(a) and (c).… We only adopt conditions which mitigate an effect of the merger in order to satisfy the public interest requirements of § 854.

By any reasonable interpretation, this would mean that the CPUC can adopt only those conditions that address specific public interest concerns arising from the deal itself. But most of the conditions in the proposed decision fail this basic test and seem designed to address broader social policy issues that have nothing to do with the alleged competitive effects of the deal.

Instead, without undertaking an analysis of the merger’s competitive effects, the proposed decision effectively accepts that the merger serves the public interest, while also simply accepting the assertions of the merger’s opponents that it doesn’t. In the name of squaring that circle, the proposed decision seeks to permit the merger to proceed, but then seeks to force the post-merger company to conform to the merger’s critics’ rather arbitrary view of their preferred market structure for the provision of cable broadband services in California.

For something — say, a merger — to be in the public interest, it need not further every conceivable public interest goal. This is a perversion of the standard, and it turns “public interest” into an unconstrained license to impose a regulatory wish-list on particular actors, outside of the scope of usual regulatory processes.

While a few people may have no problem with the proposed decision’s expansive vision of Internet access regulation, California governor Jerry Brown and the overwhelming majority of the California state legislature cannot be counted among the supporters of this approach.

In 2012 the state legislature passed by an overwhelming margin — and Governor Brown signed — SB 1161 (codified as Section 710 of the California Public Utilities Code), which expressly prohibits the CPUC from regulating broadband:

The commission shall not exercise regulatory jurisdiction or control over Voice over Internet Protocol and Internet Protocol enabled services except as required or expressly delegated by federal law or expressly directed to do so by statute or as set forth in [certain enumerated exceptions].”

The message is clear: The CPUC should not try to bypass clear state law and all institutional safeguards by misusing the merger clearance process.

While bipartisan majorities in the state house, supported by a Democratic governor, have stopped the CPUC from imposing new regulations on Internet and VoIP services through SB 1161, the proposed decision seeks to impose regulations through merger conditions that go far beyond anything permitted by this state law.

For instance, the proposed decision seeks to impose arbitrary retail price controls on broadband access:

Comcast shall offer to all customers of the merged companies, for a period of five years following the effective date of the parent company merger, the opportunity to purchase stand-alone broadband Internet service at a price not to exceed the price charged by Time Warner for providing that service to its customers, and at speeds, prices, and terms, at least comparable to that offered by Time Warner prior to the merger’s closing.

And the proposed decision seeks to mandate market structure in other insidious ways, as well, mandating specific broadband speeds, requiring a break-neck geographic expansion of Comcast’s service area, and dictating installation and service times, among other things — all without regard to the actual plausibility (or cost) of implementing such requirements.

But the problem is even more acute. Not only does the proposed decision seek to regulate Internet access issues irrelevant to the merger, it also proposes to impose conditions that would actually undermine competition.

The proposed decision would impose the following conditions on Comcast’s business VoIP and business Internet services:

Comcast shall offer Time Warner’s Business Calling Plan with Stand Alone Internet Access to interested CLECs throughout the combined service territories of the merging companies for a period of five years from the effective date of the parent company merger at existing prices, terms and conditions.

Comcast shall offer Time Warner’s Carrier Ethernet Last Mile Access product to interested CLECs throughout the combined service territories of the merging companies for a period of five years from the effective date of the parent company at the same prices, terms and conditions as offered by Time Warner prior to the merger.

But the proposed decision fails to recognize that Comcast is an also-ran in the business service market. Last year it served a small fraction of the business customers served by AT&T and Verizon, who have long dominated the business services market:

According to a Sept. 2011 ComScore survey, AT&T and Verizon had the largest market shares of all business services ISPs. AT&T held 20% of market share and Verizon held 12%. Comcast ranked 6th, with 5% of market share.

The proposed conditions would hamstring the upstart challenger Comcast by removing both product and pricing flexibility for five years – an eternity in rapidly evolving technology markets. That’s a sure-fire way to minimize competition, not promote it.

The proposed decision reiterates several times its concern that the combined Comcast/Time Warner Cable will serve more than 80% of California households, and “reduce[] the possibilities for content providers to reach the California broadband market.” The alleged concern is that the combined company could exercise anticompetitive market power — imposing artificially high fees for carrying content or degrading service of unaffiliated content and services.

The problem is Comcast and TWC don’t compete anywhere in California today, and they face competition from other providers everywhere they operate. As the decision matter-of-factly states:

Comcast and Time Warner do not compete with one another… [and] Comcast and Time Warner compete with other providers of Internet access services in their respective service territories.

As a result, the merger will actually have no effect on the number of competitive choices in the state; the increase in the statewide market share as a result of the deal is irrelevant. And so these purported competition concerns can’t be the basis for any conditions, let alone the sweeping ones set out in the proposed decision.

The stated concern about content providers finding it difficult to reach Californians is a red herring: the post-merger Comcast geographic footprint will be exactly the same as the combined, pre-merger Comcast/TWC/Charter footprint. Content providers will be able to access just as many Californians (and with greater speeds) as before the merger.

True, content providers that just want to reach some number of random Californians may have to reach more of them through Comcast than they would have before the merger. But what content provider just wants to reach some number of Californians in the first place? Moreover, this fundamentally misstates the way the Internet works: it is users who reach the content they prefer; not the other way around. And, once again, for literally every consumer in the state, the number of available options for doing so won’t change one iota following the merger.

Nothing shows more clearly how the proposed decision has strayed from responding to merger concerns to addressing broader social policy issues than the conditions aimed at expanding low-price broadband offerings for underserved households. Among other things, the proposed conditions dramatically increase the size and scope of Comcast’s Internet Essentials program, converting this laudable effort from a targeted program (that uses a host of tools to connect families where a child is eligible for the National School Lunch Program to the Internet) into one that must serve all low-income adults.

Putting aside the damage this would do to the core Internet Essentials’ mission of connecting school age children by diverting resources from the program’s central purpose, it is manifestly outside the scope of the CPUC’s review. Nothing in the deal affects the number of adults (or children, for that matter) in California without broadband.

It’s possible, of course, that Comcast might implement something like an expanded Internet Essentials program without any prodding; after all, companies implement (and expand) such programs all the time. But why on earth should regulators be able to define such an obligation arbitrarily, and to impose it on whatever ISP happens to be asking for a license transfer? That arbitrariness creates precisely the sort of business uncertainty that SB 1161 was meant to prevent.

The same thing applies to the proposed decision’s requirement regarding school and library broadband connectivity:

Comcast shall connect and/or upgrade Internet infrastructure for K-12 schools and public libraries in unserved and underserved areas in Comcast’s combined California service territory so that it is providing high speed Internet to at least the same proportion of K-12 schools and public libraries in such unserved and underserved areas as it provides to the households in its service territory.

No doubt improving school and library infrastructure is a noble goal — and there’s even a large federal subsidy program (E-Rate) devoted to it. But insisting that Comcast do so — and do so to an extent unsupported by the underlying federal subsidy program already connecting such institutions, and in contravention of existing provider contracts with schools — as a condition of the merger is simple extortion.

The CPUC is treating the proposed merger like a free-for-all, imposing in the name of the “public interest” a set of conditions that it would never be permitted to impose absent the gun-to-the-head of merger approval. Moreover, it seeks to remake California’s broadband access landscape in a fashion that would likely never materialize in the natural course of competition: If the merger doesn’t go through, none of the conditions in the proposed decision and alleged to be necessary to protect the public interest will exist.

Far from trying to ensure that Comcast’s merger with TWC doesn’t erode competitive forces to the detriment of the public, the proposed decision is trying to micromanage the market, simply asserting that the public interest demands imposition of it’s subjective and arbitrary laundry list of preferred items. This isn’t sensible regulation, it isn’t compliant with state law, and it doesn’t serve the people of California.

Today the D.C. Circuit struck down most of the FCC’s 2010 Open Internet Order, rejecting rules that required broadband providers to carry all traffic for edge providers (“anti-blocking”) and prevented providers from negotiating deals for prioritized carriage. However, the appeals court did conclude that the FCC has statutory authority to issue “Net Neutrality” rules under Section 706(a) and let stand the FCC’s requirement that broadband providers clearly disclose their network management practices.

The following statement may be attributed to Geoffrey Manne and Berin Szoka:

The FCC may have lost today’s battle, but it just won the war over regulating the Internet. By recognizing Section 706 as an independent grant of statutory authority, the court has given the FCC near limitless power to regulate not just broadband, but the Internet itself, as Judge Silberman recognized in his dissent.

The court left the door open for the FCC to write new Net Neutrality rules, provided the Commission doesn’t treat broadband providers as common carriers. This means that, even without reclassifying broadband as a Title II service, the FCC could require that any deals between broadband and content providers be reasonable and non-discriminatory, just as it has required wireless carriers to provide data roaming services to their competitors’ customers on that basis. In principle, this might be a sound approach, if the rule resembles antitrust standards. But even that limitation could easily be evaded if the FCC regulates through case-by-case enforcement actions, as it tried to do before issuing the Open Internet Order. Either way, the FCC need only make a colorable argument under Section 706 that its actions are designed to “encourage the deployment… of advanced telecommunications services.” If the FCC’s tenuous “triple cushion shot” argument could satisfy that test, there is little limit to the deference the FCC will receive.

But that’s just for Net Neutrality. Section 706 covers “advanced telecommunications,” which seems to include any information service, from broadband to the interconnectivity of smart appliances like washing machines and home thermostats. If the court’s ruling on Section 706 is really as broad as it sounds, and as the dissent fears, the FCC just acquired wide authority over these, as well — in short, the entire Internet, including the “Internet of Things.” While the court’s “no common carrier rules” limitation is a real one, the FCC clearly just gained enormous power that it didn’t have before today’s ruling.

Today’s decision essentially rewrites the Communications Act in a way that will, ironically, do the opposite of what the FCC claims: hurt, not help, deployment of new Internet services. Whatever the FCC’s role ought to be, such decisions should be up to our elected representatives, not three unelected FCC Commissioners. So if there’s a silver lining in any of this, it may be that the true implications of today’s decision are so radical that Congress finally writes a new Communications Act — a long-overdue process Congressmen Fred Upton and Greg Walden have recently begun.

Szoka and Manne are available for comment at media@techfreedom.org. Find/share this release on Facebook or Twitter.

I have a new post up at TechPolicyDaily.com, excerpted below, in which I discuss the growing body of (surprising uncontroversial) work showing that broadband in the US compares favorably to that in the rest of the world. My conclusion, which is frankly more cynical than I like, is that concern about the US “falling behind” is manufactured debate. It’s a compelling story that the media likes and that plays well for (some) academics.

Before the excerpt, I’d also like to quote one of today’s headlines from Slashdot:

“Google launched the citywide Wi-Fi network with much fanfare in 2006 as a way for Mountain View residents and businesses to connect to the Internet at no cost. It covers most of the Silicon Valley city and worked well until last year, as Slashdot readers may recall, when connectivity got rapidly worse. As a result, Mountain View is installing new Wi-Fi hotspots in parts of the city to supplement the poorly performing network operated by Google. Both the city and Google have blamed the problems on the design of the network. Google, which is involved in several projects to provide Internet access in various parts of the world, said in a statement that it is ‘actively in discussions with the Mountain View city staff to review several options for the future of the network.'”

The added emphasis is mine. It is added to draw attention to the simple point that designing and building networks is hard. Like, really really hard. Folks think that it’s easy, because they have small networks in their homes or offices — so surely they can scale to a nationwide network without much trouble. But all sorts of crazy stuff starts to happen when we substantially increase the scale of IP networks. This is just one of the very many things that should give us pause about calls for the buildout of a government run or sponsored Internet infrastructure.

Another of those things is whether there’s any need for that. Which brings us to my TechPolicyDaily.com post:

In the week or so since TPRC, I’ve found myself dwelling on an observation I made during the conference: how much agreement there was, especially on issues usually thought of as controversial. I want to take a few paragraphs to consider what was probably the most surprisingly non-controversial panel of the conference, the final Internet Policy panel, in which two papers – one by ITIF’s Rob Atkinson and the other by James McConnaughey from NTIA – were presented that showed that broadband Internet service in US (and Canada, though I will focus on the US) compares quite well to that offered in the rest of the world. […]

But the real question that this panel raised for me was: given how well the US actually compares to other countries, why does concern about the US falling behind dominate so much discourse in this area? When you get technical, economic, legal, and policy experts together in a room – which is what TPRC does – the near consensus seems to be that the “kids are all right”; but when you read the press, or much of the high-profile academic literature, “the sky is falling.”

The gap between these assessments could not be larger. I think that we need to think about why this is. I hate to be cynical or disparaging – especially since I know strong advocates on both sides and believe that their concerns are sincere and efforts earnest. But after this year’s conference, I’m having trouble shaking the feeling that ongoing concern about how US broadband stacks up to the rest of the world is a manufactured debate. It’s a compelling, media- and public-friendly, narrative that supports a powerful political agenda. And the clear incentives, for academics and media alike, are to find problems and raise concerns. […]

Compare this to the Chicken Little narrative. As I was writing this, I received a message from a friend asking my views on an Economist blog post that shares data from the ITU’s just-released Measuring the Information Society 2013 report. This data shows that the US has some of the highest prices for pre-paid handset-based mobile data around the world. That is, it reports the standard narrative – and it does so without looking at the report’s methodology. […]

Even more problematic than what the Economist blog reports, however, is what it doesn’t report. [The report contains data showing the US has some of the lowest cost fixed broadband and mobile broadband prices in the world. See the full post at TechPolicyDaily.com for the numbers.]

Now, there are possible methodological problems with these rankings, too. My point here isn’t to debate over the relative position of the United States. It’s to ask why the “story” about this report cherry-picks the alarming data, doesn’t consider its methodology, and ignores the data that contradicts its story.

Of course, I answered that question above: It’s a compelling, media- and public-friendly, narrative that supports a powerful political agenda. And the clear incentives, for academics and media alike, are to find problems and raise concerns. Manufacturing debate sells copy and ads, and advances careers.

Susan Crawford recently received the OneCommunity Broadband Hero Award for being a “tireless advocate for 21st century high capacity network access.” In her recent debate with Geoffrey Manne and Berin Szoka, she emphasized that there is little competition in broadband or between cable broadband and wireless, asserting that the main players have effectively divided the markets. As a result, she argues (as she did here at 17:29) that broadband and wireless providers “are deciding not to invest in the very expensive infrastructure because they are very happy with the profits they are getting now.” In the debate, Manne countered by pointing to substantial investment and innovation in both the wired and wireless broadband marketplaces, and arguing that this is not something monopolists insulated from competition do. So, who’s right?

The recently released 2013 Progressive Policy Institute Report, U.S. Investment Heroes of 2013: The Companies Betting on America’s Future, has two useful little tables that lend support to Manne’s counterargument.

skitch

The first shows the top 25 investors that are nonfinancial companies, and guess who comes in 1st, 2nd, 10th, 13th, and 17th place? None other than AT&T, Verizon Communications, Comcast, Sprint Nextel, and Time Warner, respectively.

skatch

And when the table is adjusted by removing non-energy companies, the ranks become 1st, 2nd, 5th, 6th, and 9th. In fact, cable and telecom combined to invest over $50.5 billion in 2012.

This high level of investment by supposed monopolists is not a new development. The Progressive Policy Institute’s 2012 Report, Investment Heroes: Who’s Betting on America’s Future? indicates that the same main players have been investing heavily for years. Since 1996, the cable industry has invested over $200 billion into infrastructure alone. These investments have allowed 99.5% of Americans to have access to broadband – via landline, wireless, or both – as of the end of 2012.

There’s more. Not only has there been substantial investment that has increased access, but the speeds of service have increased dramatically over the past few years. The National Broadband Map data show that by the end of 2012:

  • Landline service ≧ 25 megabits per second download available to 81.7% of households, up from 72.9% at the end of 2011 and 58.4% at the end of 2010
  • Landline service ≧ 100 megabits per second download available to 51.5% of households, up from 43.4% at the end of 2011 and only 12.9% at the end of 2010
  • ≧ 1 gigabit per second download available to 6.8% of households, predominantly via fiber
  • Fiber at any speed was available to 22.9% of households, up from 16.8% at the end of 2011 and 14.8% at the end of 2010
  • Landline broadband service at the 3 megabits / 768 kilobits threshold available to 93.4% of households, up from 92.8% at the end of 2011
  • Mobile wireless broadband at the 3 megabits / 768 kilobits threshold available to 94.1% of households , up from 75.8% at the end of 2011
  • Access to mobile wireless broadband providing ≧ 10 megabits per second download has grown to 87%, up from 70.6 percent at the end of 2011 and 8.9 percent at the end of 2010
  • Landline broadband ≧ 10 megabits download was available to 91.1% of households

This leaves only one question: Will the real broadband heroes please stand up?

Over at Forbes Berin Szoka and I have a lengthy piece discussing “10 Reasons To Be More Optimistic About Broadband Than Susan Crawford Is.” Crawford has become the unofficial spokesman for a budding campaign to reshape broadband. She sees cable companies monopolizing broadband, charging too much, withholding content and keeping speeds low, all in order to suppress disruptive innovation — and argues for imposing 19th century common carriage regulation on the Internet. Berin and I begin (we expect to contribute much more to this discussion in the future) to explain both why her premises are erroneous and also why her proscription is faulty. Here’s a taste:

Things in the US today are better than Crawford claims. While Crawford claims that broadband is faster and cheaper in other developed countries, her statistics are convincingly disputed. She neglects to mention the significant subsidies used to build out those networks. Crawford’s model is Europe, but as Europeans acknowledge, “beyond 100 Mbps supply will be very difficult and expensive. Western Europe may be forced into a second fibre build out earlier than expected, or will find themselves within the slow lane in 3-5 years time.” And while “blazing fast” broadband might be important for some users, broadband speeds in the US are plenty fast enough to satisfy most users. Consumers are willing to pay for speed, but, apparently, have little interest in paying for the sort of speed Crawford deems essential. This isn’t surprising. As the LSE study cited above notes, “most new activities made possible by broadband are already possible with basic or fast broadband: higher speeds mainly allow the same things to happen faster or with higher quality, while the extra costs of providing higher speeds to everyone are very significant.”

Even if she’s right, she wildly exaggerates the costs. Using a back-of-the-envelope calculation, Crawford claims that slow downloads (compared to other countries) could cost the U.S. $3 trillion/year in lost productivity from wasted time spent “waiting for a link to load or an app to function on your wireless device.” This intentionally sensationalist claim, however, rests on a purely hypothetical average wait time in the U.S. of 30 seconds (vs. 2 seconds in Japan). Whatever the actual numbers might be, her methodology would still be shaky, not least because time spent waiting for laggy content isn’t necessarily simply wasted. And for most of us, the opportunity cost of waiting for Angry Birds to load on our phones isn’t counted in wages — it’s counted in beers or time on the golf course or other leisure activities. These are important, to be sure, but does anyone seriously believe our GDP would grow 20% if only apps were snappier? Meanwhile, actual econometric studies looking at the productivity effects of faster broadband on businesses have found that higher broadband speeds are not associated with higher productivity.

* * *

So how do we guard against the possibility of consumer harm without making things worse? For us, it’s a mix of promoting both competition and a smarter, subtler role for government.

Despite Crawford’s assertion that the DOJ should have blocked the Comcast-NBCU merger, antitrust and consumer protection laws do operate to constrain corporate conduct, not only through government enforcement but also private rights of action. Antitrust works best in the background, discouraging harmful conduct without anyone ever suing. The same is true for using consumer protection law to punish deception and truly harmful practices (e.g., misleading billing or overstating speeds).

A range of regulatory reforms would also go a long way toward promoting competition. Most importantly, reform local franchising so competitors like Google Fiber can build their own networks. That means giving them “open access” not to existing networks but to the public rights of way under streets. Instead of requiring that franchisees build out to an entire franchise area—which often makes both new entry and service upgrades unprofitable—remove build-out requirements and craft smart subsidies to encourage competition to deliver high-quality universal service, and to deliver superfast broadband to the customers who want it. Rather than controlling prices, offer broadband vouchers to those that can’t afford it. Encourage telcos to build wireline competitors to cable by transitioning their existing telephone networks to all-IP networks, as we’ve urged the FCC to do (here and here). Let wireless reach its potential by opening up spectrum and discouraging municipalities from blocking tower construction. Clear the deadwood of rules that protect incumbents in the video marketplace—a reform with broad bipartisan appeal.

In short, there’s a lot of ground between “do nothing” and “regulate broadband like electricity—or railroads.” Crawford’s arguments simply don’t justify imposing 19th century common carriage regulation on the Internet. But that doesn’t leave us powerless to correct practices that truly harm consumers, should they actually arise.

Read the whole thing here.

By Geoffrey Manne & Berin Szoka

As Democrats insist that income taxes on the 1% must go up in the name of fairness, one Democratic Senator wants to make sure that the 1% of heaviest Internet users pay the same price as the rest of us. It’s ironic how confused social justice gets when the Internet’s involved.

Senator Ron Wyden is beloved by defenders of Internet freedom, most notably for blocking the Protect IP bill—sister to the more infamous SOPA—in the Senate. He’s widely celebrated as one of the most tech-savvy members of Congress. But his latest bill, the “Data Cap Integrity Act,” is a bizarre, reverse-Robin Hood form of price control for broadband. It should offend those who defend Internet freedom just as much as SOPA did.

Wyden worries that “data caps” will discourage Internet use and allow “Internet providers to extract monopoly rents,” quoting a New York Times editorial from July that stirred up a tempest in a teapot. But his fears are straw men, based on four false premises.

First, US ISPs aren’t “capping” anyone’s broadband; they’re experimenting with usage-based pricing—service tiers. If you want more than the basic tier, your usage isn’t capped: you can always pay more for more bandwidth. But few users will actually exceed that basic tier. For example, Comcast’s basic tier, 300 GB/month, is so generous that 98.5% of users will not exceed it. That’s enough for 130 hours of HD video each month (two full-length movies a day) or between 300 and 1000 hours of standard (compressed) video streaming. Continue Reading…

At today’s Open Commission Meeting, the FCC is set to consider two apparently forthcoming Notices of Proposed Rulemaking that will shape the mobile broadband sector for years to come.  It’s not hyperbole to say that the FCC’s approach to the two issues at hand — the design of spectrum auctions and the definition of the FCC’s spectrum screen — can make or break wireless broadband in this country.  The FCC stands at a crossroads with respect to its role in this future, and it’s not clear that it will choose wisely.

Chairman Genachowski has recently jumped on the “psychology of abundance” bandwagon, suggesting that the firms that provide broadband service must (be forced by the FCC to) act as if spectrum and bandwidth were abundant (they aren’t), and not to engage in activities that are sensible responses to broadband scarcity.  According to Genachowski, “Anything that depresses broadband usage is something that we need to be really concerned about. . . . We should all be concerned with anything that is incompatible with the psychology of abundance.”  This is the idea — popularized by non-economists and ideologues like Susan Crawford — that we should require networks to act as if we have “abundant” capacity, and enact regulations and restraints that prevent network operators from responding to actual scarcity with business structures, rational pricing or usage rules that could in any way deviate from this imaginary Nirvana.

This is rhetorical bunk.  The culprit here, if there is one, isn’t the firms that plow billions into expanding scarce capacity to meet abundant demand and struggle to manage their networks to maximize capacity within these constraints (dubbed “investment heroes” by the more reasonable lefties at the Progressive Policy Institute).  Firms act like there is scarcity because there is — and the FCC is largely to blame.  What we should be concerned about is not the psychology of abundance, but rather the sources of actual scarcity.

The FCC faces a stark choice—starting with tomorrow’s meeting.  The Commission can choose to continue to be the agency that micromanages scarcity as an activist intervenor in the market — screening-out some market participants as “too big,” and scrutinizing every scarcity-induced merger, deal, spectrum transfer, usage cap, pricing decision and content restriction for how much it deviates from a fanciful ideal.  Or it can position itself as the creator of true abundance and simply open the spectrum spigot that it has negligently blocked for years, delivering more bandwidth into the hands of everyone who wants it.

If the FCC chooses the latter course — if it designs effective auctions that attract sellers, permitting participation by all willing buyers — everyone benefits.  Firms won’t act like there is scarcity if there is no scarcity.  Investment in networks and the technology that maximizes their capacity will continue as long as those investments are secure and firms are allowed to realize a return — not lambasted every time they try to do so.

If, instead, the Commission remains in thrall to self-proclaimed consumer advocates (in truth, regulatory activists) who believe against all evidence that they can and should design industry’s structure (“big is bad!”) and second-guess every business decision (“psychology of abundance!”), everyone loses (except the activists, I suppose).  Firms won’t stop acting like there’s scarcity until there is no scarcity.  And investment will take a backseat to unpopular network management decisions that represent the only sensible responses to uncertain, over-regulated market conditions.