Archives For regulation

Today, in Horne v. Department of Agriculture, the U.S. Supreme Court held that the Fifth Amendment requires that the Government pay just compensation when it takes personal property, just as when it takes real property, and that the Government cannot make raisin growers relinquish their property without just compensation as a condition of selling their raisins in interstate commerce. This decision represents a major victory for economic liberty, but it is at best the first step in the reining in of anticompetitive cartel-like government regulation by government. (See my previous discussion of this matter at Truth on the Market here and a more detailed discussion of today’s decision here.) A capsule summary of the Court’s holding follows.

Most American raisins are grown in California. Under a United States Department of Agriculture Raisin Marketing Order, California raisin growers must give a percentage of their crop to a Raisin Administrative Committee (a government entity largely comprised of raisin producers appointed by the Secretary of Agriculture) to sell, allocate, or dispose of, and the government sets the compensation price that growers are paid for these “reserved” raisins. After selling the reserved raisins and deducting expenses, the Committee returns any net proceeds to the growers. The Hornes were assessed a fine of $480,000 plus a $200,000 civil penalty for refusing to set aside raisins for the government in 2002. The Hornes sued in court, arguing that the reserve requirement violated the Fifth Amendment Takings Clause. The Ninth Circuit rejected the Hornes’ claim that this was a per se taking, because personal property is entitled to less protection than private property, and concluded rather that this should be treated as a regulatory taking, such as a government condition on the grant of a land use permit. The Supreme Court reversed, holding that neither the text nor the history of the Takings Clause suggests that appropriation of personal property is different from appropriation of real property. The Court also held that the government may not avoid its categorical duty to pay just compensation by reserving to the property owner a contingent interest in the property. The Court further held that in this case, the government mandate to surrender property as a condition to engage in commerce effects a per se taking, noting that selling raisins in interstate commerce is “not a special governmental benefit that the Government may hold hostage, to be ransomed by the waiver of constitutional protection.” The Court majority determined that the case should not be remanded to the Ninth Circuit to calculate the amount of just compensation, because the government already did so when it fined the Hornes $480,000, the fair market value of the raisins.

The Horne decision is a victory for economic freedom and the right of individuals not to participate in government cartel schemes that harm the public interest. Unfortunately, however, it is a limited one. As the dissent by Justice Sotomayor indicates, “the Government . . . can permissibly achieve its market control goals by imposing a quota without offering raisin producers a way of reaping any return whatsoever on the raisins they cannot sell.” In short, today’s holding turns entirely on the conclusion that the raisin marketing order involves a “physical taking” of raisins. A more straightforward regulatory scheme under which the federal government directly limited production by raisin growers (much as the government did to a small wheat farmer in Wickard v. Filburn) likely would pass constitutional muster under modern Commerce Clause jurisprudence.

Thus, if it is truly interested in benefiting the American public and ferreting out special interest favoritism in agriculture, Congress should give serious consideration to prohibiting far more than production limitations in agricultural marketing orders. More generally, it should consider legislation to bar any regulatory restrictions that have the effect of limiting the freedom of individual farmers to grow and sell as much of their crop as they please. Such a rule would promote general free market competition, to the benefit of American consumers and the American economy.

Today, in Kimble v. Marvel Entertainment, a case involving the technology underlying the Spider-Man Web-Blaster, the Supreme Court invoked stare decisis to uphold an old precedent based on bad economics. In so doing, the Court spun a tangled web of formalism that trapped economic common sense within it, forgetting that, as Spider-Man was warned in 1962, “with great power there must also come – great responsibility.”

In 1990, Stephen Kimble obtained a patent on a toy that allows children (and young-at-heart adults) to role-play as “a spider person” by shooting webs—really, pressurized foam string—“from the palm of [the] hand.” Marvel Entertainment made and sold a “Web-Blaster” toy based on Kimble’s invention, without remunerating him. Kimble sued Marvel for patent infringement in 1997, and the parties settled, with Marvel agreeing to buy Kimble’s patent for a lump sum (roughly a half-million dollars) plus a 3% royalty on future sales, with no end date set for the payment of royalties.

Marvel subsequently sought a declaratory judgment in federal district court confirming that it could stop paying Kimble royalties after the patent’s expiration date. The district court granted relief, the Ninth Circuit Court of Appeals affirmed, and the Supreme Court affirmed the Ninth Circuit. In an opinion by Justice Kagan, joined by Justices Scalia, Kennedy, Ginsburg, Breyer, and Sotomayor, the Court held that a patentee cannot continue to receive royalties for sales made after his patent expires. Invoking stare decisis, the Court reaffirmed Brulotte v. Thys (1964), which held that a patent licensing agreement that provided for the payment of royalties accruing after the patent’s expiration was illegal per se, because it extended the patent monopoly beyond its statutory time period. The Kimble Court stressed that stare decisis is “the preferred course,” and noted that though the Brulotte rule may prevent some parties from entering into deals they desire, parties can often find ways to achieve similar outcomes.

Justice Alito, joined by Chief Justice Roberts and Justice Thomas, dissented, arguing that Brulotte is a “baseless and damaging precedent” that interferes with the ability of parties to negotiate licensing agreements that reflect the true value of a patent. More specifically:

“There are . . . good reasons why parties sometimes prefer post-expiration royalties over upfront fees, and why such arrangements have pro-competitive effects. Patent holders and licensees are often unsure whether a patented idea will yield significant economic value, and it often takes years to monetize an innovation. In those circumstances, deferred royalty agreements are economically efficient. They encourage innovators, like universities, hospitals, and other institutions, to invest in research that might not yield marketable products until decades down the line. . . . And they allow producers to hedge their bets and develop more products by spreading licensing fees over longer periods. . . . By prohibiting these arrangements, Brulotte erects an obstacle to efficient patent use. In patent law and other areas, we have abandoned per se rules with similarly disruptive effects. . . . [T]he need to avoid Brulotte is an economic inefficiency in itself. . . . And the suggested alternatives do not provide the same benefits as post-expiration royalty agreements. . . . The sort of agreements that Brulotte prohibits would allow licensees to spread their costs, while also allowing patent holders to capitalize on slow-developing inventions.”

Furthermore, the Supreme Court was willing to overturn a nearly century-old antitrust precedent that absolutely barred resale price maintenance in the Leegin case, despite the fact that the precedent was extremely well know (much better known than the Brulotte rule) and had prompted a vast array of contractual workarounds. Given the seemingly greater weight of the Leegin precedent, why was stare decisis set aside in Leegin, but not in Kimble? The Kimble majority’s argument that stare decisis should weigh more heavily in patent than in antitrust because, unlike the antitrust laws, “the patent laws do not turn over exceptional law-shaping authority to the courts”, is unconvincing. As the dissent explains:

“[T]his distinction is unwarranted. We have been more willing to reexamine antitrust precedents because they have attributes of common-law decisions. I see no reason why the same approach should not apply where the precedent at issue, while purporting to apply a statute, is actually based on policy concerns. Indeed, we should be even more willing to reconsider such a precedent because the role implicitly assigned to the federal courts under the Sherman [Antitrust] Act has no parallel in Patent Act cases.”

Stare decisis undoubtedly promotes predictability and the rule of law and, relatedly, institutional stability and efficiency – considerations that go to the costs of administering the legal system and of formulating private conduct in light of prior judicial precedents. The cost-based efficiency considerations underlying applying stare decisis to any particular rule, must, however, be weighed against the net economic benefits associated with abandonment of that rule. The dissent in Kimble did this, but the majority opinion regrettably did not.

In sum, let us hope that in the future the Court keeps in mind its prior advice, cited in Justice Alito’s dissent, that “stare decisis is not an ‘inexorable command’,” and that “[r]evisiting precedent is particularly appropriate where . . . a departure would not upset expectations, the precedent consists of a judge-made rule . . . , and experience has pointed up the precedent’s shortcomings.”

Remember when net neutrality wasn’t going to involve rate regulation and it was crazy to say that it would? Or that it wouldn’t lead to regulation of edge providers? Or that it was only about the last mile and not interconnection? Well, if the early petitions and complaints are a preview of more to come, the Open Internet Order may end up having the FCC regulating rates for interconnection and extending the reach of its privacy rules to edge providers.

On Monday, Consumer Watchdog petitioned the FCC to not only apply Customer Proprietary Network Information (CPNI) rules originally meant for telephone companies to ISPs, but to also start a rulemaking to require edge providers to honor Do Not Track requests in order to “promote broadband deployment” under Section 706. Of course, we warned of this possibility in our joint ICLE-TechFreedom legal comments:

For instance, it is not clear why the FCC could not, through Section 706, mandate “network level” copyright enforcement schemes or the DNS blocking that was at the heart of the Stop Online Piracy Act (SOPA). . . Thus, it would appear that Section 706, as re-interpreted by the FCC, would, under the D.C. Circuit’s Verizon decision, allow the FCC sweeping power to regulate the Internet up to and including (but not beyond) the process of “communications” on end-user devices. This could include not only copyright regulation but everything from cybersecurity to privacy to technical standards. (emphasis added).

While the merits of Do Not Track are debatable, it is worth noting that privacy regulation can go too far and actually drastically change the Internet ecosystem. In fact, it is actually a plausible scenario that overregulating data collection online could lead to the greater use of paywalls to access content.  This may actually be a greater threat to Internet Openness than anything ISPs have done.

And then yesterday, the first complaint under the new Open Internet rule was brought against Time Warner Cable by a small streaming video company called Commercial Network Services. According to several news stories, CNS “plans to file a peering complaint against Time Warner Cable under the Federal Communications Commission’s new network-neutrality rules unless the company strikes a free peering deal ASAP.” In other words, CNS is asking for rate regulation for interconnectionshakespeare. Under the Open Internet Order, the FCC can rule on such complaints, but it can only rule on a case-by-case basis. Either TWC assents to free peering, or the FCC intervenes and sets the rate for them, or the FCC dismisses the complaint altogether and pushes such decisions down the road.

This was another predictable development that many critics of the Open Internet Order warned about: there was no way to really avoid rate regulation once the FCC reclassified ISPs. While the FCC could reject this complaint, it is clear that they have the ability to impose de facto rate regulation through case-by-case adjudication. Whether it is rate regulation according to Title II (which the FCC ostensibly didn’t do through forbearance) is beside the point. This will have the same practical economic effects and will be functionally indistinguishable if/when it occurs.

In sum, while neither of these actions were contemplated by the FCC (they claim), such abstract rules are going to lead to random complaints like these, and companies are going to have to use the “ask FCC permission” process to try to figure out beforehand whether they should be investing or whether they’re going to be slammed. As Geoff Manne said in Wired:

That’s right—this new regime, which credits itself with preserving “permissionless innovation,” just put a bullet in its head. It puts innovators on notice, and ensures that the FCC has the authority (if it holds up in court) to enforce its vague rule against whatever it finds objectionable.

I mean, I don’t wanna brag or nothin, but it seems to me that we critics have been right so far. The reclassification of broadband Internet service as Title II has had the (supposedly) unintended consequence of sweeping in far more (both in scope of application and rules) than was supposedly bargained for. Hopefully the FCC rejects the petition and the complaint and reverses this course before it breaks the Internet.

The TCPA is an Antiquated Law

The TCPA is an Antiquated Law

The Telephone Consumer Protection Act (“TCPA”) is back in the news following a letter sent to PayPal from the Enforcement Bureau of the FCC.  At issue are amendments that PayPal intends to introduce into its end user agreement. Specifically, PayPal is planning on including an automated call and text message system with which it would reach out to its users to inform them of account updates, perform quality assurance checks, and provide promotional offers.

Enter the TCPA, which, as the Enforcement Bureau noted in its letter, has been used for over twenty years by the FCC to “protect consumers from harassing, intrusive, and unwanted calls and text messages.” The FCC has two primary concerns in its warning to PayPal. First, there was no formal agreement between PayPal and its users that would satisfy the FCC’s rules and allow PayPal to use an automated call system. And, perhaps most importantly, PayPal is not entitled to simply attach an “automated calls” clause to its user agreement as a condition of providing the PayPal service (as it clearly intends to do with its amendments).

There are a number of things wrong with the TCPA and the FCC’s decision to enforce its provisions against PayPal in the current instance. The FCC has the power to provide for some limited exemptions to the TCPA’s prohibition on automated dialing systems. Most applicable here, the FCC has the discretion to provide exemptions where calls to cell phone users won’t result in those users being billed for the calls. Although most consumers still buy plans that allot minutes for their monthly use, the practical reality for most cell phone users is that they no longer need to count minutes for every call. Users typically have a large number of minutes on their plans, and certainly many of those minutes can go unused. It seems that the progression of technology and the economics of cellphones over the last twenty-five years should warrant a Congressional revisit to the underlying justifications of at least this prohibition in the TCPA.

However, exceptions aside, there remains a much larger issue with the TCPA, one that is also rooted in the outdated technological assumptions underlying the law. The TCPA was meant to prevent dedicated telemarketing companies from using the latest in “automated dialing” technology circa 1991 from harassing people. It was not intended to stymie legitimate businesses from experimenting with more efficient methods of contacting their own customers.

The text of the law underscores its technological antiquity:  according to the TCPA, an “automatic telephone dialing system” means equipment which “has the capacity” to sequentially dial random numbers. This is to say, the equipment that was contemplated when the law was written was software-enabled phones that were purpose built to enable telemarketing firms to make blanket cold calls to every number in a given area code. The language clearly doesn’t contemplate phones connected to general purpose computing resources, as most phone systems are today.

The modern phone systems, connected to intelligent computer backends, are designed to flexibly reach out to hundreds or thousands of existing customers at a time, and in a way that efficiently enhances the customer’s experience with the company. Technically, yes, these systems are capable of auto-dialing a large number of random recipients; however, when a company like PayPal uses this technology, its purpose is clearly different than that employed by the equivalent of spammers on the phone system. Not having a nexus between an intent to random-dial and a particular harm experienced by an end user is a major hole in the TCPA. Particularly in this case, it seems fairly absurd that the TCPA could be used to prevent PayPal from interacting with its own customers.

Further, there is a lot at stake for those accused of violating the TCPA. In the PayPal warning letter, the FCC noted that it is empowered to levy a $16,000 fine per call or text message that it finds violates the terms of the TCPA. That’s bad, but it’s nowhere near as bad as it could get. The TCPA also contains a private right of action that was meant to encourage individual consumers to take telemarketers to small claims court in their local state.  Each individual consumer is entitled to receive provable damages or statutory damages of $500.00, whichever is greater. If willfulness can be proven, the damages are trebled, which in effect means that most individual plaintiffs in the know will plead willfulness, and wait for either a settlement conference or trial to sort the particulars out.

However, over the years a cottage industry has built up around class action lawyers aggregating “harmed” plaintiffs who had received unwanted automatic calls or texts, and forcing settlements in the tens of millions of dollars. The math is pretty simple. A large company with lots of customers may be tempted to use an automatic system to send out account information and offer alerts. If it sends out five hundred thousand auto calls or texts, that could result in “damages” in the amount of $250M in a class action suit. A settlement for five or ten million dollars is a deal by comparison. For instance, in 2013 Bank of America entered into a $32M settlement for texts and calls made between 2007 and 2013 to 7.7 million people.  If they had gone to trial and lost, the damages could have been as much as $3.8B!

The purpose of the TCPA was to prevent abusive telemarketers from harassing people, not to defeat the use of an entire technology that can be employed to increase efficiency for businesses and lower costs for consumers. The per call penalties associated with violating the TCPA, along with imprecise and antiquated language in the law, provide a major incentive to use the legal system to punish well-meaning companies that are just operating their non-telemarketing businesses in a reasonable manner. It’s time to seriously revise this law in light of the changes in technology over the past twenty-five years.

During the recent debate over whether to grant the Obama Administration “trade promotion authority” (TPA or fast track) to enter into major international trade agreements (such as the Trans-Pacific Partnership, or TPP), little attention has been directed to the problem of remaining anticompetitive governmental regulatory obstacles to liberalized trade and free markets.  Those remaining obstacles, which merit far more public attention, are highlighted in an article coauthored by Shanker Singham and me on competition policy and international trade distortions.

As our article explains, international trade agreements simply do not reach a variety of anticompetitive welfare-reducing government measures that create de facto trade barriers by favoring domestic interests over foreign competitors.  Moreover, many of these restraints are not in place to discriminate against foreign entities, but rather exist to promote certain favored firms. We dub these restrictions “anticompetitive market distortions” or “ACMDs,” in that they involve government actions that empower certain private interests to obtain or retain artificial competitive advantages over their rivals, be they foreign or domestic.  ACMDs are often a manifestation of cronyism, by which politically-connected enterprises successfully pressure government to shield them from effective competition, to the detriment of overall economic growth and welfare.  As we emphasize in our article, existing international trade rules have been able to reach ACMDs, which include: (1) governmental restraints that distort markets and lessen competition; and (2) anticompetitive private arrangements that are backed by government actions, have substantial effects on trade outside the jurisdiction that imposes the restrictions, and are not readily susceptible to domestic competition law challenge.  Among the most pernicious ACMDs are those that artificially alter the cost-base as between competing firms. Such cost changes will have large and immediate effects on market shares, and therefore on international trade flows.

Likewise, with the growing internationalization of commerce, ACMDs not only diminish domestic consumer welfare – they increasingly may have a harmful effect on foreign enterprises that seek to do business in the country imposing the restraint.  The home nations of the affected foreign enterprises, moreover, may as a practical matter find it not feasible to apply their competition laws extraterritorially to curb the restraint, given issues of jurisdictional reach and comity (particularly if the restraint flies under the colors of domestic law).  Because ACMDs also have not been constrained by international trade liberalization initiatives, they pose a serious challenge to global welfare enhancement by curtailing potential trade and investment opportunities.

Interest group politics and associated rent-seeking by well-organized private actors are endemic to modern economic life, guaranteeing that ACMDs will not easily be dismantled.  What is to be done, then, to curb ACMDs?

As a first step, Shanker Singham and I have proposed the development of a metric to estimate the net welfare costs of ACMDs.  Such a metric could help strengthen the hand of international organizations (including the International Competition Network, the World Bank, and the OECD) – and of reform-minded public officials – in building the case for dismantling these restraints, or (as a last resort) replacing them with less costly means for benefiting favored constituencies.  (Singham, two other coauthors, and I have developed a draft paper that delineates a specific metric, which we hope will be suitable for public release in the near future.)

Furthermore, free market-oriented think tanks can also be helpful by highlighting the harm special interest governmental restraints impose on the economy and on economic freedom.  In that regard, the Heritage Foundation’s excellent work in opposing cronyism deserves special mention.

Working to eliminate ACMDs and thereby promoting economic liberty is an arduous long-term task – one that will only succeed in increments, one battle at a time (the current principled effort to eliminate the Ex-Im Bank, strongly supported by the Heritage Foundation, is one such example).  Nevertheless, it is very much worth the candle.

The FCC’s proposed “Open Internet Order,” which would impose heavy-handed “common carrier” regulation of Internet service providers (the Order is being appealed in federal court and there are good arguments for striking it down) in order to promote “net neutrality,” is fundamentally misconceived.  If upheld, it will slow innovation, impose substantial costs, and harm consumers (see Heritage Foundation commentaries on FCC Internet regulation here, here, here, and here).  What’s more, it is not needed to protect consumers and competition from potential future abuse by Internet firms.  As I explain in a Heritage Foundation Legal Memorandum published yesterday, should the Open Internet Order be struck down, the U.S. Federal Trade Commission (FTC) has ample authority under Section 5 of the Federal Trade Commission Act (FTC Act) to challenge any harmful conduct by entities involved in Internet broadband services markets when such conduct undermines competition or harms consumers.

Section 5 of the FTC Act authorizes the FTC to prevent persons, partnerships, or corporations from engaging in “unfair methods of competition” or “unfair or deceptive acts or practices” in or affecting commerce.  This gives it ample authority to challenge Internet abuses raising antitrust (unfair methods) and consumer protection (unfair acts or practices) issues.

On the antitrust side, in evaluating individual business restraints under a “rule of reason,” the FTC relies on objective fact-specific analyses of the actual economic and consumer protection implications of a particular restraint.  Thus, FTC evaluations of broadband industry restrictions are likely to be more objective and predictable than highly subjective “public interest” assessments by the FCC, leading to reduced error and lower planning costs for purveyors of broadband and related services.  Appropriate antitrust evaluation should accord broad leeway to most broadband contracts.  As FTC Commissioner Josh Wright put it in testifying before Congress, “fundamental observation and market experience [demonstrate] that the business practices at the heart of the net neutrality debate are generally procompetitive.”  This suggests application of a rule of reason that will fully weigh efficiencies but not shy away from challenging broadband-related contractual arrangements that undermine the competitive process.

On the consumer protection side, the FTC can attack statements made by businesses that mislead and thereby impose harm on consumers (including business purchasers) who are acting reasonably.  It can also challenge practices that, though not literally false or deceptive, impose substantial harm on consumers (including business purchasers) that they cannot reasonably avoid, assuming the harm is greater than any countervailing benefits.  These are carefully designed and cabined sources of authority that require the FTC to determine the presence of actual consumer harm before acting.  Application of the FTC’s unfairness and deception powers therefore lacks the uncertainty associated with the FCC’s uncabined and vague “public interest” standard of evaluation.  As in the case of antitrust, the existence of greater clarity and a well-defined analytic methodology suggests that reliance on FTC rather than FCC enforcement in this area is preferable from a policy standpoint.

Finally, arguments for relying on FTC Internet policing are based on experience as well – the FTC is no Internet policy novice.  It closely monitors Internet activity and, over the years, it has developed substantial expertise in Internet topics through research, hearings, and enforcement actions.

Most recently, for example, the FTC sued AT&T in federal court for allegedly slowing wireless customers’ Internet speeds, although the customers had subscribed to “unlimited” data usage plans.  The FTC asserted that in offering renewals to unlimited-plan customers, AT&T did not adequately inform them of a new policy to “throttle” (drastically reduce the speed of) customer data service once a certain monthly data usage cap was met. The direct harm of throttling was in addition to the high early termination fees that dissatisfied customers would face for early termination of their services.  The FTC characterized this behavior as both “unfair” and “deceptive.”  Moreover, the commission claimed that throttling-related speed reductions and data restrictions were not determined by real-time network congestion and thus did not even qualify as reasonable network management activity.  This case illustrates that the FTC is perfectly capable of challenging potential “network neutrality” violations that harm consumer welfare (since “throttled” customers are provided service that is inferior to the service afforded customers on “tiered” service plans) and thus FCC involvement is unwarranted.

In sum, if a court strikes down the latest FCC effort to regulate the Internet, the FTC has ample authority to address competition and consumer protection problems in the area of broadband, including questions related to net neutrality.  The FTC’s highly structured, analytic, fact-based approach to these issues is superior to FCC net neutrality regulation based on vague and unfocused notions of the public interest.  If a court does not act, Congress might wish to consider legislation to prohibit FCC Internet regulation and leave oversight of potential competitive and consumer abuses to the FTC.

Understanding the nature and extent of the growth of the federal regulatory state is vital to sound policymaking.  Taking that to heart, over the last decade the Heritage Foundation has issued a series of reports measuring trends in federal regulatory activity.  On May 11 of this year, Heritage released its most recent regulatory study, “Red Tape Rising: Six Years of Escalating Regulation Under Obama” (RTP).  RTP, as the title suggests, paints a grim picture of rapidly escalating federal regulation unmoored from sound cost-benefit analysis – regulatory policy that is detrimental to American economic health.  Fortunately, RTP also suggests potential prescriptions to tame the federal regulatory virus.  You should read the entire study, but a few key excerpts from RTP, highlighted below, merit particular attention:

“The number and cost of government regulations continued to climb in 2014, intensifying Washington’s control over the economy and Americans’ lives. The addition of 27 new major rules last year pushed the tally for the Obama Administration’s first six years to 184, with scores of other rules in the pipeline. The cost of just these 184 rules is estimated by regulators to be nearly $80 billion annually, although the actual cost of this massive expansion of the administrative state is obscured by the large number of rules for which costs have not been fully quantified. Absent substantial reform, economic growth and individual freedom will continue to suffer.”

“President Barack Obama has repeatedly demonstrated his willingness to act by regulatory fiat instead of executing laws as passed by Congress. But regulatory overreach by the executive branch is only part of the problem. A great deal of the excessive regulation in the past six years is the result of Congress granting broad powers to agencies through passage of vast and vaguely worded legislation. The misnamed Affordable Care Act and the Dodd–Frank financial-regulation law top the list.”

“Many more regulations are on the way, with another 126 economically significant rules on the Administration’s agenda, such as directives to farmers for growing and harvesting fruits and vegetables; strict limits on credit access for service members; and, yet another redesign of light bulbs.”

“In many respects, the need for reform of the regulatory system has never been greater. The White House, Congress, and federal agencies routinely ignore regulatory costs, exaggerate benefits, and breach legislative and constitutional boundaries. They also increasingly dictate lifestyle choices rather than focusing on public health and safety.”

“Immediate reforms should include requiring legislation to undergo an analysis of regulatory impacts before a floor vote in Congress, and requiring every major regulation to obtain congressional approval before taking effect. Sunset deadlines should be set in law for all major rules, and independent agencies should be subject—as are executive branch agencies—to the White House regulatory review process.”

In light of its findings, RTP makes eight specific recommendations:

  1. Require congressional approval of new major regulations issued by agencies.
  2. Require regulatory impact assessments of proposed legislation.
  3. Establish a sunset date for regulations.
  4. Subject “independent” agencies to executive branch regulatory review.
  5. Codify stricter information-quality standards for rulemaking.
  6. Reform “sue and settle” practices (under which regulators work in concert with advocacy groups to produce settlements to lawsuits that result in greater regulation).
  7. Increase professional staff levels within the Office of Information and Regulatory Affairs (OIRA) (the small White House agency charged with reviewing proposed regulations).
  8. Codify the requirement now imposed by Executive Order 12866 mandating agencies to assess the costs and benefits of proposed rules and to consider alternatives.

These excellent recommendations merit serious consideration by federal policymakers.

Recently, Commissioner Pai praised the introduction of bipartisan legislation to protect joint sales agreements (“JSAs”) between local television stations. He explained that

JSAs are contractual agreements that allow broadcasters to cut down on costs by using the same advertising sales force. The efficiencies created by JSAs have helped broadcasters to offer services that benefit consumers, especially in smaller markets…. JSAs have served communities well and have promoted localism and diversity in broadcasting. Unfortunately, the FCC’s new restrictions on JSAs have already caused some stations to go off the air and other stations to carry less local news.

fccThe “new restrictions” to which Commissioner Pai refers were recently challenged in court by the National Association of Broadcasters (NAB), et. al., and on April 20, the International Center for Law & Economics and a group of law and economics scholars filed an amicus brief with the D.C. Circuit Court of Appeals in support of the petition, asking the court to review the FCC’s local media ownership duopoly rule restricting JSAs.

Much as it did with with net neutrality, the FCC is looking to extend another set of rules with no basis in sound economic theory or established facts.

At issue is the FCC’s decision both to retain the duopoly rule and to extend that rule to certain JSAs, all without completing a legally mandated review of the local media ownership rules, due since 2010 (but last completed in 2007).

The duopoly rule is at odds with sound competition policy because it fails to account for drastic changes in the media market that necessitate redefinition of the market for television advertising. Moreover, its extension will bring a halt to JSAs currently operating (and operating well) in nearly 100 markets.  As the evidence on the FCC rulemaking record shows, many of these JSAs offer public interest benefits and actually foster, rather than stifle, competition in broadcast television markets.

In the world of media mergers generally, competition law hasn’t yet caught up to the obvious truth that new media is competing with old media for eyeballs and advertising dollars in basically every marketplace.

For instance, the FTC has relied on very narrow market definitions to challenge newspaper mergers without recognizing competition from television and the Internet. Similarly, the generally accepted market in which Google’s search conduct has been investigated is something like “online search advertising” — a market definition that excludes traditional marketing channels, despite the fact that advertisers shift their spending between these channels on a regular basis.

But the FCC fares even worse here. The FCC’s duopoly rule is premised on an “eight voices” test for local broadcast stations regardless of the market shares of the merging stations. In other words, one entity cannot own FCC licenses to two or more TV stations in the same local market unless there are at least eight independently owned stations in that market, even if their combined share of the audience or of advertising are below the level that could conceivably give rise to any inference of market power.

Such a rule is completely unjustifiable under any sensible understanding of competition law.

Can you even imagine the FTC or DOJ bringing an 8 to 7 merger challenge in any marketplace? The rule is also inconsistent with the contemporary economic learning incorporated into the 2010 Merger Guidelines, which looks at competitive effects rather than just counting competitors.

Not only did the FCC fail to analyze the marketplace to understand how much competition there is between local broadcasters, cable, and online video, but, on top of that, the FCC applied this outdated duopoly rule to JSAs without considering their benefits.

The Commission offers no explanation as to why it now believes that extending the duopoly rule to JSAs, many of which it had previously approved, is suddenly necessary to protect competition or otherwise serve the public interest. Nor does the FCC cite any evidence to support its position. In fact, the record evidence actually points overwhelmingly in the opposite direction.

As a matter of sound regulatory practice, this is bad enough. But Congress directed the FCC in Section 202(h) of the Telecommunications Act of 1996 to review all of its local ownership rules every four years to determine whether they were still “necessary in the public interest as the result of competition,” and to repeal or modify those that weren’t. During this review, the FCC must examine the relevant data and articulate a satisfactory explanation for its decision.

So what did the Commission do? It announced that, instead of completing its statutorily mandated 2010 quadrennial review of its local ownership rules, it would roll that review into a new 2014 quadrennial review (which it has yet to perform). Meanwhile, the Commission decided to retain its duopoly rule pending completion of that review because it had “tentatively” concluded that it was still necessary.

In other words, the FCC hasn’t conducted its mandatory quadrennial review in more than seven years, and won’t, under the new rules, conduct one for another year and a half (at least). Oh, and, as if nothing of relevance has changed in the market since then, it “tentatively” maintains its already suspect duopoly rule in the meantime.

In short, because the FCC didn’t conduct the review mandated by statute, there is no factual support for the 2014 Order. By relying on the outdated findings from its earlier review, the 2014 Order fails to examine the significant changes both in competition policy and in the market for video programming that have occurred since the current form of the rule was first adopted, rendering the rulemaking arbitrary and capricious under well-established case law.

Had the FCC examined the record of the current rulemaking, it would have found substantial evidence that undermines, rather than supports, the FCC’s rule.

Economic studies have shown that JSAs can help small broadcasters compete more effectively with cable and online video in a world where their advertising revenues are drying up and where temporary economies of scale (through limited contractual arrangements like JSAs) can help smaller, local advertising outlets better implement giant, national advertising campaigns. A ban on JSAs will actually make it less likely that competition among local broadcasters can survive, not more.

OfficialPaiCommissioner Pai, in his dissenting statement to the 2014 Order, offered a number of examples of the benefits of JSAs (all of them studiously ignored by the Commission in its Order). In one of these, a JSA enabled two stations in Joplin, Missouri to use their $3.5 million of cost savings from a JSA to upgrade their Doppler radar system, which helped save lives when a devastating tornado hit the town in 2011. But such benefits figure nowhere in the FCC’s “analysis.”

Several econometric studies also provide empirical support for the (also neglected) contention that duopolies and JSAs enable stations to improve the quality and prices of their programming.

One study, by Jeff Eisenach and Kevin Caves, shows that stations operating under these agreements are likely to carry significantly more news, public affairs, and current affairs programming than other stations in their markets. The same study found an 11 percent increase in audience shares for stations acquired through a duopoly. Meanwhile, a study by Hal Singer and Kevin Caves shows that markets with JSAs have advertising prices that are, on average, roughly 16 percent lower than in non-duopoly markets — not higher, as would be expected if JSAs harmed competition.

And again, Commissioner Pai provides several examples of these benefits in his dissenting statement. In one of these, a JSA in Wichita, Kansas enabled one of the two stations to provide Spanish-language HD programming, including news, weather, emergency and community information, in a market where that Spanish-language programming had not previously been available. Again — benefit ignored.

Moreover, in retaining its duopoly rule on the basis of woefully outdated evidence, the FCC completely ignores the continuing evolution in the market for video programming.

In reality, competition from non-broadcast sources of programming has increased dramatically since 1999. Among other things:

  • VideoScreensToday, over 85 percent of American households watch TV over cable or satellite. Most households now have access to nearly 200 cable channels that compete with broadcast TV for programming content and viewers.
  • In 2014, these cable channels attracted twice as many viewers as broadcast channels.
  • Online video services such as Netflix, Amazon Prime, and Hulu have begun to emerge as major new competitors for video programming, leading 179,000 households to “cut the cord” and cancel their cable subscriptions in the third quarter of 2014 alone.
  • Today, 40 percent of U.S. households subscribe to an online streaming service; as a result, cable ratings among adults fell by nine percent in 2014.
  • At the end of 2007, when the FCC completed its last quadrennial review, the iPhone had just been introduced, and the launch of the iPad was still more than two years away. Today, two-thirds of Americans have a smartphone or tablet over which they can receive video content, using technology that didn’t even exist when the FCC last amended its duopoly rule.

In the face of this evidence, and without any contrary evidence of its own, the Commission’s action in reversing 25 years of agency practice and extending its duopoly rule to most JSAs is arbitrary and capricious.

The law is pretty clear that the extent of support adduced by the FCC in its 2014 Rule is insufficient. Among other relevant precedent (and there is a lot of it):

The Supreme Court has held that an agency

must examine the relevant data and articulate a satisfactory explanation for its action, including a rational connection between the facts found and the choice made.

In the DC Circuit:

the agency must explain why it decided to act as it did. The agency’s statement must be one of ‘reasoning’; it must not be just a ‘conclusion’; it must ‘articulate a satisfactory explanation’ for its action.

And:

[A]n agency acts arbitrarily and capriciously when it abruptly departs from a position it previously held without satisfactorily explaining its reason for doing so.

Also:

The FCC ‘cannot silently depart from previous policies or ignore precedent’ . . . .”

And most recently in Judge Silberman’s concurrence/dissent in the 2010 Verizon v. FCC Open Internet Order case:

factual determinations that underly [sic] regulations must still be premised on demonstrated — and reasonable — evidential support

None of these standards is met in this case.

It will be noteworthy to see what the DC Circuit does with these arguments given the pending Petitions for Review of the latest Open Internet Order. There, too, the FCC acted without sufficient evidentiary support for its actions. The NAB/Stirk Holdings case may well turn out to be a bellwether for how the court views the FCC’s evidentiary failings in that case, as well.

The scholars joining ICLE on the brief are:

  • Babette E. Boliek, Associate Professor of Law, Pepperdine School of Law
  • Henry N. Butler, George Mason University Foundation Professor of Law and Executive Director of the Law & Economics Center, George Mason University School of Law (and newly appointed dean).
  • Richard Epstein, Laurence A. Tisch Professor of Law, Classical Liberal Institute, New York University School of Law
  • Stan Liebowitz, Ashbel Smith Professor of Economics, University of Texas at Dallas
  • Fred McChesney, de la Cruz-Mentschikoff Endowed Chair in Law and Economics, University of Miami School of Law
  • Paul H. Rubin, Samuel Candler Dobbs Professor of Economics, Emory University
  • Michael E. Sykuta, Associate Professor in the Division of Applied Social Sciences and Director of the Contracting and Organizations Research Institute, University of Missouri

The full amicus brief is available here.

Last week the International Center for Law & Economics, joined by TechFreedom, filed comments with the Federal Aviation Administration (FAA) in its Operation and Certification of Small Unmanned Aircraft Systems (“UAS” — i.e, drones) proceeding to establish rules for the operation of small drones in the National Airspace System.

We believe that the FAA has failed to appropriately weigh the costs and benefits, as well as the First Amendment implications, of its proposed rules.

The FAA’s proposed drones rules fail to meet (or even undertake) adequate cost/benefit analysis

FAA regulations are subject to Executive Order 12866, which, among other things, requires that agencies:

  • “consider incentives for innovation,”
  • “propose or adopt a regulation only upon a reasoned determination that the benefits of the intended regulation justify its costs”;
  • “base [their] decisions on the best reasonably obtainable scientific, technical, economic, and other information”; and
  • “tailor [their} regulations to impose the least burden on society,”

The FAA’s proposed drone rules fail to meet these requirements.

An important, and fundamental, problem is that the proposed rules often seem to import “scientific, technical, economic, and other information” regarding traditional manned aircraft, rather than such knowledge specifically applicable to drones and their uses — what FTC Commissioner Maureen Ohlhausen has dubbed “The Procrustean Problem with Prescriptive Regulation.”

As such, not only do the rules often not make sense as a practical matter, they also seek to simply adapt existing standards, rules and understandings promulgated for manned aircraft to regulate drones — insufficiently tailoring the rules to “impose the least burden on society.”

In some cases the rules would effectively ban obviously valuable uses outright, disregarding the rules’ effect on innovation (to say nothing of their effect on current uses of drones) without adequately defending such prohibitions as necessary to protect public safety.

Importantly, the proposed rules would effectively prohibit the use of commercial drones for long-distance services (like package delivery and scouting large agricultural plots) and for uses in populated areas — undermining what may well be drones’ most economically valuable uses.

As our comments note:

By prohibiting UAS operation over people who are not directly involved in the drone’s operation, the rules dramatically limit the geographic scope in which UAS may operate, essentially limiting commercial drone operations to unpopulated or extremely sparsely populated areas. While that may be sufficient for important agricultural and forestry uses, for example, it effectively precludes all possible uses in more urban areas, including journalism, broadcasting, surveying, package delivery and the like. Even in nonurban areas, such a restriction imposes potentially insurmountable costs.

Mandating that operators not fly over other individuals not involved in the UAS operation is, in fact, the nail in the coffin of drone deliveries, an industry that is likely to offer a significant fraction of this technology’s potential economic benefit. Imposing such a blanket ban thus improperly ignores the important “incentives for innovation” suggested by Executive Order 12866 without apparent corresponding benefit.

The FAA’s proposed drone rules fail under First Amendment scrutiny

The FAA’s failure to tailor the rules according to an appropriate analysis of their costs and benefits also causes them to violate the First Amendment. Without proper tailoring based on the unique technological characteristics of drones and a careful assessment of their likely uses, the rules are considerably more broad than the Supreme Court’s “time, place and manner” standard would allow.

Several of the rules constitute a de facto ban on most — indeed, nearly all — of the potential uses of drones that most clearly involve the collection of information and/or the expression of speech protected by the First Amendment. As we note in our comments:

While the FAA’s proposed rules appear to be content-neutral, and will thus avoid the most-exacting Constitutional scrutiny, the FAA will nevertheless have a difficult time demonstrating that some of them are narrowly drawn and adequately tailored time, place, and manner restrictions.

Indeed, many of the rules likely amount to a prior restraint on protected commercial and non-commercial activity, both for obvious existing applications like news gathering and for currently unanticipated future uses.

Our friends Eli Dourado, Adam Thierer and Ryan Hagemann at Mercatus also filed comments in the proceeding, raising similar and analogous concerns:

As far as possible, we advocate an environment of “permissionless innovation” to reap the greatest benefit from our airspace. The FAA’s rules do not foster this environment. In addition, we believe the FAA has fallen short of its obligations under Executive Order 12866 to provide thorough benefit-cost analysis.

The full Mercatus comments, available here, are also recommended reading.

Read the full ICLE/TechFreedom comments here.

reason-mag-dont-tread-on-my-internetBen Sperry and I have a long piece on net neutrality in the latest issue of Reason Magazine entitled, “How to Break the Internet.” It’s part of a special collection of articles and videos dedicated to the proposition “Don’t Tread on My Internet!”

Reason has put together a great bunch of material, and packaged it in a special retro-designed page that will make you think it’s the 1990s all over again (complete with flaming graphics and dancing Internet babies).

Here’s a taste of our article:

“Net neutrality” sounds like a good idea. It isn’t.

As political slogans go, the phrase net neutrality has been enormously effective, riling up the chattering classes and forcing a sea change in the government’s decades-old hands-off approach to regulating the Internet. But as an organizing principle for the Internet, the concept is dangerously misguided. That is especially true of the particular form of net neutrality regulation proposed in February by Federal Communications Commission (FCC) Chairman Tom Wheeler.

Net neutrality backers traffic in fear. Pushing a suite of suggested interventions, they warn of rapacious cable operators who seek to control online media and other content by “picking winners and losers” on the Internet. They proclaim that regulation is the only way to stave off “fast lanes” that would render your favorite website “invisible” unless it’s one of the corporate-favored. They declare that it will shelter startups, guarantee free expression, and preserve the great, egalitarian “openness” of the Internet.

No decent person, in other words, could be against net neutrality.

In truth, this latest campaign to regulate the Internet is an apt illustration of F.A. Hayek’s famous observation that “the curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.” Egged on by a bootleggers-and-Baptists coalition of rent-seeking industry groups and corporation-hating progressives (and bolstered by a highly unusual proclamation from the White House), Chairman Wheeler and his staff are attempting to design something they know very little about-not just the sprawling Internet of today, but also the unknowable Internet of tomorrow.

And the rest of the contents of the site are great, as well. Among other things, there’s:

  • “Why are Edward Snowden’s supporters so eager to give the government more control over the Internet?” Matt Welch’s  take on the contradictions in the thinking of net neutrality’s biggest advocates.
  • “The Feds want a back door into your computer. Again.” Declan McCullagh on the eternal return of government attempts to pre-hack your technology.
  • “Uncle Sam wants your Fitbit.” Adam Thierer on the coming clampdown on data coursing through the Internet of Things.
  • Mike Godwin on how net neutrality can hurt developing countries most of all.
  • “How states are planning to grab tax dollars for online sales,” by Veronique de Rugy
  • FCC Commissioner Ajit Pai on why net neutrality is “a solution that won’t work to a problem that simply doesn’t exist.”
  • “8 great libertarian apps that make your world a little freer and a whole lot easier to navigate.”

There’s all that, plus enough flaming images and dancing babies to make your eyes bleed. Highly recommended!