The leading contribution to sound competition policy made by former Assistant U.S. Attorney General Makan Delrahim was his enunciation of the “New Madison Approach” to patent-antitrust enforcement—and, in particular, to the antitrust treatment of standard essential patent licensing (see, for example, here, here, and here). In short (citations omitted):
The New Madison Approach (“NMA”) advanced by former Assistant Attorney General for Antitrust Makan Delrahim is a simple analytical framework for understanding the interplay between patents and antitrust law arising out of standard setting. A key aspect of the NMA is its rejection of the application of antitrust law to the “hold-up” problem, whereby patent holders demand supposedly supra-competitive licensing fees to grant access to their patents that “read on” a standard – standard essential patents (“SEPs”). This scenario is associated with an SEP holder’s prior commitment to a standard setting organization (“SSO”), that is: if its patented technology is included in a proposed new standard, it will license its patents on fair, reasonable, and non-discriminatory (“FRAND”) terms. “Hold-up” is said to arise subsequently, when the SEP holder reneges on its FRAND commitment and demands that a technology implementer pay higher-than-FRAND licensing fees to access its SEPs.
The NMA has four basic premises that are aimed at ensuring that patent holders have adequate incentives to innovate and create welfare-enhancing new technologies, and that licensees have appropriate incentives to implement those technologies:
1. Hold-up is not an antitrust problem. Accordingly, an antitrust remedy is not the correct tool to resolve patent licensing disputes between SEP-holders and implementers of a standard.
2. SSOs should not allow collective actions by standard-implementers to disfavor patent holders in setting the terms of access to patents that cover a new standard.
3. A fundamental element of patent rights is the right to exclude. As such, SSOs and courts should be hesitant to restrict SEP holders’ right to exclude implementers from access to their patents, by, for example, seeking injunctions.
4. Unilateral and unconditional decisions not to license a patent should be per se legal.
Delrahim emphasizes that the threat of antitrust liability, specifically treble damages, distorts the incentives associated with good faith negotiations with SSOs over patent inclusion. Contract law, he goes on to note, is perfectly capable of providing an ex post solution to licensing disputes between SEP holders and implementers of a standard. Unlike antitrust law, a contract law framework allows all parties equal leverage in licensing negotiations.
[P]atented technology serves as a catalyst for the wealth-creating diffusion of innovation. This occurs through numerous commercialization methods; in the context of standardized technologies, the development of standards is a process of discovery. At each [SSO], the process of discussion and negotiation between engineers, businesspersons, and all other relevant stakeholders reveals the relative value of alternative technologies and tends to result in the best patents being integrated into a standard.
The NMA supports this process of discovery and implementation of the best patented technology born of the labors of the innovators who created it. As a result, the NMA ensures SEP valuations that allow SEP holders to obtain an appropriate return for the new economic surplus that results from the commercialization of standard-engendered innovations. It recognizes that dynamic economic growth is fostered through the incentivization of innovative activities backed by patents.
In sum, the NMA seeks to promote innovation by offering incentives for SEP-driven technological improvements. As such, it rejects as ill-founded prior Federal Trade Commission (FTC) litigation settlements and Obama-era U.S. Justice Department (DOJ) Antitrust Division policy statements that artificially favored implementor licensees’ interests over those of SEP licensors (see here).
In light of the NMA, DOJ cooperated with the U.S. Patent and Trademark Office and National Institute of Standards and Technology (NIST) in issuing a 2019 SEP Policy Statement clarifying that an SEP holder’s promise to license a patent on fair, reasonable, and non-discriminatory (FRAND) terms does not bar it from seeking any available remedy for patent infringement, including an injunction. This signaled that SEPs and non-SEP patents enjoy equivalent legal status.
DOJ also issued a 2020 supplement to its 2015 Institute of Electrical and Electronics Engineers (IEEE) business review letter. The 2015 letter had found no legal fault with revised IEEE standard-setting policies that implicitly favored implementers of standardized technology over SEP holders. The 2020 supplement characterized key elements of the 2015 letter as “outdated,” and noted that the anti-SEP bias of that document could “harm competition and chill innovation.”
Furthermore, DOJ issued a July 2019 Statement of Interest before the 9th U.S. Circuit Court of Appeals in FTC v. Qualcomm, explaining that unilateral and unconditional decisions not to license a patent are legal under the antitrust laws. In October 2020, the 9th Circuit reversed a district court decision and rejected the FTC’s monopolization suit against Qualcomm. The circuit court, among other findings, held that Qualcomm had no antitrust duty to license its SEPs to competitors.
Regrettably, the Biden Administration appears to be close to rejecting the NMA and to reinstituting the anti-strong patents SEP-skeptical views of the Obama administration (see here and here). DOJ already has effectively repudiated the 2020 supplement to the 2015 IEEE letter and the 2019 SEP Policy Statement. Furthermore, written responses to Senate Judiciary Committee questions by assistant attorney general nominee Jonathan Kanter suggest support for renewed antitrust scrutiny of SEP licensing. These developments are highly problematic if one supports dynamic economic growth.
Conclusion
The NMA represents a pro-American, pro-growth innovation policy prescription. Its abandonment would reduce incentives to invest in patents and standard-setting activities, to the detriment of the U.S. economy. Such a development would be particularly unfortunate at a time when U.S. Supreme Court decisions have weakened American patent rights (see here); China is taking steps to strengthen Chinese patents and raise incentives to obtain Chinese patents (see here); and China is engaging in litigation to weaken key U.S. patents and undermine American technological leadership (see here).
The rejection of NMA would also be in tension with the logic of the 5th U.S. Circuit Court of Appeals’ 2021 HTC v. Ericsson decision, which held that the non-discrimination portion of the FRAND commitment required Ericsson to give HTC the same licensing terms as given to larger mobile-device manufacturers. Furthermore, recent important European court decisions are generally consistent with NMA principles (see here).
Given the importance of dynamic competition in an increasingly globalized world economy, Biden administration officials may wish to take a closer look at the economic arguments supporting the NMA before taking final action to condemn it. Among other things, the administration might take note that major U.S. digital platforms, which are the subject of multiple U.S. and foreign antitrust enforcement investigations, tend to firmly oppose strong patents rights. As one major innovation economist recently pointed out:
If policymakers and antitrust gurus are so concerned about stemming the rising power of Big Tech platforms, they should start by first stopping the relentless attack on IP. Without the IP system, only the big and powerful have the privilege to innovate[.]
One of the key recommendations of the House Judiciary Committee’s antitrust report which seems to have bipartisan support (see Rep. Buck’s report) is shifting evidentiary burdens of proof to defendants with “monopoly power.” These recommended changes are aimed at helping antitrust enforcers and private plaintiffs “win” more. The result may well be more convictions, more jury verdicts, more consent decrees, and more settlements, but there is a cost.
Presumption of illegality for certain classes of defendants unless they can prove otherwise is inconsistent with the American traditions of the presumption of innocence and allowing persons to dispose of their property as they wish. Forcing antitrust defendants to defend themselves from what is effectively a presumption of guilt will create an enormous burden upon them. But this will be felt far beyond just antitrust defendants. Consumers who would have benefited from mergers that are deterred or business conduct that is prevented will have those benefits foregone.
The Presumption of Liberty in American Law
The Presumption of Innocence
There is nothing wrong with presumptions in law as a general matter. For instance, one of the most important presumptions in American law is that criminal defendants are presumed innocent until proven guilty. Prosecutors bear the burden of proof, and must prove guilt beyond a reasonable doubt. Even in the civil context, plaintiffs, whether public or private, have the burden of proving a violation of the law, by the preponderance of the evidence. In either case, the defendant is not required to prove they didn’t violate the law.
Fundamentally, the presumption of innocence is about liberty. As William Blackstone put it in his Commentaries on the Law of England centuries ago: “the law holds that it is better that ten guilty persons escape than that one innocent suffer.”
In economic terms, society must balance the need to deter bad conduct, however defined, with not deterring good conduct. In a world of uncertainty, this includes the possibility that decision-makers will get it wrong. For instance, if a mere allegation of wrongdoing places the burden upon a defendant to prove his or her innocence, much good conduct would be deterred out of fear of false allegations. In this sense, the presumption of innocence is important: it protects the innocent from allegations of wrongdoing, even if that means in some cases the guilty escape judgment.
Presumptions in Property, Contract, and Corporate Law
Similarly, presumptions in other areas of law protect liberty and are against deterring the good in the name of preventing the bad. For instance, the presumption when it comes to how people dispose of their property is that unless a law says otherwise, they may do as they wish. In other words, there is no presumption that a person may not use their property in a manner they wish to do so. The presumption is liberty, unless a valid law proscribes behavior. The exceptions to this rule typically deal with situations where a use of property could harm someone else.
In contracts, the right of persons to come to a mutual agreement is the general rule, with rare exceptions. The presumption is in favor of enforcing voluntary agreements. Default rules in the absence of complete contracting supplement these agreements, but even the default rules can be contracted around in most cases.
Bringing the two together, corporate law—essentially the nexus of contract law and property law— allows persons to come together to dispose of property and make contracts, supplying default rules which can be contracted around. The presumption again is that people are free to do as they choose with their own property. The default is never that people can’t create firms to buy or sell or make agreements.
A corollary right of the above is that people may start businesses and deal with others on whatever basis they choose, unless a generally applicable law says otherwise. In fact, they can even buy other businesses. Mergers and acquisitions are generally allowed by the law.
Presumptions in Antitrust Law
Antitrust is a generally applicable set of laws which proscribe how people can use their property. But even there, the presumption is not that every merger or act by a large company is harmful.
On the contrary, antitrust laws allow groups of people to dispose of property as they wish unless it can be shown that a firm has “market power” that is likely to be exercised to the detriment of competition or consumers. Plaintiffs, whether public or private, bear the burden of proving all the elements of the antitrust violation alleged.
In particular, antitrust law has incorporated the error cost framework. This framework considers the cost of getting decisions wrong. Much like the presumption of innocence is based on the tradeoff of allowing some guilty persons to go unpunished in order to protect the innocent, the error cost framework notes there is tradeoff between allowing some anticompetitive conduct to go unpunished in order to protect procompetitive conduct. American antitrust law seeks to avoid the condemnation of procompetitive conduct more than it avoids allowing the guilty to escape condemnation.
For instance, to prove a merger or acquisition would violate the antitrust laws, a plaintiff must show the transaction will substantially lessen competition. This involves defining the market, that the defendant has power over that market, and that the transaction would lessen competition. While concentration of the market is an important part of the analysis, antitrust law must consider the effect on consumer welfare as a whole. The law doesn’t simply condemn mergers or acquisitions by large companies just because they are large.
Similarly, to prove a monopolization claim, a plaintiff must establish the defendant has “monopoly power” in the relevant market. But monopoly power isn’t enough. As stated by the Supreme Court in Trinko:
The mere possession of monopoly power, and the concomitant charging of monopoly prices, is not only not unlawful; it is an important element of the free-market system. The opportunity to charge monopoly prices—at least for a short period— is what attracts “business acumen” in the first place; it induces risk taking that produces innovation and economic growth. To safeguard the incentive to innovate, the possession of monopoly power will not be found unlawful unless it is accompanied by an element of anticompetitive conduct.
The plaintiff must also prove the defendant has engaged in the “willful acquisition or maintenance of [market] power, as distinguished from growth or development as a consequence of a superior product, business acumen, or historical accident.” Antitrust law is careful to avoid mistaken inferences and false condemnations, which are especially costly because they “chill the very conduct antitrust laws are designed to protect.”
The presumption isn’t against mergers or business conduct even when those businesses are large. Antitrust law only condemns mergers or business conduct when it is likely to harm consumers.
How Changing Antitrust Presumptions will Harm Society
In light of all of this, the House Judiciary Committee’s Investigation of Competition in Digital Markets proposes some pretty radical departures from the law’s normal presumption in favor of people disposing property how they choose. Unfortunately, the minority report issued by Representative Buck agrees with the recommendations to shift burdens onto antitrust defendants in certain cases.
One of the recommendations from the Subcommittee is that Congress:
“codify[] bright-line rules for merger enforcement, including structural presumptions. Under a structural presumption, mergers resulting in a single firm controlling an outsized market share, or resulting in a significant increase in concentration, would be presumptively prohibited under Section 7 of the Clayton Act. This structural presumption would place the burden of proof upon the merging parties to show that the merger would not reduce competition. A showing that the merger would result in efficiencies should not be sufficient to overcome the presumption that it is anticompetitive. It is the view of Subcommittee staff that the 30% threshold established by the Supreme Court in Philadelphia National Bank is appropriate, although a lower standard for monopsony or buyer power claims may deserve consideration by the Subcommittee. By shifting the burden of proof to the merging parties in cases involving concentrated markets and high market shares, codifying the structural presumption would help promote the efficient allocation of agency resources and increase the likelihood that anticompetitive mergers are blocked. (emphasis added)
Under this proposal, in cases where concentration meets an arbitrary benchmark based upon the market definition, the presumption will be that the merger is illegal. Defendants will now bear the burden of proof to show the merger won’t reduce competition, without even getting to refer to efficiencies that could benefit consumers.
Changing the burden of proof to be against criminal defendants would lead to more convictions of guilty people, but it would also lead to a lot more false convictions of innocent defendants. Similarly, changing the burden of proof to be against antitrust defendants would certainly lead to more condemnations of anticompetitive mergers, but it would also lead to the deterrence of a significant portion of procompetitive mergers.
So yes, if adopted, plaintiffs would likely win more as a result of these proposed changes, including in cases where mergers are anticompetitive. But this does not necessarily mean it would be to the benefit of larger society.
Antitrust has evolved over time to recognize that concentration alone is not predictive of likely competitive harm in merger analysis. Both the horizontal merger guidelines and the vertical merger guidelines issued by the FTC and DOJ emphasize the importance of fact-specific inquiries into competitive effects, and not just a reliance on concentration statistics. This reflected a long-standing bipartisan consensus. The HJC’s majority report overturns this consensus by suggesting a return to the structural presumptions which have largely been rejected in antitrust law.
The HJC majority report also calls for changes in presumptions when it comes to monopolization claims. For instance, the report calls on Congress to consider creating a statutory presumption of dominance by a seller with a market share of 30% or more and a presumption of dominance by a buyer with a market share of 25% or more. The report then goes on to suggest overturning a number of precedents dealing with monopolization claims which in their view restricted claims of tying, predatory pricing, refusals to deal, leveraging, and self-preferencing. In particular, they call on Congress to “[c]larify[] that ‘false positives’ (or erroneous enforcement) are not more costly than ‘false negatives’ (erroneous non-enforcement), and that, when relating to conduct or mergers involving dominant firms, ‘false negatives’ are costlier.”
This again completely turns the ordinary presumptions about innocence and allowing people to dispose of the property as they see fit on their head. If adopted, defendants would largely have to prove their innocence in monopolization cases if their shares of the market are above a certain threshold.
Moreover, the report calls for Congress to consider making conduct illegal even if it “can be justified as an improvement for consumers.” It is highly likely that the changes proposed will harm consumer welfare in many cases, as the focus changes from economic efficiency to concentration.
Conclusion
The HJC report’s recommendations on changing antitrust presumptions should be rejected. The harms will be felt not only by antitrust defendants, who will be much more likely to lose regardless of whether they have violated the law, but by consumers whose welfare is no longer the focus. The result is inconsistent with the American tradition that presumes innocence and the ability of people to dispose of their property as they see fit.
One of the great scholars of law & economics turns 90 years old today. In his long and distinguished career, Thomas Sowell has written over 40 books and countless opinion columns. He has been a professor of economics and a long-time Senior Fellow at the Hoover Institution. He received a National Humanities Medal in 2002 for a lifetime of scholarship, which has only continued since then. His ability to look at issues with an international perspective, using the analytical tools of economics to better understand institutions, is an inspiration to us at the International Center for Law & Economics.
Here, almost as a blog post festschrift as a long-time reader of his works, I want to briefly write about how Sowell’s voluminous writings on visions, law, race, and economics could be the basis for a positive agenda to achieve a greater measure of racial justice in the United States.
The Importance of Visions
One of the most important aspects of Sowell’s work is his ability to distill wide-ranging issues into debates involving different mental models, or a “Conflict of Visions.” He calls one vision the “tragic” or “constrained” vision, which sees all humans as inherently limited in knowledge, wisdom, and virtue, and fundamentally self-interested even at their best. The other vision is the “utopian” or “unconstrained” vision, which sees human limitations as artifacts of social arrangements and cultures, and that there are some capable by virtue of superior knowledge and morality that can redesign society to create a better world.
An implication of the constrained vision is that the difference in knowledge and virtue between the best and the worst in society is actually quite small. As a result, no one person or group of people can be trusted with redesigning institutions which have spontaneously evolved. The best we can hope for is institutions that reasonably deter bad conduct and allow people the freedom to solve their own problems.
An important implication of the unconstrained vision, on the other hand, is that there are some who because of superior enlightenment, which Sowell calls the “Vision of the Anointed,” can redesign institutions to fundamentally change human nature, which is seen as malleable. Institutions are far more often seen as the result of deliberate human design and choice, and that failures to change them to be more just or equal is a result of immorality or lack of will.
The importance of visions to how we view things like justice and institutions makes all the difference. In the constrained view, institutions like language, culture, and even much of the law result from the “spontaneous ordering” that is the result of human action but not of human design. Limited government, markets, and tradition are all important in helping individuals coordinate action. Markets work because self-interested individuals benefit when they serve others. There are no solutions to difficult societal problems, including racism, only trade-offs.
But in the unconstrained view, limits on government power are seen as impediments to public-spirited experts creating a better society. Markets, traditions, and cultures are to be redesigned from the top down by those who are forward-looking, relying on their articulated reason. There is a belief that solutions could be imposed if only there is sufficient political will and the right people in charge. When it comes to an issue like racism, those who are sufficiently “woke” should be in charge of redesigning institutions to provide for a solution to things like systemic racism.
For Sowell, what he calls “traditional justice” is achieved by processes that hold people accountable for harms to others. Its focus is on flesh-and-blood human beings, not abstractions like all men or blacks versus whites. Differences in outcomes are not just or unjust, by this point of view, what is important is that the processes are just. These processes should focus on institutional incentives of participants. Reforms should be careful not to upset important incentive structures which have evolved over time as the best way for limited human beings to coordinate behavior.
The “Quest for Cosmic Justice,” on the other hand, flows from the unconstrained vision. Cosmic justice sees disparities between abstract groups, like whites and blacks, as unjust and in need of correction. If results from impartial processes like markets or law result in disparities, those with an unconstrained vision often see those processes as themselves racist. The conclusion is that the law should intervene to create better outcomes. This presumes considerable knowledge and morality on behalf of those who are in charge of the interventions.
For Sowell, a large part of his research project has been showing that those with the unconstrained vision often harm those they are proclaiming the intention to help in their quest for cosmic justice.
A Constrained Vision of Racial Justice
Sowell has written quite a lot on race, culture, intellectuals, economics, and public policy. One of the main thrusts of his argument about race is that attempts at cosmic justice often harm living flesh-and-blood individuals in the name of intertemporal abstractions like “social justice” for black Americans. Sowell nowhere denies that racism is an important component of understanding the history of black Americans. But his constant challenge is that racism can’t be the only variable which explains disparities. Sowell points to the importance of culture and education in building human capital to be successful in market economies. Without taking those other variables into account, there is no way to determine the extent that racism is the cause of disparities.
This has important implications for achieving racial justice today. When it comes to policies pursued in the name of racial justice, Sowell has argued that many programs often harm not only members of disfavored groups, but the members of the favored groups.
For instance, Sowell has argued that affirmative action actually harms not only flesh-and-blood white and Asian-Americans who are passed over, but also harms those African-Americans who are “mismatched” in their educational endeavors and end up failing or dropping out of schools when they could have been much better served by attending schools where they would have been very successful. Another example Sowell often points to is minimum wage legislation, which is often justified in the name of helping the downtrodden, but has the effect of harming low-skilled workers by increasing unemployment, most especially young African-American males.
Any attempts at achieving racial justice, in terms of correcting historical injustices, must take into account how changes in processes could actually end up hurting flesh-and-blood human beings, especially when those harmed are black Americans.
A Positive Agenda for Policy Reform
In Sowell’s constrained vision, a large part of the equation for African-American improvement is going to be cultural change. However, white Americans should not think that this means they have no responsibility in working towards racial justice. A positive agenda must take into consideration real harms experienced by African-Americans due to government action (and inaction). Thus, traditional justice demands institutional reforms, and in some cases, recompense.
The policy part of this equation outlined below is motivated by traditional justice concerns that hold people accountable under the rule of law for violations of constitutional rights and promotes institutional reforms to more properly align incentives.
What follows below are policy proposals aimed at achieving a greater degree of racial justice for black Americans, but fundamentally informed by the constrained vision and traditional justice concerns outlined by Sowell. Most of these proposals are not on issues Sowell has written a lot on. In fact, some proposals may actually not be something he would support, but are—in my opinion—consistent with the constrained vision and traditional justice.
Reparations for Historical Rights Violations
Sowell once wrote this in regards to reparations for black Americans:
Nevertheless, it remains painfully clear that those people who were torn from their homes in Africa in centuries past and forcibly brought across the Atlantic in chains suffered not only horribly, but unjustly. Were they and their captors still alive, the reparations and retribution owed would be staggering. Time and death, however, cheat us of such opportunities for justice, however galling that may be. We can, of course, create new injustices among our flesh-and-blood contemporaries for the sake of symbolic expiation, so that the son or daughter of a black doctor or executive can get into an elite college ahead of the son or daughter of a white factory worker or farmer, but only believers in the vision of cosmic justice are likely to take moral solace from that. We can only make our choices among alternatives actually available, and rectifying the past is not one of those options.
In other words, if the victims and perpetrators of injustice are no longer alive, it is not just to hold entire members of respective races accountable for crimes which they did not commit. However, this would presumably leave open the possibility of applying traditional justice concepts in those cases where death has not cheated us.
For instance, there are still black Americans alive who suffered from Jim Crow, as well as children and family members of those lynched. While it is too little, too late, it seems consistent with traditional justice to still seek out and prosecute criminally perpetrators who committed heinous acts but a few generations ago against still living victims. This is not unprecedented. Old Nazis are still prosecuted for crimes against Jews. A similar thing could be done in the United States.
Similarly, civil rights lawsuits for the damages caused by Jim Crow could be another way to recompense those who were harmed. Alternatively, it could be done by legislation. The Civil Liberties Act of 1988 was passed under President Reagan and gave living Japanese Americans who were interned during World War II some limited reparations. A similar system could be set up for living victims of Jim Crow.
Statutes of limitations may need to be changed to facilitate these criminal prosecutions and civil rights lawsuits, but it is quite clearly consistent with the idea of holding flesh-and-blood persons accountable for their unlawful actions.
Holding flesh-and-blood perpetrators accountable for rights violations should not be confused with the cosmic justice idea—that Sowell consistently decries—that says intertemporal abstractions can be held accountable for crimes. In other words, this is not holding “whites” accountable for all historical injustices to “blacks.” This is specifically giving redress to victims and deterring future bad conduct.
End Qualified Immunity
Another way to promote racial justice consistent with the constrained vision is to end one of the Warren Court’s egregious examples of judicial activism: qualified immunity. Qualified immunity is nowhere mentioned in the statute for civil rights, 42 USC § 1983. As Sowell argues in his writings, judges in the constrained vision are supposed to declare what the law is, not what they believe it should be, unlike those in the unconstrained vision who—according to Sowell— believe they have the right to amend the laws through judicial edict. The introduction of qualified immunity into the law by the activist Warren Court should be overturned.
In a civil rights lawsuit, the goal is to make the victim (or their families) of a rights violation whole by monetary damages. From a legal perspective, this is necessary to give the victim justice. From an economic perspective this is necessary to deter future bad conduct and properly align ex ante incentives going forward. Under a well-functioning system, juries would, after hearing all the evidence, make a decision about whether constitutional rights were violated and the extent of damages. A functioning system of settlements would result as a common law develops determining what counts as reasonable or unreasonable uses of force. This doesn’t mean plaintiffs always win, either. Officers may be determined to be acting reasonably under the circumstances once all the evidence is presented to a jury.
However, one of the greatest obstacles to holding police officers accountable in misconduct cases is the doctrine of qualified immunity… courts have widely expanded its scope to the point that qualified immunity is now protecting officers even when their conduct violates the law, as long as the officers weren’t on clear notice from specific judicial precedent that what they did was illegal when they did it… This standard has predictably led to a situation where officer misconduct which judges and juries would likely find egregious never makes it to court. The Cato Institute’s website Unlawful Shield details many cases where federal courts found an officer’s conduct was illegal yet nonetheless protected by qualified immunity.
Immunity of this nature has profound consequences on the incentive structure facing police officers. Police officers, as well as the departments that employ them, are insufficiently accountable when gross misconduct does not get past a motion to dismiss for qualified immunity… The result is to encourage police officers to take insufficient care when making the choice about the level of force to use.
Those with a constrained vision focus on processes and incentives. In this case, it is police officers who have insufficient incentives to take reasonable care when they receive qualified immunity for their conduct.
End the Drug War
While not something he has written a lot on, Sowell has argued for the decriminalization of drugs, comparing the War on Drugs to the earlier attempts at Prohibition of alcohol. This is consistent with the constrained vision, which cares about the institutional incentives created by law.
Interestingly, work by Michelle Alexander in the second chapter of The New Jim Crow is largely consistent with Sowell’s point of view. There she argued the institutional incentives of police departments were systematically changed when the drug war was ramped up.
Alexander asks a question which is right in line with the constrained vision:
[I]t is fair to wonder why the police would choose to arrest such an astonishing percentage of the American public for minor drug crimes. The fact that police are legally allowed to engage in a wholesale roundup of nonviolent drug offenders does not answer the question why they would choose to do so, particularly when most police departments have far more serious crimes to prevent and solve. Why would police prioritize drug-law enforcement? Drug use and abuse is nothing new; in fact, it was on the decline, not on the rise, when the War on Drugs began.
Alexander locates the impetus for ramping up the drug war in federal subsidies:
In 1988, at the behest of the Reagan administration, Congress revised the program that provides federal aid to law enforcement, renaming it the Edward Byrne Memorial State and Local Law Enforcement Assistance Program after a New York City police officer who was shot to death while guarding the home of a drug-case witness. The Byrne program was designed to encourage every federal grant recipient to help fight the War on Drugs. Millions of dollars in federal aid have been offered to state and local law enforcement agencies willing to wage the war. By the late 1990s, the overwhelming majority of state and local police forces in the country had availed themselves of the newly available resources and added a significant military component to buttress their drug-war operations.
On top of that, police departments were benefited by civil asset forfeiture:
As if the free military equipment, training, and cash grants were not enough, the Reagan administration provided law enforcement with yet another financial incentive to devote extraordinary resources to drug law enforcement, rather than more serious crimes: state and local law enforcement agencies were granted the authority to keep, for their own use, the vast majority of cash and assets they seize when waging the drug war. This dramatic change in policy gave state and local police an enormous stake in the War on Drugs—not in its success, but in its perpetual existence. Suddenly, police departments were capable of increasing the size of their budgets, quite substantially, simply by taking the cash, cars, and homes of people suspected of drug use or sales. Because those who were targeted were typically poor or of moderate means, they often lacked the resources to hire an attorney or pay the considerable court costs. As a result, most people who had their cash or property seized did not challenge the government’s action, especially because the government could retaliate by filing criminal charges—baseless or not.
As Alexander notes, black Americans (and other minorities) were largely targeted in this ramped up War on Drugs, noting the drug war’s effects have been to disproportionately imprison black Americans even though drug usage and sales are relatively similar across races. Police officers have incredible discretion in determining who to investigate and bring charges against. When it comes to the drug war, this discretion is magnified because the activity is largely consensual, meaning officers can’t rely on victims to come to them to start an investigation. Alexander finds the reason the criminal justice system has targeted black Americans is because of implicit bias in police officers, prosecutors, and judges, which mirrors the bias shown in media coverage and in larger white American society.
Anyone inspired by Sowell would need to determine whether this is because of racism or some other variable. It is important to note here that Sowell never denies that racism exists or is a real problem in American society. But he does challenge us to determine whether this alone is the cause of disparities. Here, Alexander makes a strong case that it is implicit racism that causes the disparities in enforcement of the War on Drugs. A race-neutral explanation could be as follows, even though it still suggests ending the War on Drugs: the enforcement costs against those unable to afford to challenge the system are lower. And black Americans are disproportionately represented among the poor in this country. As will be discussed below in the section on reforming indigent criminal defense, most prosecutions are initiated against defendants who can’t afford a lawyer. The result could be racially disparate even without a racist motivation.
Regardless of whether racism is the variable that explains the disparate impact of the War on Drugs, it should be ended. This may be an area where traditional and cosmic justice concerns can be united in an effort to reform the criminal justice system.
Reform Indigent Criminal Defense
A related aspect of how the criminal justice system has created a real barrier for far too many black Americans is the often poor quality of indigent criminal defense. Indigent defense is a large part of criminal defense in this country since a very high number of criminal prosecutions are initiated against those who are often too poor to afford a lawyer (roughly 80%). Since black Americans are disproportionately represented among the indigent and those in the criminal justice system, it should be no surprise that black Americans are disproportionately represented by public defenders in this country.
According to the constrained vision, it is important to look at the institutional incentives of public defenders. Considering the extremely high societal costs of false convictions, it is important to get these incentives right.
David Friedman and Stephen Schulhofer’s seminal article exploring the law & economics of indigent criminal defense highlighted the conflict of interest inherent in government choosing who represents criminal defendants when the government is in charge of prosecuting. They analyzed each of the models used in the United States for indigent defense from an economic point of view and found each wanting. On top of that, there is also a calculation problem inherent in government-run public defender’s offices whereby defendants may be systematically deprived of viable defense strategies because of a lack of price signals.
An interesting alternative proposed by Friedman and Schulhofer is a voucher system. This is similar to the voucher system Sowell has often touted for education. The idea would be that indigent criminal defendants get to pick the lawyer of their choosing that is part of the voucher program. The government would subsidize the provision of indigent defense, in this model, but would not actually pick the lawyer or run the public defender organization. Incentives would be more closely aligned between the defendant and counsel.
Conclusion
While much more could be said consistent with the constrained vision that could help flesh-and-blood black Americans, including abolishing occupational licensing, ending wage controls, promoting school choice, and ending counterproductive welfare policies, this is enough for now. Racial justice demands holding rights violators accountable and making victims whole. Racial justice also means reforming institutions to make sure incentives are right to deter conduct which harms black Americans. However, the growing desire to do something to promote racial justice in this country should not fall into the trap of cosmic justice thinking, which often ends up hurting flesh-and-blood people of all races in the present in the name of intertemporal abstractions.
Happy 90th birthday to one of the greatest law & economics scholars ever, Dr. Thomas Sowell.
[TOTM: The following is part of a blog series by TOTM guests and authors on the law, economics, and policy of the ongoing COVID-19 pandemic. The entire series of posts is available here.
This post is authored by Dirk Auer, (Senior Researcher, Liege Competition & Innovation Institute; Senior Fellow, ICLE).]
Privacy absolutism is the misguided belief that protecting citizens’ privacy supersedes all other policy goals, especially economic ones. This is a mistake. Privacy is one value among many, not an end in itself. Unfortunately, the absolutist worldview has filtered into policymaking and is beginning to have very real consequences. Readers need look no further than contact tracing applications and the fight against Covid-19.
Covid-19 has presented the world with a privacy conundrum worthy of the big screen. In fact, it’s a plotline we’ve seen before. Moviegoers will recall that, in the wildly popular film “The Dark Knight”, Batman has to decide between preserving the privacy of Gotham’s citizens or resorting to mass surveillance in order to defeat the Joker. Ultimately, the caped crusader begrudgingly chooses the latter. Before the Covid-19 outbreak, this might have seemed like an unrealistic plot twist. Fast forward a couple of months, and it neatly illustrates the difficult decision that most western societies urgently need to make as they consider the use of contract tracing apps to fight Covid-19.
Contact tracing is often cited as one of the most promising tools to safely reopen Covid-19-hit economies. Unfortunately, its adoption has been severely undermined by a barrage of overblown privacy fears.
Take the contact tracing API and App co-developed by Apple and Google. While these firms’ efforts to rapidly introduce contact tracing tools are laudable, it is hard to shake the feeling that they have been holding back slightly.
In an overt attempt to protect users’ privacy, Apple and Google’s joint offering does not collect any location data (a move that has irked some states). Similarly, both firms have repeatedly stressed that users will have to opt-in to their contact tracing solution (as opposed to the API functioning by default). And, of course, all the data will be anonymous – even for healthcare authorities.
This is a missed opportunity. Google and Apple’s networks include billions of devices. That puts them in a unique position to rapidly achieve the scale required to successfully enable the tracing of Covid-19 infections. Contact tracing applications need to reach a critical mass of users to be effective. For instance, some experts have argued that an adoption rate of at least 60% is necessary. Unfortunately, existing apps – notably in Singapore, Australia, Norway and Iceland – have struggled to get anywhere near this number. Forcing users to opt-out of Google and Apple’s services could go a long way towards inverting this trend. Businesses could also boost these numbers by making them mandatory for their employees and consumers.
However, it is hard to blame Google or Apple for not pushing the envelope a little bit further. For the best part of a decade, they and other firms have repeatedly faced specious accusations of “surveillance capitalism”. This has notably resulted in heavy-handed regulation (including the GDPR, in the EU, and the CCPA, in California), as well as significant fines and settlements.
Those chickens have now come home to roost. The firms that are probably best-placed to implement an effective contact tracing solution simply cannot afford the privacy-related risks. This includes the risk associated with violating existing privacy law, but also potential reputational consequences.
Matters have also been exacerbated by the overly cautious stance of many western governments, as well as their citizens:
The European Data Protection Board cautioned governments and private sector actors to anonymize location data collected via contact tracing apps. The European Parliament made similar pronouncements.
A group of Democratic Senators pushed back against Apple and Google’s contact tracing solution, notably due to privacy considerations.
And public support for contact tracing is also critically low. Surveys in the US show that contact tracing is significantly less popular than more restrictive policies, such as business and school closures. Similarly, polls in the UK suggest that between 52% and 62% of Britons would consider using contact tracing applications.
Belgium’s initial plans for a contact tracing application were struck down by its data protection authority on account that they did not comply with the GDPR.
Finally, across the globe, there has been pushback against so-called “centralized” tracing apps, notably due to privacy fears.
In short, the West’s insistence on maximizing privacy protection is holding back its efforts to combat the joint threats posed by Covid-19 and the unfolding economic recession.
But contrary to the mass surveillance portrayed in the Dark Knight, the privacy risks entailed by contact tracing are for the most part negligible. State surveillance is hardly a prospect in western democracies. And the risk of data breaches is no greater here than with many other apps and services that we all use daily. To wit, password, email, and identity theft are still, by far, the most common targets for cyber attackers. Put differently, cyber criminals appear to be more interested in stealing assets that can be readily monetized, rather than location data that is almost worthless. This suggests that contact tracing applications, whether centralized or not, are unlikely to be an important target for cyberattackers.
The meagre risks entailed by contact tracing – regardless of how it is ultimately implemented – are thus a tiny price to pay if they enable some return to normalcy. At the time of writing, at least 5,8 million human beings have been infected with Covid-19, causing an estimated 358,000 deaths worldwide. Both Covid-19 and the measures destined to combat it have resulted in a collapse of the global economy – what the IMF has called “the worst economic downturn since the great depression”. Freedoms that the west had taken for granted have suddenly evaporated: the freedom to work, to travel, to see loved ones, etc. Can anyone honestly claim that is not worth temporarily sacrificing some privacy to partially regain these liberties?
More generally, it is not just contact tracing applications and the fight against Covid-19 that have suffered because of excessive privacy fears. The European GDPR offers another salient example. Whatever one thinks about the merits of privacy regulation, it is becoming increasingly clear that the EU overstepped the mark. For instance, an early empirical study found that the entry into force of the GDPR markedly decreased venture capital investments in Europe. Michal Gal aptly summarizes the implications of this emerging body of literature:
The price of data protection through the GDPR is much higher than previously recognized. The GDPR creates two main harmful effects on competition and innovation: it limits competition in data markets, creating more concentrated market structures and entrenching the market power of those who are already strong; and it limits data sharing between different data collectors, thereby preventing the realization of some data synergies which may lead to better data-based knowledge. […] The effects on competition and innovation identified may justify a reevaluation of the balance reached to ensure that overall welfare is increased.
In short, just like the Dark Knight, policymakers, firms and citizens around the world need to think carefully about the tradeoff that exists between protecting privacy and other objectives, such as saving lives, promoting competition, and increasing innovation. As things stand, however, it seems that many have veered too far on the privacy end of the scale.
Yet another sad story was caught on camera this week showing a group of police officers killing an unarmed African-American man named George Floyd. While the officers were fired from the police department, there is still much uncertainty about what will happen next to hold those officers accountable as a legal matter.
A well-functioning legal system should protect the constitutional rights of American citizens to be free of unreasonable force from police officers, while also allowing police officers the ability to do their jobs safely and well. In theory, civil rights lawsuits are supposed to strike that balance.
In a civil rights lawsuit, the goal is to make the victim (or their families) of a rights violation whole by monetary damages. From a legal perspective, this is necessary to give the victim justice. From an economic perspective this is necessary to deter future bad conduct and properly align ex ante incentives going forward. Under a well-functioning system, juries would, after hearing all the evidence, make a decision about whether constitutional rights were violated and the extent of damages. A functioning system of settlements would result as a common law develops determining what counts as reasonable or unreasonable uses of force. This doesn’t mean plaintiffs always win, either. Officers may be determined to be acting reasonably under the circumstances once all the evidence is presented to a jury.
However, one of the greatest obstacles to holding police officers accountable in misconduct cases is the doctrine of qualified immunity. Qualified immunity started as a mechanism to protect officers from suit when they acted in “good faith.” Over time, though, the doctrine has evolved away from a subjective test based upon the actor’s good faith to an objective test based upon notice in judicial precedent. As a result, courts have widely expanded its scope to the point that qualified immunity is now protecting officers even when their conduct violates the law, as long as the officers weren’t on clear notice from specific judicial precedent that what they did was illegal when they did it. In the words of the Supreme Court, qualified immunity protects “all but the plainly incompetent or those who knowingly violate the law.”
This standard has predictably led to a situation where officer misconduct which judges and juries would likely find egregious never makes it to court. The Cato Institute’s website Unlawful Shield details many cases where federal courts found an officer’s conduct was illegal yet nonetheless protected by qualified immunity.
Immunity of this nature has profound consequences on the incentive structure facing police officers. Police officers, as well as the departments that employ them, are insufficiently accountable when gross misconduct does not get past a motion to dismiss for qualified immunity. On top of that, the regular practice of governments is to indemnify officers even when there is a settlement or a judgment. The result is to encourage police officers to take insufficient care when making the choice about the level of force to use.
Economics 101 makes a clear prediction: When unreasonable uses of force are not held accountable, you get more unreasonable uses of force. Unfortunately, the news continues to illustrate the accuracy of this prediction.
Since the LabMD decision, in which the Eleventh Circuit Court of Appeals told the FTC that its orders were unconstitutionally vague, the FTC has been put on notice that it needs to reconsider how it develops and substantiates its claims in data security enforcement actions brought under Section 5.
While the new orders do list more specific requirements to help explain what the FTC believes is a “comprehensive data security program”, there is still no legal analysis in either the orders or the complaints that would give companies fair notice of what the law requires. Furthermore, nothing about the underlying FTC process has changed, which means there is still enormous pressure for companies to settle rather than litigate the contours of what “reasonable” data security practices look like. Thus, despite the Commission’s optimism, the recent orders and complaints do little to nothing to remedy the problems that plague the Commission’s data security enforcement program.
The changes
In his blog post, the director of the Bureau of Consumer Protection at the FTC describes how new orders in data security enforcement actions are more specific, with one of the main goals being more guidance to businesses trying to follow the law.
Since the early 2000s, our data security orders had contained fairly standard language. For example, these orders typically required a company to implement a comprehensive information security program subject to a biennial outside assessment. As part of the FTC’s Hearings on Competition and Consumer Protection in the 21st Century, we held a hearing in December 2018 that specifically considered how we might improve our data security orders. We were also mindful of the 11th Circuit’s 2018 LabMD decision, which struck down an FTC data security order as unenforceably vague.
Based on this learning, in 2019 the FTC made significant improvements to its data security orders. These improvements are reflected in seven orders announced this year against an array of diverse companies: ClixSense (pay-to-click survey company), i-Dressup (online games for kids), DealerBuilt (car dealer software provider), D-Link (Internet-connected routers and cameras), Equifax (credit bureau), Retina-X (monitoring app), and Infotrax (service provider for multilevel marketers)…
[T]he orders are more specific. They continue to require that the company implement a comprehensive, process-based data security program, and they require the company to implement specific safeguards to address the problems alleged in the complaint. Examples have included yearly employee training, access controls, monitoring systems for data security incidents, patch management systems, and encryption. These requirements not only make the FTC’s expectations clearer to companies, but also improve order enforceability.
Why the FTC’s data security enforcement regime fails to provide fair notice or develop law (and is not like the common law)
While these changes are long overdue, it is just one step in the direction of a much-needed process reform at the FTC in how it prosecutes cases with its unfairness authority, particularly in the realm of data security. It’s helpful to understand exactly why the historical failures of the FTC process are problematic in order to understand why the changes it is undertaking are insufficient.
For instance, Geoffrey Manne and I previously highlighted the various ways the FTC’s data security consent order regime fails in comparison with the common law:
In Lord Mansfield’s characterization, “the common law ‘does not consist of particular cases, but of general principles, which are illustrated and explained by those cases.’” Further, the common law is evolutionary in nature, with the outcome of each particular case depending substantially on the precedent laid down in previous cases. The common law thus emerges through the accretion of marginal glosses on general rules, dictated by new circumstances.
The common law arguably leads to legal rules with at least two substantial benefits—efficiency and predictability or certainty. The repeated adjudication of inefficient or otherwise suboptimal rules results in a system that generally offers marginal improvements to the law. The incentives of parties bringing cases generally means “hard cases,” and thus judicial decisions that have to define both what facts and circumstances violate the law and what facts and circumstances don’t. Thus, a benefit of a “real” common law evolution is that it produces a body of law and analysis that actors can use to determine what conduct they can undertake without risk of liability and what they cannot.
In the abstract, of course, the FTC’s data security process is neither evolutionary in nature nor does it produce such well-defined rules. Rather, it is a succession of wholly independent cases, without any precedent, narrow in scope, and binding only on the parties to each particular case. Moreover it is generally devoid of analysis of the causal link between conduct and liability and entirely devoid of analysis of which facts do not lead to liability. Like all regulation it tends to be static; the FTC is, after all, an enforcement agency, charged with enforcing the strictures of specific and little-changing pieces of legislation and regulation. For better or worse, much of the FTC’s data security adjudication adheres unerringly to the terms of the regulations it enforces with vanishingly little in the way of gloss or evolution. As such (and, we believe, for worse), the FTC’s process in data security cases tends to reject the ever-evolving “local knowledge” of individual actors and substitutes instead the inherently limited legislative and regulatory pronouncements of the past.
By contrast, real common law, as a result of its case-by-case, bottom-up process, adapts to changing attributes of society over time, largely absent the knowledge and rent-seeking problems of legislatures or administrative agencies. The mechanism of constant litigation of inefficient rules allows the common law to retain a generally efficient character unmatched by legislation, regulation, or even administrative enforcement.
Because the common law process depends on the issues selected for litigation and the effects of the decisions resulting from that litigation, both the process by which disputes come to the decision-makers’ attention, as well as (to a lesser extent, because errors will be corrected over time) the incentives and ability of the decision-maker to render welfare-enhancing decisions, determine the value of the common law process. These are decidedly problematic at the FTC.
In our analysis, we found the FTC’s process to be wanting compared to the institution of the common law. The incentives of the administrative complaint process put a relatively larger pressure on companies to settle data security actions brought by the FTC compared to private litigants. This is because the FTC can use its investigatory powers as a public enforcer to bypass the normal discovery process to which private litigants are subject, and over which independent judges have authority.
In a private court action, plaintiffs can’t engage in discovery unless their complaint survives a motion to dismiss from the defendant. Discovery costs remain a major driver of settlements, so this important judicial review is necessary to make sure there is actually a harm present before putting those costs on defendants.
Furthermore, the FTC can also bring cases in a Part III adjudicatory process which starts in front of an administrative law judge (ALJ) but is then appealable to the FTC itself. Former Commissioner Joshua Wright noted in 2013 that “in the past nearly twenty years… after the administrative decision was appealed to the Commission, the Commission ruled in favor of FTC staff. In other words, in 100 percent of cases where the ALJ ruled in favor of the FTC, the Commission affirmed; and in 100 percent of the cases in which the ALJ ruled against the FTC, the Commission reversed.” In other words, the FTC nearly always rules in favor of itself on appeal if the ALJ finds there is no case, as it did in LabMD. The combination of investigation costs before any complaint at all and the high likelihood of losing through several stages of litigation makes the intelligent business decision to simply agree to a consent decree.
The results of this asymmetrical process show the FTC has not really been building a common law. In all but two cases (Wyndham and LabMD), the companies who have been targeted for investigation by the FTC on data security enforcement have settled. We also noted how the FTC’s data security orders tended to be nearly identical from case-to-case, reflecting the standards of the FTC’s Safeguards Rule. Since the orders were giving nearly identical—and as LabMD found, vague—remedies in each case, it cannot be said there was a common law developing over time.
What LabMD addressed and what it didn’t
In its decision, the Eleventh Circuit sidestepped fundamental substantive problems with the FTC’s data security practice (which we have made in both our scholarship and LabMD amicus brief) about notice or substantial injury. Instead, the court decided to assume the FTC had proven its case and focused exclusively on the remedy.
We will assume arguendo that the Commission is correct and that LabMD’s negligent failure to design and maintain a reasonable data-security program invaded consumers’ right of privacy and thus constituted an unfair act or practice.
What the Eleventh Circuit did address, though, was that the remedies the FTC had been routinely applying to businesses through its data enforcement actions lacked the necessary specificity in order to be enforceable through injunctions or cease and desist orders.
In the case at hand, the cease and desist order contains no prohibitions. It does not instruct LabMD to stop committing a specific act or practice. Rather, it commands LabMD to overhaul and replace its data-security program to meet an indeterminable standard of reasonableness. This command is unenforceable. Its unenforceability is made clear if we imagine what would take place if the Commission sought the order’s enforcement…
The Commission moves the district court for an order requiring LabMD to show cause why it should not be held in contempt for violating the following injunctive provision:
[T]he respondent shall … establish and implement, and thereafter maintain, a comprehensive information security program that is reasonably designed to protect the security, confidentiality, and integrity of personal information collected from or about consumers…. Such program… shall contain administrative, technical, and physical safeguards appropriate to respondent’s size and complexity, the nature and scope of respondent’s activities, and the sensitivity of the personal information collected from or about consumers….
The Commission’s motion alleges that LabMD’s program failed to implement “x” and is therefore not “reasonably designed.” The court concludes that the Commission’s alleged failure is within the provision’s language and orders LabMD to show cause why it should not be held in contempt.
At the show cause hearing, LabMD calls an expert who testifies that the data-security program LabMD implemented complies with the injunctive provision at issue. The expert testifies that “x” is not a necessary component of a reasonably designed data-security program. The Commission, in response, calls an expert who disagrees. At this point, the district court undertakes to determine which of the two equally qualified experts correctly read the injunctive provision. Nothing in the provision, however, indicates which expert is correct. The provision contains no mention of “x” and is devoid of any meaningful standard informing the court of what constitutes a “reasonably designed” data-security program. The court therefore has no choice but to conclude that the Commission has not proven — and indeed cannot prove — LabMD’s alleged violation by clear and convincing evidence.
In other words, the Eleventh Circuit found that an order requiring a reasonable data security program is not specific enough to make it enforceable. This leaves questions as to whether the FTC’s requirement of a “reasonable data security program” is specific enough to survive a motion to dismiss and/or a fair notice challenge going forward.
Under the Federal Rules of Civil Procedure, a plaintiff must provide “a short and plain statement . . . showing that the pleader is entitled to relief,” Fed. R. Civ. P. 8(a)(2), including “enough facts to state a claim . . . that is plausible on its face.” Bell Atl. Corp. v. Twombly, 550 U.S. 544, 570 (2007). “[T]hreadbare recitals of the elements of a cause of action, supported by mere conclusory statements” will not suffice. Ashcroft v. Iqbal, 556 U.S. 662, 678 (2009). In FTC v. D-Link, for instance, the Northern District of California dismissed the unfairness claims because the FTC did not sufficiently plead injury.
[T]hey make out a mere possibility of injury at best. The FTC does not identify a single incident where a consumer’s financial, medical or other sensitive personal information has been accessed, exposed or misused in any way, or whose IP camera has been compromised by unauthorized parties, or who has suffered any harm or even simple annoyance and inconvenience from the alleged security flaws in the DLS devices. The absence of any concrete facts makes it just as possible that DLS’s devices are not likely to substantially harm consumers, and the FTC cannot rely on wholly conclusory allegations about potential injury to tilt the balance in its favor.
The fair notice question wasn’t reached in LabMD, though it was in FTC v. Wyndham. But the Third Circuit did not analyze the FTC’s data security regime under the “ascertainable certainty” standard applied to agency interpretation of a statute.
Wyndham’s position is unmistakable: the FTC has not yet declared that cybersecurity practices can be unfair; there is no relevant FTC rule, adjudication or document that merits deference; and the FTC is asking the federal courts to interpret § 45(a) in the first instance to decide whether it prohibits the alleged conduct here. The implication of this position is similarly clear: if the federal courts are to decide whether Wyndham’s conduct was unfair in the first instance under the statute without deferring to any FTC interpretation, then this case involves ordinary judicial interpretation of a civil statute, and the ascertainable certainty standard does not apply. The relevant question is not whether Wyndham had fair notice of the FTC’s interpretation of the statute, but whether Wyndham had fair notice of what the statute itself requires.
In other words, Wyndham boxed itself into a corner arguing that they did not have fair notice that the FTC could bring a data security enforcement action against the under Section 5 unfairness. LabMD, on the other hand, argued they did not have fair notice as to how the FTC would enforce its data security standards. Cf. ICLE-Techfreedom Amicus Brief at 19. The Third Circuit even suggested that under an “ascertainable certainty” standard, the FTC failed to provide fair notice: “we agree with Wyndham that the guidebook could not, on its own, provide ‘ascertainable certainty’ of the FTC’s interpretation of what specific cybersecurity practices fail § 45(n).” Wyndham, 799 F.3d at 256 n.21.
Most importantly, the Eleventh Circuit did not actually get to the issue of whether LabMD actually violated the law under the factual record developed in the case. This means there is still no caselaw (aside from the ALJ decision in this case) which would allow a company to learn what is and what is not reasonable data security, or what counts as a substantial injury for the purposes of Section 5 unfairness in data security cases.
How FTC’s changes fundamentally fail to address its failures of process
The FTC’s new approach to its orders is billed as directly responsive to what the Eleventh Circuit did reach in the LabMD decision, but it leaves so much of what makes the process insufficient in place.
First, it is notable that while the FTC highlights changes to its orders, there is still a lack of legal analysis in the orders that would allow a company to accurately predict whether its data security practices are enough under the law. A listing of what specific companies under consent orders are required to do is helpful. But these consent decrees do not require companies to admit liability or contain anything close to the reasoning that accompanies court opinions or normal agency guidance on complying with the law.
For instance, the general formulation in these 2019 orders is that the company must “establish, implement, and maintain a comprehensive information/software security program that is designed to protect the security, confidentiality, and integrity of such personal information. To satisfy this requirement, Respondent/Defendant must, at a minimum…” (emphasis added), followed by a list of pretty similar requirements with variation depending on the business. Even if a company does all of the listed requirements but a breach occurs, the FTC is not obligated to find the data security program was legally sufficient. There is no safe harbor or presumptive reasonableness that attaches even for the business subject to the order, nonetheless companies looking for guidance.
While the FTC does now require more specific things, like “yearly employee training, access controls, monitoring systems for data security incidents, patch management systems, and encryption,” there is still no analysis on how to meet the standard of reasonableness the FTC relies upon. In other words, it is not clear that this new approach to orders does anything to increase fair notice to companies as to what the FTC requires under Section 5 unfairness.
Second, nothing about the underlying process has really changed. The FTC can still investigate and prosecute cases through administrative law courts with itself as initial court of appeal. This makes the FTC the police, prosecutor, and judge in its own case. In the case of LabMD, who actually won after many appeals, this process ended in bankruptcy. It is no surprise that since the LabMD decision, each of the FTC’s data security enforcement cases have been settled with consent orders, just as they were before the Eleventh Circuit opinion.
Unfortunately, if the FTC really wants to evolve its data security process like the common law, it needs to engage in an actual common law process. Without caselaw on the facts necessary to establish substantial injury, “unreasonable” data security practices, and causation, there will continue to be more questions than answers about what the law requires. And without changes to the process, the FTC will continue to be able to strong-arm companies into consent decrees.
A pending case in the U.S. Court of Appeals for the 3rd Circuit has raised several interesting questions about the FTC enforcement approach and patent litigation in the pharmaceutical industry. The case, FTC v. AbbVie, involves allegations that AbbVie (and Besins) filed sham patent infringement cases against generic manufacturer Teva (and Perrigo) for the purpose of preventing or delaying entry into the testosterone gel market in which AbbVie’s AndroGel had a monopoly. The FTC further alleges that AbbVie and Teva settled the testosterone gel litigation in AbbVie’s favor while making a large payment to Teva in an unrelated case, behavior that, considered together, amounted to an illegal reverse payment settlement. The district court dismissed the reverse payment claims, but concluded that the patent infringement cases were sham litigation. It ordered disgorgement damages of $448 million against AbbVie and Besins which was the profit they gained from maintaining the AndroGel monopoly.
The
3rd Circuit has been asked to review several elements of the
district court’s decision including whether the original patent infringement
cases amounted to sham litigation, whether the payment to Teva in a separate
case amounted to an illegal reverse payment, and whether the
FTC has the authority to seek disgorgement damages. The decision will help to clarify outstanding
issues relating to patent litigation and the FTC’s enforcement abilities, but
it also has the potential to chill pro-competitive behavior in the
pharmaceutical market encouraged under Hatch-Waxman.
First,
the 3rd Circuit will review whether AbbVie’s patent infringement
case was sham litigation by asking whether the district court
applied the right standard and how plaintiffs must prove that lawsuits are
baseless. The district court determined
that the case was a sham because it was objectively baseless (AbbVie couldn’t
reasonably expect to win) and subjectively baseless (AbbVie brought the cases
solely to delay generic entry into the market). AbbVie argues that the district court erred by
not requiring affirmative evidence of bad faith and not requiring the FTC to
present clear and convincing evidence that AbbVie and its attorneys believed
the lawsuits were baseless.
While
sham litigation should be penalized and deterred, especially when it produces
anticompetitive effects, the 3rd Circuit’s decision, depending on
how it comes out, also has the potential to deter brand drug makers from filing
patent infringement cases in the first place.
This threatens to disrupt the delicate balance that Hatch-Waxman sought to establish
between protecting generic entry while encouraging brand competition.
The 3rd Circuit will also determine whether AbbVie’s payment to Teva in a separate case involving cholesterol medicine was an illegal reverse payment, otherwise known as a “pay-for-delay” settlement. The FTC asserts that the actions in the two cases—one involving testosterone gel and the other involving cholesterol medicine—should be considered together, but the district court disagreed and determined there was no illegal reverse payment. True pay-for-delay settlements are anticompetitive and harm consumers by delaying their access to cheaper generic alternatives. However, an overly-liberal definition of what constitutes an illegal reverse payment will deter legitimate settlements, thereby increasing expenses for all parties that choose to litigate and possibly dissuading generics from bringing patent challenges in the first place. Moreover, FTC’s argument that two settlements occurring in separate cases around the same time is suspicious overlooks the reality that the pharmaceutical industry has become increasingly concentrated and drug companies often have more than one pending litigation matter against another company involving entirely different products and circumstances.
Finally, the 3rd Circuit will
determine whether the FTC has the authority to seek disgorgement damages on past
acts like settled patent litigation.
AbbVie has argued that the agency has no right to disgorgement because
it isn’t enumerated in the FTC Act and because courts can’t order injunctive
relieve, including disgorgement, on completed past acts.
The FTC has sought disgorgement damages only sparingly, but the frequency with which the agency seeks disgorgement and the amount of the damages have increased in recent years. Proponents of the FTC’s approach argue that the threat of large disgorgement damages provides a strong deterrent to anticompetitive behavior. While true, FTC-ordered disgorgement (even if permissible) may go too far and end up chilling economic activity by exposing businesses to exorbitant liability without clear guidance on when disgorgement will be awarded. The 3rd Circuit will determine whether the FTC’s enforcement approach is authorized, a decision that has important implications for whether the agency’s enforcement can deter unfair practices without depressing economic activity.
Last week, we posted a piece on TOTM, criticizing the amicus brief written by Mark Lemley, Douglas Melamed and Steven Salop in the ongoing Qualcomm litigation. The authors prepared a thoughtful response to our piece, which we published today on TOTM.
In this post, we highlight the points where we agree with the amici (or at least we think so), as well as those where we differ.
Negotiating in the shadow of FRAND litigation
Let us imagine a hypothetical world, where an OEM must source one chipset from Qualcomm (i.e. this segment of the market is non-contestable) and one chipset from either Qualcomm or its rivals (i.e. this segment is contestable). For both of these chipsets, the OEM must also reach a license agreement with Qualcomm.
We use the same number as the amici:
The OEM has a reserve price of $20 for each chip/license combination.
Rivals can produce chips at a cost of $11.
The hypothetical FRAND benchmark is $2 per chip.
With these numbers in mind, the critical question is whether there is a realistic threat of litigation to constrain the royalties commanded by Qualcomm (we believe that Lemley et al. agree with us on this point). The following table shows the prices that a hypothetical OEM would be willing to pay in both of these scenarios:
Blue cells are segments where QC can increase its profits if the threat of litigation is removed.
When the threat of litigation is present, Qualcomm obtains a total of $20 for the combination of non-contestable chips and IP. Qualcomm can use its chipset position to evade FRAND and charges the combined monopoly price of $20. At a chipset cost of $11, it would thus make $9 worth of profits. However, it earns only $13 for contestable chips ($2 in profits). This is because competition brings the price of chips down to $11 and Qualcomm does not have a chipset advantage to earn more than the FRAND rate for its IP.
When the threat of litigation is taken off the table, all chipsets effectively become non-contestable. Qualcomm still earns $20 for its previously non-contestable chips. But it can now raise its IP rate above the FRAND benchmark in the previously contestable segment (for example, by charging $10 for the IP). This squeezes its chipset competitors.
If our understanding of the amici’s response is correct, they argue that the combination of Qualcomm’s strong chipset position and its “No License, No Chips” policy (“NLNC”) effectively nullifies the threat of litigation:
Qualcomm is able to charge more than $2 for the license only because it uses the power of its chip monopoly to coerce the OEMs to give up the option of negotiating in light of the otherwise applicable constraints on the royalties it can charge.
According to the amici, the market thus moves from a state of imperfect competition (where OEMs would pay $33 for two chips and QC’s license) to a world of monopoly (where they pay the full $40).
We beg to differ.
Our points of disagreement
From an economic standpoint, the critical question is the extent to which Qualcomm’s chipset position and its NLNC policy deter OEMs from obtaining closer-to-FRAND rates.
While the case record is mixed and contains some ambiguities, we think it strongly suggests that Qualcomm’s chipset position and its NLNC policy do not preclude OEMs from using litigation to obtain rates that are close to the FRAND benchmark. There is thus no reason to believe that it can exclude its chipset rivals.
We believe the following facts support our assertion:
OEMs have pursued various litigation strategies in order to obtain lower rates on Qualcomm’s IP. As we mentioned in our previous post, this was notably the case for Apple, Samsung and LG. All three companies ultimately reached settlements with Qualcomm (and these settlements were concluded in the shadow of litigation proceedings — indeed, in Apple’s case, on the second day of trial). If anything, this suggests that court proceedings are an integral part of negotiations between Qualcomm and its OEMs.
For the most part, Qualcomm’s threats to cut off chip supplies were just that: threats. In any negotiation, parties will try to convince their counterpart that they have a strong outside option. Qualcomm may have done so by posturing that it would not sell chips to OEMs before they concluded a license agreement.
However, it seems that only once did Qualcomm apparently follow through with its threats to withhold chips (against Sony). And even then, the supply cutoff lasted only seven days.
And while many OEMs did take Qualcomm to court in order to obtain more favorable license terms, this never resulted in Qualcomm cutting off their chipset supplies. Other OEMs thus had no reason to believe that litigation would entail disruptions to their chipset supplies.
OEMs also wield powerful threats. These include patent holdout, litigation, vertical integration, and purchasing chips from Qualcomm’s rivals. And of course they have aggressively pursued the bringing of this and other litigation around the world by antitrust authorities — even quite possibly manipulating the record to bolster their cases. Here’s how one observer sums up Apple’s activity in this regard:
“Although we really only managed to get a small glimpse of Qualcomm’s evidence demonstrating the extent of Apple’s coordinated strategy to manipulate the FRAND license rate, that glimpse was particularly enlightening. It demonstrated a decade-long coordinated effort within Apple to systematically engage in what can only fairly be described as manipulation (if not creation of evidence) and classic holdout.
Qualcomm showed during opening arguments that, dating back to at least 2009, Apple had been laying the foundation for challenging its longstanding relationship with Qualcomm.” (Emphasis added)
Moreover, the holdout and litigation paths have been strengthened by the eBaycase, which significantly reduced the financial risks involved in pursuing a holdout and/or litigation strategy. Given all of this, it is far from obvious that it is Qualcomm who enjoys the stronger bargaining position here.
Qualcomm’s chipsets might no longer be “must-buys” in the future. Rivals have gained increasing traction over the past couple of years. And with 5G just around the corner, this momentum could conceivably accelerate. Whether or not one believes that this will ultimately be the case, the trend surely places additional constraints on Qualcomm’s conduct. Aggressive behavior today may spur disgruntled rivals to enter the chipset market or switch suppliers tomorrow.
To summarize, as we understand their response, the delta between supracompetitive and competitive prices is entirely a function of Qualcomm’s ability to charge supra-FRAND prices for its licenses. On this we agree. But, unlike Lemley et al., we do not agree that Qualcomm is in a position to evade its FRAND pledges by using its strong position in the chipset market and its NLNC policy.
Finally, it must be said again: To the extent that that is the problem — the charging of supra-FRAND prices for licenses — the issue is manifestly a contract issue, not an antitrust one. All of the complexity of the case would fall away, and the litigation would be straightforward. But the opponents of Qualcomm’s practices do not really want to ensure that Qualcomm lowers its royalties by this delta; if they did, they would be bringing/supporting FRAND litigation. What the amici and Qualcomm’s contracting partners appear to want is to use antitrust litigation to force Qualcomm to license its technology at even lower rates — to force Qualcomm into a different business model in order to reset the baseline from which FRAND prices are determined (i.e., at the chip level, rather than at the device level). That may be an intelligible business strategy from the perspective of Qualcomm’s competitors, but it certainly isn’t sensible antitrust policy.
In an amicus brief filed last Friday, a diverse group of antitrust scholars joined the Washington Legal Foundation in urging the U.S. Court of Appeals for the Second Circuit to vacate the Federal Trade Commission’s misguided 1-800 Contacts decision. Reasoning that 1-800’s settlements of trademark disputes were “inherently suspect,” the FTC condemned the settlements under a cursory “quick look” analysis. In so doing, it improperly expanded the category of inherently suspect behavior and ignored an obvious procompetitive justification for the challenged settlements. If allowed to stand, the Commission’s decision will impair intellectual property protections that foster innovation.
A number of 1-800’s rivals purchased online ad placements that would appear when customers searched for “1-800 Contacts.” 1-800 sued those rivals for trademark infringement, and the lawsuits settled. As part of each settlement, 1-800 and its rival agreed not to bid on each other’s trademarked terms in search-based keyword advertising. (For example, EZ Contacts could not bid on a placement tied to a search for 1-800 Contacts, and vice-versa). Each party also agreed to employ “negative keywords” to ensure that its ads would not appear in response to a consumer’s online search for the other party’s trademarks. (For example, in bidding on keywords, 1-800 would have to specify that its ad must not appear in response to a search for EZ Contacts, and vice-versa). Notably, the settlement agreements didn’t restrict the parties’ advertisements through other media such as TV, radio, print, or other forms of online advertising. Nor did they restrict paid search advertising in response to any search terms other than the parties’ trademarks.
The FTC concluded that these settlement agreements violated the antitrust laws as unreasonable restraints of trade. Although the agreements were not unreasonable per se, as naked price-fixing is, the Commission didn’t engage in the normally applicable rule of reason analysis to determine whether the settlements passed muster. Instead, the Commission condemned the settlements under the truncated analysis that applies when, in the words of the Supreme Court, “an observer with even a rudimentary understanding of economics could conclude that the arrangements in question would have an anticompetitive effect on customers and markets.” The Commission decided that no more than a quick look was required because the settlements “restrict the ability of lower cost online sellers to show their ads to consumers.”
That was a mistake. First, the restraints in 1-800’s settlements are far less extensive than other restraints that the Supreme Court has said may not be condemned under a cursory quick look analysis. In California Dental, for example, the Supreme Court reversed a Ninth Circuit decision that employed the quick look analysis to condemn a de facto ban on all price and “comfort” advertising by members of a dental association. In light of the possibility that the ban could reduce misleading ads, enhance customer trust, and thereby stimulate demand, the Court held that the restraint must be assessed under the more probing rule of reason. A narrow limit on the placement of search ads is far less restrictive than the all-out ban for which the California Dental Court prescribed full-on rule of reason review.
1-800’s settlements are also less likely to be anticompetitive than are other settlements that the Supreme Court has said must be evaluated under the rule of reason. The Court’s Actavis decision rejected quick look and mandated full rule of reason analysis for reverse payment settlements of pharmaceutical patent litigation. In a reverse payment settlement, the patent holder pays an alleged infringer to stay out of the market for some length of time. 1-800’s settlements, by contrast, did not exclude its rivals from the market, place any restrictions on the content of their advertising, or restrict the placement of their ads except on webpages responding to searches for 1-800’s own trademarks. If the restraints in California Dental and Actavis required rule of reason analysis, then those in 1-800’s settlements surely must as well.
In addition to disregarding Supreme Court precedents that limit when mere quick look is appropriate, the FTC gave short shrift to a key procompetitive benefit of the restrictions in 1-800’s settlements. 1-800 spent millions of dollars convincing people that they could save money by ordering prescribed contact lenses from a third party rather than buying them from prescribing optometrists. It essentially built the online contact lens market in which its rivals now compete. In the process, it created a strong trademark, which undoubtedly boosts its own sales. (Trademarks point buyers to a particular seller and enhance consumer confidence in the seller’s offering, since consumers know that branded sellers will not want to tarnish their brands with shoddy products or service.)
When a rival buys ad space tied to a search for 1-800 Contacts, that rival is taking a free ride on 1-800’s investments in its own brand and in the online contact lens market itself. A rival that has advertised less extensively than 1-800—primarily because 1-800 has taken the lead in convincing consumers to buy their contact lenses online—will incur lower marketing costs than 1-800 and may therefore be able to underprice it. 1-800 may thus find that it loses sales to rivals who are not more efficient than it is but have lower costs because they have relied on 1-800’s own efforts.
If market pioneers like 1-800 cannot stop this sort of
free-riding, they will have less incentive to make the investments that create
new markets and develop strong trade names. The restrictions in the 1-800
settlements were simply an effort to prevent inefficient free-riding while otherwise
preserving the parties’ freedom to advertise. They were a narrowly tailored
solution to a problem that hurt 1-800 and
reduced incentives for future investments in market-developing activities that inure
to the benefit of consumers.
Rule of reason analysis would have allowed the FTC to assess
the full market effects of 1-800’s settlements. The Commission’s truncated assessment,
which was inconsistent with Supreme Court decisions on when a quick look will
suffice, condemned conduct that was likely procompetitive. The Second Circuit
should vacate the FTC’s order.
Last week the Senate Judiciary Committee held a hearing, Intellectual
Property and the Price of Prescription Drugs: Balancing Innovation and
Competition, that explored whether changes to the pharmaceutical patent
process could help lower drug prices. The
committee’s goal was to evaluate various legislative proposals that might
facilitate the entry of cheaper generic drugs, while also recognizing that strong
patent rights for branded drugs are essential to incentivize drug
innovation. As Committee Chairman
Lindsey Graham explained:
One thing you don’t want to do is kill the goose who laid the golden egg, which is pharmaceutical development. But you also don’t want to have a system that extends unnecessarily beyond the ability to get your money back and make a profit, a patent system that drives up costs for the average consumer.
Several proposals that were discussed at the hearing have
the potential to encourage competition in the pharmaceutical industry and help
rein in drug prices. Below, I discuss these proposals, plus a few additional
reforms. I also point out some of the language in the current draft proposals
that goes a bit too far and threatens the ability of drug makers to remain
innovative.
1. Prevent brand drug makers from blocking generic companies’ access to drug samples. Some brand drug makers have attempted to delay generic entry by restricting generics’ access to the drug samples necessary to conduct FDA-required bioequivalence studies. Some brand drug manufacturers have limited the ability of pharmacies or wholesalers to sell samples to generic companies or abused the REMS (Risk Evaluation Mitigation Strategy) program to refuse samples to generics under the auspices of REMS safety requirements. The Creating and Restoring Equal Access To Equivalent Samples (CREATES) Act of 2019 would allow potential generic competitors to bring an action in federal court for both injunctive relief and damages when brand companies block access to drug samples. It also gives the FDA discretion to approve alternative REMS safety protocols for generic competitors that have been denied samples under the brand companies’ REMS protocol. Although the vast majority of brand drug companies do not engage in the delay tactics addressed by CREATES, the Act would prevent the handful that do from thwarting generic competition. Increased generic competition should, in turn, reduce drug prices.
2. Restrict abuses of FDA Citizen Petitions. The citizen petition process was created as a way for individuals and community groups to flag legitimate concerns about drugs awaiting FDA approval. However, critics claim that the process has been misused by some brand drug makers who file petitions about specific generic drugs in the hopes of delaying their approval and market entry. Although FDA has indicated that citizens petitions rarely delay the approval of generic drugs, there have been a few drug makers, such as Shire ViroPharma, that have clearly abused the process and put unnecessary strain on FDA resources. The Stop The Overuse of Petitions and Get Affordable Medicines to Enter Soon (STOP GAMES) Act is intended to prevent such abuses. The Act reinforces the FDA and FTC’s ability to crack down on petitions meant to lengthen the approval process of a generic competitor, which should deter abuses of the system that can occasionally delay generic entry. However, lawmakers should make sure that adopted legislation doesn’t limit the ability of stakeholders (including drug makers that often know more about the safety of drugs than ordinary citizens) to raise serious concerns with the FDA.
3. Curtail Anticompetitive Pay-for-Delay Settlements. The Hatch-Waxman Act incentivizes generic companies to challenge brand drug patents by granting the first successful generic challenger a period of marketing exclusivity. Like all litigation, many of these patent challenges result in settlements instead of trials. The FTC and some courts have concluded that these settlements can be anticompetitive when the brand companies agree to pay the generic challenger in exchange for the generic company agreeing to forestall the launch of their lower-priced drug. Settlements that result in a cash payment are a red flag for anti-competitive behavior, so pay-for-delay settlements have evolved to involve other forms of consideration instead. As a result, the Preserve Access to Affordable Generics and Biosimilars Act aims to make an exchange of anything of value presumptively anticompetitive if the terms include a delay in research, development, manufacturing, or marketing of a generic drug. Deterring obvious pay-for-delay settlements will prevent delays to generic entry, making cheaper drugs available as quickly as possible to patients.
However, the Act’s rigid presumption that an exchange of anything of value is presumptively anticompetitive may also prevent legitimate settlements that ultimately benefit consumers. Brand drug makers should be allowed to compensate generic challengers to eliminate litigation risk and escape litigation expenses, and many settlements result in the generic drug coming to market before the expiration of the brand patent and possibly earlier than if there was prolonged litigation between the generic and brand company. A rigid presumption of anticompetitive behavior will deter these settlements, thereby increasing expenses for all parties that choose to litigate and possibly dissuading generics from bringing patent challenges in the first place. Indeed, the U.S. Supreme Court has declined to define these settlements as per se anticompetitive, and the FTC’s most recent agreement involving such settlements exempts several forms of exchanges of value. Any adopted legislation should follow the FTC’s lead and recognize that some exchanges of value are pro-consumer and pro-competitive.
4. Restore the balance established by Hatch-Waxman between branded drug innovators and generic drug challengers. I have previously discussed how an unbalanced inter partes review (IPR) process for challenging patents threatens to stifle drug innovation. Moreover, current law allows generic challengers to file duplicative claims in both federal court and through the IPR process. And because IPR proceedings do not have a standing requirement, the process has been exploited by entities that would never be granted standing in traditional patent litigation—hedge funds betting against a company by filing an IPR challenge in hopes of crashing the stock and profiting from the bet. The added expense to drug makers of defending both duplicative claims and claims against challengers that are exploiting the system increases litigation costs, which may be passed on to consumers in the form of higher prices.
The Hatch-Waxman Integrity Act (HWIA) is designed to return the balance established by Hatch-Waxman between branded drug innovators and generic drug challengers. It requires generic challengers to choose between either Hatch-Waxman litigation (which saves considerable costs by allowing generics to rely on the brand company’s safety and efficacy studies for FDA approval) or an IPR proceeding (which is faster and provides certain pro-challenger provisions). The HWIA would also eliminate the ability of hedge funds and similar entities to file IPR claims while shorting the stock. By reducing duplicative litigation and the exploitation of the IPR process, the HWIA will reduce costs and strengthen innovation incentives for drug makers. This will ensure that patent owners achieve clarity on the validity of their patents, which will spur new drug innovation and make sure that consumers continue to have access to life-improving drugs.
5. Curb illegal product hopping and patent thickets. Two drug maker tactics currently garnering a lot of attention are so-called “product hopping” and “patent thickets.” At its worst, product hopping involves brand drug makers making minor changes to a drug nearing the end of its patent so that they gets a new patent on the slightly-tweaked drug, and then withdrawing the original drug from the market so that patients shift to the newly patented drug and pharmacists can’t substitute a generic version of the original drug. Similarly, at their worst, patent thickets involve brand drug makers obtaining a web of patents on a single drug to extend the life of their exclusivity and make it too costly for other drug makers to challenge all of the patents associated with a drug. The proposed Affordable Prescriptions for Patients Act of 2019 is meant to stop these abuses of the patent system, which would facilitate generic entry and help to lower drug prices.
However, the Act goes too far by also capturing many legitimate activities in its definitions. For example, the bill defines as anticompetitive product-hopping the selling of any improved version of a drug during a window which extends to a year after the launch of the first generic competitor. Presently, to acquire a patent and FDA approval, the improved version of the drug must be different and innovative enough from the original drug, yet the Act would prevent the drug maker from selling such a product without satisfying a demanding three-pronged test before the FTC or a district court. Similarly, the Act defines as anticompetitive patent thickets any new patents filed on a drug in the same general family as the original patent, and this presumption can only be rebutted by providing extensive evidence and satisfying demanding standards to the FTC or a district court. As a result, the Act deters innovation activity that is at all related to an initial patent and, in doing so, ignores the fact that most important drug innovation is incremental innovation based on previous inventions. Thus, the proposal should be redrafted to capture truly anticompetitive product hopping and patent thicket activity, while exempting behavior this is critical for drug innovation.
Reforms that close loopholes in the current patent process should facilitate competition in the pharmaceutical industry and help to lower drug prices. However, lawmakers need to be sure that they don’t restrict patent rights to the extent that they deter innovation because a significant body of research predicts that patients’ health outcomes will suffer as a result.
This has been a big year for business in the courts. A U.S. district court approved the AT&T-Time Warner merger, the Supreme Court upheld Amex’s agreements with merchants, and a circuit court pushed back on the Federal Trade Commission’s vague and heavy handed policing of companies’ consumer data safeguards.
These three decisions mark a new era in the intersection of law and economics.
AT&T-Time Warner
AT&T-Time Warner is a vertical merger, a combination of firms with a buyer-seller relationship. Time Warner creates and broadcasts content via outlets such as HBO, CNN, and TNT. AT&T distributes content via services such as DirecTV.
Economists see little risk to competition from vertical mergers, although there are some idiosyncratic circumstances in which competition could be harmed. Nevertheless, the U.S. Department of Justice went to court to block the merger.
The last time the goverment sued to block a merger was more than 40 years ago, and the government lost. Since then, the government relied on the threat of litigation to extract settlements from the merging parties. For example, in the 1996 merger between Time Warner and Turner, the FTC required limits on how the new company could bundle HBO with less desirable channels and eliminated agreements that allowed TCI (a cable company that partially owned Turner) to carry Turner channels at preferential rates.
With AT&T-Time Warner, the government took a big risk, and lost. It was a big risk because (1) it’s a vertical merger, and (2) the case against the merger was weak. The government’s expert argued consumers would face an extra 45 cents a month on their cable bills if the merger went through, but under cross-examination, conceded it might be as little as 13 cents a month. That’s a big difference and raised big questions about the reliability of the expert’s model.
Judge Richard J. Leon’s 170+ page ruling agreed that the government’s case was weak and its expert was not credible. While it’s easy to cheer a victory of big business over big government, the real victory was the judge’s heavy reliance on facts, data, and analysis rather than speculation over the potential for consumer harm. That’s a big deal and may make the way for more vertical mergers.
Ohio v. American Express
The Supreme Court’s ruling in Amex may seem obscure. The court backed American Express Co.’s policy of preventing retailers from offering customers incentives to pay with cheaper cards.
Amex charges higher fees to merchants than do other cards, such as Visa, MasterCard, and Discover. Amex cardholders also have higher incomes and tend to spend more at stores than those associated with other networks. And, Amex offers its cardholders better benefits, services, and rewards than the other cards. Merchants don’t like Amex because of the higher fees, customers prefer Amex because of the card’s perks.
Amex, and other card companies, operate in what is known as a two-sided market. Put simply, they have two sets of customers: merchants who pay swipe fees, and consumers who pay fees and interest.
Part of Amex’s agreement with merchants is an “anti-steering” provision that bars merchants from offering discounts for using non-Amex cards. The U.S. Justice Department and a group of states sued the company, alleging the Amex rules limited merchants’ ability to reduce their costs from accepting credit cards, which meant higher retail prices. Amex argued that the higher prices charged to merchants were kicked back to its cardholders in the form of more and better perks.
The Supreme Court found that the Justice Department and states focused exclusively on one side (merchant fees) of the two-sided market. The courts says the government can’t meet its burden by showing some effect on some part of the market. Instead, they must demonstrate, “increased cost of credit card transactions … reduced number of credit card transactions, or otherwise stifled competition.” The government could not prove any of those things.
We live in a world two-sided markets. Amazon may be the biggest two-sided market in the history of the world, linking buyers and sellers. Smartphones such as iPhones and Android devices are two-sided markets, linking consumers with app developers. The Supreme Court’s ruling in Amex sets a standard for how antitrust law should treat the economics of two-sided markets.
LabMD
LabMD is another matter that seems obscure, but could have big impacts on the administrative state.
Since the early 2000s, the FTC has brought charges against more than 150 companies alleging they had bad security or privacy practices. LabMD was one of them, when its computer system was compromised by professional hackers in 2008. The FTC claimed that LabMD’s failure to adequately protect customer data was an “unfair” business practice.
Challenging the FTC can get very expensive and the agency used the threat of litigation to secure settlements from dozens of companies. It then used those settlements to convince everyone else that those settlements constituted binding law and enforceable security standards.
Because no one ever forced the FTC to defend what it was doing in court, the FTC’s assertion of legal authority became a self-fulfilling prophecy. LabMD, however, chose to challege the FTC. The fight drove LabMD out of business, but public interest law firm Cause of Action and lawyers at Ropes & Gray took the case on a pro bono basis.
The 11th Circuit Court of Appeals ruled the FTC’s approach to developing security standards violates basic principles of due process. The court said the FTC’s basic approach—in which the FTC tries to improve general security practices by suing companies that experience security breaches—violates the basic legal principle that the government can’t punish someone for conduct that the government hasn’t previously explained is problematic.
My colleague at ICLE observes the lesson to learn from LabMD isn’t about the illegitimacy of the FTC’s approach to internet privacy and security. Instead, it says legality of the administrative state is premised on courts placing a check on abusive regulators.
The lessons learned from these three recent cases reflect a profound shift in thinkging about the laws governing economic activity:
AT&T-Time Warner indicates that facts matter. Mere speculation of potential harms will not satisfy the court.
Amex highlights the growing role two-sided markets play in our economy and provides framework for evaluating competition in these markets.
LabMD is a small step in reining in the administrative state. Regulations must be scrutinized before they are imposed and enforced.
In some ways none of these decisions are revolutionary. Instead, they reflect an evolution toward greater transparency in how the law is to be applied and greater scrutiny over how the regulations are imposed.
The populists are on the march, and as the 2018 campaign season gets rolling we’re witnessing more examples of political opportunism bolstered by economic illiteracy aimed at increasingly unpopular big tech firms.
The latest example comes in the form of a new investigation of Google opened by Missouri’s Attorney General, Josh Hawley. Mr. Hawley — a Republican who, not coincidentally, is running for Senate in 2018 — alleges various consumer protection violations and unfair competition practices.
But while Hawley’s investigation may jump start his campaign and help a few vocal Google rivals intent on mobilizing the machinery of the state against the company, it is unlikely to enhance consumer welfare — in Missouri or anywhere else.
According to the press release issued by the AG’s office:
[T]he investigation will seek to determine if Google has violated the Missouri Merchandising Practices Act—Missouri’s principal consumer-protection statute—and Missouri’s antitrust laws.
The business practices in question are Google’s collection, use, and disclosure of information about Google users and their online activities; Google’s alleged misappropriation of online content from the websites of its competitors; and Google’s alleged manipulation of search results to preference websites owned by Google and to demote websites that compete with Google.
Mr. Hawley’s justification for his investigation is a flourish of populist rhetoric:
We should not just accept the word of these corporate giants that they have our best interests at heart. We need to make sure that they are actually following the law, we need to make sure that consumers are protected, and we need to hold them accountable.
But Hawley’s “strong” concern is based on tired retreads of the same faulty arguments that Google’s competitors (Yelp chief among them), have been plying for the better part of a decade. In fact, all of his apparent grievances against Google were exhaustively scrutinized by the FTC and ultimately rejected or settled in separate federal investigations in 2012 and 2013.
The antitrust issues
To begin with, AG Hawley references the EU antitrust investigation as evidence that
this is not the first-time Google’s business practices have come into question. In June, the European Union issued Google a record $2.7 billion antitrust fine.
True enough — and yet, misleadingly incomplete. Missing from Hawley’s recitation of Google’s antitrust rap sheet are the following investigations, which were closed without any finding of liability related to Google Search, Android, Google’s advertising practices, etc.:
United States FTC, 2013. The FTC found no basis to pursue a case after a two-year investigation: “Challenging Google’s product design decisions in this case would require the Commission — or a court — to second-guess a firm’s product design decisions where plausible procompetitive justifications have been offered, and where those justifications are supported by ample evidence.” The investigation did result in a consent order regarding patent licensing unrelated in any way to search and a voluntary commitment by Google not to engage in certain search-advertising-related conduct.
South Korea FTC, 2013. The KFTC cleared Google after a two-year investigation. It opened a new investigation in 2016, but, as I have discussed, “[i]f anything, the economic conditions supporting [the KFTC’s 2013] conclusion have only gotten stronger since.”
Canada Competition Bureau, 2016. The CCB closed a three-year long investigation into Google’s search practices without taking any action.
Similar investigations have been closed without findings of liability (or simply lie fallow) in a handful of other countries (e.g., Taiwan and Brazil) and even several states (e.g., Ohio and Texas). In fact, of all the jurisdictions that have investigated Google, only the EU and Russia have actually assessed liability.
As Beth Wilkinson, outside counsel to the FTC during the Google antitrust investigation, noted upon closing the case:
Undoubtedly, Google took aggressive actions to gain advantage over rival search providers. However, the FTC’s mission is to protect competition, and not individual competitors. The evidence did not demonstrate that Google’s actions in this area stifled competition in violation of U.S. law.
The CCB was similarly unequivocal in its dismissal of the very same antitrust claims Missouri’s AG seems intent on pursuing against Google:
The Bureau sought evidence of the harm allegedly caused to market participants in Canada as a result of any alleged preferential treatment of Google’s services. The Bureau did not find adequate evidence to support the conclusion that this conduct has had an exclusionary effect on rivals, or that it has resulted in a substantial lessening or prevention of competition in a market.
Unfortunately, rather than follow the lead of these agencies, Missouri’s investigation appears to have more in common with Russia’s effort to prop up a favored competitor (Yandex) at the expense of consumer welfare.
The Yelp Claim
Take Mr. Hawley’s focus on “Google’s alleged misappropriation of online content from the websites of its competitors,” for example, which cleaves closely to what should become known henceforth as “The Yelp Claim.”
While the sordid history of Yelp’s regulatory crusade against Google is too long to canvas in its entirety here, the primary elements are these:
Once upon a time (in 2005), Google licensed Yelp’s content for inclusion in its local search results. In 2007 Yelp ended the deal. By 2010, and without a license from Yelp (asserting fair use), Google displayed small snippets of Yelp’s reviews that, if clicked on, led to Yelp’s site. Even though Yelp received more user traffic from those links as a result, Yelp complained, and Google removed Yelp snippets from its local results.
In its 2013 agreement with the FTC, Google guaranteed that Yelp could opt-out of having even snippets displayed in local search results by committing Google to:
make available a web-based notice form that provides website owners with the option to opt out from display on Google’s Covered Webpages of content from their website that has been crawled by Google. When a website owner exercises this option, Google will cease displaying crawled content from the domain name designated by the website owner….
The commitments also ensured that websites (like Yelp) that opt out would nevertheless remain in Google’s general index.
Ironically, Yelp now claims in a recent study that Google should show not only snippets of Yelp reviews, but even more of Yelp’s content. (For those interested, my colleagues and I have a paper explaining why the study’s claims are spurious).
The key bit here, of course, is that Google stopped pulling content from Yelp’s pages to use in its local search results, and that it implemented a simple mechanism for any other site wishing to opt out of the practice to do so.
It’s difficult to imagine why Missouri’s citizens might require more than this to redress alleged anticompetitive harms arising from the practice.
Perhaps AG Hawley thinks consumers would be better served by an opt-in mechanism? Of course, this is absurd, particularly if any of Missouri’s citizens — and their businesses — have websites. Most websites want at least some of their content to appear on Google’s search results pages as prominently as possible — see this and this, for example — and making this information more accessible to users is why Google exists.
To be sure, some websites may take issue with how much of their content Google features and where it places that content. But the easy opt out enables them to prevent Google from showing their content in a manner they disapprove of. Yelp is an outlier in this regard because it views Google as a direct competitor, especially to the extent it enables users to read some of Yelp’s reviews without visiting Yelp’s pages.
For Yelp and a few similarly situated companies the opt out suffices. But for almost everyone else the opt out is presumably rarely exercised, and any more-burdensome requirement would just impose unnecessary costs, harming instead of helping their websites.
The privacy issues
The Missouri investigation also applies to “Google’s collection, use, and disclosure of information about Google users and their online activities.” More pointedly, Hawley claims that “Google may be collecting more information from users than the company was telling consumers….”
Presumably this would come as news to the FTC, which, with a much larger staff and far greater expertise, currently has Google under a 20 year consent order (with some 15 years left to go) governing its privacy disclosures and information-sharing practices, thus ensuring that the agency engages in continual — and well-informed — oversight of precisely these issues.
The FTC’s consent order with Google (the result of an investigation into conduct involving Google’s short-lived Buzz social network, allegedly in violation of Google’s privacy policies), requires the company to:
“[N]ot misrepresent in any manner, expressly or by implication… the extent to which respondent maintains and protects the privacy and confidentiality of any [user] information…”;
“Obtain express affirmative consent from” users “prior to any new or additional sharing… of the Google user’s identified information with any third party” if doing so would in any way deviate from previously disclosed practices;
“[E]stablish and implement, and thereafter maintain, a comprehensive privacy program that is reasonably designed to [] address privacy risks related to the development and management of new and existing products and services for consumers, and (2) protect the privacy and confidentiality of [users’] information”; and
Along with a laundry list of other reporting requirements, “[submit] biennial assessments and reports [] from a qualified, objective, independent third-party professional…, approved by the [FTC] Associate Director for Enforcement, Bureau of Consumer Protection… in his or her sole discretion.”
What, beyond the incredibly broad scope of the FTC’s consent order, could the Missouri AG’s office possibly hope to obtain from an investigation?
Google is already expressly required to provide privacy reports to the FTC every two years. It must provide several of the items Hawley demands in his CID to the FTC; others are required to be made available to the FTC upon demand. What materials could the Missouri AG collect beyond those the FTC already receives, or has the authority to demand, under its consent order?
And what manpower and expertise could Hawley apply to those materials that would even begin to equal, let alone exceed, those of the FTC?
That penalty is of undeniable import, not only for its amount (at the time it was the largest in FTC history) and for stemming from alleged problems completely unrelated to the issue underlying the initial action, but also because it was so easy to obtain. Having put Google under a 20-year consent order, the FTC need only prove (or threaten to prove) contempt of the consent order, rather than the specific elements of a new violation of the FTC Act, to bring the company to heel. The former is far easier to prove, and comes with the ability to impose (significant) damages.
So what’s really going on in Jefferson City?
While states are, of course, free to enforce their own consumer protection laws to protect their citizens, there is little to be gained — other than cold hard cash, perhaps — from pursuing cases that, at best, duplicate enforcement efforts already undertaken by the federal government (to say nothing of innumerable other jurisdictions).
To take just one relevant example, in 2013 — almost a year to the day following the court’s approval of the settlement in the FTC’s case alleging Google’s violation of the Buzz consent order — 37 states plus DC (not including Missouri) settled their own, follow-on litigation against Google on the same facts. Significantly, the terms of the settlement did not impose upon Google any obligation not already a part of the Buzz consent order or the subsequent FTC settlement — but it did require Google to fork over an additional $17 million.
Not only is there little to be gained from yet another ill-conceived antitrust campaign, there is much to be lost. Such massive investigations require substantial resources to conduct, and the opportunity cost of doing so may mean real consumer issues go unaddressed. The Consumer Protection Section of the Missouri AG’s office says it receives some 100,000 consumer complaints a year. How many of those will have to be put on the back burner to accommodate an investigation like this one?
Even when not politically motivated, state enforcement of CPAs is not an unalloyed good. In fact, empirical studies of state consumer protection actions like the one contemplated by Mr. Hawley have shown that such actions tend toward overreach — good for lawyers, perhaps, but expensive for taxpayers and often detrimental to consumers. According to a recent study by economists James Cooper and Joanna Shepherd:
[I]n recent decades, this thoughtful balance [between protecting consumers and preventing the proliferation of lawsuits that harm both consumers and businesses] has yielded to damaging legislative and judicial overcorrections at the state level with a common theoretical mistake: the assumption that more CPA litigation automatically yields more consumer protection…. [C]ourts and legislatures gradually have abolished many of the procedural and remedial protections designed to cabin state CPAs to their original purpose: providing consumers with redress for actual harm in instances where tort and contract law may provide insufficient remedies. The result has been an explosion in consumer protection litigation, which serves no social function and for which consumers pay indirectly through higher prices and reduced innovation.
AG Hawley’s investigation seems almost tailored to duplicate the FTC’s extensive efforts — and to score political points. Or perhaps Mr. Hawley is just perturbed that Missouri missed out its share of the $17 million multistate settlement in 2013.
Which raises the spectre of a further problem with the Missouri case: “rent extraction.”
It’s no coincidence that Mr. Hawley’s investigation follows closely on the heels of Yelp’s recent letter to the FTC and every state AG (as well as four members of Congress and the EU’s chief competition enforcer, for good measure) alleging that Google had re-started scraping Yelp’s content, thus violating the terms of its voluntary commitments to the FTC.
It’s also no coincidence that Yelp “notified” Google of the problem only by lodging a complaint with every regulator who might listen rather than by actuallynotifying Google. But an action like the one Missouri is undertaking — not resolution of the issue — is almost certainly exactly what Yelp intended, and AG Hawley is playing right into Yelp’s hands.
Google, for its part, strongly disputes Yelp’s allegation, and, indeed, has — even according to Yelp — complied fully with Yelp’s request to keep its content off Google Local and other “vertical” search pages since 18 months before Google entered into its commitments with the FTC. Google claims that the recent scraping was inadvertent, and that it would happily have rectified the problem if only Yelp had actually bothered to inform Google.
Indeed, Yelp’s allegations don’t really pass the smell test: That Google would suddenly change its practices now, in violation of its commitments to the FTC and at a time of extraordinarily heightened scrutiny by the media, politicians of all stripes, competitors like Yelp, the FTC, the EU, and a host of other antitrust or consumer protection authorities, strains belief.
But, again, identifying and resolving an actual commercial dispute was likely never the goal. As a recent, fawning New York Times article on “Yelp’s Six-Year Grudge Against Google” highlights (focusing in particular on Luther Lowe, now Yelp’s VP of Public Policy and the author of the letter):
Yelp elevated Mr. Lowe to the new position of director of government affairs, a job that more or less entails flying around the world trying to sic antitrust regulators on Google. Over the next few years, Yelp hired its first lobbyist and started a political action committee. Recently, it has started filing complaints in Brazil.
Missouri, in other words, may just be carrying Yelp’s water.
The one clear lesson of the decades-long Microsoft antitrust saga is that companies that struggle to compete in the market can profitably tax their rivals by instigating antitrust actions against them. As Milton Friedman admonished, decrying “the business community’s suicidal impulse” to invite regulation:
As a believer in the pursuit of self-interest in a competitive capitalist system, I can’t blame a businessman who goes to Washington [or is it Jefferson City?] and tries to get special privileges for his company.… Blame the rest of us for being so foolish as to let him get away with it.
Taking a tough line on Silicon Valley firms in the midst of today’s anti-tech-company populist resurgence may help with the electioneering in Mr. Hawley’s upcoming bid for a US Senate seat and serve Yelp, but it doesn’t offer any clear, actual benefits to Missourians. As I’ve wondered before: “Exactly when will regulators be a little more skeptical of competitors trying to game the antitrust laws for their own advantage?”