Archives For

[Cross Posted to the Center for the Protection of Intellectual Property]

In its recent decision in Douglas Dynamics v. Buyers Products Co. (Fed. Cir., May 21, 2013), the Federal Circuit was forced to reverse a district court’s abuse of its discretion because the trial judge injected an anti-patent bias into the legal test for determining whether a patent-owner should receive a permanent injunction against an infringer. As highlighted in a blog posting, the Federal Circuit explained that district courts should not read the eBay four-factor test such that it eviscerates “the public’s general interest in the judicial protection of property rights in inventive technology” (to quote from the Douglas Dynamics opinion).

A scant two months later, the Federal Circuit again reversed another district court’s denial of an injunction and again had to explain why the equitable test for issuing injunctions should not be applied in a way that undermines the property rights secured in patented innovation.  In Aria Diagnostics, Inc. v. Sequenom, Inc. (Fed. Cir., Aug. 9, 2013), the Federal Circuit reversed a district court’s denial of the patent-owner’s request for a preliminary injunction, holding that the district court improperly balanced the multi-factor test governing issuance of preliminary injunctions. 

In Chief Judge Randall Rader’s opinion for a unanimous panel in Aria Diagnostics, the Federal Circuit criticized the district court’s denial of Sequenom’s request for a preliminary injunction against Aria Diagnostics.  In this case, Sequenom countersued Aria Diagnostics following Aria Diagnostics declaratory judgment lawsuit against it, alleging that Aria Diagnostics infringes its patented diagnostic test for identifying trisomy disorders (U.S. Patent No. 6,258,540).  Trisomy disorders are genetic disorders that can result in a range of complications during and after pregnancy—from death of a fetus to down syndrome in a newborn.  The evidence submitted to the district court established that Sequenom’s patented tests eliminated the need for risky amniocenteses and “presented fewer risks and a more dependable rate of abnormality detection.”

Following its countersuit for patent infringement, Sequenom requested a preliminary injunction and the district court denied its request.  In the proceedings below, as the Federal Circuit explained, the district court rejected Sequenom’s request for a preliminary injunction because it simple assumed that Sequenom would not suffer irreparable injury given that it would easily profit from its radically innovative test.  On the basis of this assumption, the district court concluded that “the erosion to Sequenom’s price and its loss of market share were not irreparable.”

The Federal Circuit pointedly identified the implicit anti-patent bias in the district court’s rarefied reasoning from such misguided and unproven assumptions:

While the facts may show that damages would be reparable, this assumption is not sufficient. In the face of that kind of universal assumption, patents would lose their character as an exclusive right as articulated by the Constitution and become at best a judicially imposed and monitored compulsory license. (original emphases)

In short, district courts should not read the multi-factor tests for injunctions so as to eviscerate the constitutional fact that a patent is a property right, and thus de facto convert a patent into merely a regulatory entitlement to a compulsory license. Property rights secure more than just a “reasonable” rate of profit as determined by either a court or a regulatory agency. As pointed out in Aria Diagnostics, the case law on injunctions have well established that “price erosion, loss of goodwill, damage to reputation, and loss of business opportunities are all valid grounds for finding irreparable harm,” which can and should justify an injunction (after these harms are appropriately balanced against the harms to the alleged infringer) to secure a property right in innovative technology.

The Federal Circuit further criticized the district court because, while finding that a preliminary injunction might put Aria Diagnostics out of business as a justification to deny Sequenom’s request for the injunction, the “district court made no findings on the harm that would accrue to Sequenom’s R&D and investment in the technology, undermining work and money spent developing, validating, and commercializing any covered product.”  The Federal Circuit emphasized that the balance of hardships in the legal test for issuing an injunction requires courts to not only assess harm to alleged infringers, but also to assess the harms to the patent-owner, such as “price erosion, loss of goodwill, damage to reputation, and loss of business opportunities.” 

In short, the district court failed to weigh the relevant harms to both the patent-owner and the alleged infringer, and instead the district court relied solely on the harm to the alleged infringer (Aria Diagnostics) as balanced against its pure conjecture of massive profits for Sequenom in some undetermined future. Thus, the district court denied Sequenom’s request for a preliminary injunction. But this was an entirely inappropriate application of the equitable inquiry required in issuing or denying a preliminary injunction.  In effect, the district court placed a large thumb on the judicial scale in favor of the alleged infringer in its equitable analysis—a violation of the fundamental principles of equity. For this reason, the Federal Circuit reversed and remanded the case back to the district court for it to make the appropriate fact findings under the appropriate application of the multi-factor test for issuing a preliminary injunction.

The significance of Aria Diagnostics is that the Federal Circuit continues to push back against the ongoing misinterpretation of the equitable tests for issuance of injunctions in patent infringement cases, whether by academics, federal officials or district courts.  In doing so, the court is providing some much-needed guidance to district courts in what facts they should consider in assessing the relevant harms to each party in issuing or denying an injunction.

 

[Cross posted at The Center for the Protection of Intellectual Property]

In a prior blog posting, I reported how reports of a so-called “patent litigation explosion” today are just wrong.  As I detailed in another blog posting, the percentage of patent lawsuits today are not only consistent with historical patent litigation rates in the nineteenth century, there is actually less litigation today than during some decades in the early nineteenth century. Between 1840 and 1849, for instance, patent litigation rates were 3.6% — more than twice the patent litigation rate today.

(As an aside, we have to hold constant for issued patents in computing litigation percentage rates because more patents are issued now per year than twice the total population of New York City (NYC) in 1820 — 253,315 patents issued in 2012 compared to 123,706 residents in NYC in 1820.  Yet before someone says that this just means we have too many patents today, as Judge Posner blithely asserts without any empirical evidence, one must also recognize that the NYC population in 2013 is 8.3 million, which is far beyond merely double its 1820 population — NYC’s population has grown by a factor of 67!  A simple comparison to population growth, especially taking into account the explosive growth in the innovation industries in the past several decades, could as easily justify the claim that we haven’t got enough patents issuing today.)

Unfortunately, the mythical claims about a “patent litigation explosion” have shifted in recent months (perhaps because the original assertion was untenable).  Now the assertion is that there has been an “explosion” in lawsuits brought by patent licensing companies.  I’ll note for the record here that patent licensing companies are often referred to today by the undefined and nonobjective rhetorical epithet of “patent troll.”  In a recent study of patent licensing companies that exposes many of the unsound and unproven claims about these much-maligned companies – such as that patents owned by these companies are of lower quality than those owned by manufacturing entities – Stephen Moore first explained that the “troll” slur is used today by academics, commentators and the public alike “without a universally accepted definition.” So, let’s dispense with nonobjective rhetoric and simply identify these companies factually by their business models: patent licensing.

As with all discriminatory slurs, it’s unsurprising that this new claim about an alleged “explosion” in so-called “patent troll” lawsuits is unproven rubbish.  Similar to the myth about patent litigation generally, this is just another example of overwrought and empirically unsound rhetoric being used to push a policy agenda in Congress and regulatory agencies. (Six bills have been on the Hill so far this year, and FTC Chairwoman Edith Ramirez has announced that the FTC intends to begin a formal § 6(b) investigation of patent licensing companies).

How do we know that patent licensing companies are not the sole driver of any increases in patent litigation?  Contrary to the much-hyped claim today that patent licensing companies are the primary cause of most patent lawsuits in district courts in 2012, other serious and more careful reviews of the litigation data have shown that the primary culprit is not patent licensing companies, but rather the America Invents Act of 2011(“AIA”). The AIA created numerous new administrative proceedings for invalidating patents at the Patent & Trademark Office, which created additional incentives to file lawsuits in certain contexts.  Moreover, the AIA expressly prohibited joinder of multiple defendants in single lawsuits.  Both of these significant changes to the patent system has produced the entirely logical and expected result of more lawsuits being filed after the AIA’s statutory provisions went into effect in 2011 and 2012. In basic statistics terms, the effect of these statutory provisions in any study of patent litigation rates that does not take them into account is referred to as a “confounding variable.”

Even more important, when the data used in one of the most-referenced studies asserting a patent litigation explosion by patent licensing companies was tested by a highly respected scholar who specializes in statistical and empirical analyses of the patent system, he reported that he found no statistically significant results. (See Dave Schwartz’s testimony at the DOJ-FTC Workshop (Dec. 10, 2012), starting at approximately 1:58 at this video. Transcript available here.)  At least the scholars of this disputed study made their data available for confirmation, according to basic scientific norms. Other prominently cited studies on patent licensing companies have relied on secret data from companies like RPX, Patent Freedom, and other firms who have a very large dog in the litigation and policy fight, and thus this data has all of the trappings of being unreliable and biased (see here and here)

The important role that the AIA is playing in increasing patent lawsuits by patent licensing companies is ironic if only because the people misreporting the patent litigation data are the same people who were big proponents of the AIA (some of them even attended the AIA’s signing ceremony with President Obama in September 2011).  Among non-patent scholars, this is called trying to have your cake and eat it, too.  Usually such efforts fail, especially when children always try to get away with this logical fallacy.  It shows the depths to which the patent policy debates have sunk that the press, Congress, the President and many others don’t seem to care about this one bit and instead are pushing ahead and repeating – and even drafting legislation based upon – bad “statistics” with serious methodological problems and compiled from secret, unreliable data.

With Congress rushing headlong to enact legislation that discriminates against patent licensing companies, it’s time to step back and start asking serious questions before the legal system that makes possible the innovation industries is changed and we discover too late that it’s for the worse.  It’s time to set aside rhetoric and made-up “statistics” based on secret data and to ask whether there really is a systemic problem.  It’s also time to start asking serious questions about why these myths were created in the first place, what does the raw data actually say, who is providing the data and funding these “troll” studies, and who is pushing this rhetoric into the public policy debates to the point that it has become a deafening roar that makes impossible all reasonable and sensible discussion.

[NOTE: minor grammatical and style changes were made after the initial posting]

 

Over at the blog for the Center for the Protection of Intellectual Property, Richard Epstein has posted a lengthy essay that critiques the Obama Administration’s decision this past August 3 to veto the exclusion order issued by the International Trade Commission (ITC) in the Samsung v. Apple dispute filed there (ITC Investigation No. 794).  In his essay, The Dangerous Adventurism of the United States Trade Representative: Lifting the Ban against Apple Products Unnecessarily Opens a Can of Worms in Patent Law, Epstein rightly identifies how the 3-page letter issued to the ITC creates tremendous institutional and legal troubles in the name an unverified theory about “patent holdup” invoked in the name of an equally overgeneralized and vague belief in the “public interest.”

Here’s a taste:

The choice in question here thus boils down to whether the low rate of voluntary failure justifies the introduction of an expensive and error-filled judicial process that gives all parties the incentive to posture before a public agency that has more business than it can possibly handle. It is on this matter critical to remember that all standards issues are not the same as this particularly nasty, high-stake dispute between two behemoths whose vital interests make this a highly atypical standard-setting dispute. Yet at no point in the Trade Representative’s report is there any mention of how this mega-dispute might be an outlier. Indeed, without so much as a single reference to its own limited institutional role, the decision uses a short three-page document to set out a dogmatic position on issues on which there is, as I have argued elsewhere, good reason to be suspicious of the overwrought claims of the White House on a point that is, to say the least, fraught with political intrigue

Ironically, there was, moreover a way to write this opinion that could have narrowed the dispute and exposed for public deliberation a point that does require serious consideration. The thoughtful dissenting opinion of Commissioner Pinkert pointed the way. Commissioner Pinkert contended that the key factor weighing against granting Samsung an exclusion order is that Samsung in its FRAND negotiations demanded from Apple rights to use certain non standard-essential patents as part of the overall deal. In this view, the introduction of nonprice terms on nonstandard patterns represents an abuse of the FRAND standard. Assume for the moment that this contention is indeed correct, and the magnitude of the problem is cut a hundred or a thousand fold. This particular objection is easy to police and companies will know that they cannot introduce collateral matters into their negotiations over standards, at which point the massive and pointless overkill of the Trade Representative’s order is largely eliminated. No longer do we have to treat as gospel truth the highly dubious assertions about the behavior of key parties to standard-setting disputes.

But is Pinkert correct? On the one side, it is possible to invoke a monopoly leverage theory similar to that used in some tie-in cases to block this extension. But those theories are themselves tricky to apply, and the counter argument could well be that the addition of new terms expands the bargaining space and thus increases the likelihood of an agreement. To answer that question to my mind requires some close attention to the actual and customary dynamics of these negotiations, which could easily vary across different standards. I would want to reserve judgment on a question this complex, and I think that the Trade Representative would have done everyone a great service if he had addressed the hard question. But what we have instead is a grand political overgeneralization that reflects a simple-minded and erroneous view of current practices.

You can read the essay at CPIP’s blog here, or you can download a PDF of the white paper version here (please feel free to distribute digitally or in hardcopy).

 

Over at the blog for the Center for the Protection of Intellectual Property, I posted a short essay discussing the Federal Circuit’s recent decision in Douglas Dynamics v. Buyers Products (Fed. Cir. May 21, 2013).  Here’s a small taste:

The Federal Circuit’s recent decision in Douglas Dynamics, LLC, v. Buyers Products Co. (Fed. Cir. May 21, 2013) is very important given the widespread, albeit mistaken, belief today that the Supreme Court’s decision in eBay v. MercExchange (2005) established that damages and not injunctions are the presumptive remedy for patent infringement. ….

On appeal, Chief Judge Randall Rader resoundingly disagreed with Judge Conley’s belief that the “public interest” is always better served by the introduction of a new competitor who is selling cheaper products.  This is what happened in this case, as Douglas Dynamics and Buyers Products Company are competitors in the sale of snowplow blades.  Instead, Chief Judge Rader recognized that its act of infringement as such is what gave Buyers Products Company its market advantage in undercutting Douglas Dynamics’ prices.  Because it did not have to incur Douglas Dynamics’ ex ante expenses in engaging in innovative research and development, Buyers Products Company’s infringement permitted it the economic advantage of being able to undercut Douglas Dynamics prices’ and thus enter the allegedly “untapped market segment” of cheaper snowplow blades. It was precisely this expansion of a consumer market that the district court relied on in its denial of Douglas Dynamics’ requested injunction. In sum, the district court used an infringement-created expansion of the market to justify denying an injunction and awarding a compulsory license to the patent-owner, which effectively rewarded Buyer Products Company for its act of infringement.

In reversing the district court’s award of a reasonable royalty, Chief Judge Rader explained the basic economic principle of dynamic efficiency that animates the Patent Act in securing property rights to inventors in their patented innovation ….

As bloggers are wont to say, go read the whole thing.

The Federalist Society has started a new program, The Executive Branch Review, which focuses on the myriad fields in which the Executive Branch acts outside of the constitutional and legal limits imposed on it, either by Executive Orders or by the plethora of semi-independent administrative agencies’ regulatory actions.

I recently posted on the Federal Trade Commission’s (FTC) ongoing investigations into the patent licensing business model and the actions (“consent decrees”) taken by the FTC against Bosch and Google.  These “consent decrees” constrain Bosch’s and Google’s rights in enforce patents they have committed to standard setting organizations (these patents are called “standard essential patents”). Here’s a brief taste:

One of the most prominent participants at the FTC-DOJ workshop back in December, former DOJ antitrust official and UC-Berkeley economics professor Carl Shapiro, explained in his opening speech that there was still insufficient data on patent licensing companies and their effects on the market.  This is true; for instance, a prominent study cited by Google et al. in support of their request to the FTC to investigate patent licensing companies has been described as being fundamentally flawed on both substantive and methodological grounds. Even more important, Professor Shapiro expressed skepticism at the workshop that, even if there was properly acquired, valid data, the FTC lacked the legal authority to sanction patent licensing firms for being allegedly anti-competitive.

Commentators have long noted that courts and agencies have a lousy historical track record when it comes to assessing the merits of new innovation, whether in new products or new business models. They maintain that the FTC should not continue such mistakes by letting its decision-making today be driven by rhetoric or by the widespread animus against certain commercial firms. Restraint and fact-gathering, institutional virtues reflected in a government animated by the rule of law and respect for individual rights, are key to preventing regulatory overreach and harm to future innovation.

Go read the whole thing, and, while you’re at it, check out Commissioner Joshua Wright’s similar comments on the FTC’s investigations of patent licensing companies, which the FTC calls “patent assertion entities.”

Over at the blog for the Center for the Protection for Intellectual Property, Wayne Sobon, the Vice President and General Counsel of Inventergy, has posted an important essay that criticizes the slew of congressional bills that have been proposed in Congress in recent months. 

In A Line in the Sand on the Calls for New Patent Legislation, Mr. Sobon responds to the heavy-handed rhetoric and emotionalism that dominates the debate today over patent licensing and litigation. He calls for a return to the real first principles of the patent system in discussions about patent licensing, as well as for more measured thinking and analysis about the costs of uncertainty created by never-ending systemic changes from legislation produced by heavy lobbying by interested parties.  Here’s a small taste:

One genius of our patent system has been an implicit recognition that since its underlying subject matter, innovation, remains by definition in constant flux, the scaffolding of our system and the ability of all stakeholders to make reasonably consistent, prudent and socially efficient choices, should remain as stable as possible.  But now these latest moves, demanding yet further significant changes to our patent laws, threaten that stability.  And it is in fact systemic instability, from whatever source, that allows the very parasitic behaviors we have termed “troll”-like, to flourish.

It is silly and blindly ahistoric to lump anyone who seeks to license or enforce a patent right, but who does not themselves make a corresponding product, as a “troll.” 

Read the whole thing here. Mr. Sobon’s essay reflects similar concerns expressed by Commissioner Joshua Wright this past April on the Federal Trade Commission’s investigation of what the FTC identifies as “patent assertion entities.”

[Cross posted at the Center for the Protection of Intellectual Property blog.]

Today’s public policy debates frame copyright policy solely in terms of a “trade off” between the benefits of incentivizing new works and the social deadweight losses imposed by the access restrictions imposed by these (temporary) “monopolies.” I recently posted to SSRN a new research paper, called How Copyright Drives Innovation in Scholarly Publishing, explaining that this is a fundamental mistake that has distorted the policy debates about scholarly publishing.

This policy mistake is important because it has lead commentators and decision-makers to dismiss as irrelevant to copyright policy the investments by scholarly publishers of $100s of millions in creating innovative distribution mechanisms in our new digital world. These substantial sunk costs are in addition to the $100s of millions expended annually by publishers in creating, publishing and maintaining reliable, high-quality, standardized articles distributed each year in a wide-ranging variety of academic disciplines and fields of research. The articles now number in the millions themselves; in 2009, for instance, over 2,000 publishers issued almost 1.5 million articles just in the scientific, technical and medical fields, exclusive of the humanities and social sciences.

The mistaken incentive-to-invent conventional wisdom in copyright policy is further compounded by widespread misinformation today about the allegedly “zero cost” of digital publication. As a result, many people are simply unaware of the substantial investments in infrastructure, skilled labor and other resources required to create, publish and maintain scholarly articles on the Internet and in other digital platforms.

This is not merely a so-called “academic debate” about copyright policy and publishing.

The policy distortion caused by the narrow, reductionist incentive-to-create conventional wisdom, when combined with the misinformation about the economics of digital business models, has been spurring calls for “open access” mandates for scholarly research, such as at the National Institute of Health and in recently proposed legislation (FASTR Act) and in other proposed regulations. This policy distortion even influenced Justice Breyer’s opinion in the recent decision in Kirtsaeng v. John Wiley & Sons (U.S. Supreme Court, March 19, 2013), as he blithely dismissed commercial incentivizes as being irrelevant to fundamental copyright policy. These legal initiatives and the Kirtsaeng decision are motivated in various ways by the incentive-to-create conventional wisdom, by the misunderstanding of the economics of scholarly publishing, and by anti-copyright rhetoric on both the left and right, all of which has become more pervasive in recent years.

But, as I explain in my paper, courts and commentators have long recognized that incentivizing authors to produce new works is not the sole justification for copyright—copyright also incentivizes intermediaries like scholarly publishers to invest in and create innovative legal and market mechanisms for publishing and distributing articles that report on scholarly research. These two policies—the incentive to create and the incentive to commercialize—are interrelated, as both are necessary in justifying how copyright law secures the dynamic innovation that makes possible the “progress of science.” In short, if the law does not secure the fruits of labors of publishers who create legal and market mechanisms for disseminating works, then authors’ labors will go unrewarded as well.

As Justice Sandra Day O’Connor famously observed in the 1984 decision in Harper & Row v. Nation Enterprises: “In our haste to disseminate news, it should not be forgotten the Framers intended copyright itself to be the engine of free expression. By establishing a marketable right to the use of one’s expression, copyright supplies the economic incentive to create and disseminate ideas.” Thus, in Harper & Row, the Supreme Court reached the uncontroversial conclusion that copyright secures the fruits of productive labors “where an author and publisher have invested extensive resources in creating an original work.” (emphases added)

This concern with commercial incentives in copyright law is not just theory; in fact, it is most salient in scholarly publishing because researchers are not motivated by the pecuniary benefits offered to authors in conventional publishing contexts. As a result of the policy distortion caused by the incentive-to-create conventional wisdom, some academics and scholars now view scholarly publishing by commercial firms who own the copyrights in the articles as “a form of censorship.” Yet, as courts have observed: “It is not surprising that [scholarly] authors favor liberal photocopying . . . . But the authors have not risked their capital to achieve dissemination. The publishers have.” As economics professor Mark McCabe observed (somewhat sardonically) in a research paper released last year for the National Academy of Sciences: he and his fellow academic “economists knew the value of their journals, but not their prices.”

The widespread ignorance among the public, academics and commentators about the economics of scholarly publishing in the Internet age is quite profound relative to the actual numbers.  Based on interviews with six different scholarly publishers—Reed Elsevier, Wiley, SAGE, the New England Journal of Medicine, the American Chemical Society, and the American Institute of Physics—my research paper details for the first time ever in a publication and at great length the necessary transaction costs incurred by any successful publishing enterprise in the Internet age.  To take but one small example from my research paper: Reed Elsevier began developing its online publishing platform in 1995, a scant two years after the advent of the World Wide Web, and its sunk costs in creating this first publishing platform and then digitally archiving its previously published content was over $75 million. Other scholarly publishers report similarly high costs in both absolute and relative terms.

Given the widespread misunderstandings of the economics of Internet-based business models, it bears noting that such high costs are not unique to scholarly publishers.  Microsoft reportedly spent $10 billion developing Windows Vista before it sold a single copy, of which it ultimately did not sell many at all. Google regularly invests $100s of millions, such as $890 million in the first quarter of 2011, in upgrading its data centers.  It is somewhat surprising that such things still have to be pointed out a scant decade after the bursting of the dot.com bubble, a bubble precipitated by exactly the same mistaken view that businesses have somehow been “liberated” from the economic realities of cost by the Internet.

Just as with the extensive infrastructure and staffing costs, the actual costs incurred by publishers in operating the peer review system for their scholarly journals are also widely misunderstood.  Individual publishers now receive hundreds of thousands—the large scholarly publisher, Reed Elsevier, receives more than one million—manuscripts per year. Reed Elsevier’s annual budget for operating its peer review system is over $100 million, which reflects the full scope of staffing, infrastructure, and other transaction costs inherent in operating a quality-control system that rejects 65% of the submitted manuscripts. Reed Elsevier’s budget for its peer review system is consistent with industry-wide studies that have reported that the peer review system costs approximately $2.9 billion annually in operation costs (translating into dollars the British £1.9 billion pounds reported in the study). For those articles accepted for publication, there are additional, extensive production costs, and then there are extensive post-publication costs in updating hypertext links of citations, cyber security of the websites, and related digital issues.

In sum, many people mistakenly believe that scholarly publishers are no longer necessary because the Internet has made moot all such intermediaries of traditional brick-and-mortar economies—a viewpoint reinforced by the equally mistaken incentive-to-create conventional wisdom in the copyright policy debates today. But intermediaries like scholarly publishers face the exact same incentive problems that is universally recognized for authors by the incentive-to-create conventional wisdom: no will make the necessary investments to create a work or to distribute if the fruits of their labors are not secured to them. This basic economic fact—dynamic development of innovative distribution mechanisms require substantial investment in both people and resources—is what makes commercialization an essential feature of both copyright policy and law (and of all intellectual property doctrines).

It is for this reason that copyright law has long promoted and secured the value that academics and scholars have come to depend on in their journal articles—reliable, high-quality, standardized, networked, and accessible research that meets the differing expectations of readers in a variety of fields of scholarly research. This is the value created by the scholarly publishers. Scholarly publishers thus serve an essential function in copyright law by making the investments in and creating the innovative distribution mechanisms that fulfill the constitutional goal of copyright to advance the “progress of science.”

DISCLOSURE: The paper summarized in this blog posting was supported separately by a Leonardo Da Vinci Fellowship and by the Association of American Publishers (AAP). The author thanks Mark Schultz for very helpful comments on earlier drafts, and the AAP for providing invaluable introductions to the five scholarly publishers who shared their publishing data with him.

NOTE: Some small copy-edits were made to this blog posting.

 

The State of the Patent System: A Discussion with Chief Judge Rader

A teleforum on Thursday, April 11, at 2pm. Hosted by George Mason Law School’s Center for the Protection of Intellectual Property Teleforum and the Federalist Society‘s Intellectual Property Practice Group.

Today, people read daily complaints about the “broken” patent system, and thus it’s unsurprising that there are numerous and wide-ranging attempts to “reform” the patent system. Legislative reform efforts include the proposed SHIELD Act, which would impose a losing-plaintiff-pays litigation system solely on patent-licensing companies and further revisions to the America Invents Act of 2011. Regulatory agencies also have skin in the patent reform game: the FTC recently reached settlements with Bosch and Google that restricted their rights to enforce their patents in standardized technology, and the FTC is currently considering whether to condemn the patent-licensing business model as “anti-competitive.” The courts are heavily involved as well: in addition to the many patent cases it has decided in recent years, the U.S. Supreme Court has four major patent cases on its docket this year, which suggests that it also agrees that the patent system is in serious need of legal reform. Yet, patents today secure innovation once imagined only as science fiction – tablet computers, smart phones, genetically modified seeds, genetic testing for cancer, personalized medical treatments for debilitating diseases, and many others – and these technological marvels are now a commonplace feature of our lives. This Teleforum with the Honorable Randall Rader, Chief Judge of the Court of Appeals for the Federal Circuit – a digital “fireside chat” – will explore these and other issues in assessing whether the patent system is broken or whether it is fundamentally sound.

Featuring:

Hon. Randall R. Rader, Chief Judge, U.S. Court of Appeals, Federal Circuit
Moderator: Prof. Adam Mossoff, Co-Director, Academic Programs and Senior Scholar, Center for the Protection of Intellectual Property, George Mason Law School

Agenda:

Call begins at 2:00 p.m. Eastern Time, Thursday, April 11, 2013.

More information here.

 

 LEC

Earlier this month, Representatives Peter DeFazio and Jason Chaffetz picked up the gauntlet from President Obama’s comments on February 14 at a Google-sponsored Internet Q&A on Google+ that “our efforts at patent reform only went about halfway to where we need to go” and that he would like “to see if we can build some additional consensus on smarter patent laws.” So, Reps. DeFazio and Chaffetz introduced on March 1 the Saving High-tech Innovators from Egregious Legal Disputes (SHIELD) Act, which creates a “losing plaintiff patent-owner pays” litigation system for a single type of patent owner—patent licensing companies that purchase and license patents in the marketplace (and who sue infringers when infringers refuse their requests to license). To Google, to Representative DeFazio, and to others, these patent licensing companies are “patent trolls” who are destroyers of all things good—and the SHIELD Act will save us all from these dastardly “trolls” (is a troll anything but dastardly?).

As I and other scholars have pointed out, the “patent troll” moniker is really just a rhetorical epithet that lacks even an agreed-upon definition.  The term is used loosely enough that it sometimes covers and sometimes excludes universities, Thomas Edison, Elias Howe (the inventor of the lockstitch in 1843), Charles Goodyear (the inventor of vulcanized rubber in 1839), and even companies like IBM.  How can we be expected to have a reasonable discussion about patent policy when our basic terms of public discourse shift in meaning from blog to blog, article to article, speaker to speaker?  The same is true of the new term, “Patent Assertion Entities,” which sounds more neutral, but has the same problem in that it also lacks any objective definition or usage.

Setting aside this basic problem of terminology for the moment, the SHIELD Act is anything but a “smarter patent law” (to quote President Obama). Some patent scholars, like Michael Risch, have begun to point out some of the serious problems with the SHIELD Act, such as its selectively discriminatory treatment of certain types of patent-owners.  Moreover, as Professor Risch ably identifies, this legislation was so cleverly drafted to cover only a limited set of a specific type of patent-owner that it ended up being too clever. Unlike the previous version introduced last year, the 2013 SHIELD Act does not even apply to the flavor-of-the-day outrage over patent licensing companies—the owner of the podcast patent. (Although you wouldn’t know this if you read the supporters of the SHIELD Act like the EFF who falsely claim that this law will stop patent-owners like the podcast patent-owning company.)

There are many things wrong with the SHIELD Act, but one thing that I want to highlight here is that it based on a falsehood: the oft-repeated claim that two Boston University researchers have proven in a study that “patent troll suits cost American technology companies over $29 billion in 2011 alone.”  This is what Rep. DeFazio said when he introduced the SHIELD Act on March 1. This claim was repeated yesterday by House Members during a hearing on “Abusive Patent Litigation.” The claim that patent licensing companies cost American tech companies $29 billion in a single year (2011) has become gospel since this study, The Direct Costs from NPE Disputes, was released last summer on the Internet. (Another name of patent licensing companies is “Non Practicing Entity” or “NPE.”)  A Google search of “patent troll 29 billion” produces 191,000 hits. A Google search of “NPE 29 billion” produces 605,000 hits. Such is the making of conventional wisdom.

The problem with conventional wisdom is that it is usually incorrect, and the study that produced the claim of “$29 billion imposed by patent trolls” is no different. The $29 billion cost study is deeply and fundamentally flawed, as explained by two noted professors, David Schwartz and Jay Kesan, who are also highly regarded for their empirical and economic work in patent law.  In their essay, Analyzing the Role of Non-Practicing Entities in the Patent System, also released late last summer, they detailed at great length serious methodological and substantive flaws in The Direct Costs from NPE Disputes. Unfortunately, the Schwartz and Kesan essay has gone virtually unnoticed in the patent policy debates, while the $29 billion cost claim has through repetition become truth.

In the hope that at least a few more people might discover the Schwartz and Kesan essay, I will briefly summarize some of their concerns about the study that produced the $29 billion cost figure.  This is not merely an academic exercise.  Since Rep. DeFazio explicitly relied on the $29 billion cost claim to justify the SHIELD Act, and he and others keep repeating it, it’s important to know if it is true, because it’s being used to drive proposed legislation in the real world.  If patent legislation is supposed to secure innovation, then it behooves us to know if this legislation is based on actual facts. Yet, as Schwartz and Kesan explain in their essay, the $29 billion cost claim is based on a study that is fundamentally flawed in both substance and methodology.

In terms of its methodological flaws, the study supporting the $29 billion cost claim employs an incredibly broad definition of “patent troll” that covers almost every person, corporation or university that sues someone for infringing a patent that it is not currently being used to manufacture a product at that moment.  While the meaning of the “patent troll” epithet shifts depending on the commentator, reporter, blogger, or scholar who is using it, one would be extremely hard pressed to find anyone embracing this expansive usage in patent scholarship or similar commentary today.

There are several reasons why the extremely broad definition of “NPE” or “patent troll” in the study is unusual even compared to uses of this term in other commentary or studies. First, and most absurdly, this definition, by necessity, includes every university in the world that sues someone for infringing one of its patents, as universities don’t manufacture goods.  Second, it includes every individual and start-up company who plans to manufacture a patented invention, but is forced to sue an infringer-competitor who thwarted these business plans by its infringing sales in the marketplace.  Third, it includes commercial firms throughout the wide-ranging innovation industries—from high tech to biotech to traditional manufacturing—that have at least one patent among a portfolio of thousands that is not being used at the moment to manufacture a product because it may be “well outside the area in which they make products” and yet they sue infringers of this patent (the quoted language is from the study). So, according to this study, every manufacturer becomes an “NPE” or “patent troll” if it strays too far from what somebody subjectively defines as its rightful “area” of manufacturing. What company is not branded an “NPE” or “patent troll” under this definition, or will necessarily become one in the future given inevitable changes in one’s business plans or commercial activities? This is particularly true for every person or company whose only current opportunity to reap the benefit of their patented invention is to license the technology or to litigate against the infringers who refuse license offers.

So, when almost every possible patent-owning person, university, or corporation is defined as a “NPE” or “patent troll,” why are we surprised that a study that employs this virtually boundless definition concludes that they create $29 billion in litigation costs per year?  The only thing surprising is that the number isn’t even higher!

There are many other methodological flaws in the $29 billion cost study, such as its explicit assumption that patent litigation costs are “too high” without providing any comparative baseline for this conclusion.  What are the costs in other areas of litigation, such as standard commercial litigation, tort claims, or disputes over complex regulations?  We are not told.  What are the historical costs of patent litigation?  We are not told.  On what basis then can we conclude that $29 billion is “too high” or even “too low”?  We’re supposed to be impressed by a number that exists in a vacuum and that lacks any empirical context by which to evaluate it.

The $29 billion cost study also assumes that all litigation transaction costs are deadweight losses, which would mean that the entire U.S. court system is a deadweight loss according to the terms of this study.  Every lawsuit, whether a contract, tort, property, regulatory or constitutional dispute is, according to the assumption of the $29 billion cost study, a deadweight loss.  The entire U.S. court system is an inefficient cost imposed on everyone who uses it.  Really?  That’s an assumption that reduces itself to absurdity—it’s a self-imposed reductio ad absurdum!

In addition to the methodological problems, there are also serious concerns about the trustworthiness and quality of the actual data used to reach the $29 billion claim in the study.  All studies rely on data, and in this case, the $29 billion study used data from a secret survey done by RPX of its customers.  For those who don’t know, RPX’s business model is to defend companies against these so-called “patent trolls.”  So, a company whose business model is predicated on hyping the threat of “patent trolls” does a secret survey of its paying customers, and it is now known that RPX informed its customers in the survey that their answers would be used to lobby for changes in the patent laws.

As every reputable economist or statistician will tell you, such conditions encourage exaggeration and bias in a data sample by motivating participation among those who support changes to the patent law.  Such a problem even has a formal name in economic studies: self-selection bias.  But one doesn’t need to be an economist or statistician to be able to see the problems in relying on the RPX data to conclude that NPEs cost $29 billion per year. As the classic adage goes, “Something is rotten in the state of Denmark.”

Even worse, as I noted above, the RPX survey was confidential.  RPX has continued to invoke “client confidences” in refusing to disclose its actual customer survey or the resulting data, which means that the data underlying the $29 billion claim is completely unknown and unverifiable for anyone who reads the study.  Don’t worry, the researchers have told us in a footnote in the study, they looked at the data and confirmed it is good.  Again, it doesn’t take economic or statistical training to know that something is not right here. Another classic cliché comes to mind at this point: “it’s not the crime, it’s the cover-up.”

In fact, keeping data secret in a published study violates well-established and longstanding norms in all scientific research that data should always be made available for testing and verification by third parties.  No peer-reviewed medical or scientific journal would publish a study based on a secret data set in which the researchers have told us that we should simply trust them that the data is accurate.  Its use of secret data probably explains why the $29 billion study has not yet appeared in a peer-reviewed journal, and, if economics has any claim to being an actual science, this study never will.  If a study does not meet basic scientific standards for verifying data, then why are Reps. DeFazio and Chaffetz relying on it to propose national legislation that directly impacts the patent system and future innovation?  If heads-in-the-clouds academics would know to reject such a study as based on unverifiable, likely biased claptrap, then why are our elected officials embracing it to create real-world legal rules?

And, to continue our running theme of classic clichés, there’s the rub. The more one looks at the actual legal requirements of the SHIELD Act, the more, in the words of Professor Risch, one is left “scratching one’s head” in bewilderment.  The more one looks at the supporting studies and arguments in favor of the SHIELD Act, the more one is left, in the words of Professor Risch, “scratching one’s head.”  The more and more one thinks about the SHIELD Act, the more one realizes what it is—legislation that has been crafted at the behest of the politically powerful (such as an Internet company who can get the President to do a special appearance on its own social media website) to have the government eliminate a smaller, publicly reviled, and less politically-connected group.

In short, people may have legitimate complaints about the ways in which the court system in the U.S. generally has problems.  Commentators and Congresspersons could even consider revising the general legal rules governing patent ligtiation for all plaintiffs and defendants to make the ligitation system work better or more efficiently (by some established metric).   Professor Risch has done exactly this in a recent Wired op-ed.  But it’s time to call a spade a spade: the SHIELD Act is a classic example of rent-seeking, discriminatory legislation.

Over at Cato Unbound, there has been a discussion this past month on copyright and copyright reform.  In his recent contribution to this discussion, Mark Schultz posted an excellent essay today, Where are the Creators? Consider Creators in Copyright Reform, that calls out the cramped, reductionist view of copyright policy that leads some libertarians and conservatives to castigate this property right as “regulation” or as “monopoly.”  Here’s a small taste from his essay:

I am genuinely puzzled when copyright discussions treat creative works if they are a pre-existing resource that the government arbitrarily allocates. They are not. They aren’t an imaginary regulatory entitlement, such as pollution credits. They aren’t leases or mineral rights on public land handed out to political cronies. Creative works are, instead, the productive intellectual labor of private parties. Real people make this stuff.

At this point in the discussion, a common rhetorical move is to reject what some scholars describe as the romantic myth of authorship. Copyright skeptics point out that authors build on the work of others and that many creative works are the work of corporations, not individuals. This argument was provoked by many decades—a couple centuries, really—of rhetoric that put the individual author on a pedestal. Even if one concedes that authors have, perhaps, been idealized, taking them for granted goes too far.

The absence of creators from the critique of copyright is one of many reasons I doubt the political (and moral) appeal of much of the case for copyright reform we have heard from a few libertarians and conservatives. At the risk of dredging up tiresome memories from the recent presidential election, the argument over “you didn’t build that” was very familiar to me as a scholar of copyright. In both instances, there is a divide between those who value (or, even, romanticize) individual achievement and those who emphasize how much that achievement depends on a social context.

This follows Mark’s earlier and equally excellent essay, Copyright Reform through Private Ordering, in which he identifies how defining and securing copyright as a property right is consistent with and advances the private-ordering regimes embraced by advocates of the free market.  Again, here’s a small taste:

Like other forms of property, copyright thus represents an invitation to a transaction and an opportunity to bargain. This opportunity for parties to transact and bargain is one of the key differences between property and regulation. A regulator has a duty to enforce the law—and if a regulator chooses not to enforce, then a court may order him to do so. Copyright owners need not enforce their rights, of course. Moreover, it is perfectly legitimate to offer a property owner money to forgo their right to enforce their copyrights; such commercial transactions are really the whole point of copyright. Make the same offer to a regulator, and you go to jail.

Read these essays in their entirety—both of them are here and here—as Mark is doing a great job in what is very brief and limited blogging space in providing both the important data and the principled arguments for how copyright is fundamentally consistent with and advances the aspirations of the free market and limited government.  This follows on his earlier, excellent blog posting at the Copyright Alliance that touched on similar themes, Copyright, Economic Freedom, and the RSC Policy Brief.

DISCLOSURE: Mark and I are both on the Academic Advisory Board of the Copyright Alliance.

For loyal readers of Truth on the Market who are in the D.C. area on Tuesday, come check out this fun talk.  Heck, forget the talk, they’re serving tea and cookies!

 

Mossoff Smithsonian