Archives For scholarship

As Thom previously posted, he and I have a new paper explaining The Case for Doing Nothing About Common Ownership of Small Stakes in Competing Firms. Our paper is a response to cries from the likes of Einer Elhauge and of Eric Posner, Fiona Scott Morton, and Glen Weyl, who have called for various types of antitrust action to reign in what they claim is an “economic blockbuster” and “the major new antitrust challenge of our time,” respectively. This is the first in a series of posts that will unpack some of the issues and arguments we raise in our paper.

At issue is the growth in the incidence of common-ownership across firms within various industries. In particular, institutional investors with broad portfolios frequently report owning small stakes in a number of firms within a given industry. Although small, these stakes may still represent large block holdings relative to other investors. This intra-industry diversification, critics claim, changes the managerial objectives of corporate executives from aggressively competing to increase their own firm’s profits to tacitly colluding to increase industry-level profits instead. The reason for this change is that competition by one firm comes at a cost of profits from other firms in the industry. If investors own shares across firms, then any competitive gains in one firm’s stock are offset by competitive losses in the stocks of other firms in the investor’s portfolio. If one assumes corporate executives aim to maximize total value for their largest shareholders, then managers would have incentive to soften competition against firms with which they share common ownership. Or so the story goes (more on that in a later post.)

Elhague and Posner, et al., draw their motivation for new antitrust offenses from a handful of papers that purport to establish an empirical link between the degree of common ownership among competing firms and various measures of softened competitive behavior, including airline prices, banking fees, executive compensation, and even corporate disclosure patterns. The paper of most note, by José Azar, Martin Schmalz, and Isabel Tecu and forthcoming in the Journal of Finance, claims to identify a causal link between the degree of common ownership among airlines competing on a given route and the fares charged for flights on that route.

Measuring common ownership with MHHI

Azar, et al.’s airline paper uses a metric of industry concentration called a Modified Herfindahl–Hirschman Index, or MHHI, to measure the degree of industry concentration taking into account the cross-ownership of investors’ stakes in competing firms. The original Herfindahl–Hirschman Index (HHI) has long been used as a measure of industry concentration, debuting in the Department of Justice’s Horizontal Merger Guidelines in 1982. The HHI is calculated by squaring the market share of each firm in the industry and summing the resulting numbers.

The MHHI is rather more complicated. MHHI is composed of two parts: the HHI measuring product market concentration and the MHHI_Delta measuring the additional concentration due to common ownership. We offer a step-by-step description of the calculations and their economic rationale in an appendix to our paper. For this post, I’ll try to distill that down. The MHHI_Delta essentially has three components, each of which is measured relative to every possible competitive pairing in the market as follows:

  1. A measure of the degree of common ownership between Company A and Company -A (Not A). This is calculated by multiplying the percentage of Company A shares owned by each Investor I with the percentage of shares Investor I owns in Company -A, then summing those values across all investors in Company A. As this value increases, MHHI_Delta goes up.
  2. A measure of the degree of ownership concentration in Company A, calculated by squaring the percentage of shares owned by each Investor I and summing those numbers across investors. As this value increases, MHHI_Delta goes down.
  3. A measure of the degree of product market power exerted by Company A and Company -A, calculated by multiplying the market shares of the two firms. As this value increases, MHHI_Delta goes up.

This process is repeated and aggregated first for every pairing of Company A and each competing Company -A, then repeated again for every other company in the market relative to its competitors (e.g., Companies B and -B, Companies C and -C, etc.). Mathematically, MHHI_Delta takes the form:

where the Ss represent the firm market shares of, and Betas represent ownership shares of Investor I in, the respective companies A and -A.

As the relative concentration of cross-owning investors to all investors in Company A increases (i.e., the ratio on the right increases), managers are assumed to be more likely to soften competition with that competitor. As those two firms control more of the market, managers’ ability to tacitly collude and increase joint profits is assumed to be higher. Consequently, the empirical research assumes that as MHHI_Delta increases, we should observe less competitive behavior.

And indeed that is the “blockbuster” evidence giving rise to Elhauge’s and Posner, et al.,’s arguments  For example, Azar, et. al., calculate HHI and MHHI_Delta for every US airline market–defined either as city-pairs or departure-destination pairs–for each quarter of the 14-year time period in their study. They then regress ticket prices for each route against the HHI and the MHHI_Delta for that route, controlling for a number of other potential factors. They find that airfare prices are 3% to 7% higher due to common ownership. Other papers using the same or similar measures of common ownership concentration have likewise identified positive correlations between MHHI_Delta and their respective measures of anti-competitive behavior.

Problems with the problem and with the measure

We argue that both the theoretical argument underlying the empirical research and the empirical research itself suffer from some serious flaws. On the theoretical side, we have two concerns. First, we argue that there is a tremendous leap of faith (if not logic) in the idea that corporate executives would forgo their own self-interest and the interests of the vast majority of shareholders and soften competition simply because a small number of small stakeholders are intra-industry diversified. Second, we argue that even if managers were so inclined, it clearly is not the case that softening competition would necessarily be desirable for institutional investors that are both intra- and inter-industry diversified, since supra-competitive pricing to increase profits in one industry would decrease profits in related industries that may also be in the investors’ portfolios.

On the empirical side, we have concerns both with the data used to calculate the MHHI_Deltas and with the nature of the MHHI_Delta itself. First, the data on institutional investors’ holdings are taken from Schedule 13 filings, which report aggregate holdings across all the institutional investor’s funds. Using these data masks the actual incentives of the institutional investors with respect to investments in any individual company or industry. Second, the construction of the MHHI_Delta suffers from serious endogeneity concerns, both in investors’ shareholdings and in market shares. Finally, the MHHI_Delta, while seemingly intuitive, is an empirical unknown. While HHI is theoretically bounded in a way that lends to interpretation of its calculated value, the same is not true for MHHI_Delta. This makes any inference or policy based on nominal values of MHHI_Delta completely arbitrary at best.

We’ll expand on each of these concerns in upcoming posts. We will then take on the problems with the policy proposals being offered in response to the common ownership ‘problem.’

 

 

 

 

 

 

I’ll be participating in two excellent antitrust/consumer protection events next week in DC, both of which may be of interest to our readers:

5th Annual Public Policy Conference on the Law & Economics of Privacy and Data Security

hosted by the GMU Law & Economics Center’s Program on Economics & Privacy, in partnership with the Future of Privacy Forum, and the Journal of Law, Economics & Policy.

Conference Description:

Data flows are central to an increasingly large share of the economy. A wide array of products and business models—from the sharing economy and artificial intelligence to autonomous vehicles and embedded medical devices—rely on personal data. Consequently, privacy regulation leaves a large economic footprint. As with any regulatory enterprise, the key to sound data policy is striking a balance between competing interests and norms that leaves consumers better off; finding an approach that addresses privacy concerns, but also supports the benefits of technology is an increasingly complex challenge. Not only is technology continuously advancing, but individual attitudes, expectations, and participation vary greatly. New ideas and approaches to privacy must be identified and developed at the same pace and with the same focus as the technologies they address.

This year’s symposium will include panels on Unfairness under Section 5: Unpacking “Substantial Injury”, Conceptualizing the Benefits and Costs from Data Flows, and The Law and Economics of Data Security.

I will be presenting a draft paper, co-authored with Kristian Stout, on the FTC’s reasonableness standard in data security cases following the Commission decision in LabMD, entitled, When “Reasonable” Isn’t: The FTC’s Standard-less Data Security Standard.

Conference Details:

  • Thursday, June 8, 2017
  • 8:00 am to 3:40 pm
  • at George Mason University, Founders Hall (next door to the Law School)
    • 3351 Fairfax Drive, Arlington, VA 22201

Register here

View the full agenda here

 

The State of Antitrust Enforcement

hosted by the Federalist Society.

Panel Description:

Antitrust policy during much of the Obama Administration was a continuation of the Bush Administration’s minimal involvement in the market. However, at the end of President Obama’s term, there was a significant pivot to investigations and blocks of high profile mergers such as Halliburton-Baker Hughes, Comcast-Time Warner Cable, Staples-Office Depot, Sysco-US Foods, and Aetna-Humana and Anthem-Cigna. How will or should the new Administration analyze proposed mergers, including certain high profile deals like Walgreens-Rite Aid, AT&T-Time Warner, Inc., and DraftKings-FanDuel?

Join us for a lively luncheon panel discussion that will cover these topics and the anticipated future of antitrust enforcement.

Speakers:

  • Albert A. Foer, Founder and Senior Fellow, American Antitrust Institute
  • Profesor Geoffrey A. Manne, Executive Director, International Center for Law & Economics
  • Honorable Joshua D. Wright, Professor of Law, George Mason University School of Law
  • Moderator: Honorable Ronald A. Cass, Dean Emeritus, Boston University School of Law and President, Cass & Associates, PC

Panel Details:

  • Friday, June 09, 2017
  • 12:00 pm to 2:00 pm
  • at the National Press Club, MWL Conference Rooms
    • 529 14th Street, NW, Washington, DC 20045

Register here

Hope to see everyone at both events!

TOTM is pleased to welcome guest blogger Nicolas Petit, Professor of Law & Economics at the University of Liege, Belgium.

Nicolas has also recently been named a (non-resident) Senior Scholar at ICLE (joining Joshua Wright, Joanna Shepherd, and Julian Morris).

Nicolas is also (as of March 2017) a Research Professor at the University of South Australia, co-director of the Liege Competition & Innovation Institute and director of the LL.M. program in EU Competition and Intellectual Property Law. He is also a part-time advisor to the Belgian competition authority.

Nicolas is a prolific scholar specializing in competition policy, IP law, and technology regulation. Nicolas Petit is the co-author (with Damien Geradin and Anne Layne-Farrar) of EU Competition Law and Economics (Oxford University Press, 2012) and the author of Droit européen de la concurrence (Domat Montchrestien, 2013), a monograph that was awarded the prize for the best law book of the year at the Constitutional Court in France.

One of his most recent papers, Significant Impediment to Industry Innovation: A Novel Theory of Harm in EU Merger Control?, was recently published as an ICLE Competition Research Program White Paper. His scholarship is available on SSRN and he tweets at @CompetitionProf.

Welcome, Nicolas!

Please Join Us For A Conference On Intellectual Property Law

INTELLECTUAL PROPERTY & GLOBAL PROSPERITY

Keynote Speaker: Dean Kamen

October 6-7, 2016

Antonin Scalia Law School
George Mason University
Arlington, Virginia

CLICK HERE TO REGISTER NOW

**9 Hours CLE**

As it begins its hundredth year, the FTC is increasingly becoming the Federal Technology Commission. The agency’s role in regulating data security, privacy, the Internet of Things, high-tech antitrust and patents, among other things, has once again brought to the forefront the question of the agency’s discretion and the sources of the limits on its power.Please join us this Monday, December 16th, for a half-day conference launching the year-long “FTC: Technology & Reform Project,” which will assess both process and substance at the FTC and recommend concrete reforms to help ensure that the FTC continues to make consumers better off.

FTC Commissioner Josh Wright will give a keynote luncheon address titled, “The Need for Limits on Agency Discretion and the Case for Section 5 UMC Guidelines.” Project members will discuss the themes raised in our inaugural report and how they might inform some of the most pressing issues of FTC process and substance confronting the FTC, Congress and the courts. The afternoon will conclude with a Fireside Chat with former FTC Chairmen Tim Muris and Bill Kovacic, followed by a cocktail reception.

Full Agenda:

  • Lunch and Keynote Address (12:00-1:00)
    • FTC Commissioner Joshua Wright
  • Introduction to the Project and the “Questions & Frameworks” Report (1:00-1:15)
    • Gus Hurwitz, Geoffrey Manne and Berin Szoka
  • Panel 1: Limits on FTC Discretion: Institutional Structure & Economics (1:15-2:30)
    • Jeffrey Eisenach (AEI | Former Economist, BE)
    • Todd Zywicki (GMU Law | Former Director, OPP)
    • Tad Lipsky (Latham & Watkins)
    • Geoffrey Manne (ICLE) (moderator)
  • Panel 2: Section 5 and the Future of the FTC (2:45-4:00)
    • Paul Rubin (Emory University Law and Economics | Former Director of Advertising Economics, BE)
    • James Cooper (GMU Law | Former Acting Director, OPP)
    • Gus Hurwitz (University of Nebraska Law)
    • Berin Szoka (TechFreedom) (moderator)
  • A Fireside Chat with Former FTC Chairmen (4:15-5:30)
    • Tim Muris (Former FTC Chairman | George Mason University) & Bill Kovacic (Former FTC Chairman | George Washington University)
  • Reception (5:30-6:30)
Our conference is a “widely-attended event.” Registration is $75 but free for nonprofit, media and government attendees. Space is limited, so RSVP today!

Working Group Members:
Howard Beales
Terry Calvani
James Cooper
Jeffrey Eisenach
Gus Hurwitz
Thom Lambert
Tad Lipsky
Geoffrey Manne
Timothy Muris
Paul Rubin
Joanna Shepherd-Bailey
Joe Sims
Berin Szoka
Sasha Volokh
Todd Zywicki

[Cross posted at the Center for the Protection of Intellectual Property blog.]

Today’s public policy debates frame copyright policy solely in terms of a “trade off” between the benefits of incentivizing new works and the social deadweight losses imposed by the access restrictions imposed by these (temporary) “monopolies.” I recently posted to SSRN a new research paper, called How Copyright Drives Innovation in Scholarly Publishing, explaining that this is a fundamental mistake that has distorted the policy debates about scholarly publishing.

This policy mistake is important because it has lead commentators and decision-makers to dismiss as irrelevant to copyright policy the investments by scholarly publishers of $100s of millions in creating innovative distribution mechanisms in our new digital world. These substantial sunk costs are in addition to the $100s of millions expended annually by publishers in creating, publishing and maintaining reliable, high-quality, standardized articles distributed each year in a wide-ranging variety of academic disciplines and fields of research. The articles now number in the millions themselves; in 2009, for instance, over 2,000 publishers issued almost 1.5 million articles just in the scientific, technical and medical fields, exclusive of the humanities and social sciences.

The mistaken incentive-to-invent conventional wisdom in copyright policy is further compounded by widespread misinformation today about the allegedly “zero cost” of digital publication. As a result, many people are simply unaware of the substantial investments in infrastructure, skilled labor and other resources required to create, publish and maintain scholarly articles on the Internet and in other digital platforms.

This is not merely a so-called “academic debate” about copyright policy and publishing.

The policy distortion caused by the narrow, reductionist incentive-to-create conventional wisdom, when combined with the misinformation about the economics of digital business models, has been spurring calls for “open access” mandates for scholarly research, such as at the National Institute of Health and in recently proposed legislation (FASTR Act) and in other proposed regulations. This policy distortion even influenced Justice Breyer’s opinion in the recent decision in Kirtsaeng v. John Wiley & Sons (U.S. Supreme Court, March 19, 2013), as he blithely dismissed commercial incentivizes as being irrelevant to fundamental copyright policy. These legal initiatives and the Kirtsaeng decision are motivated in various ways by the incentive-to-create conventional wisdom, by the misunderstanding of the economics of scholarly publishing, and by anti-copyright rhetoric on both the left and right, all of which has become more pervasive in recent years.

But, as I explain in my paper, courts and commentators have long recognized that incentivizing authors to produce new works is not the sole justification for copyright—copyright also incentivizes intermediaries like scholarly publishers to invest in and create innovative legal and market mechanisms for publishing and distributing articles that report on scholarly research. These two policies—the incentive to create and the incentive to commercialize—are interrelated, as both are necessary in justifying how copyright law secures the dynamic innovation that makes possible the “progress of science.” In short, if the law does not secure the fruits of labors of publishers who create legal and market mechanisms for disseminating works, then authors’ labors will go unrewarded as well.

As Justice Sandra Day O’Connor famously observed in the 1984 decision in Harper & Row v. Nation Enterprises: “In our haste to disseminate news, it should not be forgotten the Framers intended copyright itself to be the engine of free expression. By establishing a marketable right to the use of one’s expression, copyright supplies the economic incentive to create and disseminate ideas.” Thus, in Harper & Row, the Supreme Court reached the uncontroversial conclusion that copyright secures the fruits of productive labors “where an author and publisher have invested extensive resources in creating an original work.” (emphases added)

This concern with commercial incentives in copyright law is not just theory; in fact, it is most salient in scholarly publishing because researchers are not motivated by the pecuniary benefits offered to authors in conventional publishing contexts. As a result of the policy distortion caused by the incentive-to-create conventional wisdom, some academics and scholars now view scholarly publishing by commercial firms who own the copyrights in the articles as “a form of censorship.” Yet, as courts have observed: “It is not surprising that [scholarly] authors favor liberal photocopying . . . . But the authors have not risked their capital to achieve dissemination. The publishers have.” As economics professor Mark McCabe observed (somewhat sardonically) in a research paper released last year for the National Academy of Sciences: he and his fellow academic “economists knew the value of their journals, but not their prices.”

The widespread ignorance among the public, academics and commentators about the economics of scholarly publishing in the Internet age is quite profound relative to the actual numbers.  Based on interviews with six different scholarly publishers—Reed Elsevier, Wiley, SAGE, the New England Journal of Medicine, the American Chemical Society, and the American Institute of Physics—my research paper details for the first time ever in a publication and at great length the necessary transaction costs incurred by any successful publishing enterprise in the Internet age.  To take but one small example from my research paper: Reed Elsevier began developing its online publishing platform in 1995, a scant two years after the advent of the World Wide Web, and its sunk costs in creating this first publishing platform and then digitally archiving its previously published content was over $75 million. Other scholarly publishers report similarly high costs in both absolute and relative terms.

Given the widespread misunderstandings of the economics of Internet-based business models, it bears noting that such high costs are not unique to scholarly publishers.  Microsoft reportedly spent $10 billion developing Windows Vista before it sold a single copy, of which it ultimately did not sell many at all. Google regularly invests $100s of millions, such as $890 million in the first quarter of 2011, in upgrading its data centers.  It is somewhat surprising that such things still have to be pointed out a scant decade after the bursting of the dot.com bubble, a bubble precipitated by exactly the same mistaken view that businesses have somehow been “liberated” from the economic realities of cost by the Internet.

Just as with the extensive infrastructure and staffing costs, the actual costs incurred by publishers in operating the peer review system for their scholarly journals are also widely misunderstood.  Individual publishers now receive hundreds of thousands—the large scholarly publisher, Reed Elsevier, receives more than one million—manuscripts per year. Reed Elsevier’s annual budget for operating its peer review system is over $100 million, which reflects the full scope of staffing, infrastructure, and other transaction costs inherent in operating a quality-control system that rejects 65% of the submitted manuscripts. Reed Elsevier’s budget for its peer review system is consistent with industry-wide studies that have reported that the peer review system costs approximately $2.9 billion annually in operation costs (translating into dollars the British £1.9 billion pounds reported in the study). For those articles accepted for publication, there are additional, extensive production costs, and then there are extensive post-publication costs in updating hypertext links of citations, cyber security of the websites, and related digital issues.

In sum, many people mistakenly believe that scholarly publishers are no longer necessary because the Internet has made moot all such intermediaries of traditional brick-and-mortar economies—a viewpoint reinforced by the equally mistaken incentive-to-create conventional wisdom in the copyright policy debates today. But intermediaries like scholarly publishers face the exact same incentive problems that is universally recognized for authors by the incentive-to-create conventional wisdom: no will make the necessary investments to create a work or to distribute if the fruits of their labors are not secured to them. This basic economic fact—dynamic development of innovative distribution mechanisms require substantial investment in both people and resources—is what makes commercialization an essential feature of both copyright policy and law (and of all intellectual property doctrines).

It is for this reason that copyright law has long promoted and secured the value that academics and scholars have come to depend on in their journal articles—reliable, high-quality, standardized, networked, and accessible research that meets the differing expectations of readers in a variety of fields of scholarly research. This is the value created by the scholarly publishers. Scholarly publishers thus serve an essential function in copyright law by making the investments in and creating the innovative distribution mechanisms that fulfill the constitutional goal of copyright to advance the “progress of science.”

DISCLOSURE: The paper summarized in this blog posting was supported separately by a Leonardo Da Vinci Fellowship and by the Association of American Publishers (AAP). The author thanks Mark Schultz for very helpful comments on earlier drafts, and the AAP for providing invaluable introductions to the five scholarly publishers who shared their publishing data with him.

NOTE: Some small copy-edits were made to this blog posting.

 

Available here.  Although not the first article to build on Orin Kerr’s brilliant paper, A Theory of Law (blog post here) (that honor belongs to Josh Blackman’s challenging and thought-provoking paper, My Own Theory of the Law) (blog post here), I think this is an important contribution to this burgeoning field.  It’s still a working paper, though, so comments are welcome.

In a response to my essay, The Trespass Fallacy in Patent Law, in which I explain why patent scholars like Michael Meurer, James Bessen, T.J. Chiang and others are committing the nirvana fallacy in their critiques of the patent system, my colleague, T.J. Chiang writes at PrawfsBlawg:

The Nirvana fallacy, at least as I understand it, is to compare an imperfect existing arrangement (such as the existing patent system) to a hypothetical idealized system. But the people comparing the patent system to real property—and I count myself among them—are not comparing it to an idealized fictional system, whether conceptualized as land boundaries or as estate boundaries. We are saying that, based on our everyday experiences, the real property system seems to work reasonably well because we don’t feel too uncertain about our real property rights and don’t get into too many disputes with our neighbors. This is admittedly a loose intuition, but it is not an idealization in the sense of using a fictional baseline. It is the same as saying that the patent system seems to work reasonably well because we see a lot of new technology in our everyday experience.

I would like to make two quick points in response to T.J.’s attempt at wiggling out from serving as one of the examples I identify in my essay as a patent scholar who uses trespass doctrine in a way that reflects the nirvana fallacy.

First, what T.J. describes as what he is doing — comparing an actual institutional system to a “loose intuition” about another institutional system — is exactly what Harold Demsetz identified as the nirvana fallacy (when he roughly coined the term in 1969).  When economists or legal scholars commit the nirvana fallacy, they always justify their idealized counterfactual standard by appeal to some intuition or gestalt sense of the world; in fact, Demsetz’s example of the nirvana fallacy is when economists have a loose intuition that regulation always works perfectly to fix market failures.  These economists do this for the simple reason that they’re social scientists, and so they have to make their critiques seem practical.

It’s like the infamous statement by Pauline Kael in 1972 (quoting from memory): “I can’t believe Nixon won, because I don’t know anyone who voted for him.” Similarly, what patent scholars like T.J. are doing is saying: “I can’t believe that trespass isn’t clear and efficient, because I don’t know anyone who has been involved in a trespass lawsuit or I don’t hear of any serious trespass lawsuits.”  Economists or legal scholars always have some anecdotal evidence — either personal experiences or merely an impressionistic intuition about other people — to offer as support for their counterfactual by which they’re evaluating (and criticizing) the actual facts of the world. The question is whether such an idealized counterfactual is a valid empirical metric or not; of course, it is not.  To do this is exactly what Demsetz criticized as the nirvana fallacy.

Ultimately, no social scientist or legal scholar ever commits the “nirvana fallacy” as T.J. has defined it in his blog posting, and this leads to my second point.  The best way to test T.J.’s definition is to ask: Does anyone know a single lawyer, legal scholar or economist who has committed the “nirvana fallacy” as defined by T.J.?  What economist or lawyer appeals to a completely imaginary “fictional baseline” as the standard for evaluating a real-world institution?

The answer to this question is obvious.  In fact, when I posited this exact question to T.J. in an exchange we had before he made his blog posting, he could not answer it.  The reason why he couldn’t answer it is because no one says in legal scholarship or in economic scholarship: “I have a completely made-up, imaginary ‘fictionalized’ world to which I’m going to compare to a real-world institution or legal doctrine.”  This is certainly is not the meaning of the nirvana fallacy, and I’m fairly sure Demsetz would be surprised to learn that he identified a fallacy that according to T.J. has never been committed by a single economist or legal scholar. Ever.

In sum, what T.J. describes in his blog posting — using a “loose intuition” of an institution an empirical standard for critiquing the operation of another institution — is the nirvana fallacy. Philosophers may posit completely imaginary and fictionalized baselines — it’s what they call “other worlds” — but that is not what social scientists and legal scholars do.  Demsetz was not talking about philosophers when he identified the nirvana fallacy.  Rather, he was talking about exactly what T.J. admits he does in his blog posting (and which he has done in his scholarship).

HT: Danny Sokol.

TOP 10 Papers for Journal of Antitrust: Antitrust Law & Policy eJournal June 4, 2012 to August 3, 2012.

Rank Downloads Paper Title
1 244 The Antitrust/Consumer Protection Paradox: Two Policies at War with Each Other 
Joshua D. Wright,
George Mason University – School of Law, Faculty,
Date posted to database: May 31, 2012
Last Revised: May 31, 2012
2 237 Cartels, Corporate Compliance and What Practitioners Really Think About Enforcement 
D. Daniel Sokol,
University of Florida – Levin College of Law,
Date posted to database: June 7, 2012
Last Revised: July 16, 2012
3 175 The Implications of Behavioral Antitrust 
Maurice E. Stucke,
University of Tennessee College of Law,
Date posted to database: July 17, 2012
Last Revised: July 17, 2012
4 167 The Oral Hearing in Competition Proceedings Before the European Commission 
Wouter P. J. WilsWouter P. J. Wils,
European Commission, University of London – School of Law,
Date posted to database: May 3, 2012
Last Revised: June 18, 2012
5 141 Citizen Petitions: An Empirical Study 
Michael A. CarrierDaryl Wander,
Rutgers University School of Law – Camden, Unaffiliated Authors – affiliation not provided to SSRN,
Date posted to database: June 4, 2012
Last Revised: June 4, 2012
6 138 The Role of the Hearing Officer in Competition Proceedings Before the European Commission 
Wouter P. J. WilsWouter P. J. Wils,
European Commission, University of London – School of Law,
Date posted to database: May 3, 2012
Last Revised: May 7, 2012
7 90 Google, in the Aftermath of Microsoft and Intel: The Right Approach to Antitrust Enforcement in Innovative High Tech Platform Markets? 
Fernando Diez,
University of Antonio de Nebrija,
Date posted to database: June 12, 2012
Last Revised: June 26, 2012
8 140 Dynamic Analysis and the Limits of Antitrust Institutions 
Douglas H. GinsburgJoshua D. Wright,
U.S. Court of Appeals for the District of Columbia, George Mason University – School of Law, Faculty,
Date posted to database: June 14, 2012
Last Revised: June 17, 2012
9 114 Optimal Antitrust Remedies: A Synthesis 
William H. Page,
University of Florida – Fredric G. Levin College of Law,
Date posted to database: May 17, 2012
Last Revised: July 29, 2012
10 111 An Economic Analysis of the AT&T-T-Mobile USA Wireless Merger 
Stanley M. BesenStephen KletterSerge MoresiSteven C. Salopjohn woodbury,
Charles River Associates (CRA), Charles River Associates (CRA), Charles River Associates (CRA), Georgetown University Law Center, Charles River Associates (CRA),
Date posted to database: April 25, 2012
Last Revised: April 25, 2012

An interesting new joint venture between Oxford University Press, Ariel Ezrachi, and Bill Kovacic (GW).  Sounds like a fantastic idea and with top notch management and might be of interest to many of our readers.

The Journal of Antitrust Enforcement 

Call for Papers – The Journal of Antitrust Enforcement (OUP) Oxford University Press is delighted to announce the launch of a new competition law journal dedicated to antitrust enforcement. The Journal of Antitrust Enforcement forms a joint collaboration between OUP, the Oxford University Centre for Competition Law and Policy and the George Washington University Competition Law Center.

The Journal of Antitrust Enforcement will provide a platform for cutting edge scholarship relating to public and private competition law enforcement, both at the international and domestic levels.

The journal covers a wide range of enforcement related topics, including: public and private competition law enforcement, cooperation between competition agencies, the promotion of worldwide competition law enforcement, optimal design of enforcement policies, performance measurement, empirical analysis of enforcement policies, combination of functions in the mandate of the competition agency, competition agency governance, procedural fairness, competition enforcement and human rights, the role of the judiciary in competition enforcement, leniency, cartel prosecution, effective merger enforcement and the regulation of sectors.

Submission of papers: Original articles that advance the field are published following a peer and editorial review process. The editors welcome submission of papers on all subjects related to antitrust enforcement. Papers should range from 8,000 to 15,000 words (including footnotes) and should be prefaced by an abstract of less than 200 words.

General inquiries may be directed to the editors: Ariel Ezrachi at the Oxford CCLP or William Kovacic at George Washington University. Submission, by email, should be directed to the Managing Editor, Hugh Hollman.

Further information about the journal may be found online: http://www.oxfordjournals.org/our_journals/antitrust/

I am the co-editor of the Supreme Court Economic Review, a peer-review publication that is one of the country’s top-rated law and economics journals, along with my colleagues Todd Zywicki and Ilya Somin.  SCER, along with its publisher, the University of Chicago Press, have put together a new submissions website.  If you have a relevant submission, please submit at the website for our review.

I’ve posted to SSRN an article written for the Antitrust Law Journal symposium on the Neo-Chicago School of Antitrust.  The article is entitled “Abandoning Chicago’s Antitrust Obsession: The Case for Evidence-Based Antitrust,” and focuses upon what I believe to be a central obstacle to the continued evolution of sensible antitrust rules in the courts and agencies: the dramatic proliferation of economic theories which could be used to explain antitrust-relevant business conduct. That proliferation has given rise to a need for a commitment to develop sensible criteria for selecting among these theories; a commitment not present in modern antitrust institutions.  I refer to this as the “model selection problem,” describe how reliance upon shorthand labels and descriptions of the various “Chicago Schools” have distracted from the development of solutions to this problem, and raise a number of promising approaches to embedding a more serious commitment to empirical testing within modern antitrust.

Here is the abstract.

The antitrust community retains something of an inconsistent attitude towards evidence-based antitrust.  Commentators, judges, and scholars remain supportive of evidence-based antitrust, even vocally so; nevertheless, antitrust scholarship and policy discourse continues to press forward advocating the use of one theory over another as applied in a specific case, or one school over another with respect to the class of models that should inform the structure of antitrust’s rules and presumptions, without tethering those questions to an empirical benchmark.  This is a fundamental challenge facing modern antitrust institutions, one that I call the “model selection problem.”  The three goals of this article are to describe the model selection problem, to demonstrate that the intense focus upon so-called schools within the antitrust community has exacerbated the problem, and to offer a modest proposal to help solve the model selection problem.  This proposal has two major components: abandonment of terms like “Chicago School,” “Neo-Chicago School,” and “Post-Chicago School,” and replacement of those terms with a commitment to testing economic theories with economic knowledge and empirical data to support those theories with the best predictive power.  I call this approach “evidence-based antitrust.”  I conclude by discussing several promising approaches to embedding an appreciation for empirical testing more deeply within antitrust institutions.

I would refer interested readers to the work of my colleagues Tim Muris and Bruce Kobayashi (also prepared for the Antitrust L.J. symposium) Chicago, Post-Chicago, and Beyond: Time to Let Go of the 20th Century, which also focuses upon similar themes.