Archives For environment

Large portions of the country are expected to face a growing threat of widespread electricity blackouts in the coming years. For example, the Western Electricity Coordinating Council—the regional entity charged with overseeing the Western Interconnection grid that covers most of the Western United States and Canada—estimates that the subregion consisting of Colorado, Utah, Nevada, and portions of southern Wyoming, Idaho, and Oregon will, by 2032, see 650 hours (more than 27 days in total) over the course of the year when available enough resources may not be sufficient to accommodate peak demand.

Supply and demand provide the simplest explanation for the region’s rising risk of power outages. Demand is expected to continue to rise, while stable supplies are diminishing. Over the next 10 years, electricity demand across the entire Western Interconnection is expected to grow by 11.4%, while scheduled resource retirements are projected to growing resource-adequacy risk in every subregion of the grid.

The largest decreases in resources are from coal, natural gas, and hydropower. Anticipated additions of highly variable solar and wind resources, as well as battery storage, will not be sufficient to offset the decline from conventional resources. The Wall Street Journal reports that, while 21,000 MW of wind, solar, and battery-storage capacity are anticipated to be added to the grid by 2030, that’s only about half as much as expected fossil-fuel retirements.

In addition to the risk associated with insufficient power generation, many parts of the U.S. are facing another problem: insufficient transmission capacity. The New York Times reports that more than 8,100 energy projects were waiting for permission to connect to electric grids at year-end 2021. That was an increase from the prior year, when 5,600 projects were queued up.

One of the many reasons for the backlog, the Times reports, is the difficulty in determining who will pay for upgrades elsewhere in the system to support the new interconnections. These costs can be huge and unpredictable. Some upgrades that penciled out as profitable when first proposed may become uneconomic in the years it takes to earn regulatory approval, and end up being dropped. According to the Times:

That creates a new problem: When a proposed energy project drops out of the queue, the grid operator often has to redo studies for other pending projects and shift costs to other developers, which can trigger more cancellations and delays.

It also creates perverse incentives, experts said. Some developers will submit multiple proposals for wind and solar farms at different locations without intending to build them all. Instead, they hope that one of their proposals will come after another developer who has to pay for major network upgrades. The rise of this sort of speculative bidding has further jammed up the queue.

“Imagine if we paid for highways this way,” said Rob Gramlich, president of the consulting group Grid Strategies. “If a highway is fully congested, the next car that gets on has to pay for a whole lane expansion. When that driver sees the bill, they drop off. Or, if they do pay for it themselves, everyone else gets to use that infrastructure. It doesn’t make any sense.”

This is not a new problem, nor is it a problem that is unique to the electrical grid. In fact, the Federal Communications Commission (FCC) has been wrestling with this issue for years regarding utility-pole attachments.

Look up at your local electricity pole and you’ll see a bunch of stuff hanging off it. The cable company may be using it to provide cable service and broadband and the telephone company may be using it, too. These companies pay the pole owner to attach their hardware. But sometimes, the poles are at capacity and cannot accommodate new attachments. This raises the question of who should pay for the new, bigger pole: The pole owner, or the company whose attachment is driving the need for a new pole?

It’s not a simple question to answer.

In comments to the FCC, the International Center for Law & Economics (ICLE) notes:

The last-attacher-pays model may encourage both hold-up and hold-out problems that can obscure the economic reasons a pole owner would otherwise have to replace a pole before the end of its useful life. For example, a pole owner may anticipate, after a recent new attachment, that several other companies are also interested in attaching. In this scenario, it may be in the owner’s interest to replace the existing pole with a larger one to accommodate the expected demand. The last-attacher-pays arrangement, however, would diminish the owner’s incentive to do so. The owner could instead simply wait for a new attacher to pay the full cost of replacement, thereby creating a hold-up problem that has been documented in the record. This same dynamic also would create an incentive for some prospective attachers to hold-out before requesting an attachment, in expectation that some other prospective attacher would bear the costs.

This seems to be very similar to the problems facing electricity-transmission markets. In our comments to the FCC, we conclude:

A rule that unilaterally imposes a replacement cost onto an attacher is expedient from an administrative perspective but does not provide an economically optimal outcome. It likely misallocates resources, contributes to hold-outs and holdups, and is likely slowing the deployment of broadband to the regions most in need of expanded deployment. Similarly, depending on the condition of the pole, shifting all or most costs onto the pole owner would not necessarily provide an economically optimal outcome. At the same time, a complex cost-allocation scheme may be more economically efficient, but also may introduce administrative complexity and disputes that could slow broadband deployment. To balance these competing considerations, we recommend the FCC adopt straightforward rules regarding both the allocation of pole-replacement costs and the rates charged to attachers, and that these rules avoid shifting all the costs onto one or another party.

To ensure rapid deployment of new energy and transmission resources, federal, state, and local governments should turn to the lessons the FCC is learning in its pole-attachment rulemaking to develop a system that efficiently and fairly allocates the costs of expanding transmission connections to the electrical grid.

Following is the second in a series of posts on my forthcoming book, How to Regulate: A Guide for Policy Makers (Cambridge Univ. Press 2017).  The initial post is here.

As I mentioned in my first post, How to Regulate examines the market failures (and other private ordering defects) that have traditionally been invoked as grounds for government regulation.  For each such defect, the book details the adverse “symptoms” produced, the underlying “disease” (i.e., why those symptoms emerge), the range of available “remedies,” and the “side effects” each remedy tends to generate.  The first private ordering defect the book addresses is the externality.

I’ll never forget my introduction to the concept of externalities.  P.J. Hill, my much-beloved economics professor at Wheaton College, sauntered into the classroom eating a giant, juicy apple.  As he lectured, he meandered through the rows of seats, continuing to chomp on that enormous piece of fruit.  Every time he took a bite, juice droplets and bits of apple fell onto students’ desks.  Speaking with his mouth full, he propelled fruit flesh onto students’ class notes.  It was disgusting.

It was also quite effective.  Professor Hill was making the point (vividly!) that some activities impose significant effects on bystanders.  We call those effects “externalities,” he explained, because they are experienced by people who are outside the process that creates them.  When the spillover effects are adverse—costs—we call them “negative” externalities.  “Positive” externalities are spillovers of benefits.  Air pollution is a classic example of a negative externality.  Landscaping one’s yard, an activity that benefits one’s neighbors, generates a positive externality.

An obvious adverse effect (“symptom”) of externalities is unfairness.  It’s not fair for a factory owner to capture the benefits of its production while foisting some of the cost onto others.  Nor is it fair for a homeowner’s neighbors to enjoy her spectacular flower beds without contributing to their creation or maintenance.

A graver symptom of externalities is “allocative inefficiency,” a failure to channel productive resources toward the uses that will wring the greatest possible value from them.  When an activity involves negative externalities, people tend to do too much of it—i.e., to devote an inefficiently high level of productive resources to the activity.  That’s because a person deciding how much of the conduct at issue to engage in accounts for all of his conduct’s benefits, which ultimately inure to him, but only a portion of his conduct’s costs, some of which are borne by others.  Conversely, when an activity involves positive externalities, people tend to do too little of it.  In that case, they must bear all of the cost of their conduct but can capture only a portion of the benefit it produces.

Because most government interventions addressing externalities have been concerned with negative externalities (and because How to Regulate includes a separate chapter on public goods, which entail positive externalities), the book’s externalities chapter focuses on potential remedies for cost spillovers.  There are three main options, which are discussed below the fold. Continue Reading…

There’s some good news on the endangered species front:  Three species of endangered African antelopes — the Scimitar-Horned Oryx, Addax, and Dama Gazelle — are coming back with a vengeance.  At least in Texas, where the population of the three antelope species quadrupled from 2004 to 2010, growing to a combined total of around 17,000.

What’s the secret?  Private property rights and markets.  In 2005, the U.S. Fish and Wildlife Service (FWS), which administers the Endangered Species Act (ESA), created a blanket exception from the ESA’s “taking” prohibition for captive-bred U.S. antelope.  FWS recognized that the rare African antelopes have great value to trophy hunters and, accordingly, to ranchers who are able set aside ideal habitat for the creatures.  The prospect of hefty bounties — up to $10,000 per antelope — has encouraged the formation of private preserves, much to the benefit of the three endangered species.

Unfortunately, an environmental organization operating under the misnomer “Friends of Animals” sued to stop hunting of the antelopes on private preserves.  “Hunting these antelope is no way to save them or treat them with dignity,” proclaimed the Friends of Animals vice-president (apparently ignoring the data on the antelopes’ population explosion in Texas). 

Today’s WSJ reports that Friends of Animals has procured new rules that will require exotic ranchers to obtain costly individual take permits for every instance of hunting.  Faced with the prospect of having to navigate the costly and time-consuming permit process, many exotic ranchers are considering whether to abandon their antelope operations altogether.  If they do so, we can expect the worldwide population of these antelope species to dwindle.  Yet another consequence of our perverse Endangered Species Act, which renders listed species a liability to landowners (thereby encouraging a “shoot, shovel, and shut up” strategy) and fights all efforts to encourage market-based conservation efforts.

Welcome Baby 7B!

Thom Lambert —  31 October 2011

According to the United Nations, sometime around Halloween a newborn baby will push the world’s population above seven billion people.  Welcome to our spectacular planet, Little One!

I should warn you that not everyone will greet your arrival as enthusiastically as I.  A great many smart folks on our planet—especially highly educated people in rich countries like my own—have fallen under the spell of this fellow named Malthus, who once warned that our planet was “overpopulated.”  Although Mr. Malthus’s ideas have been proven wrong time and again, his smart and influential disciples keep insisting that your arrival spells disaster, that this lonely planet just can’t support you. 

Now my own suspicion is that modern day Malthusians, who are smart enough to know that actual events have discredited their leader’s theories, continue to parrot Mr. Malthus’s ideas because they lend support to all manner of governmental intervention into private affairs.  (These smarty-pants Malthusians, who are well-aware of their own intelligence, tend to think they can arrange things better than the “men and women on the spot” and are constantly looking for reasons to go meddling in others’ business!)  Whatever their motivation, Mr. Malthus’s disciples just won’t shut up about how our planet is overpopulated.

You should know, though, that this simply isn’t true.  The first time you hear one of Mr. Malthus’s followers decrying your very existence by insisting that our planet is overpopulated, you should ask him or her:  “Overpopulated relative to what?”  Modern Malthusians can never give a good answer to that question, though they always try.

Sometimes they say “living space.”  But that’s plain silly.  Our planet is really pretty huge.  Indeed, if all seven billion people on the planet moved to the state ofAlaska, each person would have 2,300 square feet of living space!  Now I realize lots of cities get crowded, but that’s because people choose to live in those areas—they’ve decided that the benefits of enhanced economic opportunity in a densely populated area outweigh the costs of close confines.  If they really wanted extra living space, they could easily find it in our planet’s vast uninhabited (or sparsely inhabited) regions.

Sometimes modern day Malthusians say the planet is overpopulated relative to available food.  Wrong again.  In the nations of the world where institutions have evolved to allow people to profit from coming up with new ideas that enhance welfare, individuals have developed all sorts of ways to get more food from less land.  Accordingly, food production has always outpaced population growth.  Now, modern day Malthusians will probably tell you that food prices have been rising in recent years — a sign that food is getting scarcer relative to people’s demand for it.  But that’s because governments, beholden to powerful agricultural lobbyists, have been requiring that huge portions of agricultural output be diverted to fuel production even though the primary biofuel (ethanol) provides no environmental benefit.  As usual, it’s actually bad government policy, not population growth, that’s creating scarcity.

In recent days, Mr. Malthus’s disciples have insisted that the world is overpopulated relative to available resources.  Nothing new here.  Back in the 1970s, lots of smart folks contended that the earth was quickly running out of resources and that drastic measures were required to constrain continued population growth.  One of those smarty pants was Stanford University biologist Paul Ehrlich, who, along with his wife Anne and President Obama’s science czar John Holdren, asked (in all seriousness): “Why should the law not be able to prevent a person from having more than two children?”  (See Paul R. Ehrlich, Anne H. Ehrlich & John P. Holdren, Ecoscience 838 (1977).)  (Ehrlich also proclaimed, in his 1968 blockbuster The Population Bomb, that “The battle to feed all of humanity is over. In the 1970s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now. At this late date nothing can prevent a substantial increase in the world death rate.”)

In 1980, Prof. Ehrlich bet economist Julian Simon (a jolly fellow who would have welcomed your birth!) that the booming population would raise demand for resources so much that prices would skyrocket.  Mr. Simon thought otherwise and therefore allowed Prof. Ehrlich to pick five metals whose price he believed would rise over the next decade.  As it turns out, the five metals Prof. Ehrlich selected — chromium, copper, nickel, tin, and tungsten — fell in price as clever, profit-seeking humans discovered both how to extract more from the earth and how to substitute other, cheaper substances.  Mr. Simon was not at all surprised.  He recognized that the long-term price trend of most resources points downward, indicating that resources are becoming more plentiful, relative to human needs, over time.  (Modern Malthusians may point to some recent price trends showing rising prices for some resources, especially precious metals.  It’s likely, though, that those price increases are due to the fact that central banks all over the world have been creating lots and lots of money, thereby threatening inflation and causing investors to hold their wealth in the form of commodities.)

The fundamental mistake Mr. Malthus’s disciples make, Little One, is to assume that our planet is the ultimate source of resources.  That’s just not true.  Our planet does contain lots of useful “stuff,” but it’s human ingenuity — something only you and those like you can provide — that turns that stuff into “resources.”  Take oil, for instance.  For most of human history, messy crude oil was a source of annoyance for landowners.  It polluted their water and fouled their property.  But when whale oil prices started to rise in response to scarcity (or, put differently, when the world started to look “overpopulated” relative to whale oil), some clever, profit-seeking folks discovered how to turn that annoyance into kerosene, and eventually petroleum.  Voila!  A “resource” was created!

Just as people once worried about overpopulation relative to whale oil supplies, lots of folks now worry about overpopulation relative to crude oil.  Well I’m not that worried, and you shouldn’t be either.  As oil prices rise, more and more clever profit-seekers will turn their energies toward finding new ways to obtain oil (e.g., hydraulic facturing), new techniques for reducing oil requirements (e.g., enhanced efficiency), and new substitutes for oil (e.g., alternative fuels).  Mr. Malthus’s disciples will continue to fret about the limits to growth, but the historical record is clear on this one:  Human ingenuity — the ultimate resource — always outpaces the diminution in useful “stuff.”

And so, Little Resource, your arrival on our planet should be celebrated, not scorned!  As you and your fellow newborns flex your creative muscle, you’ll develop new sources of wealth for the world.  As you do so, birth rates will plummet, as they typically do when societies become wealthier, and the demand for a cleaner environment, demand that rises with wealth, will grow.  We therefore need not worry about “overpopulation.”

We do, though, need to ensure the survival of those institutions — property rights, free markets, the rule of law — that encourage resource-creating innovation.  I, for one, promise to do my best to defend those institutions so that you and your fellow newborns can add to our planet’s resource base.

Apparently, the detergent industry has entered into what has been described as a “voluntary agreement” to reduce the use of phosphates in detergents (HT: Ted Frank).  A press release from Clean Water Action describes the agreement as follows:

On July 1, 2010 a voluntary ban on phosphates in dishwasher detergents will be implemented by many members of the American Cleaning Council (formerly the Soap and Detergent Association), a manufacturer’s trade group representing most detergent companies.”Industry’s announcement on phosphates in dishwasher detergents is welcome news, indeed, if somewhat overdue,” said Jonathan Scott, a spokesman for Clean Water Action, founded in the early 1970’s to fight for clean, safe water. “Even small amounts of phosphates can wreak havoc when they get into our water,” Scott says, “so it’s the last thing you want as an ingredient in detergents, which are specifically designed to end up in the water by way of household appliances and drain pipes.”

It is also apparent that some our none too pleased with the effects of reducing phosphate levels in detergents — with the primary downside being that the new product doesn’t work too well.  An article in the Weekly Standard describes the impact of the reduction:

The result is detergents that don’t work very well. There have been a handful of stories in the media about consumer complaints. The New York Times noted that on the review section of the website for Cascade—Procter & Gamble’s market-leading brand—ratings plummeted after the switch, with only 11 percent of consumers saying they would recommend the product. One woman in Florida told National Public Radio that she called Procter & Gamble to complain about how its detergent no longer worked. The customer rep told her to consider handwashing the dishes instead.

Some NPR commenters agreed. “Like so many -others, I had disassembled my dishwasher, run multiple empty ‘cleaning cycles’ using all kinds of various chemical treatments, all trying to get my dishwasher ‘fixed,’ ” said one. “We assumed that something was wrong with the machine, that it was limed up, and we tried vinegar and other remedies with limited success,” wrote another. “We do wash some dishes by hand now, using more hot water than before, and also have simply lowered our standards for what constitutes ‘clean.’ ” Another commenter complained: “I live in AZ and had the same thing happen last year when it was introduced out here. I thought it was a reaction between the ‘Green’ soap and the hard water. I wrote to the company and they sent me about $30 in coupons—for other items and for their non-green soap. I dumped the 3 unopened bottles plus the one I was using.”

The detergents were so problematic that they caused environmental delinquency even among NPR listeners. One disappointed commenter rationalized his backsliding:

We first heard about the new phosphate-free detergent formulations almost a year ago. Wanting to do the Right Thing we rushed out and bought some and immediately began using it. The results, although not as bad as reported by some, were still pretty underwhelming. Our dishes and glassware were covered by a gritty film and so was the inside of the dishwasher. We are in Southern California and have very hard water. Adding vinegar to the rinse cycle helped *some* but still we found excessive buildup on our dishes. Disgusted with the new detergent, we decided to go back to something with phosphate. We were not able to find phosphate detergent at the supermarket, but some local discount stores sell supplies that are apparently remaindered by the manufacturers. We bought six boxes of old Cascade with phosphate—about a year’s supply. We figured someone would buy it—might as well be us.

When Consumer Reports did laboratory testing on the new nil-phosphate detergents, they concluded that none of them “equaled the excellent (but now discontinued) product that topped our Ratings in August 2009.”

There is, of course, an interesting antitrust angle here.  Thom has posted previously on another voluntary industry agreement in the soda industry to refrain from selling high calorie soda (and limiting the size of even healthy drinks) in schools.  In the comments to that post I suggest that one important issue is whether the soda players actually reached an agreement:

The passing or collective endorsement of a set of “best practices” to which members of the industry can voluntary choose to adhere to or not is not necessarily an actionable antitrust conspiracy. Of course, calling something “voluntary guidelines” won’t immunize an actual agreement if it is there. But it seems that the parties were pretty careful — EXCEPT for in their commercials and in print!!! — to make sure to emphasize that the antitrust-relevant choices were made unilaterally. But I can’t imagine antitrust counsel would have given the thumbs up to the commercials…

So it is here.  I’ve no doubt that such an agreement, if it exists, is reachable under Section of the Sherman Act.  The question is whether the detergent industry has taken some steps to protect themselves from an “agreement” finding under Section 1.  I don’t have enough detail to know whether that is the case — but if anybody does, please send it along.

That’s the punchline of a recent paper by Pierre Desrochers (U Toronto). Pierre has written some interesting papers on a range of topics related to economic development, technological innovation, and the intersection of business and the environment.   He argues that it is governmental (regulatory) failures that distort the environmental consequences of corporate behavior, not market failures. Should be an interesting read.

The environmental responsibility of business is to increase its profits (by creating value within the bounds of private property rights).” Industrial and Corporate Change, vol. 19, no 1 (February 2010), pp. 161-204.


Proponents of corporate social responsibility (CSR) typically consider “business as usual” unsustainable. Building on historical evidence that long predates the modern environmental movement, the contrary case is made that the interplay of voluntary exchange, private property rights, and self-interest has generally resulted in the so-called “triple bottom line” (economic, social, and environmental) through more efficient use of materials and the continual creation of higher quality resources. However, because market processes continually eliminate less competitive firms and tend to concentrate business activities geographically, political pressure brought to bear by adversely affected vested interests often results in the creation of policies that cause greater environmental harm than would otherwise be evident. Environmental CSR proponents often misinterpret these government failures as market failures, and characteristically advocate policies that further distract firms from their core objective and resulting triple bottom line. The article concludes by arguing that the most promising path toward truly sustainable development lies in the unwavering pursuit of profitability within the bounds of well-defined and enforced private property rights.

Today the SEC voted 3-2 to approve an interpretive release offering guidance to companies on disclosure obligations as they relate to climate change.  Commissioners Casey and Paredes voted to reject the proposed guidance.

Everyone can agree that companies may have an obligation under Regulation S-K to disclose risks arising from, among other many things, climate change laws and regulations.  But this guidance goes further, urging companies to disclose risks arising from “physical effects of climate change.”  This is nonsense on stilts.  Leave aside the underlying inanity of the larger enterprise built on the premise that rationally-ignorant and rationally-passive individual investors should read, assess and make active investment decisions on the basis of massive amounts of regularly-disclosed information.  Here corporations are asked to disclose “information” about risks to the company posed by future, possible environmental conditions about which the firms know nothing, the science is utterly un-settled and speculative, and the actual physical and economic consequences of which are even less certain.  (I put the word information in scare quotes because nothing so speculative and uncertain can reasonably be called information).  What possible value could there be to investors (assuming there is any value to investors from disclosure of this type of information anyway) from the sorts of disclosures that would reasonably follow?:  “There is somewhere between a 0% and 90% chance that the temperature during either the summer or the winter or maybe-but-not-definitely both will be either warmer or colder by somewhere between 0.01 and 5 degrees sometime within the next 300 years.  This may or may not be bad for our business depending on whether it makes our customers richer or poorer, improves or harms our productivity and helps or harms our competitors by more or less than it helps or harms us.”

At least Troy Paredes isn’t buying it and once again stands nearly alone (Commissioner Casey also voted to reject) for common sense in Washington.  An extended quote from his statement at today’s hearing:

Second, the release states that companies “whose businesses may be vulnerable to severe weather or climate related events should consider disclosing material risks of, or consequences from,” the “physical effects of climate change, such as effects on the severity of weather (for example, floods or hurricanes), sea levels, the arability of farmland, and water availability and quality.”

The prospect that this guidance will in fact foster confusion and uncertainty about a company’s required disclosures troubles me. What triggers a “reputational damage” or “physical effects” disclosure is far from certain, as is the scope of any such disclosure if and when required. More to the point, reputational damage and the impact on a company of the physical effects of climate change can be quite speculative. There is a notable risk that the interpretive release will encourage disclosures that are unlikely to improve investor decision making and may actually distract investors from focusing on more important information. Here, it is worth recalling that, in rejecting the view that a fact is “material” if an investor “might” find it important, Justice Marshall, writing for the Supreme Court in TSC Industries, warned that “management’s fear of exposing itself to substantial liability may cause it simply to bury the shareholders in an avalanche of trivial information — a result that is hardly conducive to informed decisionmaking.

Also problematic are the interpretive release’s introductory and background discussions on climate change and its regulation. To me, the effect of the discussions is to find the Commission joining the ongoing debate over climate change by lending support to a particular view of climate change. Although the release does not expressly take sides, the release emphasizes the “concerns” and potential harms of climate change and discusses a range of regulatory and legislative developments, along with international efforts, aimed at regulating and otherwise remedying causes of climate change. In particular, the release highlights new EPA regulations, proposed “cap-and-trade” legislation, the Kyoto Protocol (which the U.S. has not ratified), the European Union Emissions Trading System, and recent discussions at the United Nations Climate Conference in Copenhagen. While the release stresses the risks of climate change and ongoing efforts to regulate greenhouse gas emissions in the U.S. and abroad, the release fails to recognize that the climate change debate remains unsettled and that many have questioned the appropriateness of the regulatory, legislative, and other initiatives aimed at reducing emissions that the release features. In short, I am troubled that the release does not strike a more neutral and balanced tone when it comes to climate change — an area far outside this agency’s expertise.

Finally, given that there are more pressing priorities before the Commission, now is not the time for this agency to consider climate change disclosure.

For these reasons, I am not able to support the release before us. Nonetheless, I would like to thank the staff for their efforts and professionalism.

On the bright side, maybe this means the SEC is qualified to investigate the fraudulent disclosures in the UN IPCC’s Fourth Assessment Report.  On the other hand, the existence of such . . . misstatements in the authoritative global survey of climate change science suggests that, as Troy suggests, this whole endeavor will have few of the intended–and plenty of unintended–consequences.

In his New York Times column, Thomas Friedman advocates “doing the Cheney-thing on climate — preparing for 1%.” He’s referring to Vice-President Cheney’s reported remark: “If there’s a 1% chance that Pakistani scientists are helping Al Qaeda build or develop a nuclear weapon, we have to treat it as a certainty in terms of our response.”

This 1% doctrine, Friedman contends, is really just a restatement of the well-known and highly influential precautionary principle. In its most famous recitation, that principle states: “When an activity raises threats of harm to human health or the environment, precautionary measures should be taken even if some cause and effect relationships are not fully established scientifically.” In other words, “It’s better to be safe than sorry.”

The problem with the precautionary principle is that it’s literally non-sensical. It says, “If there’s a course of action that involves threats of harm to human health or the environment, take precautions against it.” The problem is that precaution-taking itself threatens harm to human health and the environment. When we devote resources to avoiding one risk, we divert those resources from some other welfare-enhancing use. When we turn away from a cheap risk-creating technology to a more expensive technology that creates less direct risk, we raise the price of the technology and of any goods or services the technology produces. Poor people will be even poorer. And poverty creates grave threats to human health and the environment.

Thus, it’s not enough to look only at the benefit side of precaution-taking. Because tradeoffs are inevitable, we should also consider the costs of precautions against anthropogenic global warming. In a series of Wall Street Journal op-eds, Bjorn Lomborg has detailed some of those costs: less money for sea walls, storm warning systems, and solidly constructed homes in India and Bangladesh; less money for food, medical treatment, and HIV drugs in Ethiopia; less money for fighting malaria in Zambia; less money for schooling and transportation in Vanuatu.

Now it may be worth incurring these costs in order to reduce the risks of anthropogenic global warming. But Friedman’s precautionary principle can’t tell us that. In saying, “Err on the side of precaution-taking,” it gives literally no direction. A far more helpful principle would balance the expected benefits of carbon caps against their very substantial costs (including, of course, the higher costs of heating and cooling this vulgar monstrosity).

It’s been interesting to observe the responses to the hacked emails from the Climate Research Unit at the University of East Anglia. The emails seem to show leading global warming scientists massaging data to generate the result they prefer (i.e., “I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years … to hide the decline”), scheming to squelch opposing evidence (i.e., “I can’t see either of these papers being in the next I.P.C.C. report. Kevin and I will keep them out somehow—even if we have to redefine what the peer-review literature is!”), admitting to a need to hide certain data from critics (“If they ever hear there is a Freedom of Information Act now in the UK, I think I’ll delete the file rather than send to anyone”), and even confessing that they were “tempted to beat” up researchers with opposing viewpoints.

The mainstream press and climate policymakers have largely brushed off these emails and have treated anthropogenic global warming as a settled scientific fact. For example, in reporting the story of the climate emails, the New York Times stated that “[t]he evidence pointing to a growing human contribution to global warming is so widely accepted that the hacked material is unlikely to erode the overall argument.” At the White House, Climate Czar Carol Browner dismissed the emails as an irrelevant distraction. And in announcing the EPA’s recent conclusion that carbon dioxide endangers public health and welfare (and is thus subject to extensive regulation under existing environmental statutes), EPA Administrator Lisa Jackson deflected questions about the emails by curtly noting that “[t]he science has been thoroughly evaluated.”

What could explain this apparent lack of concern about these emails, which strongly suggest that the scientific evidence has been doctored? An obvious possibility is that the powers that be realize we’re on the verge of implementing policies they’ve long favored — significant carbon caps, either via legislation or EPA rulemaking — and they don’t want to do or say anything that might prevent that outcome.

But we might be able to explain the absence of alarm on less cynical grounds. As Cass Sunstein has observed, when people have to make complicated risk judgments under uncertainty, they frequently fall prey to psychological and social forces that make them wary of questioning the conventional wisdom about particular risks. In his 2002 book, Risk and Reason, Sunstein explained how informational cascades, reputational cascades, and group polarization can collectively cement — and even strengthen — ill-founded risk judgments:

An informational cascade occurs when people with little personal information about a particular matter base their own beliefs on the apparent beliefs of others. Imagine, for example, that Alan says that abandoned hazardous waste sites are dangerous, or that Alan initiates protest activity because such a site is located nearby. Betty, otherwise skeptical or in equipoise, may go along with Alan; Carl, otherwise an agnostic, may be convinced that if Alan and Betty share the relevant belief, the belief must be true. It will take a confident Deborah to resist the shared judgments of Alan, Betty, and Carl. The result of th[is] set of influences can be social cascades, as hundreds, thousands, or millions of people come to accept a certain belief simply because of what they think other people believe. …

In the case of a reputational cascade, people do not subject themselves to social influences because they think that others are more knowledgeable. Their motivation is simply to earn social approval and avoid disapproval. … If many people are alarmed about some risk, you might not voice your doubts about whether the alarm is merited, simply in order not to seem obtuse, cruel, or indifferent. … Sometimes people take to speaking and acting as if they share, or at least do not reject, what they view as the dominant belief. As in the informational context, the outcome may be the cleansing of public discourse of unusual perceptions, arguments, and actions. … Lawmakers, even more than ordinary citizens, are vulnerable to reputational pressures; that is part of their job. They may even support legislation to control risks that they know to be quite low. …

When like-minded people are talking mostly to one another, especially interesting things are likely to happen. If members of a group tend to think that global warming poses a signficant danger, their discussions will move them, not to the middle, but to a more extreme point in line with their original tendencies. … If members of a group tend to believe that for cancer, the serious dietary problem lies in the use of pesticides, those same people will tend, after discussion, to have a heightened fear of pesticide use. All these are examples of group polarization — a process by which people engaged in process of deliberation end up thinking a more extreme version of what they already thought. Group polarization is central to the cascade-like processes discussed here. If like-minded people are speaking mostly with one another, they can end up with intensely heightened concerns about small risks.

Now I’m not arguing that the risks associated with anthropogenic global warming are small. On that matter, I’m agnostic. I do think it’s appropriate, though, to ask whether our climate policymakers and the members of our media elite are beset by the same social forces that influence the masses. Their lack of concern about the climate emails suggests that they may be. Consider, for example, Climate Czar Browner’s response to questions about the emails:

There has been for a very long time a very small group of people who continue to say this isn’t a real problem, that we don’t need to do anything. On the other hand, we have 2,500 of the word’s foremost scientists who are in absolute agreement that this is a real problem and that we need to do something and we need to do something as soon as possible. What am I going to do, side with the couple of naysayers out there or the 2,500 scientists? I’m sticking with the 2,500 scientists.

Of course, many of the 2,500 scientists “who are in absolute agreement” are Betties, Carls, Deborahs, Edwards, etc. who have formed their beliefs, at least in part, on the basis of evidence presented by a tainted Alan. And the Carol Browners and Lisa Jacksons of the world are certainly worried about preserving their reputations (not to mention, their jobs)! And they and their fellow policymakers have left their nests of like-minded folks in Washington, Geneva, etc. and convened to form an even larger community of like-minded folks in Copenhagen. Informational cascades, reputational cascades, and group polarization — oh my!

Apparently the Obama administration is not very confident about getting its environmental climate change agenda passed through Congress. Given a legislative “solution” is off the table, at least for the foreseeable future, perhaps it is not surprising that today the EPA announced it’s ruling that greenhouse gases are “a danger to public health and welfare“.

By making it’s declaration, the EPA has unilaterally claimed authority over regulating greenhouse gas emissions (under the auspices of the Clean Air Act) without worrying about legislative approval. After all, why bother waiting for the elected representatives of the citizenry to grant authority that can be obtained by administrative fiat? Instead, claim the authority and risk Congress being able to muster enough votes to overturn the decision.  The checks-and-balances version of the old adage: better to ask forgiveness than permission.

To be fair, the US Supreme Court opened the door for the EPA to make this move in its decision in Massachusetts v. EPA (549 U.S. 497 (2007)), a door the Obama administration was all to happy to run through. There is no little irony in the fact that the Court’s majority relied heavily on a report by the UN’s Intergovernmental Panel on Climate Change (IPCC); a report that has been criticized by the scientific community (for example, see here or here–referred by my Nobel-winning atmospheric science colleague Tony Lupo ), and a report which, according to recently-revealed emails between IPCC proponents, intentionally squelched opposing voices.

Although the EPA’s ruling is immediately aimed at increasing fuel economy standards for new cars and light-weight trucks, the broad-sweeping pronouncement that greenhouse gases (presumably of all sorts and from all sources) are a “danger to public health and welfare” portends a much more expansive role for EPA to regulate virtually any greenhouse gas-producing activity…a regulatory reach that would seem unprecedented, and all without so much as a debate in Congress.

But breathe easy…the EPA has already stated it is not concerned about the greenhouse gases produced simply by humans breathing.

In a move stupider even than Chicago’s foie gras and trans fat bans (on which see Thom here), California appears to be set to ban . . . wait for it . . . big TVs.  Environmentalists, those growing enemies of freedom and common sense everywhere, are pushing the ban because large-screen TVs use a lot of power.  And by large screen we’re talking 40 inches–not just the giant honkers bigger than most Multiplex screens.  And former-libertarian-leaning Arnold is on board.

Here’s a taste:

Arnold Schwarzenegger, the state’s governor, has supported controversial proposals by the California’s energy commission to impose strict energy consumption limits on TVs with screens that are more than 40 inches wide.

* * *

The commission argues that television owners would save around $30 (£18) a year per set in reduced energy consumption. The state itself could benefit by as much as $8.1 billion and could drop plans to a new natural gas-fired power station.

“We would not propose TV efficiency standards if we thought there was any evidence in the record that they will hurt the economy,” Julia Levin, an energy commissioner, told the Los Angeles Times.

“This will actually save consumers money and help the California economy grow and create new clean, sustainable jobs.”

I never fail to be impressed by the temerity and economic illiteracy of public officials.  Certainly there can be no benefit to anyone in California from watching a large screen TV, so $30 dollars a year in savings is all gain!  And why try to align incentives and let people decide for themselves where they might best cut energy use by raising the price of electricity or–better yet–by using dynamic pricing and smart grid technology, when we can just micromanage their choices for them?  And, gosh, nothing says “economic growth” like beating up on an existing (mostly foreign–hmmmm, I wonder . . . nah!) industry.   Finally–bonus!–by making TV less attractive, we’ll make Californians more productive!

I sure do hope our inevitable health insurance overlords commissioners are as sensible as the California Energy Commissioners.

For a great review of smart grid and its prospects, see this article by Lynne Kiesling in Reason.

(HT:  Seth Weinberger)

I have no intention of wading into the debate over the climate change chapter in Superfreakonomics.  I’m sure you all know the controversy:  Levitt and Dubner had the temerity to suggest that global warming was a huge problem, that we should look hard for really expensive solutions, and we need to do something.  And the outcry was from . . . the global warming alarmists. Curious.

Anyway, Brad DeLong has been among the most vocal and strident (Brad? Strident? Naaaaaaaah) critics of the book.  And one of Brad’s criticisms–couched in terms of “why are other people such idiots when I am so smart?”–appears on Yoram Bauman’s website in response to Yoram’s own critique of the book.  Here’s the main gist of Brad’s comment:

Yoram Bauman: “I have just seen a PDF of the Superfreakonomics chapter on climate change, and it makes basic mistakes when it says things like “When Al Gore urges the citizenry to sacrifice… the agnostics grumble that human activity accounts for just 2 percent of global carbon-dioxide emissions, with the remainder generated by natural processes like plant decay.”… [Y]es, human generation of CO2 is dwarfed by natural processes like plant decay. But it also shows that natural processes balance each other out…. What you’re left with is a completely plausible story in which human activity slowly increases atmospheric concentrations of CO2 from pre-industrial concentrations of about 285ppm (parts per million) to current concentrations of about 385ppm that are going up by about 2ppm per year. This sort of misleading skepticism exists throughout the chapter, and it does a disservice to climate science, to economists like me who work on climate change, to academic work in general, and to the general public that will have to live with the impacts of climate policy down the road…”

Steven Levitt: “I don’t understand…. Why does it matter if natural processes are in balance or not? CO2 is CO2! The source doesn’t matter. If we could cut CO2 emissions a little bit overall, whether through natural sources or others, the effect would be the same. It is not saying that cutting human emissions isn’t the right way to do it, but it is a surprising fact and one worth mentioning…”

Levitt and Dubner are saying that the fact that only 2% of emissions are of human origin is in some sense relevant to and supports the “agnostic” case on global warming. That is grossly, grossly misleading–talking about flows when the relevant variables are the stocks.

Actually, Brad, if you are talking about cutting emissions–i.e., FLOWS–it is perfectly appropriate to talk about where the biggest flows are coming from.  As far as I know, the point of the chapter is to be agnostic about where the solutions are to be found, not necessarily about the cause.  And, frankly, I’m not sure why the historical cause would matter if the only way to reverse the problem is to cut flows or to reduce stocks (and from whence the stocks came is hardly relevant, unless their origin tells you something about how to reduce them, and I don’t think this is true).  So, essentially, Brad has it, as I see it, exactly backward.  But bravo to Brad for having the courage of his convictions to be utterly insulting to others while being so utterly wrong.

UPDATE:  Brad makes the same point even more stridently (and equally wrongly) on his blog.