Unintended Consequences of Government Policies: The Depletion of America’s Wetlands

Private land-use decisions can be affected dramatically by public investments in highways, waterways, flood control, or other infrastructure.  The large movement of jobs from central cities to suburbs in the postwar United States and the ongoing destruction of Amazon rain forests have occurred with major public investment in supporting infrastructure.  As these examples suggest, private land-use decisions can generate major environmental and social externalities – or, in common language, unintended consequences.

In an analysis that appeared in 1990 in the American Economic Review, Adam Jaffe of Brandeis University and I demonstrated that the depletion of forested wetlands in the Mississippi Valley – an important environmental problem and a North American precursor to the loss of South American rain forests – was exacerbated by Federal water-project investments, despite explicit Federal policy to protect wetlands.

Wetland Losses

Forested wetlands are among the world’s most productive ecosystems, providing improved water quality, erosion control, floodwater storage, timber, wildlife habitat, and recreational opportunities.  Their depletion globally is a serious problem; and preservation and protection of wetlands have been major Federal environmental policy goals for forty years.

From the 1950s through the mid-1970s, over one-half million acres of U.S. wetlands were lost each year.  This rate slowed greatly in subsequent years, averaging approximately 60 thousand acres lost per year in the lower 48 states from 1986 through 1997.  And by 2006, the Bush administration’s Secretary of the Interior, Gale Norton, was able to announce a net gain in wetland acreage in the United Sates, due to restoration and creation activities surpassing wetland losses.

What Caused the Observed Losses?

What were the causes of the huge annual losses of wetlands in the earlier years?  That question and our analysis are as germane today as in 1990, because of lessons that have emerged about the unintended consequences of public investments.

The largest remaining wetland habitat in the continental United States is the bottomland hardwood forest of the Lower Mississippi Alluvial Plain.  Originally covering 26 million acres in seven states, this resource was reduced to about 12 million acres by 1937.  By 1990, another 7 million acres had been cleared, primarily for conversion to cropland.

The owner of a wetland parcel faces an economic decision involving revenues from the parcel in its natural state (primarily from timber), costs of conversion (the cost of clearing the land minus the resulting forestry windfall), and expected revenues from agriculture.  Agricultural revenues depend on prices, yields, and, significantly, the drainage and flooding frequency of the land.  Needless to say, landowners typically do not consider the positive environmental externalities generated by wetlands; thus conversion may occur more often than is socially optimal.

Such externalities are the motivation for Federal policy aimed at protecting wetlands, as embodied in the Clean Water Act.  Nevertheless, the Federal government engaged in major public investment activities, in the form of U.S. Army Corps of Engineers and U.S. Soil Conservation Service flood-control and drainage projects, which appeared to make agriculture more attractive and thereby encourage wetland depletion.  The significance of this effect had long been disputed by the agencies which construct and maintain these projects; they attributed the extensive conversion exclusively to rising agricultural prices.

In an econometric (statistical) analysis of data from Arkansas, Mississippi, and Louisiana, from 1935 to 1984, Jaffe and I sought to sort out the effects of Federal projects and other economic forces.  We discovered that these public investments were a very substantial factor causing conversion of wetlands to agriculture, with between 30 and 50 percent of the total wetland depletion over those five decades due to the Federal projects.

More broadly, four conclusions emerged from our analysis.  First, landowners had responded to economic incentives in their land-use decisions.  Second, construction of Federal flood-control and drainage projects caused a higher rate of conversion of forested wetlands to croplands than would have occurred in the absence of projects, leading to the depletion of an additional 1.25 million acres of wetlands.  Third, Federal projects had this impact because they made agriculture feasible on land where it had previously been infeasible, and because, on average, they improved the quality of feasible land.  Fourth, adjustment of land use to economic conditions was gradual.

Government Working at Cross-Purposes

The analysis highlighted a striking inconsistency in the Federal government’s approach to wetlands.  In articulated policies, laws, and regulations, the government recognized the positive externalities associated with some wetlands, with the George H.W. Bush administration first enunciating a “no net loss of wetlands” policy.  But public investments in wetlands – in the form of flood-control and drainage projects – had created major incentives to convert these areas to alternative uses.  The government had been working at cross-purposes.

The conclusion that major public infrastructure investments affect private land-use decisions (thereby often generating negative externalities) may not be a surprise to some readers, but it was the 1990 analysis described here that first provided rigorous evidence which contrasted sharply with the accepted wisdom among policy makers.

The Ongoing Importance of Induced Land-Use Changes

As wetlands, tropical rain forests, barrier islands, and other sensitive environmental areas become more scarce, their marginal social value rises.  In general, if induced land-use changes are not considered, the country will engage in more public investment programs whose net social benefits are negative.

Share

Chaos and Uncertainty in Copenhagen?

Earlier today, I was asked by the Financial Times, “who is responsible for the chaos and uncertainty” at COP-15 in Copenhagen?  I’m not sure those are the words I would have chosen to characterize the situation at the climate negotiations in the Danish capital, but here is my response for the FT’s Energy-Source Climate Experts panel — with some elaboration.

There are two aspects to what has been characterized as the “chaotic and uncertain” nature of the COP-15 conference at the Bella Center in Copenhagen.  One is the substantive process and eventual outcome, which remains uncertain as of this hour, and the other is the shocking logistical failure.

An Uncertain Outcome for the Negotiations

It should not be surprising that the outcome remains in doubt, because of some basic economic realities.  First of all, keep in mind that climate change is the ultimate global commons problem, because greenhouse gases uniformly mix in the atmosphere.  Therefore, each country incurs the costs of its emission-reduction actions, but the benefits of its actions are spread worldwide.  Hence, for any individual nation, the benefits it receives from its actions are inevitably less than the costs it incurs, despite the fact that globally the total benefits of appropriate coordinated international action would exceed the total costs (and for many countries the national benefits of coordinated international action would exceed their national costs of action).

This creates a classic free-rider problem, and is the reason why international cooperation – whether through an agreement under the United Nations Framework Convention on Climate Change or through some other multilateral or bilateral arrangements – is necessary.

Second, addressing global climate change will be costly and it raises profound distributional implications for the countries of the world.  In particular, addressing climate change at minimum cost (i.e., cost-effectively) requires that all countries take responsibility for their emissions going forward, and indeed necessitates that all countries control at the same marginal abatement cost.

On the other hand, addressing climate change in an equitable fashion clearly requires taking account of the dramatically different economic circumstances of the countries of the world, and may also involve looking backwards at historic responsibility for the anthropogenic greenhouse gases which have already accumulated in the atmosphere.   These are profound issues of distributional equity.

This classic trade-off between cost-effectiveness (or efficiency), on the one hand, and distributional equity, on the other hand, raises significant obstacles to reaching an agreement.

So, I place the fault for the substantive uncertainty in the negotiations neither on the industrialized countries (including the United States, for insisting that China and other key emerging economies participate in meaningful and transparent ways), nor on the developing countries (for insisting that the industrialized world pay much of the bill).

The key question going forward is whether negotiators in Copenhagen today and tonight, or in Bonn several months from now, or in Mexico City a year from now, can identify a policy architecture that is both reasonably cost-effective and sufficiently equitable, and thereby can assemble support from the key countries of the world, and thus do something truly meaningful about the long-term path of global greenhouse gas emissions.  There are promising paths forward, and – if you’ll forgive me – I will remind readers that many have been identified by the Harvard Project on International Climate Agreements.

Rather than pointing fingers at who is to blame for the current uncertainty at this hour, I can attribute credit to a number of countries and institutions for having brought the negotiations to the point where it appears at least possible that a successful outcome will be achieved in Copenhagen or subsequently.

First of all, tremendous credit must be given to the national leaders and the negotiating teams of the seventeen major economies of the world who together represent about 90% of global emissions, because these countries have worked hard to produce what each considers a sensible outcome over the months and years leading up to COP-15.

This includes not only the European Union, Australia, Japan, New Zealand, and Canada, but also the United States, which at least since January of this year has been an enthusiastic and intelligent participant in this international process.  It also includes many of the key emerging economies of the world – China , India, Brazil, Mexico, Korea, South Africa, and Indonesia, among them – as well as a considerable number of poor, developing countries, which likewise take the problem seriously and have been trying to find an acceptable path forward.

Finally, credit should be given to the Danish government and its leadership, the Secretariat of the United Nations Framework Convention on Climate Change, and UN Secretary-General Ban Ki-moon, who have worked tirelessly for months, indeed years, to prepare for the substance of these negotiations at COP-15 in Copenhagen.

That’s the “good news,” but now I should turn to the other aspect of the “uncertainty and chaos” in Copenhagen.

Chaos at COP-15’s Bella Center

As I noted at the outset, there are two aspects of the “chaos” in Copenhagen, and for the second aspect it is (sadly) possible to identify the apparently responsible parties.  I am referring to the fact that the organizers – the Secretariat of the United Nations Framework Convention on Climate Change (UNFCCC) and the hosts, the Danish government – apparently approved a list of some 40,000 observers from 900 official, accredited organizations around the world, knowing that the Bella Center could accommodate at most 15,000 persons at any one time.  The result is that thousands of people – including not only NGO representatives, but also government negotiators – stood in line outside of the Bella Center in the bitter cold on Monday and Tuesday of this week waiting 8-10 hours to get inside to receive their credentials.  Thousands of others never got inside to receive their credentials, despite having waited up to 8 hours, standing in the cold.  These are not exaggerations.  It is remarkable and very fortunate if no one died in the process.

Then, on Wednesday through Friday, the Bella Center was essentially closed to all representatives of civil society, despite the fact that side-events had been organized by them months in advance with the approval of the COP-15 organizers.

The result is that thousands of people, who had been informed by the COP-15 organizers many months ago that they were approved to attend, had flown to Copenhagen from all over the world, incurred those costs plus the costs of their accommodations, yet never were able to get inside the Bella Center to carry out any of the work they had planned, and flew back home having wasted their time and resources (and having contributed to the COP-15 carbon footprint in non-trivial ways).

Now, I have never been an enthusiast of what some people have described as the annual “circus” of the COPs, a circus – if it is that — which is largely due to the fact that the actual government negotiators are vastly outnumbered by the civil society representatives (“official observers” in the UNFCCC language) and the press.  However, if the participation of civil society representatives is going  to be encouraged (as required under the original UNFCCC agreement), and if the attendance of those representatives is going to be approved in advance, then surely they should not be denied admission when they arrive, nor forced to stand in line outside in the cold for 8 hours waiting to be admitted.

No doubt, both the UNFCCC and the Danish government will point fingers at the other, but ultimately the responsibility must be shared.  In seventeen years of these annual conferences, going back to the 1992 Earth Summit in Rio de Janeiro, there has never been such a disastrous logistical failure.  It could have been anticipated.  And it should have been prevented.

A Final Word

Of course, as of this hour, I — along with millions of others — hope that the negotiators in Copenhagen will achieve agreement on some truly meaningful steps forward in this important process.

Share

Is Benefit-Cost Analysis Helpful for Environmental Regulation?

With the locus of action on Federal climate policy moving this week from the House of Representatives to the Senate, this is a convenient moment to step back from the political fray and reflect on some fundamental questions about U.S. environmental policy.

One such question is whether economic analysis – in particular, the comparison of the benefits and costs of proposed policies – plays a truly useful role in Washington, or is it little more than a distraction of attention from more important perspectives on public policy, or – worst of all – is it counter-productive, even antithetical, to the development, assessment, and implementation of sound policy in the environmental, resource, and energy realms.   With an exceptionally talented group of thinkers – including scientists, lawyers, and economists – now in key environmental and energy policy positions at the White House, the Environmental Protection Agency, the Department of Energy, and the Department of the Treasury, this question about the usefulness of benefit-cost analysis is of particular importance.

For many years, there have been calls from some quarters for greater reliance on the use of economic analysis in the development and evaluation of environmental regulations.  As I have noted in previous posts on this blog, most economists would argue that economic efficiency — measured as the difference between benefits and costs — ought to be one of the key criteria for evaluating proposed regulations.  (See:  “The Myths of Market Prices and Efficiency,” March 3, 2009; “What Baseball Can Teach Policymakers,” April 20, 2009; “Does Economic Analysis Shortchange the Future?” April 27, 2009)  Because society has limited resources to spend on regulation, such analysis can help illuminate the trade-offs involved in making different kinds of social investments.  In this sense, it would seem irresponsible not to conduct such analyses, since they can inform decisions about how scarce resources can be put to the greatest social good.

In principle, benefit-cost analysis can also help answer questions of how much regulation is enough.  From an efficiency standpoint, the answer to this question is simple — regulate until the incremental benefits from regulation are just offset by the incremental costs.  In practice, however, the problem is much more difficult, in large part because of inherent problems in measuring marginal benefits and costs.  In addition, concerns about fairness and process may be very important economic and non-economic factors.  Regulatory policies inevitably involve winners and losers, even when aggregate benefits exceed aggregate costs.

Over the years, policy makers have sent mixed signals regarding the use of benefit-cost analysis in policy evaluation.  Congress has passed several statutes to protect health, safety, and the environment that effectively preclude the consideration of benefits and costs in the development of certain regulations, even though other statutes actually require the use of benefit-cost analysis.  At the same time, Presidents Carter, Reagan, Bush, Clinton, and Bush all put in place formal processes for reviewing economic implications of major environmental, health, and safety regulations. Apparently the Executive Branch, charged with designing and implementing regulations, has seen a greater need than the Congress to develop a yardstick against which regulatory proposals can be assessed.  Benefit-cost analysis has been the yardstick of choice

It was in this context that ten years ago a group of economists from across the political spectrum jointly authored an article in Science magazine, asking whether there is role for benefit-cost analysis in environmental, health, and safety regulation.  That diverse group consisted of Kenneth Arrow, Maureen Cropper, George Eads, Robert Hahn, Lester Lave, Roger Noll, Paul Portney, Milton Russell, Richard Schmalensee, Kerry Smith, and myself.  That article and its findings are particularly timely, with President Obama considering putting in place a new Executive Order on Regulatory Review.

In the article, we suggested that benefit-cost analysis has a potentially important role to play in helping inform regulatory decision making, though it should not be the sole basis for such decision making.  We offered eight principles.

First, benefit-cost analysis can be useful for comparing the favorable and unfavorable effects of policies, because it can help decision makers better understand the implications of decisions by identifying and, where appropriate, quantifying the favorable and unfavorable consequences of a proposed policy change.  But, in some cases, there is too much uncertainty to use benefit-cost analysis to conclude that the benefits of a decision will exceed or fall short of its costs.

Second, decision makers should not be precluded from considering the economic costs and benefits of different policies in the development of regulations.  Removing statutory prohibitions on the balancing of benefits and costs can help promote more efficient and effective regulation.

Third, benefit-cost analysis should be required for all major regulatory decisions. The scale of a benefit-cost analysis should depend on both the stakes involved and the likelihood that the resulting information will affect the ultimate decision.

Fourth, although agencies should be required to conduct benefit-cost analyses for major decisions, and to explain why they have selected actions for which reliable evidence indicates that expected benefits are significantly less than expected costs, those agencies should not be bound by strict benefit-cost tests.  Factors other than aggregate economic benefits and costs may be important.

Fifth, benefits and costs of proposed policies should be quantified wherever possible.  But not all impacts can be quantified, let alone monetized.  Therefore, care should be taken to assure that quantitative factors do not dominate important qualitative factors in decision making.  If an agency wishes to introduce a “margin of safety” into a decision, it should do so explicitly.

Sixth, the more external review that regulatory analyses receive, the better they are likely to be.  Retrospective assessments should be carried out periodically.

Seventh, a consistent set of economic assumptions should be used in calculating benefits and costs.  Key variables include the social discount rate, the value of reducing risks of premature death and accidents, and the values associated with other improvements in health.

Eighth, while benefit-cost analysis focuses primarily on the overall relationship between benefits and costs, a good analysis will also identify important distributional consequences for important subgroups of the population.

From these eight principles, we concluded that benefit-cost analysis can play an important role in legislative and regulatory policy debates on protecting and improving the natural environment, health, and safety.  Although formal benefit-cost analysis should not be viewed as either necessary or sufficient for designing sensible public policy, it can provide an exceptionally useful framework for consistently organizing disparate information, and in this way, it can greatly improve the process and hence the outcome of policy analysis.

If properly done, benefit-cost analysis can be of great help to agencies participating in the development of environmental regulations, and it can likewise be useful in evaluating agency decision making and in shaping new laws (which brings us full-circle to the climate legislation that will be developed in the U.S. Senate over the weeks and months ahead, and which I hope to discuss in future posts).

Share