31 January 2012

Economic Models: Caveat Utilitor!

The Australia Institute has released a very nice report by Richard Denniss titled, The use and abuse of economic modelling in Australia: Users' guide to Tricks of the Trade (PDF).  The essay illustrates its critique with several recent cases related to claims about jobs in the mining industry, the poker machine industry and as a consequence of the carbon tax.

Here is an excerpt:
Economic modelling has, for many people involved in Australian policy debates, become synonymous with the process of serious policy development. Proponents of policy change that are armed with economic modeling are often taken more seriously than those with 20 years experience working on the same problem. The modelling result that suggests tens of thousands of jobs will be lost or created often trumps logic or experience that suggests such claims are nonsensical.

This is not to suggest that modelling has no role to play in policy debates. It can and it does often make a useful contribution, but the fact that it sometimes can should not be confused with the conclusion that it always will. Indeed, in recent times some of the claims based on 'economic modelling' that has been made in debates such as the likely impact of poker machine reform or the introduction of a carbon price can only be described as nonsense.

The problem has become, however, that in an era in which segments of the media no longer have the time or inclination to examine claims before they are reported bad economic modelling is preferred by many advocacy and industry groups to good economic modelling for three main reasons:

1. it is cheaper
2. it is quicker
3. it is far more likely to yield the result preferred by the client

That said, bad economic modelling is relatively easy to identify if readers are willing to ask themselves, and the modeller, a range of simple questions. Indeed, it is even easier to spot when the modeler can't, or won't, answer such simple questions.
Economic models, like all models, can be very useful. But they can also be used in ways that are misleading or just plain wrong. Denniss provides some good advice for recognizing the difference.

What is a Job?

People familiar with my work, such as in The Honest Broker and The Climate Fix, will also be familiar with my interest in unpacking issues and problems into comprehensible bits. To that end in the area of innovation I am going to be posting on some definitional issues to simply sort through a number of basic propositions as a matter of clarifying my own thinking and writing.

In this post I am exploring the definition of a "job" -- which is a concept that carries considerable political importance and is a variable that we'd like to modulate via policy, but which typically falls into the category of "too obvious to define precisely."

What is a job?  Let's start with the following definitions related to employment offered  by the US government's Bureau of Labor Statistics (emphasis in the original):
The basic concepts involved in identifying the employed and unemployed are quite simple:
  • People with jobs are employed.
  • People who are jobless, looking for jobs, and available for work are unemployed.
  • People who are neither employed nor unemployed are not in the labor force.
The survey is designed so that each person age 16 and over who is neither in an institution (for example, correctional facilities and residential nursing and mental health care facilities) nor on active duty in the Armed Forces is counted and classified in only one group. The sum of the employed and the unemployed constitutes the civilian labor force. Persons not in the labor force combined with those in the civilian labor force constitute the civilian noninstitutional population 16 years and over.
These definitions, which date to 1942 (source), are extremely useful because they clearly define how the government views employment, which is the variable that policy makers seek to modulate when talking about "jobs." But these definitions don't quite get us to a fundamental definition of "jobs."

Here is what the BLS says about jobs:
Not all of the wide range of job situations in the American economy fit neatly into a given category. For example, people are considered employed if they did any work at all for pay or profit during the survey week.
This leads me to the following description of a job:

The defining characteristic of a job is an exchange between and employer and an employee, of wages (or some other compensation) for services (economists like to call such services "labor," defined variously in terms of skills, knowledge, capabilities, etc.). Governments regulate such services in many ways (e.g., some services may be disallowed -- think hit men, drug dealing or prostitution) and the terms of employment are also regulated (e.g., minimum wage, occupational safety, etc.).

All jobs are thus service jobs. With that as a starting point, we are in a position to ask ourselves, in what ways should we categorize and classify jobs in order to help realize the various objectives of public policy? The answer to this question is not obvious, and it is not clear to me that the official government categories are necessarily the most useful or helpful for thinking about policies related to jobs -- recent discussions here related to "manufacturing" are one example.

30 January 2012

All Jobs are Service Jobs

I am trying to figure out (a) what a "manufacturing job" is, and (b) why many economists think that such jobs are in some way a special category of jobs.

My emerging view is that all jobs are service jobs and some such jobs involve the manipulation of tangible goods. In our economic accounting we classify some (but far from all) of those jobs that involve the manipulation of tangible goods (for instance, those that can be put into a shipping container) as manufacturing jobs, and others (such as in construction) as services. The distinction seems somewhat (entirely?) arbitrary to me and as apt to mislead as clarify our discussions of innovation and the economy.

Here is a specific example to discuss (Thanks AC!):
In this sprawling facility on Route 128, sporty Kia coupes and Volvo trucks are regularly taken apart and reassembled. Caterpillar tractors and Harley-Davidson motorcycles are put through exacting trials that test the latest advances in power steering and antilock brakes. Both Aston Martin Racing and the Penske Racing Team come here to shave seconds off their times.

But the 1,000-plus employees at PTC never touch a wrench or ball-peen hammer. Instead they develop and advance software that allows automakers to design, build, and service the latest automobiles rolling off production lines all over the world.

“The actual making of cars has moved to other parts of the world,’’ said Sin Min Yap, PTC’s vice president for automotive market strategy, “but the digital making of cars is thriving here.’’
Are the jobs at PTC "manufacturing jobs"? Are they "service jobs"? And, most importantly, why does such a distinction matter for our discussion of innovation, the economy and employment? (For initial background, here is how the North American Industry Classification System characterizes manufacturing.)

My view -- all jobs are service jobs. I will follow up on the consequences of such a view in subsequent posts.

26 January 2012

US Emissions Projections Compared to Reduction Targets

Last week the US Energy Information Agency published an "early release" of its 2012 Annual Energy Outlook, which includes the agency's projections for various energy statistics out to 2030, based on a range of assumptions. The report also includes projections of carbon dioxide emissions to 2035, which allows for a comparison of the Obama Administration's commitments to targets for emissions reduction for 2020, 2025 and 2030 (the formal commitment made under the UN FCCC is here).

The graph at the top of this post shows the U.S. government's emissions projections (black line) and the emissions reduction targets (red, blue and green). In case you were wondering how big the misses are with respect to the targets in some sort of intuitive way, I've provided a measure of the magnitude of the shortfall (using the same methods described in depth in The Climate Fix) in terms of the number of coal power plants that would have to be replaced with nuclear power plants to meet the targets. (If you'd like to replace gas power plants, the numbers are about 40% more, due to the lower carbon intensity of gas generation. If you'd like to use wind turbines or solar power, well, get out a big calculator;-).

It should be fairly obvious that under the assumptions of the EIA (such as positive economic growth) that the emissions reduction targets are not going to be met. Given President Obama's renewed commitment to an "all of the above" strategy for energy production in the United States, is it finally time to dismiss the charade of emissions reductions targets and adopt a different approach?

Political Identification of American College Freshmen

Courtesy of The Chronicle of Higher Education, the graph above shows a time series of the self-reported political orientation of college freshman in the United States. There is a subtle hint of an increased polarization at the end of the G. W. Bush presidency, somewhat reversed in the time since. Like the US as a whole, the perspectives are dominated by those who describe their political views as "middle of the road."

25 January 2012

Mike Rowe on Dirty Jobs

A reader (Thanks JZ!) sends in links to Mike Rowe's Ted talk (at the bottom of this post) and Congressional testimony from last year (above). He is a champion of skilled labor and a great spokesman for parts of the workforce that are often marginalized as less worthy than college graduates.

Here is an excerpt:
I believe we need a national PR Campaign for Skilled Labor. A big one. Something that addresses the widening skills gap head on, and reconnects the country with the most important part of our workforce.

Right now, American manufacturing is struggling to fill 200,000 vacant positions. There are 450,000 openings in trades, transportation and utilities. The skills gap is real, and it's getting wider. In Alabama, a third of all skilled tradesmen are over 55. They're retiring fast, and no one is there to replace them.

Alabama's not alone. A few months ago in Atlanta I ran into Tom Vilsack, our Secretary of Agriculture. Tom told me about a governor who was unable to move forward on the construction of a power plant. The reason was telling. It wasn't a lack of funds. It wasn't a lack of support. It was a lack of qualified welders.

In general, we're surprised that high unemployment can exist at the same time as a skilled labor shortage. We shouldn't be. We've pretty much guaranteed it.

In high schools, the vocational arts have all but vanished. We've elevated the importance of "higher education" to such a lofty perch that all other forms of knowledge are now labeled "alternative." Millions of parents and kids see apprenticeships and on-the-job-training opportunities as "vocational consolation prizes," best suited for those not cut out for a four-year degree. And still, we talk about millions of "shovel ready" jobs for a society that doesn't encourage people to pick up a shovel.
Here is an entertaining Ted talk by  Rowe (skip to 15:30 for the bottom line).

Top Global Think Tanks

The Think Tanks and Civil Societies Program at the University of Pennsylvania has released its annual global rankings of think tanks around the world (here in PDF). Here are some top line findings.

Which countries have the most think tanks (global total = 6,545)?
1 United States 1815
2 China 425
3 India 292
4 United Kingdom 286
5 Germany 194
6 France 176
7 Argentina 137
8 Russia 112
9 Japan 103
10 Canada 97
What are the top ranked think tanks worldwide?
1. Brookings Institution – United States
2. Chatham House (CH), Royal Institute of International Affairs – United Kingdom
3. Carnegie Endowment for International Peace – United States
4. Council on Foreign Relations (CFR) – United States
5. Center for Strategic and International Studies (CSIS) – United States
6. RAND Corporation – United States
7. Amnesty International – United Kingdom
8. Transparency International – Germany
9. International Crisis Group (ICG) – Belgium
10. Peterson Institute for International Economics – United States
11. German Institute for International and Security Affairs (Stiftung Wissenschaft und
Politik SWP) – Germany
12. International Institute for Strategic Studies (IISS) – United Kingdom
13. Heritage Foundation – United States
14. Cato Institute – United States
15. Woodrow Wilson International Center for Scholars – United States
There is plenty more information on the global think tank "ecosystem" in the full report, here in PDF.

24 January 2012

Upcoming Talk in Sydney at the Lowy Institute

                   Tuesday, 7 February 2012
Time:       12:30 pm for 12:45 pm – 1:45 pm
                   Ground floor, 31 Bligh Street, Sydney
RSVP:      Before 5.00pm on Monday, 6 February 2012

Please click here to accept or decline this invitation Description: Description: RSVP

Scientists in Policy and Politics

Scientists, and experts more generally have choices about the roles that they play in today's political debates on topics such as global warming, genetically modified foods, and food and drug safety, just to name a few. Professor in Environmental Studies at the University of Colorado, Roger Pielke will discuss how we can understand these choices, their theoretical and empirical bases, what considerations are important to think about when deciding, and the consequences for the individual scientist and the broader scientific enterprise.

Roger A. Pielke, Jr. has been on the faculty of the University of Colorado since 2001 and is a Professor in the Environmental Studies Program and a Fellow of the Cooperative Institute for Research in Environmental Sciences (CIRES) where he served as the Director of the Center for Science and Technology Policy Research from 2001-2007. Roger’s research focuses on the intersection of science and technology and decision making. In 2006, Roger received the Eduard Brückner Prize in Munich, Germany for outstanding achievement in interdisciplinary climate research. From 1993-2001 Roger was a Scientist at the National Center for Atmospheric Research. He holds appointments as a Research Fellow, Risk Frontiers, Macquarie University; Visiting Senior Fellow, Mackinder Programme, London School of Economics; and Senior Visiting Fellow at the Consortium for Science, Policy and Outcomes of Arizona State University. He is a Senior Fellow of the Breakthrough Institute. He is also author, co-author or co-editor of seven books, including The Honest Broker: Making Sense of Science in Policy and Politics published by Cambridge University Press in 2007. His most recent book is The Climate Fix: What Scientists and Politicians Won't Tell you About Global Warming.

Please join us for a lively and thought provoking discussion.

Word Clouds of the 2012 SOTU and Republican Response

Text of both speeches from the New York Times and world cloud courtesy of WordItOut (top 100 words, minimum 5 characters).

State of the Union:
Republican response (Governor Mitch Daniels):

Understanding American Manufacturing

Writing in the current issue of The Atlantic, Adam Davidson has an absolutely brilliant article on the state of American manufacturing. It is a lengthy article that you should read in full. The article clearly explains why it is that manufacturing jobs are going away, even as the manufacturing sector strengthens. It also explores the challenges facing so-called unskilled workers in a big, rich 21st century economy. The article does this by looking at real people in a real factory.

Here is an excerpt from the piece:
Is there a crisis in manufacturing in America? Looking just at the dollar value of manufacturing output, the answer seems to be an emphatic no. Domestic manufacturers make and sell more goods than ever before. Their success has been grounded in incredible increases in productivity, which is a positive way of saying that factories produce more with fewer workers.

Productivity, in and of itself, is a remarkably good thing. Only through productivity growth can the average quality of human life improve. Because of higher agricultural productivity, we don’t all have to work in the fields to make enough food to eat. Because of higher industrial productivity, few of us need to work in factories to make the products we use. In theory, productivity growth should help nearly everyone in a society. When one person can grow as much food or make as many car parts as 100 used to, prices should fall, which gives everyone in that society more purchasing power; we all become a little richer. In the economic models, the benefits of productivity growth should not go just to the rich owners of capital. As workers become more productive, they should be able to demand higher salaries.

Throughout much of the 20th century, simultaneous technological improvements in both agriculture and industry happened to create conditions that were favorable for people with less skill. The development of mass production allowed low-skilled farmers to move to the city, get a job in a factory, and produce remarkably high output. Typically, these workers made more money than they ever had on the farm, and eventually, some of their children were able to get enough education to find less-dreary work. In that period of dramatic change, it was the highly skilled craftsperson who was more likely to suffer a permanent loss of wealth. Economists speak of the middle part of the 20th century as the “Great Compression,” the time when the income of the unskilled came closest to the income of the skilled.

The double shock we’re experiencing now—globalization and computer-aided industrial productivity—happens to have the opposite impact: income inequality is growing, as the rewards for being skilled grow and the opportunities for unskilled Americans diminish.
Looking for significant job growth in a sector that is in the midst of experiencing a revolution in productivity gains is just bad math.

Edward Alden, writing at the new CFR blog, Renewing America, points to business services where future job growth has significant prospects:
[E]ven as the manufacturing sector will continue to grow, the United States will need to look to other industries for robust, higher-wage job growth. My bet is on the business services sector, in fields such as engineering services, movie and software production, and telecommunications where demand for U.S. services is growing rapidly, especially in the emerging markets. Brad Jensen of Georgetown University and the Peterson Institute has laid out the case in his excellent new book. These sectors already employ twice as many people at higher average wages than in manufacturing, and job growth has been strong over the past decade. The United States runs a steadily rising trade surplus in services, compared with a deep, chronic deficit in manufacturing trade. These are sectors in which the United States, along with Europe, has a strong comparative advantage and the potential to sell much more to the world.

No one sector is going to dig the United States out of the jobs hole we currently find ourselves. But manufacturing is a particularly poor candidate.
With an estimated 40 million unskilled workers (according to Davidson in The Atlantic) the US has a big challenge ahead.

Cat Model Mayhem

Writing at her blog, The Short Run, my superstar grad student Jessica Weinkle looks at recent catastrophe model filings in the state of Florida, as part of her dissertation research:
In America's deep south, a region not so far away, hides a new foe threatening otherwise intelligent people's ability to decide. The Louisiana Insurance Commissioner, Jim Donelone, has rung the alarm putting homeowners on alert of  "The looming threat of the new cat model, RMS 11".  This is the newest addition in the catastrophe model rogue gallery challenging the gallant efforts of state insurance regulating offices. The kryptonite in their coding is the incredible capacity to produce scientifically supported uncertainty thereby weakening the ability to control rates by politically hopeful insurance commissioners everywhere. A past episode between dueling regulating powers and risk predicting machinery demonstrated the societal cost inflicted by these dastardly foes creating uncertainty whenever plugged into a wall. In 2006, RMS rolled out an arbitrary change to their trusty hurricane catastrophe model in RiskLink 6.0, costing Florida homeowners $82 billion. Stay tuned to state regulating offices for the latest updates on the battle between man and machine...

In the mean time, let's take a closer look at these new trade secret rascals...
Weinkle uncovers some eyebrow raising factiods, such as the fact that the estimated probability of a Category 5 hurricane hitting Florida has apparently increased from previous model filings in several models by 100%. She also shows that across five different models, the estimated cost of a Category 5 storm in Florida ranges from $18 billion to $146 billion.

Based on these numbers, Weinkle calls the catastrophe models tools that create uncertainties and makes the nonobvious point that decisions about risk are actually decisions about modeled risk -- which may or may not be the same thing:
Together, these models create a great deal of uncertainty about the risk being insured against.  In the world of insurance, uncertainty about the risk is risk in and of itself.  If uncertainty increases, then the cost will too and vice versa.  So, a reasonable question to ask would be, "Has the modeled risk changed?"
Not surprisingly, catastrophe models have faced some criticism, such as found in this recent news article from Louisiana:
Catastrophe models are controversial. Proponents say they bring science to underwriting and synthesize the latest understanding of storms and climate change to insurers. Opponents say they're gee-whiz black boxes that manufacture instant justification for high rates for insurers.
The problem with catastrophe models is not that they lack value (they are actually extremely powerful and potentially useful tools), it is just very hard to assess what that value is (e.g., PDF), and their black box nature makes such assessments extremely difficult. The lack of an industry-wide evaluation capability, strong hints of conflict of interest and the defensive nature of some of the cat modelers makes the issue a mine field of bad decisions for businesses and governments alike.

Follow Up: 2011 Brisbane Floods

Just over a year ago, Brisbane, Australia experienced its worst flooding since 1974, resulting in billions of dollars in damage. Immediately after the event the focus of attention turned to the management of the Wivenhoe Dam, which was built after the 1974 floods to prevent a repeat.

This past week, as the Queensland flood commission is wrapping up it investigation, The Australian has uncovered evidence that the Wivenhoe managers operated the dam according to an incorrect procedure. Here is a summary from the Sydney Morning Herald:
On Monday and Tuesday The Australian newspaper alleged engineers operating the Wivenhoe Dam used the wrong water-release strategy, breaching its operation manual, in the lead-up to the January 2011 Brisbane flood.

It reported SEQWater engineers, who operate the dam, failed to move to a higher water release strategy early enough, contributing to the Brisbane and Ipswich floods.

The paper used emails between SEQWater and the WaterGrid to back up their claims.

It went on to accuse the commission of overlooking the documents and accepting at face value evidence from engineers who said the manual had been followed correctly.

The commission was in possession of the emails but did not make them publicly available.
The release of the information contained in the emails has prompted a re-opening of the Floods Commission inquiry and a delay in the Queensland state election. Anna Bligh, the Premier of Queensland (pictured at the top of this post with Prime Minister Julia Gillard), is facing an electoral defeat based on polling, prompting calls that the election is being delayed for politically strategic reasons.

Writing very recently in the open-access journal Water, Robin van den Honert and John McAneney, of Macquerie University provide a comprehensive review and assessment of the 2011 floods and their impacts, and which is likely to serve as the definitive study of the event for some time to come. Here is the paper's abstract:
The 2011 Brisbane Floods: Causes, Impacts, Implications

On 13th January 2011 major flooding occurred throughout most of the Brisbane River catchment, most severely in Toowoomba and the Lockyer Creek catchment (where 23 people drowned), the Bremer River catchment and in Brisbane, the state capital of Queensland. Some 56,200 claims have been received by insurers with payouts totalling $2.55 billion. This paper backgrounds weather and climatic factors implicated in the flooding and the historical flood experience of Brisbane. We examine the time history of water releases from the Wivenhoe dam, which have been accused of aggravating damage downstream. The dam was built in response to even worse flooding in 1974 and now serves as Brisbane’s main water supply. In our analysis, the dam operators made sub-optimal decisions by neglecting forecasts of further rainfall and assuming a ‘no rainfall’ scenario. Questions have also been raised about the availability of insurance cover for riverine flood, and the Queensland government’s decision not to insure its infrastructure. These and other questions have led to Federal and State government inquiries. We argue that insurance is a form of risk transfer for the residual risk following risk management efforts and cannot in itself be a solution for poor land-use planning. With this in mind, we discuss the need for risk-related insurance premiums to encourage flood risk mitigating behaviours by all actors, and for transparency in the availability of flood maps. Examples of good flood risk management to arise from this flood are described.
Based on the new reporting from The Australian on the possible errors in flood management and the comprehensive analysis in van den Honert and McAneney (2011), it is clear that bad decision making played a major role in the disaster. The bad decisions were the result of mismanagement, a deeply flawed management architecture, or what seems to be increasingly likely -- both.

22 January 2012

Apples and Americans

If you read the New York Times, you might be led to believe that some experts think that Apple represents a lot that is wrong with the modern innovation economy. Here is an excerpt from yesterday's lengthy article on Apple:
Not long ago, Apple boasted that its products were made in America. Today, few are. Almost all of the 70 million iPhones, 30 million iPads and 59 million other products Apple sold last year were manufactured overseas. . .

Apple employs 43,000 people in the United States and 20,000 overseas, a small fraction of the over 400,000 American workers at General Motors in the 1950s, or the hundreds of thousands at General Electric in the 1980s. Many more people work for Apple’s contractors: an additional 700,000 people engineer, build and assemble iPads, iPhones and Apple’s other products. But almost none of them work in the United States. Instead, they work for foreign companies in Asia, Europe and elsewhere, at factories that almost all electronics designers rely upon to build their wares.

“Apple’s an example of why it’s so hard to create middle-class jobs in the U.S. now,” said Jared Bernstein, who until last year was an economic adviser to the White House.
Let me just say – No it's not.

What kind of jobs does Apple and its suppliers have overseas? The NYT investigated and described a facility in China:
The facility has 230,000 employees, many working six days a week, often spending up to 12 hours a day at the plant. Over a quarter of Foxconn’s work force lives in company barracks and many workers earn less than $17 a day. When one Apple executive arrived during a shift change, his car was stuck in a river of employees streaming past. “The scale is unimaginable,” he said.

Foxconn employs nearly 300 guards to direct foot traffic so workers are not crushed in doorway bottlenecks. The facility’s central kitchen cooks an average of three tons of pork and 13 tons of rice a day. While factories are spotless, the air inside nearby teahouses is hazy with the smoke and stench of cigarettes.

Foxconn Technology has dozens of facilities in Asia and Eastern Europe, and in Mexico and Brazil, and it assembles an estimated 40 percent of the world’s consumer electronics for customers like Amazon, Dell, Hewlett-Packard, Motorola, Nintendo, Nokia, Samsung and Sony.

“They could hire 3,000 people overnight,” said Jennifer Rigoni, who was Apple’s worldwide supply demand manager until 2010, but declined to discuss specifics of her work. “What U.S. plant can find 3,000 people overnight and convince them to live in dorms?”
What indeed? These are not “middle class jobs.”

Kraemer at al. (2011, here in PDF), by researchers at California-Irvine, Berkeley and Syracuse who have studied Apple’s supply chains, first for the iPod and then the iPhone and iPad, conclude:
Those who decry the decline of U.S. manufacturing too often point at the offshoring of assembly for electronics goods like the iPhone. Our analysis here and elsewhere makes clear that there is simply little value in electronics assembly. The gradual concentration of electronics manufacturing in Asia over the past 30 years cannot be reversed in the short- to medium-term without undermining the relatively free flow of goods, capital, and people that provides the basis for the global economy. And even if high-volume assembly expands in North America, this will likely take place in Mexico where there is already a relatively low-cost electronics assembly infrastructure.
What has Apple done?

It has captured a significant fraction of the global market for mobile phones, as shown by the figure below from The Economist.
More importantly, Apple has created jobs in the United States. In 2006, before the iPhone was even on the market Apple as a company had less than 18,000 employees. In 2010, according to the NYT, Apple had 63,000 employees, with 43,000 in the United States. Even if we assume that all of its 2006 employees were in the US (they weren’t) Apple has created more than 25,000 jobs in the US. To put this data into perspective, at a time when the number employed in the US dropped by more than 5% Apple increased its US-based employment by more than 150%.

Rather than calling out Apple as an example of what is wrong in the innovation economy, we should be pointing to Apple as an example to emulate. The question that we should be asking is not how can we get Apple to hire more Americans, but rather, how do we get America to create more Apples?

Future Tropical Cyclone Damage

Kerry Emanuel of MIT (WCAS in PDF) and Robert Medelsohn of Yale University, Emanuel and colleagues (Nature Climate Change), have new papers out on future hurricane damage. The findings of both papers reinforce existing literature on the very long time necessary to detect a signal of human caused climate change in the disaster record under recent projections and the relative role of the importance of development over human-caused climate change in future losses from tropical cyclones.

Emanuel (2011, PDF)  implemented an alternative methodology to Crompton et al. (2011) to assess under various scenarios when the signal of human-caused climate change would be detectable in the damage record of Atlantic hurricanes. He looked at four different models, and three of them showed increasing losses and one a small decrease. Of the three models that showed increasing losses the time until detection is 40, 113 and 170 years.

This time to detection is shorter than that which we found in Crompton et al. (2011). Why is that? Emanuel use an older set of model runs (we used Bender et al. 2010 -- I wonder why they used something different) and that probably accounts for the difference. It would have been nice if Emanuel had used the Bender runs, as that would have allowed an apples to apples comparison. I'd speculate that our numbers would be quite similar apples-to-apples.

Regardless, the two papers are in agreement that the time to detection of a signal of human-caused climate climate change, assuming that recent projections are correct, is a long, long time. Like, not in our lifetimes and certainly not now.

Mendelson et al. (2012) examine a range of scenarios for how tropical cyclone damage will increase to 2100. That paper concludes that tropical cyclone damage will decrease as a proportion of global GDP from 0.04% today to 0.01% in 2100 assuming human-caused climate change, using the same four models as used in Emanuel (2011, PDF). That is right, decrease as a portion of GDP. (Apparently, this result was not sexy enough for Nature Climate Change which headlined their homepage announcement of the paper rather misleadingly as, "Tropical Cyclone Damage Set to Double" referring to the expected increase in aggregate damages to 2100.) The paper also explores how damages might increase in regions around the world, though it is important to recognize that both the climate and socio-economic assumptions of the paper are highly speculative.

Mendelsohn et al. (2012) explain that their findings are consistent with existing work and as such, adds to our growing understanding that under a very wide range of scenarios for how climate might change and how society might develop, socio-economic factors will dominate the future damage record (see this paper in PDF for an unrealistically broad range of explored scenarios) independent of a wide range of assumptions and uncertainties.

Anyone claiming that they can see a human-caused climate signal in the hurricane damage record (or even the hurricane record itself) is facing a growing mountain of peer-reviewed research to overcome.

H/T to Revkin and Kloor

We're Number 50


20 January 2012

Another Billionz Update: NOAA Discovers Inflation

[UPDATE 4/11: Here is the Federal Register announcement for the forthcoming NOAA workshop on "billion dollar disasters." Shouldn't the research take place before the press release and fancy website? ;-)]

Yesterday, NOAA issued a press release breathlessly announcing that they (through some creative effort) were able to cobble together 2 additional "billion dollar disasters" for 2011. If you think that identifying 2 additional billion-dollar disasters is big news worthy of a press release, then you might also conclude that locating 19 other billion-dollar disasters would be even bigger news. Well, you'd be wrong.

NOAA has quietly added 19 new disasters to their tally since 1980, apparently having discovered a quantity called inflation. The modification of their tally is (it seems) in response to a blog critique which was followed up by a Washington Post blogger. The new "billion dollar disaster" figure is at the top of this post, with the 19 new additions from NOAA in red.

NOAA has also added a new disclaimer to the webpage that hosts the list of disasters:
Caution should be used in interpreting any trends based on this graphic for a variety of reasons. For example, inflation has affected our ability to compare costs over time. The graphic now shows events that were reported to have less than a billion dollars in damage prior, but after adjusting for Consumer Price Index increases, they now exceed a billion in damages. There are nineteen new events as indicated by the shaded extensions of the bars. Continued assessment of these data are in process, as there are other factors as well that affect any rate of change interpretation. NCDC intends to include academic, federal, and private sector experts in such an assessment this year. Comparison of events for years closest to 2011 are most reliable.
An unvarnished translation would be -- "This graph means virtually nothing."

In addition to inflation, patterns of population growth, nature of economic development, accumulation of wealth all play a role in how extreme events distant in time would lead to economic impacts had they occurred with the same underlying societal conditions. (To understand why inflation is important but less so than other adjustments, in this paper in PDF compare the inflation-adjusted hurricane losses in Figure 3 and normalized hurricane losses in Figure 4.) As I wrote in my initial post on this topic, for 1980 there is certainly 4 (and maybe 5 or more) other events that occurred in 1980 but would exceed a billion dollar threshold had they occurred in 2011. So by adding one event to 1980, NOAA has recognized the general problem, but has not come close to actually dealing with it.

In the end, it is nice to see NOAA acknowledge their mistaken methodology and propose an expert assessment of this topic later this year -- a complete reanalysis of normalized disaster costs from 1980 to 2011 is a big job. Ultimately, the time for a federal science agency to get the science right is well before issuing breathless press releases. NOAA has dropped the ball on this one, as have virtually all of the media and bloggers who purport to care about science integrity.

Disclaimer: I am a Fellow of a NOAA cooperative institute here at the University of Colorado and the Center where I work receives NOAA funding.

Invention as the Mother of Necessity

Natural gas prices have plummeted, due to an oversupply, and because gas is a also by-product of other commodities, the supply keeps growing, setting up a situation where concerns have been expressed about our ability to store all that gas. The figure above shows US storage of natural gas approaching capacity (data from the US EIA).
Weak demand for heating has raised the prospect that underground storage facilities will bulge with record stocks by the end of winter. Bentek Energy, a market analysis group, last week warned that some swollen storage facilities might be forced to let out gas to maintain operational integrity, causing “extreme downward pressure on prices in March”.

Even a hot summer that would force electric power plants to burn more gas may not keep inventories from straining proved storage capacity of 4.1tn cu ft when the gas “injection season” ends in October or November.

Drillers have endured low prices, pushing output above 60bn cu ft per day, because many wells also pump higher-value liquids and petroleum. “It’s like a guy mining for silver and he keeps running into gold,” said Vikas Dwivedi, energy analyst at Macquarie.
I don't know what it means to "let gas out to maintain operational integrity" but it sounds like both a bad idea and a big opportunity -- environmentally, economically, technologically and politically.  A reader points to this link on the basics of storage.

A New Review of The Climate Fix

ContractingBusiness.com has just published a very positive review of The Climate Fix. Here is how it starts out:
Reading The Climate Fix: What Scientists and Politicians Won’t Tell You About Global Warming, by Roger Pielke, Jr., (2010, Basic Books) could start a rabid, climate change denier on a road to accepting and understanding the need for a reasonable approach to what will soon become a significant carbon dioxide problem.
They also did a short email interview with me, which you can see here.

18 January 2012

Upcoming Talk in Canberra

Presented by HC Coombs Policy Forum, Crawford School of Economics & Government

The Climate Fix

Thursday 2 February 2012 
5.30pm - 6.30, followed by light refreshments

Acton Theatre, Ground Floor, JG Crawford Building #132 Lennox Crossing, ANU

Professor Roger Pielke Jr.
Professor of Environmental Studies, Centre for Science & Technology Policy Research, University of Colorado at Boulder

This lecture is free and open to the public. Registration (required): Please register online

The world’s response to climate change is deeply flawed. The conventional wisdom on how to deal with climate change has failed and it’s time to change course. To date, climate policies have been guided by targets and timetables for emissions reduction derived from various academic exercises. Such methods are both oblivious to and in violation of on-the-ground political and technological realities that serve as practical ‘boundary conditions’ for effective policy making.

Until climate policies are designed with respect for these boundary conditions, failure is certain. Using nothing more than arithmetic and logical explanation, this talk provides a comprehensive exploration of the problem and a proposal for a more effective way forward.

Professor Pielke’s research focuses on the intersection of science and technology and decision making.

Roger A Pielke Jr. has been on the faculty of the University of Colorado since 2001 and is a Professor in the Environmental Studies Program and a Fellow of the Cooperative Institute for Research in Environmental Sciences (CIRES). At CIRES, Roger served as the Director of the Center for Science and Technology Policy Research from 2001-2007. In 2006 Roger received the Eduard Brückner Prize in Munich, Germany for outstanding achievement in interdisciplinary climate research. Before joining the University of Colorado, from 1993-2001 Roger was a Scientist at the National Center for Atmospheric Research. Professr Pielke has appointments as a Research Fellow, Risk Frontiers, Macquarie University; Visiting Senior Fellow, Mackinder Programme, London School of Economics; and Senior Visiting Fellow at the Consortium for Science, Policy and Outcomes of Arizona State University. He is also a Senior Fellow of The Breakthrough Institute, a progressive think tank.

Professor Pielke is author, co-author or co-editor of seven books, including 'The Honest Broker: Making Sense of Science in Policy and Politics' (2007, Cambridge University Press) and his most recent book 'The Climate Fix: What Scientists and Politicians Won't Tell You About Global Warming' (2010, Basic Books).

ABCnews24In line with its mission to foster more vibrant public debate on government policy choices, the HC Coombs Policy Forum has put in place a pilot joint venture with the ABC. One of the fruits of this collaboration is ABC 24’s new Future Forum television series. This series considers and debates hypothetical situations that Australia may face in the future as a result of key policy choices made today.

The Australian National Institute for Public Policy (ANIPP) and the HC Coombs Policy Forum receive Australian Government funding under the Enhancing Public Policy Initiative.

US Energy Independence by 2030?

The Daily Show With Jon StewartMon - Thurs 11p / 10c
An Energy-Independent Future
Daily Show Full EpisodesPolitical Humor & Satire BlogThe Daily Show on Facebook
My how expectations change. From the FT:
North America will become almost totally self-sufficient in energy in two decades, thanks to a big growth in the production of biofuels, shale gas and unconventional oil, according to projections by BP.

Presenting the oil company’s energy outlook to 2030, BP said North America’s energy deficit would turn into a “small surplus” by that year. That contrasts with Europe, which will have to import some 60 per cent of its natural gas by 2030 as demand grows and domestic production declines.

Spotted at Climate Progress

17 January 2012

Consequences of Innovation and Aversion to Innovation

The Financial Times reports that BASF is moving its plant sciences research team from Europe to the United States, with consequences for employment:
BASF, the German chemical giant, is to pull out of genetically modified plant development in Europe and relocate it to the US, where political and consumer resistance to GM crops is not so entrenched.

The headquarters of BASF Plant Science will move from Limburgerhof in south-west Germany to Raleigh, North Carolina, and two smaller sites in Germany and Sweden will close. The company will transfer some GM crop development to the US but stop work on crops targeted at the European market – four varieties of potato and one of wheat.

The decision, which involves the net loss of 140 highly skilled jobs in Europe, also signals the end of GM crop development for European farmers. Bayer, BASF’s German competitor, is working on GM cotton and rice in Ghent, Belgium – but not for European markets.
The move, according to BASF, is the consequence of aversion to genetically modified crops in Europe. Setting aside whether such aversion is appropriate or justified, it exists and has consequences, just as the aversion to stem cell research by the administration of George W. Bush during the last decade prompted relocation of researchers in that field.

Innovation means change, and change is not always welcomed amongst the public and their represenative. But the perversity of the innovation economy is such that resisting innovation does not mean that things will stay the same. Innovation has consequences and so too does aversion to innovation.

16 January 2012

The Apparent Paradox of Productivity Growth

On the one hand, we see that increasing productivity can kill jobs -- in today's manufacturing sector 177 American workers can produce the same output that required 1,000 workers in 1950. One the other hand, productivity is viewed as central to job creation, as this article in today's FT notes:
The rate at which workers are raising their productivity in the world’s advanced economies fell by half in 2011, and is even starting to slow in some emerging economies, according to a report that suggests that unemployment is likely to rise in the months ahead.

According to the Conference Board, the global business organisation, productivity – defined as output per worker – in the most advanced economies fell from 3.1 per cent in 2010 to 1.6 per cent in 2011.
It turns out that decreasing productivity also kills jobs:
Bart van Ark, chief economist at the Conference Board, said that in the short term, the drop in productivity suggested that employers would cut labour to match the drop in overall output. “But in the longer term, productivity gains come from technology innovation and investment,” he said.

Moreover, there are concerns that the focus on austerity by governments may exacerbate the loss of productivity because without expenditure on new technology, any gains will be limited. “With calls for austerity, you have to be cautious not to cut the investments in new technology that increase productivity,” he said.
Read that again. If labor productivity falls, then businesses will eliminate workers, but over the longer term innovation will more than compensate for the short-term losses -- or at least that is the argument.

Even leading economists are of two minds on productivity. Here is Martin Wolf writing in 2005, extolling the virtues of productivity growth:
Productivity determines the wealth of nations. The proportion of the population at work matters, too, and so does the number of hours worked by each person. But neither is as important as productivity.
And here is Martin Wolf writing last summer extolling the virtues of productivity decline:
[I]f one is going to pursue austerity, as the UK government does, it greatly helps to have poor productivity performance. With US productivity, too, the UK would have a jobless rate of over 12 per cent.

On balance, I am grateful that the UK job market has responded to this recession in this curiously continental way.
By a "continental" response to the financial crisis Wolf means "a market that adjusts to shocks via hours worked per person rather than via jobs." So from this perspective, poor productivity performance is a consequence of decisions about how to spread the pain of an economic crisis.

But labor productivity is only one element of total productivity -- other factors matter as well. An economy can weather declines in labor productivity -- even those that are self-imposed -- if total factor productivity continues to increase. Wolf's apparently contradictory statements can be reconciled if we understand that in 2005 he was referring to total productivity and in 2011 he was referring to labor productivity. On his blog I am going to strive to be very clear about what I mean by "productivity" when I use the term.

Understanding the modern economy requires making sense of the easily confusing concept of productivity and how it relates to jobs and economic growth. [Total and especially labor] productivity growth does indeed kill jobs, but it creates jobs as well. Creating a virtuous cycle of total productivity growth is a key challenge of managing the 21st century economy.

EU Decarbonization 1980 to 2010 and Non-Carbon Forcings

Welcome, if you have arrived at this blog after reading John Tierney's column in the New York Times on the recent Shindell et al. paper in Science on addressing climate forcings other than carbon dioxide. Shindell et al. was first discussed here early last year when it was released as a UNEP report.

In Tierney's column he references The Climate Fix and some updated data analysis that I did for him in response to a query about Europe's rate of decarbonization. When I wrote The Climate Fix data were available through 2006. I now have data through 2010 -- from BP (for carbon dioxide), Maddison (for GDP through 2008), and Eurostat (for GDP in 2009 and 2010).

Here is the relevant passage from Tierney's column that relies on the updated analysis:
Ever since the Kyoto Protocol imposed restrictions in industrial countries, the first priority of environmentalists has been to further limit the emission of carbon dioxide. Burning fewer fossil fuels is the most obvious way to counteract the greenhouse effect, and the notion has always had a wonderfully virtuous political appeal — as long as it’s being done by someone else.

But as soon as people are asked to do it themselves, they follow a principle identified by Roger Pielke Jr. in his book “The Climate Fix.” Dr. Pielke, a political scientist at the University of Colorado, calls it iron law of climate policy: When there’s a conflict between policies promoting economic growth and policies restricting carbon dioxide, economic growth wins every time.

The law holds even in the most ecologically correct countries of Europe, as Dr. Pielke found by looking at carbon reductions from 1990 until 2010.

The Kyoto Protocol was supposed to put Europe on a new energy path, but it contained so many loopholes that the rate of “decarbonization” in Europe did not improve in the years after 1998, when the protocol was signed, or after 2002, when it was ratified. In fact, Europe’s economy became more carbon-intensive in 2010, he says — a trend that seems likely to continue as nuclear power plants are shut down in Germany and replaced by coal-burning ones.

“People will make trade-offs, but the one thing that won’t be traded off is keeping the lights on at reasonable cost,” he says.
Here is information that I sent to Tierney in response to a question about whether Europe's decarbonization data looked any different if one used its ratification of Kyoto as a break point, rather than the signing of the treaty:
On European decarbonization, Europe was an original signatory of the Kyoto Protocol (1998) and a first to ratify (2002). To look into this further, I have just now looked at data using 2002 (ratification) as a break point instead of 1998, and the average for the 8 years prior to 2002 is identical to the rate of decarbonization the eight years after (a decrease of 1.8% per year in the ratio of CO2/GDP in both cases). (Note that I now have data through 2010.) In fact, the long-term rate from 1989-2010 is also 1.8% per year! So there is still no evidence that Europe has fundamentally increased its decarbonization rate in the post-Kyoto era, whether that break point is 1998 (signing) or 2002 (ratifying).

FYI, as I suspected 2010 data show a re-carbonization of the EU-15, at a rate of 0.3% from 2009 (there were only two years with higher rates since at least 1980.
You can see the decarbonization rates of the EU-15 from 1980 to 2010 in the graph at the top of this post.  As I mention above, there is no trend in the data since the late 1980s, but there is a decreasing rate of decarbonization if you start the analysis in 1980 (see the red trend line on the graph). A decreasing rate of decarbonization means that the European economy is becoming more carbon intensive than required to hit ambitious emissions targets. If you are interested in understanding how it is that Germany has come to rely more on coal, this paper is a good place to start.

Tierney's conclusion quotes Shindell sounding quite Hartwellian:
“But I also worry that carbon dioxide will go up even if we do focus on it,” he says. “We’re at a complete deadlock on carbon dioxide. Dealing with the short-lived pollutants might really be a way to bridge some of the differences, both between the two sides in the United States and between the developed and the developing world.”

No matter what people think about global warming, there aren’t a lot of fans of dirty snow, poor crops and diseased lungs.
Amen to that.

We are Not as Smart as We'd Like to Think

Writing at VoxEu.org Victor Ginsburgh, of the Université Libre de Bruxelles, notes that experts aren't so expert in many situations:
A paper by Fritz et al (2012) published last week in the Proceedings of the National Academy of Sciences shows that professional musicians are unable to distinguish between the tonal superiority of a violin built by Stradivari (which would cost up to $4 million) from that of a new American instrument (a couple of thousand). . .

Likewise, Ashenfelter and Quandt (1999) [here in DOC] show that there is lack of concordance between wine judges. Hodgson’s (2008) [here in PDF] result is even stronger, since he finds that only about 10% of the judges are able to replicate their score within a single wine medal group.

In artistic skating, evaluation depends on the incentives and the monitoring faced by judges. Lee (2004) points out that they face an “outlier aversion bias” because they may be excluded from further competitions if they cannot explain why their rating is at odds with the mean of other judges. Therefore, they manipulate their ratings to achieve “a targeted level of agreement with the other judges,” which essentially implies that their judgement is based on previous achievements, and not on the one that is unfolding, since they have to cast their votes a couple of seconds after the performance of each skater.
In related news the Federal Reserve has released transcripts of their deliberations in 2006 on the eve of the financial crisis. NPR provides a nice round-up of coverage:
Here's how the Los Angeles Times frames the story: The transcripts, released Thursday after the usual five-year wait, "reveal in painfully embarrassing detail the high degree of overconfidence and lack of foresight just ahead of the real estate collapse and financial crisis that engulfed the nation" . . .

Perhaps, The Wall Street Journal's Real Time Economics blog found the best bit. During the March 27-28 meeting the Fed's chief economist, David Stockton described a dire situation, which Bernanke acknowledged but quickly dismissed:
"'Right now, it feels a bit like riding a roller coaster with one's eyes shut,' when discussing his forecast for a modest slowdown in housing. 'We sense that we're going over the top, but we just don't know what lies below.' Later, he notes that housing is 'the most salient risk' to the economy. 'I just don't know how to forecast those prices,' he says of housing prices.

"'Again, I think we are unlikely to see growth being derailed by the housing market, but I do want us to be prepared for some quarter-to-quarter fluctuations,' Bernanke says. He identifies housing as a crucial issue, but adds that he agrees 'with most of the commentary that the strong fundamentals support a relatively soft landing in housing."
We are not as smart as we'd like to think we are. Trust me on this, I'm an expert.

Follow Up: Nigeria Fuel Subsidies

Following the removal of fuel subsidies that led to a dramatic and instant increase in the price of fuel and food, Nigeria has seen protests (pictured above), strikes and violence. President Goodluck Jonathan, boxed into a corner, has partially removed the fuel subsidies:
Nigerian unions have suspended their crippling week-long strike, news agencies report, after Goodluck Jonathan, Nigeria’s president, cut petrol prices by 31 per cent on Monday and promised to investigate oil sector corruption.

Unions had called the strike after the removal of fuel subsidies. “In the past eight days through strikes, mass rallies, shutdown, debates and street protests, Nigerians demonstrated clearly that they cannot be taken for granted and that sovereignty belongs to them,” Abdulwahed Omar, president of the Nigeria Labour Congress (NLC), said during a press conference. “Labour and its allies formally announce the suspension of strikes, mass rallies and protests across the country.” It remains unclear whether the suspension of the strike is conditional.
It is unclear if the partial reinstatement of the subsidy will quell protests.

15 January 2012

Manufacturing, Services, Resources: One is Different

The decline of jobs in manufacturing is the result of innovations that have led to dramatic growth in productivity, as shown in the graph above from a 2011 presentation by William Strauss, Senior Economist and Economic Advisor Federal Reserve Bank of Chicago. A consequence of productivity growth in manufacturing is that this sector has become smaller as a proportion of the overall economy, even as the absolute magnitude of manufacturing output has increased.

These dynamics should tell you that the simple mathematics of President Obama's new "insourcing" proposal's can only influence jobs at the very thin margin, if at all. The president explained:
“I don’t want America to be a nation that’s primarily known for financial speculation, and racking up debt and buying stuff from other nations,” the president said. “I want us to be known for making and selling products all over the world stamped with three proud words, ‘Made in America.’ ”
But here is the problem: Due to advances in manufacturing productivity that have exceeded the growth of the economy, manufacturing is a shrinking part of the global and national economy. If productivity growth were to slow or reverse, then jobs would be lost overseas anyway. Efforts to expand employment in manufacturing are simply swimming upstream against a strong current.

The Obama Administration does not seem to recognize, at least publicly, the implications of these various trends. The White House explains:
[C]ontinued productivity growth has – as several outside analysts have noted – made the United States more competitive in attracting businesses to invest and create jobs by reducing the relative cost of doing business compared to other countries.
Of course, one of the reasons for productivity growth is lower unit labor costs! 

Seeking to keep manufacturing as an important element of the US economy is not the same thing as increasing employment in manufacturing. The Administration will have far more success supporting job creation by focusing on areas that are expanding parts of the economy, meaning that policy will be going with rather than against the current. Specifically, two other areas of the economy are expanding in both absolute and relative terms, natural resources and services.

While manufacturing seems to garner a lot of attention, particularly in politically important states, that does not mean that the administration is neglecting natural resources, particularly energy (I'll have more to say on services in subsequent posts).

The White House explains the importance of energy resources with a lot less fanfare than its focus on manufacturing (PDF):
Of the major fossil fuels, natural gas is the cleanest and least carbon‐intensive for electric power generation. By keeping domestic energy costs relatively low, this resource also supports energy intensive manufacturing in the United States. In fact, companies like Dow Chemical and Westlake Chemical have announced intentions to make major investments in new facilities over the next several years. In addition, firms that provide equipment for shale gas production have announced major investments in the U.S., including Vallourec’s $650 million plant for steel pipes in Ohio.

An abundant local supply will translate into relatively low costs for the industries that use natural gas as an input. Expansion in these industries, including industrial chemicals and fertilizers, will boost investment and exports in the coming years, generating new jobs. In the longer run, the scale of America's natural gas endowment appears to be sufficiently large that exports of natural gas to other major markets could be economically viable.
The politics of resources, especially energy resources, are problematic for the Obama Administration, with the Keystone XL pipeline Exhibit A. But if expanding jobs means paying attention to those parts of the economy that are expanding rather than shrinking, then the Obama Administration won't be able to keep the ongoing energy revolution on a back burner for too long.

The Partisan Divide on the Tebow Question

Something to keep in mind when considering ongoing battles over public opinion on various questions related to science on issues like evolution, climate change, genetically modified foods and so on: Consider that about 4 in 10 Democrats and 5 in 10 Republicans think that Tim Tebow's success on the football field (pictured below) can be attributed to "divine intervention."  Data here in PDF from a polling firm called Poll Position.
Political debates that involve science will be far more productive-- for both policy and the health of the scientific enterprise -- if the focus of debates is on policy options rather than what people happen to think about this or that. As Walter Lippmann once said, democracy is about getting people who think differently to act alike, not to get them to think alike.

13 January 2012

Friday's Links and Music Video

Too few hours and too much interesting. Below are a few items worth noting that I did not get to this week, and above is a music video for your viewing pleasure.  Thanks for reading!

12 January 2012

Does Basic Research Drive out Applied?

In the seminal 1945 report on science policy, Science -- The Endless Frontier, Vannevar Bush expressed a view of the relationship of basic and applied research in the federal government:
. . . under pressure for immediate results, and unless deliberate policies are set up to guard against it, applied science invariably drives out pure.
Writing in last week's Nature, Dan Sarewitz wonders if this relationship has now been reversed based on recent budget numbers:
[O]ver the past 15 years, mission agencies such as the USGS that seek principally to serve public goals rather than to advance science have experienced minimal budgetary growth, in some cases not even keeping up with inflation. Since 1996, research funds at the USGS have risen by a mere 16%; at the National Oceanic and Atmospheric Administration (NOAA), 11%; the Environmental Protection Agency, 33%; the National Institute of Standards and Technology (NIST), 38%; and the Centers for Disease Control and Prevention, 45%. Even Department of Defense research has grown relatively modestly, by 60% in 15 years.

Yet, over this same period, government funding for research doubled. Most of the increase went to the National Institutes of Health (NIH) and the National Science Foundation (NSF). The NIH's budget has tripled; the NSF's more than doubled. Together, they captured three-quarters of all the spending increases for federal science. (Although the NIH is in some respects a mission agency, its priorities, its work force and the image it has cultivated focus on fundamental science, a reality acknowledged in director Francis Collins's efforts to create an institute to translate research into useful technology).
Sarewitz explains that it is in the self-interests of the scientific community to support blue-sky research at the expense of that conducted by the mission agencies:
One important reason may be that the leading public voices speaking on behalf of research funding come mostly from the high-prestige frontiers of science, and from the institutions associated with such research — universities, the National Academies, the professional scientific societies, and so on.

Last November, for example, the head of the American Association for the Advancement of Science called for “rethinking the science system” to make the funding of university researchers more efficient (A. I. Leshner Science 334, 738; 2011). This is a worthy goal, but nowhere in his editorial, or in the many similar examples of hand-wringing, is it acknowledged that the main goal of rethinking science should be to ensure that the scientific enterprise continues to meet existing and future challenges to public well-being, not simply to protect science for its own sake.

Defending science for its own sake disproportionately benefits the fundamental-science agencies, which can claim to be doing the most prestigious and therefore the most apparently worthwhile science. In the face of the new budgetary reality, advocacy for science must take a new, strategic approach — one that insists on balance between the fundamental-science agencies and the mission agencies that link science to the public good. Otherwise, the value of the public investment in science will decline right along with the budget.
So-called "basic research" is important, but so too is research conducted by mission agencies. Nowadays it seems that basic drives out the applied.