| Jason Furman 是美国前总统欧巴马的经济顾问委员会(Council of Economic Advisers)主席。 他于2003年从哈佛大学取得经济学博士学位,师从著名经济学家曼昆。在本文中,他介绍了他从平时工作积累中总结出的七条宏观数据分析的宝贵经验。虽然本文涉及的是美国经济数据,但对于我们的经济研究有很强的借鉴作用。
|
Extracting the Signal from the Noise
Oct 19, 2016, by Jason Furman
In the early stages of the great depression, policymakers in Washington were faced with a profound gap in their understanding of the state of the U.S. economy: no one actually knew how many Americans were out of work. Aside from some attempts in the decennial census, before the 1930s no government agency had regularly measured the number of individuals seeking work in the United States.
This inability to measure basic economic conditions may seem shocking to observers of the U.S. economy today. The federal statistical agencies – the Bureau of Economic Analysis, the Bureau of Labor Statistics (BLS) and the Census Bureau – and a variety of other public and private entities now provide a wealth of economic data on an annual, quarterly, monthly and sometimes even weekly or daily basis.
Yet, while we no longer must cope with the void that policymakers faced in the 1930s, the mountain of data available creates its own problems. Perhaps chief among them is that we can sometimes ask too much of the data while doing too little to put it in context. The conflicting demands for timely reporting of data and their accuracy and completeness make it necessary to be cautious in interpreting the numbers.
For example, the BLS originally reported that the economy added 38,000 jobs last May, which could have led an observer to believe the economy was slowing markedly since job growth had averaged over 200,000 a month in both 2014 and 2015. But then in June, according to the Bureau's initial estimate, the economy added 287,000 jobs – a boom.
The truth is that, at a monthly frequency, it is difficult to accurately measure the vitals of the economy, and placing much weight on monthly data when they are first released can lead one seriously astray in assessing what's happening.
Underlying economic reality, as well as our attempts to measure it, exhibits substantial volatility. Hence, as important as it can be to gauge turning points in prices, employment, output and the like on a frequent basis, it is also important not to lean heavily on any single data point – or even on a combination of data points – because our measures are nowhere close to perfect. This is not the fault of the statistical agencies, but simply due to the inherent complexity of a vast and rapidly changing economy like that of the United States. More data over longer periods make it easier to disentangle underlying trends from transitory noise. While there are a variety of sophisticated statistical techniques to smooth economic data, a simple moving average that weights past as well as current numbers equally offers a reasonable way to assess trends.
Take labor productivity, a measure of how much output is produced by an average hour of labor. Measured productivity growth is extremely noisy – that is, full of spurious volatility – at a quarterly frequency, and we largely look to it to answer longer-run questions about the economy. Moreover, there is some evidence that the best predictor of productivity growth is a long-term average of past productivity growth. All of this suggests that, at a minimum, productivity growth should be assessed with something like a trailing 10-year moving average as shown in Figure 1.
It is not just that numbers bounce around from month to month; seemingly comparable measures can offer divergent readings even for the same period. The United States measures economic output in two different ways that in theory provide different routes to the same destination: Gross Domestic Product and Gross Domestic Income. In the second quarter of 2015, the economy grew a disappointing 0.6 percent, according to one, and a solid 2.6 percent according to the other. In this case the difference between these two numbers was simply statistical noise, a reminder that these statistics are an imprecise way to measure the economy's temperature at a quarterly frequency.
These are not just academic issues. How individuals and institutions (and in some cases, computer algorithms) interpret and react to economic data influences economic policy as well as private consumption and investment decisions. In the midst of the economic crisis, for example, economic growth for the fourth quarter of 2008 was initially estimated at –3.8 percent and job losses in November 2008 were originally estimated at 598,000. These data points affected perceptions in Washington of what constituted an appropriate fiscal policy response. However, the estimates would later be revised down to much grimmer numbers (–8.2 percent growth and 791,000 jobs lost), which, had they been known earlier, might well have led to a proposal for more stimulus.
Here, I offer seven lessons to help guide those trying to make sense of the wealth of economic data available today, many of them drawing on analytical work by the Council of Economic Advisers. I also provide some applications of these lessons that have proved most valuable in understanding the economy. But all of this has a simple bottom line: when assessing the overall health of the economy, never read too much into a single data snapshot. Rely, instead, on data series over substantial periods and in the context of what other data suggest is happening.
Payroll job growth is less volatile than productivity growth and thus can be examined in the context of a shorter window like the 12-month trailing moving average shown in Figure 2. From 2012 to the end of 2015, the 12-month moving average of private-sector job growth held steady at about 200,000 per month – a much more accurate picture of the economy than the excessive optimism suggested in the many months when job growth came in above that average or the excessive pessimism of news reports in the many months when job growth fell well short of that average.
Averaging over time is essential with higher-frequency data, like initial claims for unemployment insurance. Initial claims are compiled weekly from administrative data from state offices, so they are not subject to the same measurement error as data derived from sample surveys, such as estimates of job growth. But the series bounces around from week to week, with dramatic movements in both directions that can mislead anyone trying to get a clear picture how many Americans are involuntarily out of work. Last May, for example, initial claims spiked for exactly one week entirely because an unusual law in New York permits many public school employees to claim benefits for their time off during spring break. Using a four-week moving average helps avoid some of the zigzags, as shown in Figure 3, giving a more stable picture of recent trends.
It's standard practice for the statistics agencies to issue revised estimates of economic data that incorporate new information as it becomes available – a fact that is easy to miss, given that these revisions can occur months or even years after the initial reporting. For example, with each month's release of employment data, the Bureau of Labor Statistics also revises the prior two months' estimates of job growth. These revisions are often large and economically meaningful, especially around economic turning points. Estimates of monthly job growth are then revised once a year for the next five years. For example, in September 2011, the Bureau of Labor Statistics reported that job growth in August had been zero, a striking number that fueled concerns the economy was headed into a double-dip recession. But the latest revised estimate for job growth in that month is a far-less-concerning 107,000.
Some of the clearest instances of revisions changing the economic narrative come from the Bureau of Economic Analysis' corrections to quarterly GDP data. When the advance estimate for GDP growth in a given quarter is published, the bureau does not yet have all of the timely data on international trade, business inventories and spending on services; thus, the agency must use projections based on statistical modeling to pencil in more than half of the data. Even nearly three months after the quarter ends, it still has to use trends or indirect indicators to estimate components that comprise about one-third of GDP.
请继续往下看。。。