Why Nominal GDP Targeting?

Nominal GDP (in logs) and approximate 6% trend since 1982
Nominal GDP (in logs) and approximate 6% trend since 1982

Probably the best example that blogging can have a significant influence on economic thinking has been Scott Sumner’s advocacy of nominal GDP (NGDP) targeting at his blog The Money Illusion (and more recently EconLog). Scott has almost singlehandedly transformed NGDP targeting from an obscure idea to one that is widely known and increasingly supported. Before getting to Scott’s proposals, let me give a very brief history of monetary policy in the United States over the last 30 years.

From 1979-1982, the influence of Milton Friedman led the Federal Reserve to attempt a policy of constant monetary growth. In other words, rather than attempt to react to economic fluctuations, the central bank would simply increase the money supply by the same rate each year. Friedman’s so called “k-percent” rule failed to promote economic stability, but the principle behind it continued to influence policy. In particular, it ushered in an era of rules rather than discretion. Under Alan Greenspan, the Fed adjusted policy to changes in inflation and output in a formulaic fashion. Studying these policies, John Taylor developed a simple mathematical equation (now called the “Taylor Rule”) that accurately predicted Fed behavior based on the Fed’s responses to macro variables. The stability of the economy from 1982-2007 led to the period being called “the Great Moderation” and Greenspan dubbed “the Maestro.” Monetary policy appeared to have been solved. Of course, everybody knows what happened next.

On Milton Friedman’s 90th birthday, former Federal Reserve chairman Ben Bernanke gave a speech where, on the topic of the Great Depression, he said to Friedman, “You’re right, we did it. We’re very sorry. But thanks to you, we won’t do it again.” According to Scott Sumner, however, that’s exactly what happened. Many explanations have been given for the Great Recession, but Scott believes most of the blame can be placed on the Fed for failing to keep nominal GDP on trend.

To defend his NGDP targeting proposal, Scott outlines a simple “musical chairs” model of unemployment. He describes that model here and here. Essentially the idea is that the total number of jobs in the economy can be approximated by dividing NGDP by the average wage (which would be exactly true if we replace NGDP by labor income, but they are highly correlated). If NGDP falls and wages are slow to adjust, the total number of jobs must decrease, leaving some workers unemployed (like taking away chairs in musical chairs). Roger Farmer has demonstrated a close connection between NGDP/W and unemployment in the data, which appears to support this intuition

NGDP_Unemployment
Source: Farmer (2012) – The stock market crash of 2008 caused the Great Recession: Theory and evidence

We can also think about the differences in a central bank’s response to shocks under inflation and NGDP targeting. Imagine a negative supply shock (for example an oil shock) hits the economy. In a simple AS-AD model, this shock would be expected to increase prices and reduce output. Under an inflation targeting policy regime, the central bank would only see the higher prices and tighten monetary policy, deepening the drop in output even more. However, since output and inflation move in opposite directions, the change in NGDP (which is just the price level times output), would be much lower, meaning the central bank would respond less to a supply shock. Conversely, a negative demand shock would cause both prices and output to drop leading to a loosening of policy under both inflation and NGDP targeting. In other words, the central bank would respond strongly to demand shocks, while allowing the market to deal with the effects of supply shocks. Since a supply shock actually reduces the productive capacity of an economy, we should not expect a central bank to be able to effectively combat supply shocks. An NGDP target ensures that they will not try to.

Nick Rowe offers another simple defense of NGDP targeting. He shows that Canadian inflation data gives no indication of any changes occurring in 2008. Following a 2% inflation target, it looks like the central bank did everything it should to keep the economy on track

CanadaCPI

Looking at a graph of NGDP makes the recession much more obvious. Had the central bank instead been following a policy of targeting NGDP, they would have needed to be much more active to stay on target. Nick therefore calls inflation “the dog that didn’t bark.” Since inflation failed to warn us a recession was imminent, perhaps NGDP is a better indicator of monetary tightness.

CanadaNGDP

Of course, we also need to think about whether a central bank would even be able to hit its NGDP target if it decided to adopt one. Scott has a very interesting idea on the best way to stay on target. I will soon have another post detailing that proposal.

Quote of the Day 8-23-16

All ownership derives from occupation and violence. When we consider the natural components of goods, apart from the labour components they contain, and when we follow the legal title back, we must necessarily arrive at a point where this title originated in the appropriation of goods accessible to all. Before that we may encounter a forcible expropriation from a predecessor whose ownership we can in its turn trace to earlier appropriation or robbery. That all rights derive from violence, all ownership from appropriation or robbery, we may freely admit to those who oppose ownership on considerations of natural law. But this offers not the slightest proof that the abolition of ownership is necessary, advisable, or morally justified.
Ludwig von Mises, Socialism, 1922

What’s Wrong With Modern Macro? Part 2 The Death of Keynesian Economics: The Lucas Critique, Microfoundations, and Rational Expectations

Part 2 in a series of posts on modern macroeconomics. Part 1 covered Keynesian economics, which dominated macroeconomic thinking for around thirty years following World War II. This post will deal with the reasons for the demise of the Keynesian consensus and introduce some of the key components of modern macro.


The Death of Keynesian Economics

Milton Friedman - Wikimedia Commons
Milton Friedman – Wikimedia Commons

Although Keynes had contemporary critics (most notably Hayek, if you haven’t seen the Keynes vs Hayek rap videos, stop reading now and go watch them here and here), these criticisms generally remained outside of the mainstream. However, a more powerful challenger to Keynesian economics arose in the 1950s and 60s: Milton Friedman. Keynesian theory offered policymakers a set of tools that they could use to reduce unemployment. A key empirical and theoretical result was the existence of a “Phillips Curve,” which posited a tradeoff between unemployment and inflation. Keeping unemployment under control simply meant dealing with slightly higher levels of inflation.

Friedman (along with Edmund Phelps), argued that this tradeoff was an illusion. Instead, he developed the Natural Rate Hypothesis. In his own words, the natural rate of unemployment is

the level that would be ground out by the Walrasian system of general equilibrium equations, provided there is embedded in them the actual structural characteristics of the labour and commodity markets, including market imperfections, stochastic variability in demands and supplies, the costs of gathering information about job vacancies, and labor availabilities, the costs of mobility, and so on.
Milton Friedman, “The Role of Monetary Policy,” 1968

If you don’t speak economics, the above essentially means that the rate of unemployment is determined by fundamental economic factors. Governments and central banks cannot do anything to influence this natural rate. Trying to exploit the Phillips curve relationship by increasing inflation would fail because eventually workers would simply factor the inflation into their decisions and restore the previous rate of employment at a higher level of prices. In the long run, printing money could do nothing to improve unemployment.

Friedman’s theories couldn’t have come at a better time. In the 1970s, the United States experienced stagflation, high levels of both inflation and unemployment, an impossibility in the Keynesian models, but easily explained by Friedman’s theory. The first blow to Keynesian economics had been dealt. It would not survive the fight that was still to come.

The Lucas Critique

While Friedman may have provided the first blow, Robert Lucas landed the knockout punch on Keynesian economics. In a 1976 article, Lucas noted that standard economic models of the time assumed people were excessively naive. The rules that were supposed to describe their behavior were invariant to policy changes. Even when they knew about a policy in advance, they were unable to use the information to improve their forecasts, ensuring that those forecasts would be wrong. In Lucas’s words, they made “correctibly incorrect” forecasts.

The Lucas Critique was devastating for the Keynesian modeling paradigm of the time. By estimating the relationships between aggregate variables based on past data, these models could not hope to capture the changes in individual actions that occur when the economy changes. In reality, people form their own theories about the economy (however simple they may be) and use those theories to form expectations about the future. Keynesian models could not allow for this possibility.

Microfoundations

At the heart of the problem with Keynesian models prior to the Lucas critique was their failure to incorporate individual decisions. In microeconomics, economic models almost always begin with a utility maximization problem for an individual or a profit maximization problem for a firm. Keynesian economics attempted to skip these features and jump straight to explaining output, inflation, and other aggregate features of the economy. The Lucas critique demonstrated that this strategy produced some dubious results.

The obvious answer to the Lucas critique was to try to explicitly build up a macroeconomic model from microeconomic fundamentals. Individual consumers and firms returned once again to macroeconomics. Part 3 will explore some specific microfounded models in a bit more detail. But first, I need to explain one other idea that is essential to Lucas’s analysis and to almost all of modern macroeconomics: rational expectations

Rational Expectations

Think about a very simple economic story where a firm has to decide how much of a good to produce before they learn the price (such a model is called a “cobweb model”). Clearly, in order to make this decision a firm needs to form expectations about what the price will be (if they expect a higher price they will want to produce more). In most models before the 1960s, firms were assumed to form these expectations by looking at the past (called adaptive expectations). The problem with this assumption is that it creates predictable forecast errors.

For example, assume that all firms simply believe that today’s price will be the same as yesterday. Then when the price yesterday was high, firms produce a lot, pushing down the price today. Then tomorrow, firms expect a low price so don’t produce very much, which pushes the price back up. A smart businessman would quickly realize that their expectations are always wrong and try to find a new way to predict future prices.

A similar analysis led John Muth to introduce the idea of rational expectations in a 1961 paper. Essentially, Muth argued that if economic theory predicted a difference between expectation and result, the people in the model should be able to do the same. As he describes, “if the prediction of the theory were substantially better than the expectations of the firms, then there would be opportunities for ‘the insider’ to profit from the knowledge.” Since only the best firms would survive, in the end expectations will be “essentially the same as the predictions of the relevant economic theory.”

Lucas took Muth’s idea and applied it to microfounded macroeconomic models. Agents in these new models were aware of the structure of the economy and therefore could not be fooled by policy changes. When a new policy is announced, these agents could anticipate the economic changes that would occur and adjust their own behavior. Microfoundations and rational expectations formed the foundation of a new kind of macroeconomics. Part 3 will discuss the development of Dynamic Stochastic General Equilibrium (DSGE) models, which have dominated macroeconomics for the last 30 years (and then in part 4 I promise I will actually get to what’s wrong with modern macro).

Links 8-21-16

“In the land of the free, where home ownership is a national dream, borrowing to buy a house is a government business for which taxpayers are on the hook.”

“If you think that there has never been a better time to be alive — that humanity has never been safer, healthier, more prosperous or less unequal — then you’re in the minority. But that is what the evidence incontrovertibly shows.”

Noah Smith criticizes heterodox macro

And then addresses some of the responses

The Solution to High Rents: Build More Houses. Who would have thought?

Which Party is For Small Government Again?

Democrats want to expand the role of government. Republicans want to shrink it. At least, that’s what their rhetoric says. The story becomes a bit harder to believe when looking at government spending statistics. Consider two presidents as an example. President A increased spending by $357 billion over the first seven years of his presidency. Over the same period, President B increased spending by around half as much, $160 billion. Who is President A? The legendary champion of small government, Ronald Reagan. President B? None other than the evil Kenyan dictator, Barack Obama himself.

Let’s look at a graph of total spending per capita over time (data from the BEA). I don’t see any clear party breaks. The one big slowdown in spending in the 1990s coincides with Clinton (a Democrat).

gov_expenditure

Breaking the data down by president makes the point even clearer. The table below shows how much spending per capita increased over each presidents tenure.

president_spending

Note that Obama’s numbers are skewed by stimulus spending. Measuring from 2009 Q2 drops the increase to $153.

Overall, Republicans have held office for 36 years since 1953 and increased government spending per capita by $5310 during that time ($148 per year on average). Democrats were in power for 27 years and increased spending per capita by $3167 ($117 per capita). Not a single president in either party has actually reduced the size of government by this measure. A natural question is whether control of the senate or house is more important than the president. I didn’t calculate the numbers, but I doubt it would help the Republicans, who had control of the senate during the expansion in spending under both Reagan and Bush.

Bill McBride at Calculated Risk keeps a tally of public and private sector jobs added by president. By those numbers, Obama is the only president since Carter to decrease the total number of public sector jobs. Again, there is no clear relationship between party affiliation and number of jobs added.

On taxes, the picture looks strikingly different. Performing the same exercise shows that Republicans reduced taxes by $23 per capita per year, while Democrats increased taxes by $278 per capita per year. So maybe you could argue that the Republicans have kept half of their small government promise. But both parties clearly like to spend. At least the Democrats seem to care about paying for it.

What’s Wrong With Modern Macro? Part 1 Before Modern Macro - Keynesian Economics

Part 1 in a series of posts on modern macroeconomics. This post focuses on Keynesian economics in order to set the stage for my explanation of modern macro, which will begin in part 2. 


John Maynard Keynes – Wikimedia Commons

If you’ve never taken a macroeconomics class, you almost certainly have no idea what macroeconomists do. Even if you have an undergraduate degree in economics, your odds of understanding modern macro probably don’t improve much (they didn’t for me at least. I had no idea what I was getting into when I entered grad school). The gap between what is taught in undergraduate macroeconomics classes and the research that is actually done by professional macroeconomists is perhaps larger than in any other field. Therefore, for those of you who made the excellent choice not to subject yourself to the horrors of a first year graduate macroeconomics sequence, I will attempt to explain in plain English (as much as possible), what modern macro is and why I think it could be better.

But before getting to modern macro itself, it is important to understand what came before. Keep in mind throughout these posts that the pretense of knowledge is quite strong here. For a much better exposition that is still somewhat readable for anyone with a basic economic background, Michael De Vroey has a comprehensive book on the history of macroeconomics. I’m working through it now and it’s very good. I highly recommend it to anyone who is interested in what I say in this series of posts.

Keynesian Economics

Although Keynes was not the first to think about business cycles, unemployment, and other macroeconomic topics, it wouldn’t be too much of an exaggeration to say that macroeconomics as a field didn’t truly appear until Keynes published his General Theory in 1936. I admit I have not read the original book (but it’s on my list). My summary here will therefore be based on my undergraduate macro courses, which I think capture the spirit (but probably not the nuance) of Keynes.

Keynesian economics begins by breaking aggregate spending (GDP) into four pieces. Private spending consists of consumption (spending by households on goods and services) and investment (spending by firms on capital). Government spending on goods and services makes up the rest of domestic spending. Finally, net exports (exports minus imports) is added to account for foreign expenditures. In a Keynesian equilibrium, spending is equal to income. Consumption is assumed to be a fraction of total income, which means that any increase in spending (like an increase in government spending) will cause an increase in consumption as well.

An important implication of this setup is that increases in spending increase total income by more than the initial increase (called the multiplier effect). Assume that the government decides to build a new road that costs $1 million. This increase in expenditure immediately increases GDP by $1 million, but it also adds $1 million to the income of the people involved in building the road. Let’s say that all of these people spend 3/4 of their income and save the rest. Then consumption also increases by $750,000, which then becomes other people’s incomes, adding another $562,500, and the process continues. Some algebra shows that the initial increase of $1 million leads to an increase in GDP of $4 million. Similar results occur if the initial change came from investment or changes in taxes.

The multiplier effect also works in the other direction. If businesses start to feel pessimistic about the future, they might cut back on investment. Their beliefs then become self-fulfilling as the reduction in investment causes a reduction in consumption and aggregate spending. Although the productive resources in the economy have not changed, output falls and some of these resources become underutilized. A recession occurs not because of a change in economic fundamentals, but because people’s perceptions changed for some unknown reason – Keynes’s famous “animal spirits.” Through this mechanism, workers may not be able to find a job even if they would be willing to work at the prevailing wage rate, a phenomenon known as involuntary unemployment. In most theories prior to Keynes, involuntary unemployment was impossible because the wage rate would simply adjust to clear the market.

Keynes’s theory also opened the door for government intervention in the economy. If investment falls and causes unemployment, the government can replace the lost spending by increasing its own expenditure. By increasing spending during recessions and decreasing it during booms, the government can theoretically smooth the business cycle.

IS-LM

The above description is Keynes at its most basic. I haven’t said anything about monetary policy or interest rates yet, but both of these were essential to Keynes’s analysis. Unfortunately, although The General Theory was a monumental achievement for its time and probably the most rigorous analysis of the economy that had been written, it is not exactly the most readable or even coherent theory. To capture Keynes’s ideas in a more tractable framework, J.R. Hicks and other economists developed the IS-LM model.

I don’t want to give a full derivation of the IS-LM model here, but the basic idea is to model the relationship between interest rates and income. The IS (Investment-Savings) curve plots all of the points where the goods market is in equilibrium. Here we assume that investment depends negatively on interest rates (if interest rates are high, firms would rather put their money in a bank then invest in new projects). A higher interest rate then lowers investment and decreases total income through the same multiplier effect outlined above. Therefore we end up with a negative relationship between interest rates and income.

The LM (Liquidity Preference-Money) curve plots all of the points where the money market is in equilibrium. Here we assume that the money supply is fixed. Money demand depends negatively on interest rates (since a higher interest rate means you would rather keep money in the bank than in your wallet) and positively on income (more cash is needed to buy stuff). Together these imply that a higher level of income results in a lower interest rate required to clear the money market. An equilibrium in the IS-LM model comes when both the money market and the goods market are in equilibrium (the point where the two lines cross).

The above probably doesn’t make much sense if you haven’t seen it before. All you really need to know is that an increase in government spending or investment shifts the IS curve right, which increases both income and interest rates. If the central bank increases the money supply, the LM curve shifts right, increasing income and decreasing interest rates. Policymakers then have two powerful options to combat economic downturns.

In the decades following Keynes and Hicks, the IS-LM model grew to include hundreds or thousands of equations that economists attempted to estimate econometrically, but the basic features remained in place. However, in the 1970s, the Keynesian model came under attack due to both empirical and theoretical failures. Part 2 will deal with these failures and the attempts to solve them.

Links 8-7-2016

Scott Sumner is Feeling the Johnson,

Vox says Republicans should be too,

But Bob Murphy’s Not Convinced.

Maybe Denmark isn’t so Great After All

“when it comes to government support of grants from the National Science Foundation (NSF) for economic research, our sense is that many economists avoid critical questions, skimp on analysis, and move straight to advocacy.”

About that Productivity Gap

In my post yesterday, I put up this graph, which has been used by many as evidence of major economic problems.

The graph shows a growing gap between the production of an average worker and the compensation they receive for that production. The Economic Policy Institute (which is the source of the graph above), claims that this gap represents a failure of wealth to “trickle down” to workers. The economy is growing, but workers don’t see any of the benefits. I agree that this story, if true, would be deeply concerning. But I’m not so sure it’s true.

First, I wanted to make sure I could recreate the gap on my own. I used FRED to plot real compensation per hour against real gross value added divided by total hours (to proxy for productivity). I’m not sure if these are exactly the series used by EPI above, but the pictures look pretty close.

compensation_productivity_gap

Notice, however, that in order to get these data into real values, I deflated by two different price indexes – CPI for compensation and an implicit price deflator for productivity (if you aren’t sure what the difference is, here’s the BLS explanation). FRED also provides real values for each of these variables and they also use different inflation measures for each. I assume other people that have made this graph have done the same. Although it might make sense to deflate these series using different measures of inflation in some situations, it does not allow for an easy direct comparison between them. Looking at the difference between the two measures over time reveals a familiar looking gap.

inflation_gap

So what happens if instead of trying to look at real values using different (and imperfect) measures of inflation, we just put everything in nominal terms? Bye-bye productivity gap:

compensation_productivity_gap_2

Now, maybe there’s still something to be concerned about here. The CPI is supposed to measure the goods actually purchased by consumers. If the prices of these goods are growing faster than average, then the value of a worker’s compensation is lower. I’m not sure I have enough faith in either measure to even grant that point, but at the very least can we start calling it an inflation gap rather than a productivity gap?