Friday, March 18, 2016

The June 1978 conference at the Federal Reserve Bank of Boston has attained legendary status in the economic profession. The revolution that has transformed macroeconomics into an area of research that is solidly built on microeconomic foundations – with optimising agents making decisions that are consistent in the context of an intertemporal setting – is said to have taken place then. Certainly, the language in the paper presented by Robert Lucas and Thomas Sargent, and in the response by Robert Solow, brings to mind a coup.

While Lucas and Sargent represented a new broom, the revolution turned out to be a slow burner. Finn Kydland and Edward Prescott published their seminal work on aggregate fluctuations in 1982. This built on earlier work by Lucas in its emphasis on microfoundations, but introduced for the first time a method – calibration – whereby empirical flesh could be built upon the theoretical bones. Calibration met with some initial scepticism within the profession. In its crudest forms, it has a back-of-the-envelope quality about it. But, as a slow burner, the methods became increasingly refined over time. The early real business cycle models developed into modern dynamic stochastic general equilibrium (DSGE) models.

These models initially had a market clearing orientation – economic theories often do because it’s easier to model things that way. Inevitably, however, as the research agenda matured, mechanisms were found to incorporate more realistic scenarios. One of these, due to Jordi Gali, is an imperfectly competitive setting in which endogenous fluctuations in market power lead to sustained variation in prices and output over time. Other models have built upon developments in the theory of pricing by Guillermo Calvo and Julio Rotemberg . An example is provided by Michael Woodford (in chapter 3 of his classic book). The variety of DSGE models that incorporate rigidities and other market imperfections provide a suite of New Keynesian (NK) approaches to modern macroeconomic modelling.

These models are at the ‘rocket science’ end of economics. Consumers (who are also workers) and firms are modelled in as simple a way as possible by focusing on ‘representative agents’ – the optimising behaviour of one typical consumer and one typical firm is modelled rather than taking into account the heterogeneity that we see around us in the world. Nevertheless the optimising problems that the analyst needs to solve on behalf of these agents are complex – they must choose the actions that maximise their dynamic returns over a time horizon, rationally forming their expectations of future prices and outcomes, where these are determined both by their own time paths of decisions and those of the other actors in the model.

These theories have emerged alongside massive improvements in computational capacity. The development of open source software - Dynare - has helped make DSGE models accessible to large numbers of macroeconomists, allowing standard models to be tweaked and also allowing empirical evaluation, and surely accounting in large measure for the popularity of the new techniques. DSGE models came to be widely used in central banks during the late 1990s, initially alongside traditional macroeconomic forecasting models, and arguably before the models were sufficiently mature to be used in a policy-making context. Indeed, it was only in the early part of the last decade that it became clear amongst academic economists that the methods introduced by the revolution had indeed become the new core of macroeconomics.

While their use has become the norm in the context of macroeconomic analysis, DSGE models are not very useful in other settings. As a labour economist, I am interested in unemployment, discrimination and inequality. Representative agent models are not equipped for analysing any of these, since they assume all workers to be identical. So ‘unemployment’ in these models takes the form of (all) workers voluntarily choosing to work fewer hours in some periods rather than others as they substitute effort intertemporally in order to take advantage of changing incentives. As a definition of unemployment, that is, frankly and obviously, ridiculous. Indeed, the representative agent model has been widely criticised for this. Recent developments with heterogeneous agents might help, but DSGE still looks far from being the tool of choice to analyse labour markets.

As they have grown in sophistication, it has become clear that confronting these models with the empirics is essential. Complicated models may have the advantage of taking more of the features of the world in which we live into account, but they tend to lead to no conclusion without empirics – anything goes, and there are circumstances where counterintuitive results are possible. At some point, indeed, these models lose sight of what it is to be a model – a model should be simple, it should be unrealistic, it should exist to help us understand, not to obfuscate.

Given these changes in how we do economics, teaching beginning students of macroeconomics has become a considerable challenge. Modern macroeconomic models are not things that beginning students are equipped to face – they draw on difficult concepts of dynamic optimisation and expectations formation, where these are exercised by numerous agents simultaneously. While they can be expressed in the form of a ‘dynamic IS curve’ and a ‘Phillips curve’, it is only with some hesitation and reservation that one would draw an analogy between these and the IS curves and Phillips curves of the theory pre-revolution. Perhaps an understanding of the old approaches to macroeconomics are a prerequisite for understanding the new. But it seems perverse, and potentially confusing to students, to teach economics using methods that, later in the curriculum, we then criticise – yet build the new on what might be the shaky foundations of the old. On the other hand, beginning with the traditional models of ISLM and aggregate supply / aggregate demand would have the advantage of offering a ready reckoner – one that, for all the arguments of the revolutionaries, still has considerable merit.

I have as yet no answers to this dilemma, and, from my examination of new initiatives in economic pedagogy (such as CORE) I’m not persuaded that anyone else does either. Here is a space to be filled.

Wednesday, March 09, 2016

Output in the production industries bounced back in January, growing by 0.2% over its level a year earlier. This follows a 0.1% fall a month earlier. The forecast shown below suggests that the series is likely to return to negative territory soon, however.

Further evidence that the production sector has entered, and is likely to remain in, a period of slowdown comes from examination of the baseline data - in 2015, the value of the production index for most months over the spring and early summer was somewhat higher than in January of that year, and this means that it will be difficult for industrial production in the first half of 2016 to match that observed in 2015.

While the performance of the production industries - and in particular manufacturing - has been muted over recent months, the service sector - which of course accounts for around 80% of the UK's Gross Domestic Product - has been growing at a reasonably healthy pace. We are nevertheless likely to be nearer the next recession than the last one, and sustaining the current rate of growth into 2017 and beyond is a challenge. The Chancellor of the Exchequer should give this serious consideration in framing the fiscal stance for next week's Budget.