Wednesday, May 11, 2016

Industrial production rose slightly from February to March, but the higher baseline figure from a year ago means that output in this sector has nevertheless fallen over the past year, by some 0.3%. The March data confirm that, for the second successive quarter, production has declined. This means, in effect, that the industrial sector is in recession. My forecasting model (see graph below) suggests that the slowdown is likely to continue for some time.

Manufacturing comprises most of the output in the production industries, and has been in the doldrums for some considerable time. Output in this sector is now some 2.2% below the level achieved at its post-recession peak at the beginning of 2012 - and is indeed no higher than was achieved in April 1989. The recovery remains dangerously unbalanced.

Monday, April 25, 2016

The rapid increase in life expectancy has led to ageing being regarded as one of the major policy challenges facing governments. Policies to encourage savings include Individual Savings Accounts (ISAs), and the new Lifetime ISA and Help to Save initiatives. At the same time, auto-enrollment in pensions schemes is being rolled out so that, by 2019, pension contributions amounting to 8% of salary will be made on behalf of all employees (unless they opt out). This 8% is widely believed to be too low a figure.

As well as providing incentives to save more, policy has been aimed at lengthening working lives. The age at which workers qualify for the state pension is being raised gradually, and in 2011 the concept of a default retirement age was scrapped.

These two solutions - saving more and working longer - are often seen as the only two options available in paying for the increased burden of an ageing population. But there may be a third.

If productivity were to rise, the additional output produced by those in work could be used to pay for some of the extra cost burden - in terms of both care and pensions - implied by ageing. In the long run, this could be done through private savings and insurance. In the short and medium term, however, it would involve some redistribution from the generation of working age people to that of retirees. That would need to be done through the tax system. (That is a transitory and technical requirement, not an expression of a preference for solutions from the political left.)

Suggesting increased productivity as a solution may appear curious at a time when the economy is still suffering from a stagnation in productivity - the so-called productivity puzzle. But there is reason to believe that productivity growth might come back.

If ageing is one challenge that society has to learn to face, another is the rapid advance of technology. Some researchers predict that almost a half of all jobs are at risk of becoming defunct owing to the steady march of the robots. If true, adapting to that is certainly a challenge - though past experience suggests that jobs morph as technology advances. What is certain is that rapid technological advance should bring about massive changes in productivity - why else would the technology be adopted?

We certainly need to plan for longer lives through saving more and working longer. But, faced by the two challenges of ageing and rapid technological change, we might just find that one can become a solution to the other. And - to reiterate - fully to take advantage of this opportunity when it comes, we must be lithe in our thinking around what is acceptable in the arena of taxation and redistribution.

Monday, April 18, 2016

The Treasury has published an analysis of the economic costs associated with Brexit. A range of estimates is given, with different values attached to different assumptions about the nature of trading arrangements made with other countries after the United Kingdom leaves the European Union. The central estimate that has attracted most media interest is a prediction that the costs will amount to 6.2% of Gross Domestic Product. This is a substantial figure - huge in relation to other analyses such as those undertaken by Open Europe which estimated a los of 0.8% of GDP on the most plausible scenario. This warrants further interrogation.

The Treasury investigates three options for the post-Brexit world. One is for the UK to negotiate a Norway-style arrangment whereby free trade with the EU is maintained but, while no longer being a member state, the UK continues to make payments to the EU. The second is the the UK to negotiate a Canada-style Comprehensive Economic and Trade Agreement with the EU. This does not cover services - which represent some 80% of UK output - and would not allow the benefits of free movement of labour. The third option considered is a 'clean break' where no special arrangement is made between the UK and EU, but trade continues under World Trade Organisation rules. The estimated loss, as a percentage of GDP, for each of these options is 3.8%, 6.2% and 7.5%. It is the 3.8% estimate that most closely corresponds to Open Economy's 0.8% figure - and clearly the gap between the two is large.

The key question that emerges from consideration of these figures, then, is this: why are the figures, particularly those for the first option in which free trade with the EU in both goods and services is maintained, so much larger than other estimates in the literature? The Open Europe analysis is based on a Computable General Equilibrium (CGE) model. This type of modelling approach examines comparative statics - in essence it compares an equilibrium position 'before' and 'after' a change - and is calibrated using estimates of key parameters that the received literature suggests are plausible. This is good for providing a long run comparison, but it is likely to understate costs that are due to adjustment between the 'before' and 'after' positions. By way of contrast, the Treasury approach employs a statistical analysis based on a gravity model. In this framework, pairs of countries that are both members of a trading partnership such as the EU can be identified using dummy variables, and the importance of such joint membership vis-a-vis, say European Economic Area (EEA) membership, can be assessed. Since there are few observations for countries that are in EEA but not in the EU, there is likely to be some imprecision in the estimates of the costs associated with the 'Norway option'.

The uncertainties surrounding any predictions of this kind are substantial. But the main message to come out of the various exercises should not be minimised. The long run costs of Brexit, even under the kindest assumptions about what regime will follow, are large albeit not prohibitive. The costs of getting from here to there are larger. Those advocating that the UK should leave the EU need to be aware that the years following Brexit would be a bumpy ride, that even in the long term there are non-negligible costs, and that they therefore expect people to value really highly whatever gains they are seeking - just how highly, they need to spell out. At this stage, it is not clear that the Leave camp is offering much beyond bluster.

Friday, April 08, 2016

The latest data on industrial production show a drop of 0.5% over the year to February. This is further evidence of an economic slowdown. While overall growth is still positive, this is entirely due to expansion in the services sector - and we know that confidence in that sector too has fallen sharply in recent months.

I have, for several years, used the industrial production data in a neural network forecasting model. The latest forecasts of this model, updated to include the data released today, appear below. They continue to indicate that the next year will be challenging for the production sector, with continued falls in output.

Thursday, April 07, 2016

Two news stories have made reference to a trend for people increasingly to shop 'little and often'. One is the rising sales figures of the Co-op supermarkets, and the other is the decision by Sainsbury's to scrap its brand match scheme (which requires customers to purchase at least 10 items in order to claim the refund).

The 'little and often' trend is evidenced by the rapid growth of convenience stores vis-a-vis larger supermarkets in recent years. This may be partly demand led, following changes in consumer habits during and after the recession, but issues on the supply side are likely to have played a part too. Smaller convenience stores with less than 280m2 of floorspace face less stringent Sunday trading restrictions, for example, and the proliferation of such stores has much to do with legal considerations of this kind.

Whether the demand for 'little and often' reflects a permanent shift in customer behaviour remains to be seen. If new technology lowers further the fixed costs of shopping (with automated tills reducing queueing time, for example), 'little and often' becomes more appealing. Likewise, the squeeze on space imposed by high property prices in London (especially) makes 'little and often' an appealing strategy, particularly in connection with bulky low-value items (such as toilet rolls, for example).

Is there, then, a future for the larger stores? Certainly if trading restrictions were eventually to be lifted, they would be able to compete on a more level playing field. And, particularly in parts of the country where space is less of a consideration for consumers, they should remain well placed. The 'little and often' trend is not necessarily a permanent feature - and, like so much of what we see around us, it has its underpinnings in the economic forces of supply and demand.

Friday, March 18, 2016

The June 1978 conference at the Federal Reserve Bank of Boston has attained legendary status in the economic profession. The revolution that has transformed macroeconomics into an area of research that is solidly built on microeconomic foundations – with optimising agents making decisions that are consistent in the context of an intertemporal setting – is said to have taken place then. Certainly, the language in the paper presented by Robert Lucas and Thomas Sargent, and in the response by Robert Solow, brings to mind a coup.

While Lucas and Sargent represented a new broom, the revolution turned out to be a slow burner. Finn Kydland and Edward Prescott published their seminal work on aggregate fluctuations in 1982. This built on earlier work by Lucas in its emphasis on microfoundations, but introduced for the first time a method – calibration – whereby empirical flesh could be built upon the theoretical bones. Calibration met with some initial scepticism within the profession. In its crudest forms, it has a back-of-the-envelope quality about it. But, as a slow burner, the methods became increasingly refined over time. The early real business cycle models developed into modern dynamic stochastic general equilibrium (DSGE) models.

These models initially had a market clearing orientation – economic theories often do because it’s easier to model things that way. Inevitably, however, as the research agenda matured, mechanisms were found to incorporate more realistic scenarios. One of these, due to Jordi Gali, is an imperfectly competitive setting in which endogenous fluctuations in market power lead to sustained variation in prices and output over time. Other models have built upon developments in the theory of pricing by Guillermo Calvo and Julio Rotemberg . An example is provided by Michael Woodford (in chapter 3 of his classic book). The variety of DSGE models that incorporate rigidities and other market imperfections provide a suite of New Keynesian (NK) approaches to modern macroeconomic modelling.

These models are at the ‘rocket science’ end of economics. Consumers (who are also workers) and firms are modelled in as simple a way as possible by focusing on ‘representative agents’ – the optimising behaviour of one typical consumer and one typical firm is modelled rather than taking into account the heterogeneity that we see around us in the world. Nevertheless the optimising problems that the analyst needs to solve on behalf of these agents are complex – they must choose the actions that maximise their dynamic returns over a time horizon, rationally forming their expectations of future prices and outcomes, where these are determined both by their own time paths of decisions and those of the other actors in the model.

These theories have emerged alongside massive improvements in computational capacity. The development of open source software - Dynare - has helped make DSGE models accessible to large numbers of macroeconomists, allowing standard models to be tweaked and also allowing empirical evaluation, and surely accounting in large measure for the popularity of the new techniques. DSGE models came to be widely used in central banks during the late 1990s, initially alongside traditional macroeconomic forecasting models, and arguably before the models were sufficiently mature to be used in a policy-making context. Indeed, it was only in the early part of the last decade that it became clear amongst academic economists that the methods introduced by the revolution had indeed become the new core of macroeconomics.

While their use has become the norm in the context of macroeconomic analysis, DSGE models are not very useful in other settings. As a labour economist, I am interested in unemployment, discrimination and inequality. Representative agent models are not equipped for analysing any of these, since they assume all workers to be identical. So ‘unemployment’ in these models takes the form of (all) workers voluntarily choosing to work fewer hours in some periods rather than others as they substitute effort intertemporally in order to take advantage of changing incentives. As a definition of unemployment, that is, frankly and obviously, ridiculous. Indeed, the representative agent model has been widely criticised for this. Recent developments with heterogeneous agents might help, but DSGE still looks far from being the tool of choice to analyse labour markets.

As they have grown in sophistication, it has become clear that confronting these models with the empirics is essential. Complicated models may have the advantage of taking more of the features of the world in which we live into account, but they tend to lead to no conclusion without empirics – anything goes, and there are circumstances where counterintuitive results are possible. At some point, indeed, these models lose sight of what it is to be a model – a model should be simple, it should be unrealistic, it should exist to help us understand, not to obfuscate.

Given these changes in how we do economics, teaching beginning students of macroeconomics has become a considerable challenge. Modern macroeconomic models are not things that beginning students are equipped to face – they draw on difficult concepts of dynamic optimisation and expectations formation, where these are exercised by numerous agents simultaneously. While they can be expressed in the form of a ‘dynamic IS curve’ and a ‘Phillips curve’, it is only with some hesitation and reservation that one would draw an analogy between these and the IS curves and Phillips curves of the theory pre-revolution. Perhaps an understanding of the old approaches to macroeconomics are a prerequisite for understanding the new. But it seems perverse, and potentially confusing to students, to teach economics using methods that, later in the curriculum, we then criticise – yet build the new on what might be the shaky foundations of the old. On the other hand, beginning with the traditional models of ISLM and aggregate supply / aggregate demand would have the advantage of offering a ready reckoner – one that, for all the arguments of the revolutionaries, still has considerable merit.

I have as yet no answers to this dilemma, and, from my examination of new initiatives in economic pedagogy (such as CORE) I’m not persuaded that anyone else does either. Here is a space to be filled.

Wednesday, March 09, 2016

Output in the production industries bounced back in January, growing by 0.2% over its level a year earlier. This follows a 0.1% fall a month earlier. The forecast shown below suggests that the series is likely to return to negative territory soon, however.

Further evidence that the production sector has entered, and is likely to remain in, a period of slowdown comes from examination of the baseline data - in 2015, the value of the production index for most months over the spring and early summer was somewhat higher than in January of that year, and this means that it will be difficult for industrial production in the first half of 2016 to match that observed in 2015.

While the performance of the production industries - and in particular manufacturing - has been muted over recent months, the service sector - which of course accounts for around 80% of the UK's Gross Domestic Product - has been growing at a reasonably healthy pace. We are nevertheless likely to be nearer the next recession than the last one, and sustaining the current rate of growth into 2017 and beyond is a challenge. The Chancellor of the Exchequer should give this serious consideration in framing the fiscal stance for next week's Budget.