This is a expanded version of my Irish Examiner column for 27 July 2013.
Economics and statistics are intimately entwined. Many economists spend their time devising more and more intricate ways to estimate, forecast, decompose and otherwise analyse relations between various data series. Much of what passes for economic commentary is reportage and discussion of new releases of economic data or revisions to previously released statistics. Think of all the times we have heard and seen headlines on unemployment, stock market indices, PMI indices, leading and lagging indicators etc. Many confuse economic statistics and actual economics.
And yet, this data diving is neither economics nor even good reportage for the most part. Economics is not about data. Its about how one thinks. Data are inputs into economic thinking. And data are more than figures from the CSO, data should include personal experience, anecdotes, qualitative data. Merely because it is not quantitative does not make data any less useful nor does it imply that it cannot be analysed and interrogated in a rigorous fashion. Economics is a human societal phenomena (for the most part) and thus the entire gamut of data we process as humans should be used for the analytics of the phenomena. Much economic statistical reporting shows the trap of being precisely wrong rather than approximately right.
Even allowing for the fact that quantitative data are easier to analyse, we need to be careful about the quality of the analysis. Modern econometric techniques are mindbogglingly complicated. But we don’t need complex techniques to deal sensibly with data. We need common sense and some basic statistics.
Consider some series that get a lot of airplay – unemployment, quarterly national accounts and purchasing managers indices. Both are important data series and are widely and correctly followed. Both are released regularly and while subject to some revision the reality is that the initial release is the one on which the press and the politicians pounce to prognosticate. Both are very volatile, but this is usually mentioned only in passing.
Basic statistics tells us that when a series is volatile there is a chance that it will show a rise and a chance that it will show a fall. There is a chance that it will show no change. We can and we should take into account the volatility of the data when we analyse it, and more to the point the reportage of the data should do so. This doesn’t happen. We will not soon see a headline “GDP rises (but its statistically equal to zero and might actually be a fall) in Quarter 2”. A rise in GDP will be greeted as corner turning stuff no matter how statistically meaningless this is. If we look at the Irish data we see significant volatility. This is best indicated by the standard ‘measure’ of recession, two successive quarters of declining GDP. By that measure we have been in recession four times (and out three times) since 2007. The reality is that it is one long recession. The GDP and GNP data are extremely volatile. GDP quarter to quarter changes are slightly less volatile than GNP, contrary to what many think and despite the GDP data being subject to distortion from the MNC sector. If we look at the quarter to quarter changes we see that they average about 1.4% quarter to quarter. If we look at the data knowing how volatile they are and how much the average change is we can consider some ‘bounds’ within which, statistically, the data should lie. This is called the ‘standard error’ . There are other measures to give these confidence intervals but all are in essence variations on a theme. Using this gives us a sense of whether or not we can be confident that a small rise or fall is in fact one, given the way the data jump about.
Similarily for PMI data, we can only despair at times at the commentary. A PMI reading below 50 indicates contraction. This doest not stop people from comenting along the following lines – PMI rose from 47 to 48, indicating a rebound in managers expectations. That is not the case. What is the case is that first, the mean monthly change in composite PMI in Ireland is almost indistinguishable from zero, and second the volatility of the data suggests that any month to month interpretation is an exercise in futility. That doesn’t stop commentators breathlessly proclaiming doom or salvation depending on the direction of this essentially random series.
For unemployment the average change in the rate (not the numbers) over the last 13 years has been a very small decline. Since 2010 the average month to month change has again been very close to zero. Again this does nothing to stop minor blips up and down being hailed as proof of austerity failing or working. The trend is the key not the month to month volatile changes.
It doesn’t have to be this way. There is nothing to stop the CSO or even newspapers themselves from taking half an hour to do some simple statistics. Instead of reporting on monthly or quarterly changes that are volatile and inherently meaningless, it would be nice to see reportage looking at three or six month averages, at some confidence intervals, at some degree of statistically meaningful measures. This would make for more boring but much more accurate reporting. If journalists, or more like Editors, wish to go beyond surface reportage then some skull sweat is required, both of readers and writers.