In my chats with national statisticians about the troublesome output statistics, I sense it was the word “baloney” used by Kier chief Paul Sheffield, that irritated more than most other comments on the figures.

The word troubled me, probably for different reasons. I take cursing seriously and the use of out-of-vogue slang emanating from the Italian-Irish-American inter-war gangster era for some reason just jarred.

Still, the Office for National Statistics decided to hit back at the growing criticism of – let’s not forget – the official historical record of activity in the GB construction industry.

Last week ONS invited a bunch of journalists to a briefing to present its defence. Oddly, I think I was the only one present that might be considered a specialist in construction. Anyway…

This followed a similar presentation to the government-industry statistics group hosted by the Business Department BIS.

From where I was sitting, the defence was forceful, justified, but not complete. Questions are still left hanging.

For those not familiar with the saga, in January 2010 the collection of construction output data changed from a quarterly survey to a new monthly survey.

Naturally there will be and were teething problems. It takes time to fine tune the process. This led to some fairly sizeable revisions, which caused some concerns.

Also, simply because the ONS is presenting data more finely, each month rather than each quarter, potential oddities that might have been hidden in the past were likely to be highlighted. And so it proved.

It’s worth considering what the problems are.

It’s probably wise to leave out some of the more complex stuff that is troubling the economists and forecasters – such as the deflators, issues with imputation and outliers, constraining sector series volume back data to fit in with the back projection of total construction output, or any systematic bias in the new series – and focus on three main problems with the figures overall.

To make myself sound scientific I’ll describe these as a phase problem, a problem with recent volatility and a problem of quantum in relation to the historic data.

The phase problem relates to concerns over a potential lag in the data which makes it unrepresentative of what is actually happening on the ground. Normally this wouldn’t be such an issue. But with a lot of volatility and with the timing of the bad weather late last year, this presented a potential problem for the figures for national output, GDP, in the final quarter of last year and the first quarter of this year.

The volatility issue mainly centres on concern over the spectacular growth rates in the middle of last year, which saw a rise of 7.2% in quarter two and a further 3.7% in quarter three.

And the problem over the relationship with historic data concerns, among other things, the need to accept that in the third quarter of last year the level of work being done was not far shy of the level of work being done in 2006 and 2007, despite the industry workforce having shrunk by 10%. This issue is well illustrated by the graph above.

I hope to go into these three issues in some depth in later blogs. But for now, here are some of the defences made by ONS.

The ONS construction output survey is the most comprehensive survey done of industry activity. Yes, that’s true. It attempts to cover the whole industry and does a pretty good job. Most other surveys cover bits of the industry, so are partial and are often contradictory.

Its methodology is more robust than other surveys. That, I think, is also spot on. Frankly there are surveys that are given prominence that are (how can I put it?) seemingly ropey on the methodological front.

It has not introduced systematic bias with the changes. Without dipping our hands into the data and getting down and dirty with it, we’ll just have to accept that and I have no reason to suspect any wool is being pulled over eyes.

That said, and this is my failing, I keep forgetting to ask about the smaller RMI sector than before and how will this influence the future pattern of output in comparison with the old series. Or maybe I have asked and forgotten.

Its feedback on the questionnaire seems to have been reasonably positive and there is no reason to doubt this.

Also, some of the constraints that have dogged the connection of the old data to the new data will be resolved in the relatively near future. The back data will be recast in the release of 2011 Q2 data on 12 August.

This will iron out some anomalies within the sector series data. Also this process will lead to the back data being lifted. This will make recent data look a bit more realistic when compared with the pre-recession workloads.

So the ONS does have a strong case, even if I (and I suspect others) still have nagging doubts about the data series.

But at the briefing the answer to one of my questions caused me discomfort rather than providing comfort as I think it was intended to.

I have been asking for some while, well more than a year: “What actually is being measured?”

For me this is always the starting point when looking to understanding quirks in the data.

I’ll be honest I was a bit miffed that the question wasn’t taken that seriously a year or so ago, partly I suspect because if there was a problem with the new data it was also present in the old data.

What the figures are supposed to represent is work on the ground. But I’ve felt it highly unlikely that this was actually being measured, rather it might be measured through some proxy, such as invoices or payments made.

If firms were using invoices or payments as a measure for workload done in the month, this may well create lags in the data of varying length.

More recently statisticians at ONS have started to look at this more closely.

And the answer I received last week was that it seems some firms are putting in figures for actual work done, some are putting in figures for work done taken from invoices and some are putting in figures from records of orders.

Now this was presented as comfort. In some ways it is. If you take the swings and roundabouts view of statistics – which can be fair enough. Some figures are ahead (data from orders) some timely (data from measured work) and some late (data from invoices).

There will be an averaging effect.

That’s great, but for me the introduction of orders data is an even bigger worry, which I need to resolve in my head.

Sadly, having tediously hogged the question time at the briefing, I held back on this one, partly because I was a bit gobsmacked and didn’t know what to make of it at the time.

But I intend to get a clearer picture.

It may not be as disturbing as I fear, but I must admit to being a bit old fashioned. When I count apples I like them to look like apples and behave like apples.

Introducing orders figures into the output equation presents a whole new can of worms.