Knowing what benchmarking really means and collating data correctly are essential if it is to become a tool of any worth to the facilities manager.

Everyone in facilities, it seems, is desperate for benchmarking data, particularly on the costs of facilities services. Facilities costs typically absorb 15 per cent of total revenue expenditure, making them the second largest cost after people.

Excluding IT hardware and software, which rarely shelter under the same umbrella as premises and support services, conventional facilities make up about 10 per cent of an organisation’s annual budget.

Of the 5 per cent or so of revenue that premises costs absorb, perhaps only 30 per cent – say 1.5 per cent of turnover – is down to the operating costs: maintenance, cleaning, security, utilities and minor projects.

Nevertheless, these costs can add up to a considerable amount – around 100 companies in the UK each spend more than £10 million a year on maintaining mechanical and electrical services alone.

However, company expenditure on rent and rates absorbs more than two-thirds of the premises budget, which is why so much time is spent focussing on optimising space use (usually to the detriment of productivity). In the worst cases, this ‘saving space at all costs’ syndrome is accompanied by a desire to keep the cost of operating space as low as possible – covering the finance director in glory but damaging workplace productivity still further.

In the face of this, the facilities manager has two choices: keep slashing costs regardless, or find out what the ‘best in class’ are doing and try to convince the ‘lost souls’ of the finance department that facilities management can actually add to corporate productivity.

The most seriously committed facilities managers are beginning to address the benchmarking option. It is a defence mechanism from a fledgling profession against the increasing attention that company boards are paying to facilities costs that were once taken as read in the annual accounts.

Faced with an almost universal lack of justification for expenditure, facilities managers around the world have been getting together to exchange information about costs – mainly with a view to presenting an argument to the ‘powers that be’ in support of their expenditure levels. What they want is an expenditure ‘fix’ for a whole host of premises and support services including cleaning, maintenance, security, utilities, reprographics, archiving, travel management, catering and all the other critical services upon which organisations depend for successful operation.

Unfortunately benchmarking does not come on the cheap. Neither does ‘valid cost comparison’, which is often confused with benchmarking. In both, the quality of the data is of paramount importance to the success of the venture.

The meaning of benchmarking

Benchmarking is the process of comparing a product, service, process – indeed any activity or object – with other samples from a peer group, with a view to identifying ‘best buy’ or ‘best practice’ and trying to emulate it. The peer group refers not to your closest buddies but to any organisation that is carrying out an activity with similar characteristics or end product to the one you wish to benchmark.

There is a common fallacy (usually evoked by those afraid of being exposed by the process) that no two organisations are similar, thereby rendering comparison meaningless. Nothing could be further from the truth. It can be useful to benchmark to your closest peers – and internal benchmarking within different parts of organisations has a lot of merit – but it is possible that someone in quite a different sphere may be doing the same sort of thing that you do – and better.

For example, a while back some bright sparks looking to improve the quality of building maintenance had a flash of inspiration; they realised that whereas people always seemed to be getting stuck in lifts it was quite unusual for an aeroplane to fall out of the sky. So they talked to the aircraft industry about maintenance management and learned that aircraft are designed and built specifically to minimise the risk of failure and maintenance of aircraft is seen as a critical part of the business activity.

Although the aircraft and construction industries seem miles apart in terms of their activities, they do in fact have similar processes and you could argue that building services failure is just as serious to a typical business as aircraft failure is to BA or the Royal Air Force.

So similarity of business function is not an essential factor of a peer group, although organisations are naturally anxious to know where they stand in their own division of the league. This comparison is not, however, the main purpose of benchmarking, for best practice will almost certainly be found in some other unrelated sector.

It is the pursuit of best practice – ‘best of breed’ some call it – that is at the root of the benchmarker’s philosophy. Having found it, the key to success is to set about emulating it in terms of cost, quality, speed or risk management.

Gathering data

There are two types of ‘benchmarking’ operations:

  • collection of data from questionnaires – published or produced privately for benchmarking ‘clubs’, and

  • individually compiled databases by consultants specialising in the field

For the benchmarking process to identify the ‘best of breed’ performance in terms of cost and/or quality, it must be possible to interrogate the providers of data – be they the source provider or the consultants who have worked with them to produce the data. This is where many of the data-gathering processes fall down.

Cost data gathered by questionnaire is fraught with danger, although it remains a common approach. One problem is that many such surveys show enormous differences in costs.

For instance, when examining maintenance costs, issues that need to be considered include national and local taxes (VAT for example), and cost drivers such as the age of the facility, whether or not it is air-conditioned, whether costs include lifecycle replacement, and relative quality. Without this it is difficult to find a suitable comparator from such data.

Pepco study

A pan-European benchmarking study carried out for a large company – dubbed Pepco – illustrates these problems further.

Pepco’s facilities departments in 16 countries were sent a very comprehensive questionnaire by the head of corporate real estate. It spelled out succinctly what was to be included in each cost centre and how the areas were to be measured. However the answers were clearly flawed and Pepco sent in consultants to extract the data.

Typical problems of data analysis include:

  • poor posting of invoices

  • failure to analyse out components of invoices belonging to different categories, eg cleaning included with maintenance invoice

  • omissions, eg catering costs leaving out the costs of consumables, equipment maintenance and vending

  • proper analysis of the times worked by multi-functional operatives, eg cleaning helping out with the catering

  • mis-classification, eg moving and alterations included in maintenance

  • how to allocate non-dedicated staff functions on smaller sites, eg secretaries collecting and delivering the mail

  • confusion between dedicated facilities management and general management responsible for some facilities services, eg responsibility for efficiency of the photocopiers.

The benchmarking specialists were alert to the potential existence of such anomalies and the usual ‘bolt-holes’ for absent data. Eventually nearly all the costs were teased out on a fairly consistent basis and the consultant was comfortable in advising that the final figures were almost all credible.

The Pepco case study draws attention to the almost universal difficulties that facilities managers have in providing correct data about the size, population and facilities costs in the buildings under their control. This is a commentary, not a criticism. The facilities may be state-of-the-art, but financial management systems, which might deliver up data of the required quality, are just not in place. This scenario is not an isolated instance.

Measuring quality

To compare your costs with a peer group without taking comparative performance achievements into account could not conceivably produce a benchmarking conclusion.

In the best benchmarking clubs you can compare results of user satisfaction surveys, contractor response times, even visit other people’s facilities and ‘feel the quality’ at first hand. There are now a number of benchmarking clubs in existence around the world, and some are trying to compare notes.

Euro FM, which is a loose association of institutions in the facilities management field in Europe, has brought together the activities of a number of benchmarking clubs in various member countries, each with its own agenda, classification protocol and timescale.

It is already apparent from the ongoing review that the problems Pepco faced in getting good data from its sites are not unique. The approaches some of these clubs are taking to overcome this problem will be the subject of a future article.