If the latest government ratings are to be believed, nine out of 10 English council housing departments can't do their jobs properly. But these figures ignore complex issues about resources, skills and the value of Whitehall's obsession with assessment.
Comprehensive performance assessments, housing investment programme ratings, performance indicators, voluntary best practice inspections – the government and its predecessors have been obsessed with the annual assessment of councils' performance for the best part of a decade. Now, yet another rating – the new "fit for purpose" assessment under the housing investment programme – has deemed that only one in 10 English housing authorities have managed to draw up a housing business plan or strategy deemed "fit for purpose".

At first glance, this sounds like a damning indictment of the sector, but the truth is less straightforward. Behind the headline figure are issues about resources, ways of working and the pressure of operating under a government obsessed with targets and tests (see "The ratings game", page 18). All of which makes you think again about the accuracy of the 10% "fit for purpose" assessment.

The "fit for purpose" rating tests a council on wider housing issues, not just its landlord service. There are no inspections, but councils must send detailed three- to five-year housing business plans and strategies to government regional offices. Successful strategies are judged against 10 targets, such as how councils will house tenants with specific needs – for example the elderly, disabled or asylum seekers – and how well they consult (see "Fit for purpose? The ODPM's criteria", right).

The "fit for purpose" assessment directly affects a council's housing investment programme rating. And although having a good HIP rating does not guarantee a massive increase in funding – top HIP performers only win about 5% extra for their efforts – there is not a council in the UK that would turn up its nose at some extra cash.

HIP ratings also have an impact on councils' comprehensive performance assessment scores, doled out by the Audit Commission.

A high housing CPA score can bring inspection "holidays", and a good score overall will bring extra cash and borrowing freedoms as proposed in the Local Government Bill.

And there is another advantage to passing the "fit for purpose" test: the ODPM rewards councils that produce good strategies by allowing them to stop submitting such detailed plans. The best councils will only have to submit progress reports next time round, easing their burden of bureaucracy.

We know that just 10% of the 354 English councils that have responsibility for housing passed the "fit for purpose" test (HT 9 January, page 9), but the ODPM has yet to publish a detailed breakdown of who passed and who failed. In the absence of detailed results however, HIP ratings can be seen as a strong indication of who will pass and who will fail. This is because "fit for purpose" status is all about judging a council's business plan and strategy, and business plans are a key part of what determines HIP ratings.

This year, 36 councils gained "well above average" HIP ratings, among them Derby – one of the councils passed as "fit for purpose", although this has not yet been publicly announced. As well as having the top HIP rating, Derby has a high CPA score. And now its "fit for purpose" accolade means it will get the maximum possible capital allocation of £3.9m for the coming year.

Derby was successful, says a spokesman, because its housing strategy "clearly lays out its objectives, its options for where to concentrate resources, its priority areas, an assessment of current performance and the reasoning behind the strategy" (see "Get fit: Write a winning housing strategy", page 18).

Broxbourne council in Hertfordshire is also hoping to pass the "fit for purpose" test. Broxbourne's housing director, John Giesen, says drawing up strategies is a core part of the council's business. Eight people are involved in preparing the housing strategy, from the housing, environmental health and planning departments. Giesen says: "The people doing it are also involved directly in service issues – so the points in the strategy are more real."

The council doesn't have a specialist policy team, but Giesen says this can be a strength.

Size does matter
Smaller authorities have to submit the same standard of strategy as their larger counterparts, but usually without the same level of skills or resources. Smaller district councils, for example, often do not even have a full-time policy officer, whereas their metropolitan neighbours may have a whole team working on strategies and policies.

The figures bear this out: only 7% of district councils were able to grab the top "well above average" HIP ratings, compared to 28% of metropolitan councils. Heriot-Watt University researcher Hal Pawson, who has carried out analysis on the issue for the Housing Quality Network, says: "It seems likely that the larger average size of metropolitan and unitary councils as compared with districts may give these councils an advantage in terms of the staffing resources they are able to devote to strategic policymaking."

Local Government Association
programme manager Gwyneth Taylor agrees. She says: "In smaller districts, resources and staff aren't there to research strategy. They're struggling in the newer housing environment of corporate and holistic approaches." She adds that metropolitan councils also have housing, education and social services all under one roof, easing the process of joint planning and cooperative working.

Smaller councils must use more imaginative approaches if they are to succeed, says Northern Housing Consortium chief executive John Moralee: "Councils need to cooperate more, cut duplication and make exercises, such as joint tendering, the norm instead of a rarity." Unless districts prove they are up to the job, they risk losing their independence with the introduction of regional assemblies, he adds, asking: "Is there a wider agenda to stack the evidence against districts? It would make it easier to amalgamate them under a new system."

Too little guidance, too little time
Another reason for the number of councils failing the rating is that the government did not issue final guidance on what it expected councils to submit until last April, just three months before plans had to be submitted.

Councils nursing "unfit" housing strategies must submit revised attempts this July. It is difficult to judge the precise impact that reworking the plans will have because the ODPM has yet to issue detailed feedback to the unsuccessful councils, but it is not difficult to imagine the difficulty managers will have in trying to motivate staff who have already suffered a series of inspection failures.

There's no doubt the proliferation of inspection regimes – and their sometimes conflicting results – is something councils are finding difficult to comprehend. Bristol council, for instance, has held talks with the Government Office South-west over the whole issue of business plans, HIP ratings and Audit Commission assessments. It is concerned at how the different ratings fit together. Its HIP rating is "above average", yet it scored only two for housing in the Audit Commission's CPA and was rated "weak" overall in the CPA.

The discrepancies between the HIP ratings and housing CPA scores arise because the HIP looks at strategy and management only. Housing CPA is scored partly on HIP, but also on performance indicators and inspections.

Claire Cook, executive member for neighbourhood and housing, says: "We're surprised by some of the outcomes nationally, particularly the decision that most plans are supposedly not 'fit for purpose'. And we are disappointed with our own rating, given that our service planning had previously been ranked well above average and was used by the government as a model for other authorities."

The inspections backlash
Other councils also clearly think the government's appetite for condensing an entire service into a single word or number has gone too far: four have launched, or are considering, legal action over the Audit Commission's CPA exercise, citing the methodology and scoring as unfair.

John Seddon, managing director of Vanguard Consulting, which helps public and private organisations improve performance through management, has a radical answer: he wants the whole regime of government targets dismantled. "The only point of the exercise is to demoralise people," he says. "As soon as you tell people they will be measured against arbitrary conditions, they focus on using their ingenuity to pass, not on improving the service. The regime is a major cause of waste."

Seddon proposes instead a system where councils must simply satisfy the government that they are capable of doing the job in hand, rather than produce slick business plans and target-meeting strategies.

In the government's defence, it has, to some extent, recognised the burden of inspections. A 2001 white paper acknowledged the burden of measuring performance. General performance indicators for top-tier councils were reduced from 123 to 97, and CPAs were intended to streamline the inspection regime.

This will be little comfort for the councils that have failed "fit for purpose". Most are unlikely to be as bad in reality as the failure tag would have us believe. But it leaves housing managers and directors to cope with boosting staff morale after a bad assessment they perhaps didn't deserve.

No one would claim that all councils' housing performance is good enough, but it is perhaps time for the government to let councils get on with the work of improving their services, instead of putting such huge effort into lining up the numbers.

Fit for purpose? The ODPM’s criteria

  1. Corporate context. How well does a council’s housing strategy fit with wider community objectives like the decent homes target, Supporting People or regeneration work?
  2. Wider priorities. To what extent have national and regional housing priorities, like low demand or the need for key-worker housing, been taken into account in the strategy?
  3. Partnership working. How well does the strategy show effective consultation and joint working with councillors, tenants and residents?
  4. Needs analysis. Is there enough analysis of current and future problems on all aspects of housing need and service performance? Is the council prepared for demographical changes?
  5. Resources. Is there a realistic view of future resources from the Approved Development Programme, regeneration schemes and contributions from partners?
  6. Priorities. How well are areas that are deemed “priority for action” justified and linked to current and future needs? For example, if elderly tenants were previously judged as a priority, does the housing strategy recognise that?
  7. Options. Does the strategy consider alternative ways of addressing priority areas for action? Should housing associations or the private sector be more involved?
  8. Action Plan. Is there a clear action plan with “specific, measurable, agreed, realistic and time-bound” objectives? Any set targets must be realistic.
  9. Progress to date. To what extent does the strategy report on progress against previous targets and objectives? How far has the council come?
  10. Accessibility. Could a non-specialist reader understand the strategy’s key messages?

The ratings game

Comprehensive performance
assessment The new Audit Commission regime introduced last year which sits at the top of the inspection regime hierarchy. CPA measures holistic performance, from housing to education to social services, and judges services as well as councillor and officer leadership. Commission inspectors give councils scores on a set of services, and judge their overall ratings from “excellent” to “poor”. Those with excellent ratings win a three year inspection “holiday”. Housing CPA score is also influenced by performance indicators and housing investment programme ratings (see below). Performance indicators Published by the Audit Commission and overseen by the ODPM, best value annual performance indicators are league tables on councils’ individual services. There are no inspections – councils send information on services ranging from how frequently bins are emptied to how many tenants pay their rent on time. The Commission studies this to help work out a council’s housing CPA score. There are nine indicators for council housing, such as how quickly repairs are done. Housing investment programme Run by the government regional offices, this annual score is used by the ODPM to calculate councils’ capital spending allowances. Ratings, based on information sent to government offices on housing management and strategy, go from “well above average” to “well below average” judged against councils in the same region. Scores are influenced by the new “fit for purpose” rating. Fit for purpose A new annual measure of councils’ housing strategies and business plans that was introduced in July as part of the housing investment programme process. It tests wider three- to five-year housing strategies on issues such as how well housing fits with wider regeneration plans, not just the landlord service. Peer review Voluntary best practice inspections of councils carried out by independent local government consultant the Improvement and Development Agency. Helps by offering independent evaluation of services, exchange of ideas, and in preparing for formal inspection.

Get fit: write a winning housing strategy

Derby’s housing strategy, which outlines how arm’s-length management organisation Derby Homes will maintain and improve the 15,500 homes it took over from the council last April, was rated “fit for purpose”. Jonathan Geall, manager of the council’s community and housing strategy unit, says the first step to a good plan is take a wide view: “Look at the [housing] objectives, ensure they are tied in with other council objectives and those of partners.” Next, he says, strategy objectives must fit the main priorities and show evidence of alternatives, like the outcome of non-intervention. Derby’s 68-page strategy demonstrates in detail the most pressing issues in the next three to five years, and draws up options for action for each one. “We’ve made detailed option appraisals on improvements to our own stock, the private sector and new affordable housing,” Geall says. He also stresses consultation with tenants is vital. Derby succeeded where others failed, Geall says, because it broke the city into 13 market areas and identified action needed in each.