In yesterday's post I explained the importance of distinguishing between inputs and outputs if you are to objectively measure achievements in advancing equality for areas such as health and social care.
In a future blog I'll explain how we researched to see whether any of the NHS organisations in our region were using any kind of performance framework.
If you want to read ahead, then you can find the research results and our strategic conclusions here, in the report A Landscape of the Region.
Little or no performance measurement
Suffice to say, what we found was that most organisations weren't measuring E&D performance at all .. and those who were simply analysed the efforts their teams were putting in. They weren't establishing baselines and progress measures for outcomes.
I'll also explain in future blogs how we arrived at our regional strategy for Equality and Diversity, Narrowing the Gaps - Better Outcomes for All. It was a strategy which defines the outcomes we wanted to see at the end of five years of strategic influence.
What's not measured doesn't get done
Yet there's no point in having such a strategy if you don't put processes around it to drive the necessary change, and to objectively measure progress.
That's the climate in which my colleague Shahnaz Ali, the Associate Director for Equality, Diversity and Human Rights at NHS North West, came up with the vision for what we named the Equality Performance and Improvement Toolkit - EPIT.
The best way to understand what EPIT is about is to read the operating manual for the process. In that we explain both the rationale for measuring results in this way, and also what kind of evidence we wanted NHS organisations to demonstrate.
You can also visit the online results 'dashboard' for the process as we've operated it, over the last 18 months. You'll find it at http://www.epit.northwest.nhs.uk/
A living, working, process
If you visit the EPIT web site then you'll find that all of the assessments from 24 PCTs and 39 NHS provider trusts in the North West region are all publicly accessible. This is something we introduced so that people could see and compare organisations.
We encourage NHS organisations to look for teams who are achieving better than themselves in each of EPIT's 13 deliverable areas, and to share best practice with each other. After the first completed round of operation we also organised a 'Master Class' for commissioners, where some of the best performing teams gave presentations to their peers.
Note that because organisations are currently engaged in updating their assessments and accompanying evidence some of the previous results have been cleared down to enable this. A lot of this kind of movement will occur over the next few months as organisations work towards the submission deadline for 'EPIT 2' .. our re-run of the whole process to establish how much progress everyone has made.
Anyone who works in local government will recognise EPIT's parentage. The idea of having several holistically-linked 'Goals' and related 'Deliverables', plus the idea of assessing at 'Developing', 'Achieving' and 'Excellent' levels closely mirrors the local government framework developed by the Local Government Improvement and Development Agency (IDeA).
There are two main differences between EPIT and the IDeA framework.
Firstly, of course, the goals and deliverables being measured are directly related to our published five year strategy for outcome improvements in the regional health economy. EPIT measures the achievement of that strategy.
The second difference is more fundamental. The IDeA framework relied entirely on self assessment and didn't involve any independent verification of evidence to support those self assessments. The weakness of that is the tendency which I've already described for people to overrate their achievements by mistaking inputs for outcomes.
EPIT's success as an objective and consistent measurement framework is due to the way we tackled that. As you'll see from both the manual and the web site, we developed an entire governance system, following what we describe as the 'lines of accountability' in the NHS economy.
EPIT describes exactly what kind of evidence an organisation needs to provide in order to support a self-assessment of any of the 13 deliverables at each level - Developing, Achieving, Excellent.
As the Strategic Health Authority, we then took it upon ourselves to be the scrutineers of the evidence from each of our 24 Primary Care Trusts. These are the organisations who commission health services, and who are performance managed in every other field by the SHA. So, introducing performance management in this area was nothing radical. It was simply overdue.
We then also expected the PCTs to apply the same scrutiny to the NHS provider organisations whom they commission to deliver services in their area. Each NHS provider may be contracted by several PCTs. However, they each usually have one PCT who takes the lead responsibility for contract monitoring.
And we instructed the PCTs to make EPIT a part of their contract monitoring arrangements. I.e. To make the achievement of improved equality outcomes as much a measure of contractual compliance as the cleanliness and safety of services.
EPIT's fully-realised approach to E&D outcomes measurement provided a massive jolt to the entire system. Some managers hated it at first, because what we were asking for was truly challenging ... especially if they were confusing effort and processes with results.
Other managers (including, notably, some of the better hospital trusts) welcomed the framework enthusiastically, saying that it provided them at last with a way to show the big improvements they were making, regardless of whether they'd been asked or not by the commissioners.
What was most encouraging was the way that some of the sceptics were turned around in the end. Many managers came to us afterwards to say how the process had really made their organisations examine what they were doing.
NHS North West will, of course, be one of the ten SHAs to be abolished in April 2012, as part of the present government's restructuring of the NHS. However, although EPIT will cease to be operated by us, the process will continue under a new guise.
Enter the Equality Delivery System
The Department of Health's Equality and Diversity Council (EDC) were so impressed by the effectiveness of EPIT that they decided to have a national framework based on the key principles which we developed. Indeed my colleagues and I spent some considerable time last summer, drafting the body of this. Again, the credit really belongs to Shahnaz Ali, for whom I work.
The plan to launch the Equality Delivery System (EDS) is already an open secret. The details were consulted widely with NHS and equality stakeholders last autumn, and the EDS is already mentioned in key documents about managing the NHS's structural transition.
There are some differences, reflecting the new shape of the NHS and the new government's preferred language. For instance, the wording of the goals reflects some of the key themes of the NHS White Paper. Also, as there will be no 'top down' management of the NHS in future, the governance process for the EDS can't rely on the 'lines of accountability' review that we employed in EPIT.
In fact, the change of governance in the EDS presents one of the biggest challenges for its' implementation. The GP Commissioning Consortia and provider organisations within the new NHS will have to accept having their EDS self assessments challenged by the only stakeholders positioned officially within the new landscape -- local Health Watch.
This is a topic which I'll return to in future. However, in the meantime readers might like to think about the capacity, capability and representative credentials of their existing Local Improvement Networks (LINks) who will morph into these 'Local Health Watch' bodies. Ask yourself how those organisations are going to need to get their act together if they are to successfully challenge the self-view of all those GP Consortia and providers who may be tempted to rate themselves as 'Excellent' .. mistaking performance measurement for a PR exercise.