Can QA be made to work?

In this post I want to explore the whole concept of quality assurance (QA) with particular reference to adult social care providers, i.e. those providing residential and home care services (although much of what I’m going to say is applicable to the broader health and social care sector as a whole).   One of the key things that always struck me when I was working for the CQC, and continues to strike me in my consultancy work, is that many providers appear to have no clear understanding of what QA actually means in the context of their own service.

Furthermore, there often seems to be some misunderstanding regarding how the CQC fits into the wider QA picture.  And what I mean by this is that although providers are fully aware that the CQC’s role is to ensure that they are providing a good (and preferably outstanding) service, I’m not sure that it’s always clear that when the CQC carry out their inspections they are actually using a fully fledged quality assurance system of their own; and more to the point, this is a system that individual providers could easily use for their own QA purposes.

Perhaps one of the problems is that under the previous CQC inspection regime, QA was included as only one specific outcome (number sixteen) amongst all the others.  And even under the current framework there is still some potential for misunderstanding because although QA is not explicitly referred to, even as part of the ‘Well-led’ domain, one of the prompts for the key line of enquiry (KLOE) for this domain (W3)  does invite the inspection team to ask the question:   ‘Are quality assurance and (where appropriate) governance and clinical governance systems effective, and are they used to drive continuous improvement?’  Furthermore, I have come across examples of providers who have been told by the CQC that they need to improve their QA systems.

Now, this is all well and good, but I think it encourages the idea that ‘quality assurance’ is a specific management practice and set of processes, rather than being at the heart of the service itself.  Furthermore, I think there is a danger that it encourages the idea that the kind of quality assurance that the CQC is engaged is somehow different from the quality assurance that individual providers are undertaking on a day-to-day basis.  However, before we go any further, perhaps we need to have a clearer understanding of the term ‘quality assurance’ as applied in the health and social care sector.

The publication Driving up Quality in Adult Social Care by Think Local Act Personal (TLAP) provides a very readable overview of quality in adult social care[1]  It points out that a care and support service can only be considered high quality if:

  • it places the person receiving the care at its centre
  • it enables personal outcomes to be achieved
  • the relationship between the person who is using the service and the people who deliver it is  based on dignity and respect (p3)

The publication goes on to elaborate upon these key points by summarising the key themes that emerged from its research into quality, which drew upon the experiences of service users, organisations and practitioners:

High quality care and support exists where people who use social care:

  • are enabled to live independent lives as defined by them, with informed choice and control through access to appropriate services and as much involvement in decisions about care and support as they want to have
  • have opportunities to participate in community life, engage in activities that match their interests, skills and abilities, and maintain good relationships
  • feel safe, secure and empowered because their human rights are safeguarded while they are supported to manage informed risks
  • have a positive experience of care provided through relationships based on mutual respect and consideration, where care is designed around their needs and is consistent and coordinated. (p.4)

The Local Government Association’s (LGA) publication Health and Care Quality Systems in practice[2] makes reference to the TLAP document but also to the NHS Constitution’s definition of quality, which has three dimensions:

  • Individual experience
  • Effectiveness
  • Safety

This last definition is particularly interesting because it links closely to the CQC’s own quality framework and its five domains of safe, effective, caring, responsive and well-led.  The LGA’s publication goes on to outline the key elements for ensuring that quality is at the heart of any service, and these include leadership, culture, accountability and workforce development.

However, it’s one thing to be able to define quality (even if such definitions are by no means definitive), but quite another to monitor or measure quality.  But of course, this is precisely what the CQC aims to do with its inspections and other forms of monitoring.  Strictly speaking , though, one could argue that the CQC’s model of quality assurance does not actually ‘measure’ quality at all, but rather it maps-out various aspects of a service within a particular set of parameters, i.e. the five domains of safe, effective, caring, responsive and well-led.  The key lines of enquiry (KLOEs) are essentially an aid to ‘fine tuning’ this mapping-out process.

Take, for example, the five ‘Safe’ KLOEs.  These are all phrased as questions; for example, KLOE S1 asks the question: How are people protected from bullying, harassment, avoidable harm and abuse that may breach their human rights?  and KLOE S4 asks: How are people’s medicines managed so that they receive them safely?

For each question, there is a range of evidence which demonstrates how the service is addressing the particular issues.  There is no ‘right’ or ‘wrong’ evidence, although there are a number of prompts which suggest what would be useful evidence.  So, for example, one piece of evidence which would address KLOE S4 might be that all medicines are stored, administered and disposed of in line with current regulations and good practice.  Linked to this would be the fact that the service had a comprehensive medicines policy which had been read by all staff who were responsible for administering medication.  In terms of hard evidence, this would be a matter of locating and reading the policy in question and, hopefully, finding a list of staff signatures which indicated that they had all read the policy.

In adult social care there are twenty one KLOEs, and for each one there can be numerous pieces of evidence.  So although this process of mapping-out the evidence, based on the KLOEs, can appear to be (and is!) quite a time consuming and laborious undertaking, it does enable the inspection team to build a comprehensive picture of each service and to assess its level of quality, as defined by the five domains.

Note that I’ve made a number of references to the CQC inspection team carrying out this process of mapping-out evidence; but what about the providers themselves?  More specifically, what about the service management?  Surely it would be a useful exercise for them to carry out a similar mapping-out process themselves for their particular services?  In fact, rather than reinventing the wheel, why not just replicate, as far as possible,  what the CQC inspection team does?

And let’s be clear: this is not simply about being ‘CQC ready’, i.e. carrying out a ‘mock inspection’ so there are no nasty surprises when the CQC do turn up.  Rather, it is about managers having a clear picture of how their service is operating in terms of the five quality domains.   The fact that such a process is essentially ‘aligned’ to the CQC model of quality assurance is obviously an advantage when it comes to CQC inspections and their ongoing monitoring, but really it should be an end in itself.

But if it’s such a good idea, why aren’t more managers doing it?  To be fair, some are, and there are a number of ‘tool kits’ on the market which facilitate the process of evidence gathering using the CQC’s domains and KLOEs.  However, a lot of managers that I’ve spoken to seem somewhat overwhelmed when it comes to quality assurance, and part of the reason goes back to what I was saying earlier about there been a lack of understanding about what QA actually is.

I would suggest, however, that there is another reason why so many managers have a rather fractured relationship with QA, and this is because they simply don’t have the time to do it, or even the time to get to grips with it is they are supposed to be doing in the first place.  As in many parts of the health and social care sector, a great deal of residential and home care service management is essentially ‘fire-fighting’ and crisis management, and finding the time to systematically map-out the service in the manner I described earlier is a luxury most managers simply do not have.

The irony is, having a decent QA system in place can actually reduce the need for such fire-fighting and crisis management.  It can give managers a clear overview of the service and act as an ‘early warning system’ regarding any potential problems that may be on the horizon.   And, of course, it can help take the sting out of any future CQC inspection, and indeed, any ongoing monitoring of the service by the CQC or other external agencies that may have an interest in the service.  It’s also worth noting at this point that as part of the CQC’s new strategy, it is encouraging all providers (right across the health and social care sector) to be more proactive in their quality assurance management, and to ‘align’ their own QA systems with that of the CQC.[3]

So where does this leave overworked managers with no time to focus seriously on quality assurance?  I suspect the majority of service managers would readily admit that not only is quality assurance important but that they should be spending more time on it, but…..they simply haven’t got time to drop everything in order to spend hours gathering evidence in order to satisfy the demands of the CQC, who may not turn up for another couple of years anyway.

But again, I can only re-emphasise that this is not simply about the CQC.  Rather, it is about good management practice with regards to quality assurance, and if the CQC have already devised a comprehensive QA model and methodology, why reinvent the wheel?

And as for the question of QA being a time consuming activity, there are ways to take some of the pain out of the process.  As I mentioned earlier, there are a number of organisations that can provide off-the-shelf QA audit tools, and some of these are based on the CQC’s own model.  In fact, some of these organisations will also provide a raft of policy documents and other information aimed at helping service managers.

However, I think it’s worth pointing out that there are certain drawbacks to many of these ‘tools’.  To start with, there are some which are still essentially paper-based systems, which means that there are even more bits of paper to add to the existing piles on managers’ desks.  This also makes sharing of information, for example with the CQC, a lot more difficult.  Other systems are software based, which, in theory at least, makes the sharing of data a lot easier, although it’s not always clear how this is achieved in practice.

A lot of these systems will also attempt to ‘rate’ the service according to each of the five domains, although it is by no means clear what ratings methodology they are actually using.   In reality, the CQC ratings (‘outstanding’, ‘good’, ‘requires improvement’ and ‘inadequate’) are derived by a process of discussion amongst the inspection team, and there is no computer algorithm that can reliably automate the process.   In other words,  the ratings for a particular service are for the CQC inspection team to agree upon and award, not for the service itself.  Any form of ‘self-rating’ is at best meaningless, and at worst harmful to the service because it may lure management into a false sense of security.

Having said all this, I believe there is still a place for a simple-to-use and cost-effective ‘tool’ or ‘app’ that can facilitate the process of mapping-out quality across individual services.  And it would seem to make sense for any such tool to use the same methodology and parameters as the CQC, but with the proviso that such a tool can be quickly ‘recalibrated’ should the CQC model change or, indeed, should another regulatory body replace the CQC.  Furthermore, it is essential that information, i.e. evidence of quality, can be easily stored, collated, analysed, presented and shared between agencies, which means that it has to in a digital format that can be easily shared between different systems.

  1. []
  2. []
  3. []