Measuring practice quality: A new approach in a Corrections setting

Giles Sullivan
Service Development Adviser, Department of Corrections

Author biography:

Giles Sullivan has over 13 years of operations performance experience from the telecommunications and engineering sectors, where he was focused on operational and outcome improvement.


Introduction

The New Zealand Department of Corrections is a dynamic and complex operating environment, working with some of the country’s most complex people. Our workforce of nearly 9,000 employees comes from a broad range of professional disciplines that are as diverse as custodial, health and education.

Given the challenge presented by the high levels of professional diversity, the department is building an assurance system to support practice quality. Rather than adopt a large number of distinct tools for each discipline, the department is focusing on implementing a single, centralised quality tool as the first step. The intention is to provide a common language and perspective of quality, and to establish a platform upon which to further develop the specific needs of each discipline.

Attempts to identify any similar systems currently used by overseas jurisdictions were largely unsuccessful. The majority do not operate a practice quality model, necessitating the development of a new tool to suit the unique Aotearoa/New Zealand Corrections environment. It was noted that some work has previously been undertaken to measure the quality of Integrated Drug Treatment Systems in UK prisons (Sondhi and Day, 2012). However, no work was undertaken to assess the tool for use across general prison operations.

A quality framework for Corrections

Corrections has recently developed an over-arching model for assessing quality, which uses a structured collection of tools to support continuous improvement. The model is known as Te Panekiretanga Integrated Quality Framework, which translates to pinnacle or excellence in Te Reo Māori.

Still under development, Te Panekiretanga encompasses four different types of measurement to provide a broad overview of quality. Building upon a base of compliance measurement, the model also includes practice quality, performance, and other specialist professional practice tools (refer figure 1 below, The Quality Puzzle).

Figure 1: The Quality Puzzle

The department already has a range of compliance and performance tools, which are complemented by other tools that assist staff with ad hoc quality reviews. However, additional tools were required to monitor practice quality across the organisation on a regular basis.

Designing a tool to measure practice quality

What is quality?

At first glance, it is easy to think that the concept of quality is widely understood. As individuals we use the term quality in general discussion and believe that we can recognise quality when we see it. However, for such a common principle, many people find it hard to define what it means. This introduces a secondary problem; if we can’t define what quality is, how can we measure it?

For some industries, quality can be easily measured, for example, how closely does a manufactured product match the physical design specification? In these circumstances, it generally does not matter who makes the assessment, the result is the same. The prescribed standard either was, or was not, met when measured.

However, things change when measuring the quality of professional practice. The product of a practice-based industry is often intangible, or experiential, and the measures used can differ from one assessor to another. Take hotel reviews as an example; two customers may provide very different reviews despite receiving essentially the same service. Why?

In this context, the perception of quality is often influenced by the personal values, priorities and beliefs of the assessor at the time of the review. Hence, the quality of service, or practice, can be said to be in the eye of the beholder. Parasuraman, Zeithaml and Berry (1988) and Olshavsky (1985) made similar observations, noting that the concept of quality was a “form of overall evaluation of a product”, and that the perception of quality was similar to attitude. Both reflect a global value judgement.

Standardising quality

A central aim for many quality assurance systems is to make quality issues visible so they may be addressed in the future. However, measuring quality of practice can be a challenge due to the reliance upon personal perceptions, biases, priorities and values. The problem is exacerbated in sizeable organisations where the larger number of people making quality judgements, and multiple services and numerous sites, mean there is a significant scope for variation of what is assessed, how it is assessed and against what criteria.

With high levels of input variability, the data gathered has a potentially low level of accuracy, and arguably low value too. It can therefore be difficult for the business to draw reliable conclusions from the information or to have confidence that any outcomes can positively influence continuous improvement.

Whilst it may not be possible to eliminate all personnel-based influences, adoption of a standardised quality tool can be useful to provide a consistent approach and reference criteria to increase data reliability. With improved data the business can have greater confidence in any recommendations that rely on the information gathered.

Bridging the gap across multiple professional practices

Corrections has a complex operational environment with multiple practice environments that run in parallel: custody, probation, health, psychology, employment, rehabilitation programmes, and education, amongst many others. However, with our “customer” hat on we should note that the people we manage are likely to consider us as a single service, albeit drawing upon multiple practices. Accordingly, our quality assessment tools should reflect the “customer” perception of service quality by spanning our various practice disciplines. Preliminary thoughts were that to build a single tool to serve multiple disciplines would be complicated, but similar tools have previously been developed. Specifically, De Landre (2007) noted that the Incident Cause Analysis Model (ICAM) had been used successfully in a diverse range of countries and industries. De Landre also observed that ICAM had been found to be both practical and easy to apply across the board.

The challenge is to find a way to consider quality of practice which can be used across a range of different disciplines and that also permits aggregated reporting across the whole service against generic attributes.

As a first step towards a mature quality system, this approach provides the department with an initial practice quality assessment capability and consistent baseline reporting upon which to build. In time, the capability can be expanded to include additional tools that specialise in specific practice disciplines.

The resulting quality tool provides the required structure to standardise our view of practice quality and provides a common quality-centric vocabulary across the work streams. Additionally, adoption of a single measurement system allows quality to be measured longitudinally across the whole service, helping to dismantle any silos that might hinder overall delivery.

Reverse engineering

The intention of quality is to support continuous improvement and the notion of doing things better, smarter and more efficiently in the future. However, from time to time, things do go wrong and inevitably this leads to reviews, which in turn provide findings and recommendations to help steer further practice developments. Reviews are considered necessary as they support continuous improvement and have a natural synergy with quality.

The design of the proposed quality tool seeks to leverage the close relationship between reviews and continuous improvement. By mirroring the post-event structure of an investigation model, a tool can be developed to proactively improve practice prior to an event. The logic suggests that if our continuous improvement system considers the same aspects as an investigation review, the organisation can identify potential issues before events arise to mitigate incidents and reduce associated costs and consequences.

Many review models1 seek to apply logic to often chaotic situations by grouping contributory factors according to a conceptual structure. Reason (1990, 1997) noted that errors can be viewed in two ways: the person approach and the system approach. Reason considered that the system approach accepted the existence of latent conditions which contributed to organisational accidents. These latent conditions are the existing organisational structures, policies and systems of the company within which staff are required to operate. Reason (2000) also observed that whilst human error was inevitable, organisations can change the prevailing conditions people are required to work under, thereby reducing the frequency and consequence of error. Whilst there was some variety between the models, the high level groups were noted to be largely similar when defining and labelling the factors that constitute an organisational structure.

The department has developed a simplified model, known internally as the Systems Approach. This approach is based upon the aforementioned industrial models, but uses language that better reflects our practice environment. The four system groups (listed below) are:

  • People
  • Tools and Resources
  • Policies and Practice Frameworks
  • Environment.

Where the department model breaks from tradition is the application of this normally reactive approach to a proactive quality function. By routinely applying the same diagnostic methods, the department is better placed to identify opportunities to lift practice quality before unfavourable events occur.By using a single systems approach, the department can align proactive preventative initiatives with reactive post-event reviews, whilst sharing a common vocabulary for all continuous improvement activities. The four groups are sufficiently broad that they can be applied to any event or industry, including our own.

From a quality perspective, all four system groups are of equal importance in that to achieve optimum levels of performance, all parts of the system must be strong. The implication is that where any one group is weak, performance can be compromised.

It is important to note that practice quality is a product of the wider system that extends beyond the actions of an individual practitioner. In simple terms, quality is a system issue, not just a people problem. This relationship is recognised in Mäori culture, as illustrated by the following proverb:

“Ma whero ma pango ka oti ai te mahi.”
When red and black work together the work will be complete.

Ara Poutama tool

When considering the performance of an organisation, it is no longer enough to simply ask “was it done?” (compliance), or “was it done on time, on budget and within specification?” (performance). It is also necessary to ask “how well was it done?” (practice quality). The Ara Poutama tool has been developed to enable this level of organisational discovery and is included as a component of the Te Panekiretanga framework, and illustrated via the jigsaw puzzle model.

Generic in nature, the tool has been designed with consideration given to the principles of the systems approach. The result is a tool that can accommodate the wide range of internal practice environments across the department, and which has the potential to be adapted by other service/practice-centric organisations to their own environments.

The name Ara Poutama2, translates as a pathway, or progression, towards excellence and improvement, mirroring the basic principles of both continuous improvement and quality assurance. Notably, Ara Poutama is also the Māori name for the Department of Corrections, reflecting the offender journey of self-improvement and rehabilitation.

The Ara Poutama tool is based upon the systems approach and considers quality as an outcome or function of the whole system. The intent is to appreciate the role played by practitioners as part of the system, helping the organisation to better understand improvement opportunities and which parts of the business are best placed to influence any required changes.

In this context, underlying issues should initially be considered as a symptom of problems embedded within the wider system, and which may also be impacting other areas of practice. This is opposed to considering the observed issue as the whole problem itself. The risk of not adopting a system approach is that root causes are not addressed and are able to impact other areas
on an on-going basis.

Whilst the system approach is understood at the governance level, the language of the model (people, tools and resources, policies and practice frameworks, and environment) is not familiar to practitioners of other disciplines. A degree of translation is required to make the tool more accessible across the organisation. Therefore, practitioners are encouraged to consider quality via five thematic conversations, using language that is more familiar to their practice:

  • Engage, communicate and respond
  • Preparation and planning
  • Consistent practice
  • Informed practice
  • Working together.

The above thematic conversations are not dissimilar to the dimensions of quality identified by Parasuraman, Zeithaml and Berry (1985, 1988) as part of the ServQual tool: reliability, assurance, tangibles, empathy and responsiveness. Each system provides a structure to describe the attributes of a service that commonly influence the perception of quality.

Ara Poutama uses a suite of questions to support quality-centric conversations between managers and individual team members, and to guide dialogue in a manner that covers each of the system groups.

Placing the four system groups across the top of a table, and the five conversation themes vertically down the side of the same table forms a 5x4, or 20 question, matrix (Refer Figure 2: Ara Poutama Question Matrix).

The questions have been written in a generic format so that they are transferable across multiple departmental practice disciplines. In principle, the same matrix should also function across other professional practice-centric industries. However, it is still possible to tailor each question further to better represent distinct practices.

Figure 2: Ara Poutama Question Matrix

 

People

Tools and Resources

Policies and

Practice Frameworks

Environment

Engage, Communicate, Respond

My engagement, whanaungatanga and communication is purposeful and contributes positively to strong practice.

Our tools for engaging and communicating with clients are fit for purpose and tasks I am required to perform.

The Department has a wide range of practice frameworks that address the needs of my clients.

The Department engages positively with staff, clients and the community to support our objectives.

Preperation and Planning

I actively prepare to promote effective and responsive engagement to ensure safe and purposeful activity.

Our tools are reliable and dependable, providing predictable practice outcomes.

The practice frameworks, guidance documents and local advice

are useful when preparing for a client engagement.

I feel supported by the organisation when making decisions based

upon safety.

Consistent Practice

I consistently strive to increase the likelihood of achieving the goals and objectives of my clients.

The training I have recieved enables me to use the correct tools at the right time.

Our operational processes and policies support my work,

helping me to consistently deliver positive outcomes.

The objectives and aims of the Department are clear, consistent and do not vary over time.

Informed Practice

I make full use of the information available to me ensure my practice is safe for both me and others.

The information required for my practice is easily readily available whenever I need it.

Our policies and practice frameworks provide the guidance I need for my role.

Other teams that I work with readily share information to support practice and safety.

Working Together

I use a wide network of strong relationships and manaakitanga to enhance my patience.

A wide range of tools are available to support my practice, safety and cultural responsivity.

The policies and practice frameworks in my part of the business join up and work seamlessly with those in other parts of the business to support my practice.

My working environment promotes respectful relationships and is conducive to positive, constructive and safe engagement with clients and colleagues.

Using the Ara Poutama tool to support continuous improvement

Ara Poutama is more than just 20 questions. It is about promoting quality-centric conversations that support the practice of individuals and identify systemic improvement opportunities where they exist. The tool works on two levels, tactically as a first line of assurance tool, and strategically at the second line of assurance to provide regional and national perspectives of the resulting data.

At the first line of assurance, the tool is intended to guide structured conversations between managers and employees about practice quality and the role that each part of the system plays in delivering quality. These conversations are intended to support both development and continuous improvement and should be centred on observed practice, other sources of evidence (e.g. documentation, feedback, results etc) and constructive dialogue. In this manner, Ara Poutama provides a tailored experience of continuous improvement that directly addresses the practice needs of the individual using practice-centric language (conversation themes) they are familiar with. Whilst the question structure guides the conversation across the different system areas, the system language is obscured from the users to improve usability at the frontline.

The Ara Poutama tool captures the data from each conversation using a four point scale (refer below) to record an achievement level against each of the 20 questions. Held in a database, the accumulated data can be analysed by the second line of assurance. By using various filters (role, practice discipline, region, site, etc.) high level patterns and trends can be identified.

Figure 3: Layers of Quality, Ara Poutama

HIGH

LOW

Te taiao

High performance layer

Ara namunamu

Focused attention layer

Ara tauwhaiti

Development layer

Ara tika

Foundational layer

The data is not sufficiently detailed to pinpoint the causes and consequences of a specific issue but it will allow the business to identify potential issues or areas of concern. The focus provided by Ara Poutama allows issues and resources to be prioritised to enable strategic planning, ensuring greater value is achieved.

Strategically, Ara Poutama offers the ability to identify, and address, potential issues before they become problems. The availability of this information will empower the department to enhance a proactive operational environment, and reduce any existing reliance upon reactive operations after an event. To do so is both more cost effective and serves to increase the overall quality of the service.

Links across Te Panekiretanga and the department

As part of an integrated quality framework, Ara Poutama is aligned with other parts of both Te Panekiretanga and the department. The systems approach (people, tools and resources, policies and practice frameworks, environment) has been adopted as the recognised methodology for undertaking event reviews, and is also to be used in a forthcoming revision of the Well Functioning Service (WFS) toolset. In both instances the systems approach will be used to align all of our proactive and reactive review tools to provide a consistent internal suite of tools.

Similarly, the Layers of Quality, shown in Figure 3, will replace the previous internal assessment system of “needs development” through to “exceptional”. Layers of Quality will be applied across all quality tools, including WFS, to describe the observed levels of quality performance. The effect is to create a single assessment mechanism, forming a core vocabulary and a commonly understood set of descriptors.

Next steps

Corrections began piloting the Ara Poutama tool in September 2017. The pilot involves over 150 frontline personnel from a number of sites and covering a broad range of professional practice disciplines, including custodial, community, programmes, health, psychology, and case management.

Following a training roadshow, the pilot will run for two months with each participant using the tool on at least two occasions. The pilot will seek feedback on usability of the tool, suitability of format, questions, areas of practice, time required to use the tool, and value added at the first and second lines of defence.

The pilot is also trialling an electronic data capture system. It is expected that this system will reduce the time taken to record information, improve the accuracy of data entry and automatically prepare the data for analysis without significant additional effort.


References

De Landre, J. (2007) Learning from Accidents and Incidents. In J. M. Anca (Editor) Multi-modal Safety Management and Human Factors: Crossing the Borders of Medical, Aviation, Road and Rail Industries. Swinburne University, Australia. p131-142.

Olshavsky, R. W. (1985). Perceived Quality in Consumer Decision Making: An Integrated Theoretical Perspective. In Perceived Quality Jacoby, J. and Olson, J. (Editors), Lexington Books, Massachusetts.

Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1985). A Conceptual Model of Service Quality and its Implications for Future Research Journal of Marketing (Fall), p41-50.

Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1988). SERVQUAL: A multiple-item scale for measuring consumer perception. Journal of Retailing, 64(1), 12.

Reason, J. (1990), Human error. New York Cambridge University Press

Reason, J. (1997), Managing the Risks of Organisational Accidents. Aldershot, UK. Ashgate

Reason, J. (2000), Human error: models and management. British Medical Journal

Sondhi, A., & Day, E. (2012). An assessment of SERVQUAL as a measure of service quality in English male prisons: Perceptions and expectations of the Integrated Drug Treatment System (IDTS), Drugs: education, prevention and policy, April 2012; 19(2):p171–180.


1 There are many systems and models that are used to undertake reviews and investigations following incident/accident events. Frequently used in both industrial and transport environments, the models commonly offer a structure, or anatomy, of an event through a methodology. Whilst the vocabulary used in each model may vary, a common structure is to differentiate between the actions of the individual and the prevailing environmental factors. Examples include: MTO-analysis (Man, Technology, Organisation); STAMP (Systems Theoretic Accident Modelling and Processes); Incident Cause Analysis Model (ICAM: People, Environment, Equipment, Process, Organisation); HPES (Human Performance Evaluation System); AcciMap (Work, Staff, Management, Company, Regulators, Government); FRAM (Functional Resonance Accident Map: input; output; preconditions; resources; time; control).

2 Ara Poutama and the Māori legend of Tāne and the Baskets of Knowledge: In Māori legend, Ara Poutama describes the journey of Tāne from earth to the twelve heavens as he searched for the baskets of knowledge. When Tāne decided to climb up to the heavens to seek the baskets of knowledge for mankind, his brother Whiro was angry. Whiro thought he had more right to the baskets than Tāne, because he was the elder brother. The two brothers struggled for power, but it was Tāne who was favoured by Io, the supreme power, so Tāne was allowed to ascend the twelve heavens. His task was made more difficult by Whiro who sent plagues of insects, reptiles and carrion-eating birds to attack Tāne. But Tāne, with the aid of the winds, was able to proceed until he reached the summit of all the heavens. Here, at Toi-ō-ngā-rangi, he was welcomed by Io and received the three baskets of knowledge and the two sacred stones.
Source: http://www.knowledge-basket.co.nz/about/knowledge-basket-legend/ (Abridged)