4-8 Safety Performance Measures Continued from page 51 Safety Management Manual (SMM) Accident incident Lagging indicators Deviation degraded condition Normal condition from ICAO Document 9859 4.3.2.5 defined and aligned will make it easier to identify SPTs, which will show the progress being made towards the attainment of safety objectives. This allows the organization to assign resources for greatest safety effect by knowing precisely what is required, and when and how to act to achieve the planned safety performance. For example, a State has a safety objective of “reduce the number of runway excursions by 50 per cent in three years” and an associated, well-aligned SPI of “number of runway excursions per million departures across all aerodromes”. initially when monitoring commences, but starts to climb again after twelve months, the State could choose to reallocate resources away from an area where, according to the SPIs, the safety objective is being easily achieved and towards the reduction of runway excursions to alleviate the undesirable trend. variance is normal—and use that as a baseline. Industry statistics are also valuable information for establishing targets, though it is important to approach aggregated data with the same caution you would another opera- tor’s information—it may be that neither are an accu- rate representation of your operation. Regardless of how you set your initial target, be prepared to revise it once you have experience. That’s not only acceptable, it is part of the design of the process and a great indica- tion of a maturing system. Defining SPIs 4.3.2.6 A Balancing Act You’ve probably already heard the terms “leading” and “lagging,” used to describe performance indicators— a reference to whether those indications are either early signs or after-the-fact data. As we design our own orga- nizational SPIs and SPTs, balance should be a key focus. Much has been said about the shortcomings of lagging indicators, but they aren’t all bad. The problem arises when we focus on backward-looking data to the exclu- sion of other sources. Accident data is a traditional—and pervasive—example of a lagging indicator, and it is easy to see how chasing accident information leaves few real options to positively change outcomes that have already happened. Investigating metrics that tie into lower- consequence events—but are predictive of incidents or accidents—means we have time to react. So, until we have access to time travel, balanc- ing lagging data with early warning systems—leading indicators—is critical to developing SPIs that are use- ful in the wild. Likewise, selecting both qualitative and Aviation Business Journal | Summer 2020 If the number of excursions drops Figure 4-3. Examples of links between lagging and leading indicators Leading indicators Precursor events Number of runway excursions/1000 landings Number of unstabilized (or non-compliance) approaches /1000 landings Percentage of pilots who have received training in stabilized approach procedures quantitative measures can help ensure balance. Quantitative indi- cators are the ones we often gravi- tate to, because they are seemingly concrete: things like hard landing occurrence rate, overspeed values, or findings per audit section. Qualitative data can be a bit more difficult to manage—things like raw safety report narratives, organizational culture assessment, or some types of observational reports—but because stories are embedded into the information, we gain access to under- standing context and intent, something quantitative data struggle to provide as efficiently. It is important to select SPIs that relate to the organization’s safety objectives. Having SPIs that are well The contents of each SPI should include: a) a description of what the SPI measures; b) the purpose of the SPI (what it is intended to manage and who it is intended to inform); c) the units of measurement and any requirements for its calculation; d) who is responsible for collecting, validating, monitoring, reporting and acting on the SPI (these may be staff from different parts of the organization); Often overlooked is the need to ensure SPIs reflect some diversity in perspective. Depending on who devel- ops safety performance measures, you may find that met- rics are skewed toward only one or two operational areas (often the largest, or those with more data). Adopting the viewpoint of a customer, a tenant, a contractor, or even a regulator can help generate some terrific insight into safety performance. Bob Schick, Director of Safety and Risk Management for TAC Air, suggests that utiliz- ing your existing safety committee or reaching out to customer safety leaders is a great way to ensure several points of view with regard to how risks are considered. That process can even improve safety buy-in, because it creates clear connections to other areas like customer service and financial performance. To see how one FBO chain approaches balancing SPIs and SPTs, take a look at the last page of this article for a look at one way safety performance metrics are used at Sheltair, courtesy of their Director of Safety & Training, Stuart Ochs. Of course, there’s more to the story than just a list of measures and targets, but seeing how one organiza- tion approaches their safety performance monitoring is a valuable tool. Like any SPIs, Ochs would tell you that these are subject to frequent review and updated—and they are an imperfect product of their team’s learning over time just as yours will be. Continued on page 54 53