Preventing Challenging Behaviour: Blackboxing the Data

No-one much cares to admit they work in an industry where ‘adverse’ events occur. Violent incidents or episodes of challenging or disruptive behaviour are just one such manifestation. When it happens within their own organisation there is a tendency for people to want to deny its existence, or brush the fallout under the corporate carpet. It’s understandable; People don’t like to co-exist with ‘bad’ things that may sully the brand. The irony is however that unless you expose the reality of such incidents to critical sunlight and careful examination no lessons will be learnt and the costly cycle of preventable violence will continue. Airlines understand how strong leadership in the area of risk management pays dividends.

airline safety through managing training

OK. So send 300 hundred tons of metal aircraft into the sky at 500 mph, after first filling it with a 100 tons of highly combustible fuel and packing it full of holiday makers and business travellers, and then ask yourself a question: What could possible go wrong? Apart from engine failure, structural fatigue, lightning strike, ice build-up, pilot error, electrical malfunction, bird-strike, terrorist attack and air rage, probably very little. All that having been said 21st century flight is an incredibly safe proposition.

According to the US Bureau of Safety Statistics, National Transportation Safety Board 46,000 people a year die in road traffic accidents whilst less than 100 die on aircraft. Forbes magazine published an article in which it asserted that to reach the 99% near certainty of boarding an aircraft that would go down would require you to undertake 67,833 years of daily flights. This fantastic safety record is down in part to the eponymous ‘blackbox’ which has become the symbol most associated with airline safety, and at the heart of a safety culture driven by evangelical safety managers.

The net safety margin that the black box bestows upon the airline industry is down to its collection of data, or more specifically careful data analysis. Other industries are getting in on the act. The Health and Safety Executive publishes annual figures on work related violence. These largely describe its incidence. The NHS’s security arm also publish the numbers relating to assaults on a trust by trust basis. Where you have cultures that support recording and reporting some useful detail on the incident narrative is collected along the way. The social care sector has now upped the ante and introduced an active obligation to reduce physical restraints on service users. Part of the Department of Health’s ‘Positive and Proactive Care’ initiative is the duty is to prepare annual figures to the board, another is to learn from behavioural events.

The idea of learning from the data, of actively extracting meaningful information from the jumble of numbers and observations and applying it to problem solving is where the airline industry has forged ahead. From the perspective of those charged with designing, developing and implementing highly technological safety systems such evidence is invaluable. The leaders in this province are largely engineers who champion the scientific method. These are individuals who by nature are tend to trust measurements more so than mere speculation, supposition or informed guesswork. After all real safety is no accident.

Elsewhere, in what might loosely be called ‘human’ services, there is not always the same scientific mind-set. In the past there has been a tendency for sectors such as social care, retail and leisure to be soft on data gathering. The reflex has been to make excuses for the behaviour (“it’s just one of those things”, to apportion blame externally (“It was his fault! we did nothing wrong!” or ultimately be reluctant to put the reality of a complex and problematic issue down in black and white for fear of it being tantamount admitting to failure, with the prevailing perception being that such ‘admissions’ can be costly. But, as someone eminently sensible once said ‘if you always do what you’ve always done, you’ll always get what you’ve always got..’

The ‘blackbox’ (or ‘Flight Data Recorder’) was invented by Dr David Warren who’s own father was killed in a plane crash, when he was only nine. Dr Warren had conceptualised a device that could record flight data and flight-deck conversations in order to help analysts piece together the events that led to an accident. In 1956 he produced a prototype called the “ARL Flight Memory Unit”. His invention did not gain immediate traction. It took around five years before units were eventually manufactured in the UK and US. However, Australia was the first country to make the technology compulsory. It has been saving lives ever since.

The ‘blackbox’ was designed to gather information that would tell investigators exactly ‘what’ had happened so that they might then investigate further and analyse the information to establish the ‘why’. Once you discover this you can then seek out to identify the appropriate range of preventative measures. It was Sherlock Holmes who once said, “It is a capital mistake to theorise before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts’. The ability of an organisation to be to take ownership of this risk management process, to remain objective and apply scientific rigour is the first step towards addressing the issue. An organisation that addresses challenging, disruptive and/or violent behaviour is one to be impressed by. One to trust. And ultimately one to do business with.

Click for the HSE Work Related Violence Statistics


About Sandra Nelson