I gave up the world of ‘publish or perish’, peer reviewed journals and university club validation when I witnessed a dozen academics create their own journal, club and association and publish hundreds of papers together. Then all used the volume of publications for advancement and a pathway to power.
The best way to win a game is to ‘gamify’ the rules and use them for a desired outcome.
The idea that constructed measures are neutral, objective and valid hinges on the acceptance of a group that validates its own bias. The projects validation from the assumptions of their own measures.
I’m not going to get into the sordid tales of the way academia works but simply to say it has its own validation process (game) to guarantee its own assumptions. Publishing the same information across 10 journals under different titles is neither a demonstration of intelligence, creativity, innovation or expertise. And that’s how the system works. If that’s the sector you wish play, good luck to you but don’t tell me my research is invalid because it doesn’t match your mythology of peer reviewed journals.
Similarly, in the schooling system we know that standardised testing doesn’t work and is indeed, anti-learning, anti-creative and anti-education:
and there are hundreds of voices in Education that confirm this (https://www.humandymensions.com/product/tackling-risk/). BTW, no one in the profession of Education or Teaching calls this ‘schools bashing’.
The truth is, political initiatives like NAPLAN (https://www.nap.edu.au/) develop their own circular proofs of testing methods that have very little to do with learning , education or intelligence (https://theconversation.com/lets-abandon-naplan-we-can-do-better-95363).
Whenever a regime of measures is thrown about the first thing that should be done is questions its assumptions and what ideology is advantaged by it.
All measures carry hidden assumptions, political agenda and an undeclared ideology. This is the same for safety.
Nothing is clearer than injury rates are NOT a measure of safety.
Yet, the belief that they do, swamps the safety industry like a religious belief based on mythology concocted as if set in concrete.
Any measure that is thrown at us requires questioning and critical thinking. Just ask some critical questions of the measure and see where (telos) it takes you. So:
· If injury rates go up and down in an organisation does that mean safety improves and declines at the same rate?
· What is selected as a measure and what is excluded?
· What is defined as an injury? Eg. Psychological or social harm? Suicide? Trauma? PTSI?
· What group validates the measure? What is their political and ethical interest?
· Does the measure deliver a moral outcome?
· What does the measure do to persons?
· Can the measure be ‘doctored’?
· What does it NOT measure?
In NAPLAN, every critical skill and attribute for becoming a responsible, mature and developing person are NOT measured!
Having a PhD in teaching doesn’t mean you are a good educator!
Keeping injury rates to zero doesn’t mean you are on a safe worksite!
All a measure does is affirm the assumptions of the measure. Most often it doesn’t affirm what is attributed to it. Such is the safety code (https://safetyrisk.net/deciphering-safety-code/).
Just because someone talks and writes about ‘safety culture’ or has a PhD, doesn’t mean they have expertise in culture (https://www.youtube.com/watch?v=xMvYgJMH7tA). Most of what gets paraded about as expertise in safety culture is just more Scientism, Engineering and Behaviourism.
The key is to not listen for the noise but for the silences (https://safetyrisk.net/if-you-want-to-know-about-culture-dont-ask-safety/).
If you know how to think critically and how to listen to the silences, you will soon realise very little of what is projected is about culture (https://safetyrisk.net/category/safety-culture-silences/ ).
Just because the global world of safety sets its mantra as zero, affirms zero as a religious belief and worships the gods of metrics/numerics, doesn’t mean any of it true or helpful. Even when Safety condemns zero it still seeks out other measures for validation, which is still an addiction to measurement (https://safetyrisk.net/the-measurement-mindset-in-safety/).
All this silly talk of lead and lag indicators, is still anchored to the addiction of attributed measures.
A wonderful method for hiding a deep need for critical thinking and, to interrogate the attributions embedded in such thinking, is to hide the ideology of metrics. Safety remains an industry of quanta (https://safetydifferently.com/todd-conklin-quanta-risk-and-safety-conference-2019/) with precious little skill or expertise in qualia. If your love is quanta, then there’s nothing ‘different’.
I don’t care what your measure is for safety, if the culture lacks trust (https://safetyrisk.net/speak-up-reporting-and-trust-in-safety/) you have an unsafe workplace. And, trust can’t be measured. Same for hope, respect, helping, listening, happiness, contentment, wisdom, care etc.
All of this focus on ‘quanta’ has been created and endorsed by an industry addicted to Engineering. This is the foundation of the safety curriculum and body of knowledge (https://safetyrisk.net/why-is-safety-an-easy-target/).
The assumption of the safety curriculum is that safety is about the management of objects and the reporting of metrics. Then all the measures in the world that follow simply endorse the engineering assumptions of the curriculum but it has precious little to do with safety. Then with these measures in mind out comes all the commentary on culture from a mindset that declares culture are confusing and all too hard. Most of the commentators on culture come from this safety mindset and don’t even realise they continue to carry all the safety baggage when talking about culture.
In SPoR, we work with all of this and tackle the challenges of measures in risk here:
Once again, SPoR provides practical tools, methods and strategies for tackling risk and all these resources are free.