Wearable Technology and Safety


Wearable Technology and Safety

imageIt is interesting to watch Safety grapple for new saviours, Techniques (Ellul) to get to zero. The latest in the ‘faith in technology’ discourse is wearable devices to give massive data and as we know, big data is also a new saviour. Didn’t you know, big data can now make the actions of fallible human persons predictable (https://safetymanagement.eku.edu; https://www.thedigitaltransformationpeople.com)

I call this a ‘faith’ because most of the promotion of wearable technologies for safety rarely discuss by-products, trade-offs or dangers (harms) of the technology. When Safety goes on a ‘wishing crusade’ you can be sure of one thing, it does so without any ethical compass. When Safety shows up to a project with its AIHS BoK Chapter on Ethics invoking the assumptions of deontological duty, you can be sure of one thing – human persons will come off as dehumanized objects.

Nothing is more dangerous to persons than Safety on a wishing crusade. When you deify data out goes any moral compass.

When an industry cannot define its ethic, and is seduced by the ideology of zero, you can be sure that when it uses the language of ‘controls’ it means manipulation, surveillance and policing. As long as Safety gets to zero, it doesn’t matter the method.

Let’s have a look at a recent example: ‘iCare finds wearable technologies can help safeguard workers’.

The alarm bells should ring early on this one. This from an organization infused with corruption ( icare-workers-compensation-insider-speaks-out, icare-workers-reported-to-nsw-police-icac-over-recruitment-scam ), toxicity (scathing-icare-review-finds-a-need-for-cultural-change) and bullying (gutted-destroyed-betrayed-icare-whistleblower-victimised-after-speaking-up).

It seems that whenever Safety speaks something it is always opposite in meaning (https://safetyrisk.net/deciphering-safety-code/ ). When we see iCare it really means ‘I don’t care’. When I see safety use the word ‘learning’ it doesn’t mean learning. When it talks about ‘resilience’ it’s not about resilience and, when it promotes solutions it never considers by-products or trade-offs (eg. the create of greater fragility – Taleb).

Foundations, ethics and worldview are critical big picture items that Safety is most silent about and this article is typical. Typically the focus is on measurable injury rates not the culture or climate that debilitates persons and ruins lives. iCare is typically interested in a narrow view of ergonomics, measurables like muscle strains and unsafe working postures. Whilst such are important they are such a small aspect of ergonomics. Without a holistic approach to ergonomics (https://cllr.com.au/product/holistic-ergonomics-unit-6/ ) Safety tinkers around the edges of the problem and, nothing changes.

Technology is such wonderful distraction from the essential need to humanize risk. When you don’t know how to communicate effectively with people, don’t know how to listen, observe and converse the natural trajectory is a wearable technology. This way no conversation is needed or people skills, data rules! This seems quite typical of the discourse in safety about technology, yearning for the machine that goes ‘bing’ (https://www.youtube.com/watch?v=NcHdF1eHhgc ). So here are a few challenges:

  • Just because a form is transferred to an app, it doesn’t cease to be ‘paperwork’.
  • What does the desire for wearable technology do to the way persons are conceptualized (defined). Most often technology-centric approaches to knowing devalue persons and transform them into objects and ‘data’.
  • Wearable technologies change the way we think about ourselves. Wearable technologies make us think in ‘instrumental’ terms.
  • Most often the admiration of technologies as ‘the new saviour’ redefine how we think about health and our bodies. Eg. technology that prolongs life is most often understood to be an ethic a good??? Length of life and quality of life and confused. How on earth can technologies measure the quality of life? So, when it comes to ending life, faith in technology is useless.
  • In order to understand the ethics of technology and the insidious nature of Technique (Ellul) one needs a good understanding of ethics, not something safety is much interested in.
  • Most often the discourse about safety by technology defines the human as having a disability, fallibility is deemed the enemy.
  • The biggest ethical issues with wearable technologies are: consent, transparency and personhood. How often does safety-as-zero justify the ‘means’ by the ‘ends’. Because injury rates are deified as the goal and a greater good, most often persons come off second best in the ‘faith in data wishing crusade’.
  • The associated costs with using a range of technologies discriminates most often against the poor and vulnerable.
  • Similarly, having access to masses of data about persons particularly health data, make users obsessive and define well-ness by loaded safety terminology that is never defined.
  • We already know of surveillance technology used on construction sites justified by safety (https://lp.siteguard.net.au/construction-nsw/B.HTML; https://evercam.com.au/ ) interpreted by engineers who somehow know how to decipher the meaning of behaviours. Without an ethic in the desire for low injury rates, secrecy seems to be a dominant safety value.
  • The problem of what to do with data and the way it is interpreted is a massive problem with wearable technologies. All data is interpreted and there is no such thing as neutral data. How data is used is a massive ethic question, again something Safety doesn’t seem to care about. When safety uses the language of ‘ethical responsibility’ its seems critical that such language be never defined.
  • Of course, once personal data becomes the property of an organization it also requires security systems to protect it but cannot foresee the changes of the organization or what future management might do with such data. The last thing I would want to do would be give an organization like iCare my information under the excuse of safety.

At a deep level Ellul (1964) shows us the archetypical nature of Technique (https://monoskop.org/images/5/55/Ellul_Jacques_The_Technological_Society.pdf ). The desire for technology is not neutral, technology and its use is not neutral and demands ethical consideration. Not much help in an industry that defines ethics as ‘do the right thing and check your gut’! (https://safetyrisk.net/the-aihs-bok-and-ethics-check-your-gut/ ).



Source link

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.