Variability: Threat or Curse?
More Info
expand_more
Abstract
In the philosophy of SAFETY-I variability is seen as a threat, because it brings with it the possibility of an unwanted outcome. Variability of hardware is curtailed by, amongst other things, precise specifications. Variability of human behavior is curtailed by inter alia regulations and protocols. In the philosophy of SAFETY-II variability is seen as an asset. In SAFETY-II, humans are seen as able to cope with the variability of technology circumstances to keep systems working. This capacity of coping has been designated resilience. Recently the meaning of resilience has been stretched to include the ability of restoring the operational state after an excursion into the realm of inoperability. The belief that humans will cope if an unexpected situation may arise, reduces the emphasis on preventive measures that limit the probability that the system may behave in an unsafe manner. The stretched meaning of resilience exacerbates this problem, because there is no real limit of what systems or society using these systems may bounce back from. The philosophies behind resilience engineering promote safety by exploiting the ingenuity of humans to keep systems within the desired operating envelope. Unfortunately the errors that may be introduced by over-relying on humans correctly assessing situations may also be fatal or catastrophic: maybe not for society as a whole, but surely for an individual, a group of individuals, or a company.