Last week was devoted to reliability in two aspects. Niklas Vareman successfully defended his thesis on Epistemic Risk, which in one of his explanations of it is the risk of “missing out of knowledge”. Another version of epistemic risk is the risk of being wrong. A statistician would say the risk of getting erroneous results by not doing your stats properly.
But it is more than the probability of committing errors of type I and II or non-efficient decision making following from making too much simplifying assumptions in our analyses. How to do you stats and build your models is a moral problem. We as scientists must ask ourselves if we are producing knowledge in a way that is reliable. A problem is that we don’t always know what a reliable production process is. Appropriate guidance is needed. But these must be constructed in a fruitful way – otherwise it will be a nightmare when everyone will have their own opinion of what is meant by a reliable knowledge production.
It is not only scientists or philosophers that are concerned about moral issues in knowledge production. This week the European Safety and Reliability Association had its annual conference in Wroclow, Poland. Here reliability refers to a safe and good performing maintenance of technical systems. These systems can be critical infrastructures providing us electricity or gas. The focus is on the performance of a system or the risks associated to a system. However, the discussion evolves not only around how to maintain these systems, but also how to treat our knowledge (or lack thereof i.e. uncertainty) about these systems and influencing processes. Again, the issue how to treat uncertainty in a good way becomes a moral problem. Which ways are there to quantify uncertainty in assessments? Which principles to consider uncertianty when making decisions are good enough?
Thus reliability enters our discussions both in terms of system performances and in our knowledge of these systems. Nice.