This post is the third in a series to do with the application of AI and ChatGPT to the process safety profession. Previous posts are ChatGPT: A Process Safety Expert and Further Thoughts Regarding ChatGPT and Process Safety.
The Retired Superintendent
An electrical power plant used coal-fired boilers to generate high pressure steam. The boiler house had been managed by a superintendent with many years of experience. But time, in its usual manner, caught up with him, so he decided to retire.
A few weeks after his departure the piping to one of the boilers started vibrating uncontrollably. No one could figure out what was going on. Management called in boiler consultants, the process safety coordinator organized a hazards analysis, the inspection group conducted extensive surveys — all to no avail. The vibration continued.
So management called the retired superintendent and asked him to return to his old haunts to see if could solve the problem. He agreed and showed up the next day. After catching up with some of his friends, he walked over to an obscure section of the boiler piping and hit that piping hard a couple of times with a large hammer.
The vibration stopped.
Management was, of course, delighted and asked the retired superintendent to send them his invoice. He did so — the invoice was for $10,000.
This led to outrage. How, management asked, could he justify such a large amount of money for just a few minutes of work. So he sent them a revised invoice; it read:
» For hitting the pipe: $ 100
» For knowing where to hit the pipe: $ 9,900
This story is, of course, an old chestnut. However, it provides some thoughts with regard to Artificial Intelligence and Expert Systems.
Capturing Human Expertise
First, the superintendent was not exhibiting ‘intelligence’ as such; he was behaving as an expert. He had years of accumulated experience (much of which was probably subconscious). He was able to apply that experience to the situation. (This is not to say that the superintendent was un-intelligent, just that in this situation he was using expert knowledge, not creative intelligence.)
Second, ChatGPT has shown itself to be adept at capturing information on the internet and being able to apply that information to new situations. A much more difficult challenge is capturing human knowledge. How can ChatGPT interview our superintendent?
Third, my guess is that, in the world of process safety, an important role for expert systems could be to help managers and incident commanders make real-time decisions based on the information that is available to them right now.
Irony and Misdirection
The story of the Superintendent ― feeble though it may be ― immediately raises a challenge for artificial intelligence and expert systems. How do they differentiate between plain factual statements and irony or hyperbole? For example, let us say that the our superintendent had said,
I don’t know why you bothered calling me, everyone knows what to do when this pipe vibrates (wink, wink).
Any human being listening to this comment knows that the superintendent was joking or making fun of the so-called experts. But AI would presumably take that comment seriously. (Unless it is trained to understand body language.)