By Gill Grady
This blog is the second in our series on improving situation awareness. The first blog, Training for Situation Awareness: 5 Ways to Use the EnVision Platform defined situational awareness (SA) and how the features of EnVision simulation can help train operators to overcome some common types of SA errors. This time we will delve deeper into the way we think and how the thinking process affects how we react to situations at work. By understanding how we might jump to conclusions, we can challenge ourselves to evaluate situations more critically.
While automation has had a significant impact on the industrial workforce it has not necessarily reduced the number of operators, but it HAS changed their role – from one of many routine tasks to process troubleshooting. Plant Services has published the Operators Guide to Successful Troubleshooting which simply defines troubleshooting as 1. Defining the problem (e.g., using alarm information to localize and identify possible causes), 2. Collecting data and turning it into useful information, and 3. Analyzing to find the root cause.
A few concepts are important for this journey:
- Priming Thoughts and Their Effect on Decision Making
- Fast vs Slow Thinking
- Using your simulator to overcome Priming Thoughts and Fast Thinking
What are Priming Thoughts?
People evaluate situations based on recognition of similarities to previous situations, and evaluate ideas serially, using the first idea that suffices.1 Priming Thoughts are where exposure to one stimulus in the past influences a response to a stimulus later on, without any conscious guidance or intention. Layer that with neuroscience research that indicates we do not see with our eyes, but with our brain. Our eyes and our brain are more likely to see what we expect to see rather than reality.2 To overcome Priming Thoughts, we need to expand the experience model and show the brain possible conflicts that will force us into what Nobelist Daniel Kahneman calls Slow Thinking.
What is Slow Thinking, or System 2?
In his book Thinking Fast and Slow, Kahneman described how the brain processes information in two vastly different ways, and the challenges those processes create. Kahneman calls them System 1 (fast) thinking and System 2 (slow) thinking. System 1 thinking is linked to our survival instincts as certain situations require an automatic response; this happens with cognitive ease and without consciousness. But it is important to note that not all first impressions are correct. Without explicit context, the brain creates its own context out of previous experiences and associations.
Here is an example Kahneman uses to illustrate fast thinking and context;
Fill in the missing letter in this word: SO_P
The answer will depend upon whether the context is about food or washing.
One of the big problems with fast thinking is it seeks to quickly create a coherent, plausible story or explanation for what is happening, relying on associations and memories, pattern matching, and assumptions. System 1 will default to that plausible, convenient story, even if that story is based upon incorrect information.
For a plant operator, incorrect first impressions can lead to faulty diagnosis and troubleshooting. One way of overcoming that is through exposure to a vast array of “experiences” that force the mind into System 2 or slow, effortful, and deliberative thinking. In the world of process troubleshooting, that is where we want to spend more of our time.
Here is an entertaining video that demonstrates fast and slow thinking that will trick your brain: https://www.youtube.com/watch?v=JiTz2i4VHFw
Most of the time we go with our System 1 recommendations because of cognitive ease. Sometimes we evoke System 2 (analysis, problem solving, deeper evaluation) when we see something unexpected, or we make a conscious effort to slow down our thinking to take a critical view. The challenge is how to develop enough experience such that our System 1 responses have a higher probability of being correct, or we are aware enough to transition into System 2 thinking.
Recommendations on Simulator Use
So how do we train our brain to overcome these biases and tendencies to jump to conclusions and illusions of competence, and to evoke System 2 evaluation skills? Here are a few recommendations on how to use your EnVision simulator to build up the experience needed to overcome biases.
- Build up a broader experience bank from which the brain draws upon. i.e. introducing malfunctions that present themselves in a similar manner but with different root causes. Take a low flow indication as an example. It could be caused by a variety of different root causes, requiring different actions:
- Pump degradation
- Equipment Fouling
- Leak in the system
- Transmitter error
Similarly changing battery limit conditions will teach the operator how system dynamics change even though the plant and equipment are operating normally. Common battery limit changes include:
- Ambient Temperature
- Cooling water supply temperature and pressure
- Fuel gas Composition
- Feed pressure, temperature, and composition
Exposing operators to many different failure types and operating conditions will build their bank of experience. This encourages a more critical evaluation of the information presented by the DCS and field operators, to prevent jumping to conclusions.
Finally, expose your operators to systems and unit operations that may not be representative of their daily tasks. By learning a new system, core concepts and fundamental operating theories are stored in the bank, rather than memorization of a system.
- Challenge the Time Horizon
- Malfunction severity and ramp time manipulation can build awareness of how event horizons and the time available for correction is affected. This allows an operator to evaluate the information available more fully so that actions are both timely and well-considered.
- Create a cognitive overload environment for more seasoned operators. Introduce multiple failures, or a failure during operationally intensive exercises such as a start-up.
- Experiment with failure
- Simulators are the perfect tool to learn from mistakes, and to correct an operator’s mental model of plant performance. The snapshot function allows a student to reset the simulation to a critical time just before an operating error was made. Trainees can then try a different approach and compare results, confirm expectations, and optimize performance.
Contact GSE and let our experts help you develop a curriculum that will improve the critical thinking skills of your plant operators with our EnVision simulation and tutorial platform.
- (G. Kline, A and R. Calderwood 1991) Situation Awareness Systems, States and Processes.: A holistic framework, Jonas Lundberg, Linkoping University.
- (Susan L. Koen PhD Safety Leadership: Neuroscience and Human Error Reduction.)