Expectation bias on the AOA: When we hear what we want to hear

By Lonna Whiting, writer, Alder Airfield Services

A pilot flying one of her regular routes prepares her descent onto a runway she’s navigated hundreds, maybe thousands of times before.

She expects all to be clear. After all, she’s landed in that very spot time and again without incident. 

Then one day, the runway appears normal but it’s actually barricaded for sudden minor repairs. The control tower commands the pilot to change her course at the last minute. Instead of hearing the tower’s instructions she proceeds as usual.

She lands on the closed runway.

Expectation bias—a safety risk on the AOA

What happens next for the pilot could result in a (hopefully) near-miss or a total catastrophe, and it all comes down to what our brains do when we expect one thing to happen — then it doesn’t. That’s called expectation bias.

Expectation bias occurs when our existing beliefs, perceptions and experiences subconsciously influence the views and the decisions we make. Consequently, we don’t always see disruptions in routine situations, even when they’re right in front of us.

Like in this example, pilots sometimes experience expectation bias where they listen to the little controllers in their heads instead of the outside instructions they’re given. It’s no different for your teams working directly on the AOA who diligently keep your runways clean, clear and compliant.

While it can wreak havoc in any situation, expectation bias is an opportunity for teams to work together towards situational awareness on the AOA where safety is everything. And that’s a very good thing, because even though your crews on the ground aren’t the ones flying the planes, they are the ones building the infrastructure that makes the flying possible.

Who is affected by expectation bias?

Organizational and Human Performance Consultant Peter Furst said expectation bias generally tends to fall into one of two general categories: our biases and the biases of others. 

And we all have them.

“We have one brain but two minds,” Furst writes. “One mind makes conscious choices based on careful consideration, self-reflection, insight, and observation—the mind of self-control. The other mind makes automatic choices based on past experience, habit, and/or instinct—the mind of impulse and habit.” 

Expectation bias exists in many situations. Medical research participants, for example, sometimes report health improvements even though they were given placebo medications. That kind of expectation bias, while helpful in drug trials, is less than helpful when we’re tasked with making quick decisions in high-stakes workplaces like the AOA.

Especially stressful ones.

How stress affects cognitive bias 

While we don’t actually notice it, our brains mull over ten, twenty, even a hundred scenarios where a choice must be made in the span of seconds or minutes. Each one of these situations requires a decision to be made, and the more critical or dangerous the scenario, the more cognitively challenged our brains become. 

“Stress or pressure plays a role in activating the automatic neural circuitry while suppressing the conscious circuitry, thus making biases unconscious,” Furst writes in his research. “The mind of self-control (conscious choices) takes more time than the one that makes automatic choices. … To avoid becoming overwhelmed, as well as to be as efficient as possible, the automatic process becomes the method of choice.”

If we dig even deeper, as Furst does, expectation bias uncovers many additional cognitive processes that can potentially result in unsafe situations on the AOA. 

Here’s what Furst has to say about these other kinds of mental biases and how they might impact your teams and projects:

Perception bias: the tendency to harbor predetermined assumptions about certain types or groups of people. “In the case of the safety practitioner, it may possibly hamper getting the safety message effectively across to everyone in the crew,” Furst writes.

Halo effect: the tendency to consider one mistake a person makes as rendering them totally incompetent or that some risk-taking improves efficiency. Furst uses the example of a foreman who disregards unsafe or noncompliant work standards to meet project deadlines. “The halo effect may color the foreman’s perception of the worker as inventive or an ‘outside-the-box thinker’ rather than a risk-taker,” Furst writes.

Groupthink: “This is a situation where people wanting to be accepted by a group mimic the beliefs and positions taken by that group as a whole,” Furst writes. “In safety, if the crew performs its work in an unsafe manner, a person wanting to be accepted by that group may perform the work in the ‘unsafe’ manner even if that person knows it is not the way the organization expects them to perform.” 

Bystander effect: Perhaps the most recognizable of the cognitive biases, the bystander effect occurs when someone witnessing a dangerous situation assumes another person will come to the rescue. Bystander effect can and does happen on the AOA, especially when crews aren’t fully equipped to manage the critical safety aspects of a project on their own.

Creating human safety with neutral third-party AOA construction support

Furst says managing expectation bias on the AOA is to understand it, be persistent about recognizing when it occurs in ourselves and others, and to practice deep awareness with our environment and our role in keeping those surroundings safe.

“Leaders who are aware of their own biases and encourage others to do the same stand a good chance of improving the quality of safety outcomes on the job,” Furst writes. “While understanding that this may not revolutionize project processes, practices, or procedures, it will greatly improve outcomes.”

Training, processes and structures can and do help mitigate the risk of cognitive biases, but the best way to eliminate human error on the AOA is through independent, objective support. Here’s how third-party safety support from companies like Alder reduce the risk of expectation bias:

Independent Resident Project Representative (RPR) services: We hire specialized consultants who provide a daily on-site presence.

Unbiased oversight: As external consultants for AOA construction projects, Alder offers an objective perspective free from the internal performance pressures or blind spots that in-house staff may have. This impartiality is critical for ensuring total compliance with OSHA and FAA standards.

Specialized safety staffing: Outsourcing quality control managers ensures that the project is overseen by professionals with specific aviation construction credentials.

Customized training and education: Our human experts offer real safety drills for diverse groups—subcontractors, supervisors, and laborers—ensuring that safety protocols are understood and applied correctly on the active airfield.

Human support for high-tech environments: AI models are often trained on datasets that contain the results of human expectation bias. What’s more, in airport construction, a human supervisor might ignore a physical warning sign because the “safe” AI dashboard didn’t flag it. We eliminate this risk by bringing the best in AOA technology together with the best in human situational awareness.

Knowledge is power—and safety

We can do a lot of things by the book, but all the safety protocols in the world don’t replace a human team’s ability to make split-second decisions for safety. Expectation bias is just one component of many which have the potential to create safety risk on the AOA.

“Cognitive bias won’t change every decision a leader makes, but knowledge of its effects can inform, and more importantly, this effort will set the stage for a fundamental change in identifying, evaluating, and modifying risks in the project operational systems, resulting in improved safety outcomes,” Furst writes.

Next time you’re trying to figure out how to manage safety protocol on the AOA, raise your expectations with expert, trained support from third-party experts like Alder.

Sources

Furst, P. G. (2020, August 7). Unconscious biases and construction safety. IRMI.

Woods, S. (2020). “Just a bit biased.” FAA Safety Briefing, July/August 2020, 1–4.

Lonna Whiting writes for purpose-driven companies like Alder, providing them with comprehensive, research-driven communications that support their industry expertise.