Human Bias On Cloud-Native Transformation

The groundbreaking work in psychology by Amos Tversky and Daniel Kahneman transformed our understanding of human behaviours. Their research opened new academic discipline of behavioural economics, later it extended to science and engineering. In recognition, Kahneman received the Nobel memorial prize in Economics in 2002 (Sadly, Tversky died in 1996). 

Human cognition has two modes of operation, as per Kahneman, System-1 (the fast thinking) and System-2 (the slow thinking). System-1 is instinctive, while System-2 is rational.

Human is mostly System-1 being, this is due to our long evolutionary history. The foundation of System-1 is “biases”. Biases are mental shortcuts, they cut the corner to reach a quick solution for any given problem. On the other hand, System-2 deploy analytical cognition which is significantly energy-intensive.

Now, what cognitive biases have to do with cloud-native transformation? 

The cloud-native transformation has two aspects -technology and culture. The technology aspect is straight forward, either such technology exists or not. Luckily, we know that technology is no longer the bottleneck. The culture, on the other hand, is a different story.

In their book Cloud Native Transformation: Practical Patterns for Innovation Pini Reznik et al. highlighted that many cloud-native transformations become halted or even cancelled because organization fail to change its culture.

Why does an organization fail to alter their culture to adopt transformation?

An organization is made of people. The root of all cultural transition failure is human cognitive biases. The detail of human biases can be found in the influential book Thinking, Fast and Slow by Daniel Kahneman. Pini Reznik et al. identified the following cognitive biases and their impact on the cloud-native journey of an organization-

Ambiguity Effect is the tendency to avoid options for which missing information makes the probability of the outcome seem “unknown.” An example of the ambiguity effect is that most people would choose a regular paycheck over the unknown payoff of a business venture. 

People tend to do what they have always done, because they know exactly how that worked, even when it no longer applies to the new context. This bias is the main reason to run experiments early in a transformation, to understand the project’s scope and fill in the missing information gaps. 

Authority Bias is the tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion. 

In traditional hierarchical organizations, authority figures have to know more than others below them in the hierarchy. In cloud-native, this bias is even more dangerous, as the managers have less understanding of what’s going on in a highly complex distributed architecture based on new technologies. They need to be careful to avoid giving orders or even providing opinions as they will be automatically received as “correct.” 

Availability Bias is the tendency to overestimate the likelihood of events with greater “availability” in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be. 

“Everyone is talking about Kubernetes so suddenly it must be the right thing to do!” 

Bandwagon Effect is the tendency to do (or believe) things because many other people do (or believe) the same thing. Related to groupthink and herd-behaviour. 

When Gartner puts particular tech on their chart, everyone decides to adopt it even without understanding how it relates to their use-case. 

The Bystander Effect is the tendency to think that others will act in an emergency situation. 

This is very relevant for overlapping responsibilities. In a hierarchical organization with its many specialized teams, for example, when a task doesn’t officially belong to anyone, no one will volunteer to pick it up. In cloud-native, where teams are responsible for independent execution, there needs to be clear communication, so all necessary tasks get covered. 

Confirmation Bias is the tendency to search for, interpret, focus on, and remember information in a way that confirms one’s preconceptions. 

Ignore all those inconvenient facts and embrace the information that supports your opinion. There is always plenty of data to choose from, so it’s easy to cherry-pick. If you have a system administrator with 20 years of traditional IT experience, he’ll be very creative in finding lots of very important reasons not to move to the cloud. If an engineer is a dead set on using some cool tool he heard about at a conference, he will ignore all information showing that another tool might actually be a better fit. 

Congruence Bias is the tendency to test hypotheses exclusively through single direct testing, instead of testing multiple hypotheses for possible alternatives. 

When you run an experiment to prove your point rather than to find new information. So, you would run only one proof of concept (POC), and if it works, you’ll automatically dismiss all other alternatives even without evaluating them. 

Curse Of Knowledge when better-informed people find it extremely difficult to think about problems from the perspective of less well-informed people. 

Engineers frequently struggle to sell cloud-native ideas to their managers due to this bias. They see so clearly why this is the right thing to do that they forget the managers have no background knowledge that enables them to understand it with equal clarity. 

Default Effect when given a choice between several options, the tendency is to favour the default one. 

If you set up a cloud-native platform, most people are probably going to use it exactly as you gave it to them. This is true both for the tools built into cloud platforms like Amazon Web Services or Azure as well as internal tools provided to employees. 

Dunning-Kruger Effect is the tendency for unskilled individuals to overestimate their own knowledge/ability, and for experts to underestimate their own knowledge/ability. 

This bias leads to overestimating your competency at things you’re not intimately familiar with. We see this with managers who try to dictate which tools or methods will be used as part of a transformation. They have no actual cloud-native experience or knowledge, but they are accustomed to calling all the shots. 

Hostile Attribution Bias is the tendency to interpret others’ behaviours as having hostile intent, even when the behaviour is ambiguous or benign. 

It is a typical human bias arising from the fact that massive change in an organization creates anxiety in those poised to undergo it. People think that cloud transformation will destroy their work and wreck their company. 

IKEA Effect  is the tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end result. 

This one can be used positively -if everyone gets involved in the organization in every stage of planning and executing a transformation. Whoever is engaged is biased to like and support the solution. 

The Illusion of Control is the tendency to overestimate one’s degree of influence over other external events. 

This is especially common in the uncertain circumstances of a cloud-native transformation. Engineers think that they know how to build microservices, and managers believe that they know what it takes to do DevOps. But in reality, it is only an illusion of control. Many complex and emergent processes are very difficult to even steer, much less control.

Information Bias is the tendency to seek information even when it cannot affect action. 

Analysis paralysis. Very common in a hierarchical organization’s efforts to try to find more and more answers for more and more questions, although there are only two or three possible actions to take and the benefits of one of them are very clear. 

Irrational Escalation (also known as a sunk-cost fallacy) is the phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. 

If you’ve spent six months working on setting up an OpenShift unlikely that you’re going to switch to another tool even if it’s proven to be superior. People routinely push forward with projects that are obviously not going to bring any value. 

Law of the instrument is an over-reliance on a familiar tool or methods, ignoring or under-valuing alternative approaches. “If all you have is a hammer, everything looks like a nail.” 

Moving to cloud-native while using old techniques/processes. 

Ostrich Effect is ignoring an obvious (negative) situation. 

When we move to cloud-native, we need to make significant cultural shifts, not just change the tech. However, many companies choose to either ignore this altogether or make only small cosmetic changes meant to signal they’ve transformed their culture — and, either way, try to work in the new paradigm using old processes that no longer apply. 

Parkinson’s Law Of Triviality is the tendency to give disproportionate weight to trivial issues. This bias explains why an organization may avoid specialized or complex subjects and instead focus on something easy to grasp or rewarding to the average participant.

When people get together for three days of discussion and planning their cloud-native migration — and then talk about tiny trivial things like which machine will Kubernetes run on or how to schedule a specific microservice. All this while avoiding large challenges, like changing organizational culture or overall architecture.

Planning Fallacy is the tendency to underestimate task-completion times. Closely related to the well-travelled road effect. People tend to underestimation traverse duration of oft-travelled routes and overestimation less familiar paths. 

Some people estimate cloud transformation as a few weeks of work, maybe a couple of months at most. Reality: often a year or longer. Basically, if you don’t have baselines from previous experience, any estimation is totally worthless. 

Pro Innovation Bias is the tendency to have an excessive optimism toward an invention or the usefulness of the innovation while failing to recognize its limitations and weaknesses. 

“Let’s do Kubernetes, regardless if it’s a good fit or even necessary. Because it is new and cool.” 

Pseudocertainty Effect is the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes. 

Successful teams will avoid investing in improvements while everything is going OK. But once a crisis erupts, they will jump on any crazy new tool or process to save themselves.

Shared Information Bias is the tendency for group members to spend more time and energy discussing information that all members are already familiar with (i.e., shared information), and less time and energy discussing information that only some members are aware of. 

The whole team went to a Docker training, so they spend a lot of time talking about Docker — and no time at all about Kubernetes, which is equally necessary but they don’t know much about it.

Status Quo Bias is the tendency to like things to stay relatively the same. 

Entire companies, and/or people within the organization, will resist moving to cloud-native due to this bias. In essence, everyone wants to remain comfortably right where they are right now, which is known and understood. 

Zero-Risk Bias is the preference for reducing a small risk to zero over a greater reduction in a larger risk. 

Companies want to reach 99.9999% availability, which is very difficult, yet they have no CI/CD to deliver the changes in the first place.

System-1 is biologically hardwired in the human brain, there is no way getting around it. But knowing these biases can help us invoking our System-2 cognition. Pini Reznik et al. proposed patterns to solve many of these bias, their work highly recommended.