The Semmelweis Reflex bias, and why people continue to believe things which are proved wrong
How would you react to a piece of information which proves that something you believe is wrong?
While most people think they would react logically, you might be surprised to know that your brain might tell you to ignore this new information and continue to believe what is wrong.
This is an unconscious bias known as the Semmelweis Reflex.
It is a story about one of the most important medical findings of our time, and how many people whom it would benefit the most, chose to simply ignore it.
It is named after the 19th Century Hungarian doctor, Ignaz Semmelweis, who was one of the first scientists to show a link between hygiene and infections in hospitals, well before Louis Pasteur popularised Germ Theory.
Semmelweis was working at two Clinics in the same hospital in Vienna which had drastically different mortality rates for mothers giving birth. Clinic 1 had average mortality rates over 10%, often caused by puerperal fever (its reputation was so bad that women would rather give birth in the street in front of the clinic than enter it), whereas Clinic 2 had rates below 4%. Semmelweis struggled for years to find a difference between the two which would explain why clinic 1 was so much more deadly than clinic 2. The two clinics had almost the same techniques and facilities. Semmelweis investigated a multitude of different factors, even going as deep as to analyse the different religious practices at the two clinics.
The main difference between the two was who worked at each clinic. Clinic 1 was designated as a teaching service for medical students, whereas Clinic 2 was predominantly for the training of midwives.
In 1947, Semmelweis’ close friend Jakob Kolletschka died after being cut by a scalpel which a medical student had used for dissecting a dead body, and Semmelweis saw similar characteristics of death in Kolletschka’s autopsy as the characteristics of women who died from puerperal fever.
This was the evidence Semmelweis had been looking for. Students and doctors in Clinic 1 would often dissect dead bodies, before immediately going on their rounds in the maternity ward. Often, they would still have blood on their hands, tools and clothing when they did this. Somehow, they were transmitting something from the dead bodies to the mothers causing the infection.
Remember, this was before people knew about bacteria or viruses and their role in disease.
Semmelweis instituted a hand-washing protocol for all staff with a chlorine solution (chosen because it was the best at removing the putrid smell of autopsy tissue). In the following months, both clinics saw their maternal mortality rate fall significantly, by over 90%.
The hand washing system was working, even though it was not 100% proven why it worked.
Still, Semmelweis expected this news to spread quickly amongst all hospitals and doctors. After all, it was their duty to help protect their patients, right?
Unfortunately, not every doctor took this new information seriously.
The medical field at the time believed that many diseases were due to the different “four humours” (liquids) in the body being out of balance, a practice dating back to ancient Greece. Semmelweis’ new theory was not in line with the prevailing theory, and thus was ignored by many doctors.
In some extreme cases, doctors thought that the blood on their hands a clothes was a symbol of their work, and thus refused to think that these should be washed.
Others even believed that the simple fact that they were respected men, high up in society, was proof that their hands could not possibly be unclean.
As a result, it took years for hand-washing to catch on around Europe, resulting in thousands of deaths which could have been prevented.
Explaining the Semmelweis Reflex bias
The Semmelweis Reflex is therefore the reflex-like reaction for people to reject new information if it contradicts established norms or paradigms.
It is one form of Belief Perseverance bias, where people will hold on to their beliefs despite new information which directly contradicts it.
Other examples here include people who were part of a cult who believed the world would end on December 21, 1954. When researchers asked them after this date what happened (hint: the world did not end and they survived), most of them did not put much emphasis on the new evidence, and some even said that they believed in the prophecy more than before.
In other modern examples, we can see it in people who believe Q-ANON conspiracy theories saying that Donald Trump had won the 2020 election, and would actually be inaugurated instead of Joe Biden. Once this didn’t happen, some simply ignored the new information, even though they could not articulate why or what they believe instead.
But it is not just cultists or conspiracy theorists who suffer from the Semmelweis Reflex and Belief Perseverance biases
It can happen to anyone, including you.
It happened to the French Wine Industry when they did not want to accept evidence that American Wines had apparently outperformed their best vintages.
It happens to millions of people around the world every day, especially in companies.
Ask yourself why so many companies go out of business because they fail to innovate and adapt to changes in the market. At a base level, it is because some of the decision makers would prefer to believe that there is in fact not a change happening, or that it is not a big deal.
Your brain craves stability. Therefore, it would prefer that new information which challenges this stability is not true.
But the good news is that another bias can be used to counteract the Semmelweis Reflex: The Mere Exposure Effect.
This bias indicates that the more often we are exposed to a new idea, the more positive we see it.
So if you are trying to get buy in for your new innovation, it might be rejected the first time. Potentially also the times after that. But the more often that decision makers are exposed to it, the more likely it will become that eventually they begin to accept it.