Thinking and the Enneagram 5 – A Potpourri of Logical Fallacies



This fifth article in the series on critical thinking and the Enneagram is a bit of a potpourri and will touch briefly on a few common logical fallacies I see people (including myself) commit when talking about the Enneagram.

The “Argument from Authority” is the mistake people make when they assume something must be true because an “authority” said it. I once read a study that talked about how a region of our brain that supports critical thinking “goes to sleep” when in the presence of a charismatic lecturer. The authors thought that this was a result of our evolutionary legacy that helped our ancestors survive because the alpha male of the tribe usually did know what he was talking about and there was reproductive advantage in listening to him.

So, we may be biologically predisposed to falling victim to the argument from authority fallacy, but it is a predisposition that has outlived its utility. History, in fact, is filled with countless examples of experts being wrong—Einstein was initially wrong about quantum physics, for example, and he was Einstein!

The facts make a claim right or wrong, not the person who is speaking them.

Does this mean that we shouldn’t listen to experts? Is a car mechanic’s advice on heart health as valid as a cardiologist’s? Of course not. It is prudent to give more credence to an expert than a non-expert as a starting point, but then to weigh claims based on evidence. The more implausible or fantastic a claim seems, the more evidence we should demand.

We should be particularly wary when a guru crosses domains of expertise—for example, when an Enneagram teacher starts giving advice on health, finance, business, etc.; or when an expert on health, finance, or business starts pontificating about the Enneagram. The “Halo Effect” (which further causes us to fall into this fallacy) is the tendency to see a person as good at everything just because they are good at one thing (or because they are attractive), and it can cause us to be gullible and not see that people often crossing domains often lack expertise in the area they are crossing into.

When it comes to experts, trust but verify.

The “Argument from Antiquity” is the belief that something must be “better” or “true” because it is old. I once challenged a specific claim by an Enneagram teacher who was applying a particular ancient practice to the Enneagram. He essentially responded with, “people have practiced this for thousands of years, I don’t understand why you would question it.”

Yes, some ancient wisdom stands the test of time, but an awful lot of old ideas deserve their place in the dustbin. People believed the sun revolved around the earth for a long time. People have long believed that planetary motions affect human affairs despite the lack of any evidence or mechanistic plausibility. People long ago believed that “bleeding” a sick patient was effective treatment (tell it to George Washington, one of countless people who died from excessive bloodletting…). “Ancient” does not mean “true;” it means “old.” Old is not inherently bad, but it is not necessarily a signifier of accuracy, either.

There seems to be a great interest in the Enneagram world to find the roots of the system deeper and deeper in antiquity.  This may be of interest from a historical perspective, but it is important to remember that, even if these claims are true, it doesn’t mean the Enneagram is therefore more valid. Our claims about the Enneagram are valid not because they may be the same claims long-dead people made; they are valid if they are logically coherent, internally and externally consistent, and based on empirical evidence.

The “No True-Scotsman fallacy” is one of my favorites, if for the name alone. The fallacy, a type of confirmation bias*, goes like this:

McGregor: “All true Scotsmen love haggis.”

Burns: “Connery is a Scotsman, and he doesn’t like haggis at all…”

McGregor: “Well, then, Connery is not a ‘true’ Scotsman!”

Similarly, I once had an Enneagram teacher tell me that all Eights, Nines, and Ones had large bellies. He then pointed to a number of people in the room to support his claim. I pointed to a particularly thin and fit man across the room who I knew to be a Ennea-type One and said, “Chris is a One, and he’s thin as a rail.” The response was “Well, he must not really be a One.”

It is not uncommon to see people make universal claims about people of a particular type based on experience of a few people of that type. This is variation of the “correspondence bias,” which is the tendency to assume that a particular action or characteristic is true of the person’s broader personality. (For example: We think “Harry got mad when that person took his parking spot; Harry must be an angry person” rather than “Harry got mad when that person took his parking spot; I wonder what caused Harry to react that way…”). Once we fall victim to the correspondence bias it is a short leap to, “Harry is a Ennea-type ‘X’ and he is an angry person and he has really nice hair; therefore, Ennea-type ‘Xs’ must all have anger issues and nice hair.” If someone points out that Jane is an Ennea-type ‘X’ but she has unremarkable hair, we say, well, she must not be a true ‘X’…

The correspondence bias and the No-True Scotsman fallacy can lead to what I call “Enneagram contortionism.” This happens when we become so in love with a premise that we contort our understanding of the system, adding more and more complexity to justify an assumption (“Yes, I know Jane is not like other ‘Xs,’ she has a ‘Y’ wing, and a ‘Z’ subtype, and her trifix is ‘XAB’ and she is at level ‘C’….”). All or any of these qualifiers may be true, but the more qualifiers we have the less useful the system becomes, and there is a significant danger of finding some rationalization to explain away a wrong first principle.

We would do well to work from a few highly refined first principles and then invoke Occam—if the evidence contradicts our premise, we should not add complexity; we should question our premise and seek a simpler, more-parsimonious explanation.** In short, a bird that coos rather than quacks may be an odd duck, but chances are it is a pigeon.

 

Mario Sikora is an executive coach and consultant who advises leaders in large multinational organizations and conducts Enneagram-based certification programs and workshops across the globe. He is the co-author of Awareness to Action: The Enneagram, Emotional Intelligence, and Change and past president of the Board of Directors of the IEA. He continues to serve on the board of directors and overseas international affairs for the IEA. He can be reached via his website: www.awarenesstoaction.com.

 

*Confirmation bias is the tendency we all have to see evidence that supports our beliefs and overlook evidence that contradicts them.

**I wrote more about this at http://ninepointsmagazine.org/simple-mario-sikora/.