-Part Two (Part One was published last week in NPM)
Continuing on to the second epistemic domain:
Philosophy literally means “love of knowledge.” It is the way we “know” the things that we have to think about or truths that we have to reason our way to that don’t fall into the domains of math or science. It is of course a very broad category, but for the purpose of this discussion we can think about philosophy as the ways in which we evaluate our assumptions about the world around us that we cannot test by science and how we should live in it. In a sense, it is the “head” part of our spirituality; it is the domain of logic- and fact-based reasoning.
(The immediate reaction of some spiritual seekers here is to say, but “What about the ‘heart’? There are other ways of knowing!” Yes, there are, that is the point of the whole conversation; we will get to that shortly.)
The demarcation between science and philosophy is not always clear, and the more-scientistic public apologists for science sometimes err and claim that questions of values and ethics can be evaluated scientifically. There are good arguments against this view. Hume, again, said it best when he pointed out that you can’t make assumptions about the way things should be based on the way they are; or, that you can’t derive an “ought” from an “is.” Science is the domain of “is;” philosophy is the domain of “ought.”
As with science, rigor, logic, and skepticism are foundational components of philosophy. So is integrity—when we arrive at a conclusion that we have rigorously reasoned our way to, we must have the integrity to embrace the new conclusion. This last is not to be taken lightly and the implications of changing one’s worldview can be severe. They may include losing our friends or social group, losing our identity as an expert, and losing our sense of certainty about the world. Philosophical integrity requires courage and strength of will.
Logic is the cornerstone of philosophy, so we will focus on that for now.
Everyone thinks they are logical, just like everyone thinks they have common sense. But again, the wisdom traditions (and modern cognitive psychology, to which we will return shortly) make it very clear to us that not only do we often fool ourselves, but that we are easy to fool. The reason why the study of logic is such an integral part of philosophy is because we are not inherently logical—we tend to make assumptions based on naïve intuitions and emotion and then rationalize those assumptions with often-flawed logic.
The best way to recognize when we do this is to familiarize ourselves with logical fallacies and ruthlessly hunt them down and excise them from our thinking.
Logical fallacies fall into two broad categories: formal and informal.
A formal fallacy is an error in logic that can be seen in the argument’s form. An example of this is provided on the Wikipedia page on formal logical fallacies:
- If Bill Gates owns Fort Knox, then he is rich.
- Bill Gates is rich.
- Therefore, Bill Gates owns Fort Knox.
Both the first two statements are true, but since there are many ways to be rich, it is fallacious (and thus illogical) to assume the conclusion is true.
Formal fallacies tend to be jarring and obvious, but informal logical fallacies can be more difficult to identify because they are subtler and not as obvious in the construct of the logical chain. Still, they are errors in reasoning that make the conclusion suspect, though not always wrong.
Logical fallacies run rampant in the Enneagram community. Some examples follow (note that in logic, P stands for “proposition,” a statement that can be either true or false):
- Argument from authority: “(Famous Enneagram Teacher) says P, therefore P must be true.” No, (Famous Enneagram Teacher) may be wrong.
- Argument from popularity (This one was actually said to me at the conference session): “If 30 people report P to be true, it must be true.” No, lots of people can be wrong about the same thing. And, what if 31 people report P to be untrue?
- Argument from antiquity: “People have taught P for thousands of years, therefore, P must be true.” Need we go through the list of ancient ideas that have ended up in the dustbin of history?
- Argument from incredulity: “I can’t believe Jane is an Eight; she is too nice!” Just because you can’t believe it doesn’t make it untrue.
- The “No True Scotsman” fallacy: This is a form of circular reasoning that goes like this: Burns—“No true Scotsman dislikes whiskey.” McGregor—“Connery is a Scotsman and he dislikes whiskey.” Burns—“Connery is no true Scotsman.” This fallacy allows us to exclude any data that might contradict our assumptions. I once had an Enneagram teacher say to me that all Eights, Nines, and Ones had big bellies. When I pointed across the room and said, “Well, look at Rob; he’s thin as a rail and he’s a One;” the teacher said, “He can’t be a One, he doesn’t have a big belly.”
I could go on, but I’ll encourage the reader to familiarize themselves with logical fallacies at these great sites: www.yourlogicalfallacyis.com, http://www.logicalfallacies.info/, and http://www.nizkor.org/features/fallacies/.
It is important to note that logic and identifying logical fallacies are not in anyway in conflict with spiritual practice; in fact, they enhance it by helping us see through our illusions more quickly. Every wisdom tradition has an element of or branch devoted to intellectual and logical rigor; they are there for a reason.
Closely related to logical fallacies but worth special mention are unsupported leaps of inference. An “inferential chain” is the series of assumptions we make based on previous facts or assumptions.
Here’s an example: My son Alec tends to eat Cheerios for breakfast each morning, and he tends to leave his bowl on the kitchen table after finishing his breakfast. Imagine that I come down stairs in the morning and everyone but Alec is asleep and there is a bowl with a little milk and some uneaten Cheerios in it on the table. Following the chain of inference—the milk and cereal, Alec’s tendencies, lack of other people around—it would reasonable for me to conclude that Alec left it there (though I should confirm the assumption before concluding it to be true). It would be unreasonable, however, to make a huge leap of inference and conclude “Goldilocks must have been here!”
This example seems obvious and silly, but I see leaps of inference made all the time.
Common inferential leaps include the aforementioned quantum physics (“The ‘observer effect’ proves that there is non-local consciousness!” No, it doesn’t.) and distortions of evolution (“Gaps in the fossil record prove there must be an intelligent agent acting on evolution!” No, they don’t.) I also often see huge leaps of inference in claims by some in the Enneagram community regarding the “heart center” based on the research of organizations like HeartMath, despite the fact that the original research does not actually support the claims.
Logical fallacies and leaps of inference do not necessarily mean that the conclusions derived from them are wrong. There may be non-local consciousness; there may be an intelligent agent acting on evolution; Goldilocks may have been in my kitchen. However, if we are going to assert these conclusions as anything other than bald statements of faith, we have to come up with a more rigorous way of supporting our claims.
This may be a good point to leap-frog over subjective experience for a moment and talk about belief as an epistemic category. “Belief” is typically defined as the feeling of being certain that something is true. In epistemology, belief can be justified when it is based on evidence and it corresponds to the facts; but here I am talking about belief based on the kind of faith described in the book of Hebrews: “the substance of things hoped for, the evidence for things not seen.” This kind of belief is certainty irrespective of evidence or logic. These matters are purely personal. There is no value in arguing with them; there is no value in arguing to support them. They are usually (but not exclusively) the in the realm of religion. They certainly inform our spirituality, but they are not necessary for spirituality.
I want to be clear that I am not anti-religion. I have studied comparative religion since my days in seminary in the early 1980s; a quick glance at my Facebook page will show that I tend to visit cathedrals, mosques, and other houses of worship whenever I travel to a new city. Faith-based beliefs have value—they can comfort us in loss, provide a sense of meaning, create social connection.
That said, I’m not comfortable seeing belief as the source of our ethics, and I join Plato in being leery of revelation-based morality.* For me, the study of ethics and morality fit comfortably into the domain of philosophy.
I also want to be clear that I am adamant about not making claims that overstep epistemic domains. If we claim that some philosophical or scientific argument supports a belief claim, it is adamant that we rigorously follow the rules of those domains. Each epistemic domain has rules or conventions and when you enter into a particular domain you must stick to those standards if you wish to maintain any kind of intellectual integrity. You can’t simply change the rules of science when the facts prove inconvenient; you can’t abandon logic when reasonable conclusions bump up against your opinions.
Now we come to the domain of the “heart” and the “gut.” I put those words into quotation marks because they are metaphors that too many people take literally. Our heart—the organ in our chest that pumps blood—does not have an intelligence of it’s own on par with the brain in our “head.” Nor does our gut—whatever we even mean by that (our liver? small intestine? large intestine? stomach? all of the above?). Yes, there are autonomous, non-conscious central nervous system functions in the chest and abdomen, but it is a false equivalence to talk about this as if they were three equal intelligences.
When we discuss subjective experience we are discussing assumptions drawn from an experience that is personal to us and may not apply to someone else. It is cognitive (formed by the brain/central nervous system) but usually not something we can reason our way to or fully explain. It is based on feelings, not logic or fact-based reasoning.
It is conceivable that there are some who would say there is no value in this way of knowing, though I have never met anyone who makes that claim—even among the most rigorous, skeptical, and combative scientists.
I think there is great value in this domain. We learn much from our life experiences, our transcendental practices, and the arts—each teach us things that can’t be learned or described in any other way.
Falling in love or having one’s heart broken teach us more about what it means to be the mercy of our emotions and the limits of reason than we could possibly learn in a textbook.
Meditation practices teach us to still our mind and pay attention to our experience.
Bruce Springsteen’s “Living Proof” and Louden Wainright’s “Daughter” teach us more about what it’s like to be a father than any study of dopamine. (Springsteen wasn’t exaggerating when he sang, “I learned more from a three-minute record than I ever learned in school.”)
Magritte’s painting “The Treachery of Images” teaches us more about the paradox of labels than a million sophomoric semantic debates.
Standing in Paris’s Sacre Couer at sunrise or dusk teaches us more about both the insignificance and majesty of humanity than any sermon ever could.
Coltrane’s “A Love Supreme” captures the experience of transcendence better than any sacred text.
Life without the “heart” and “gut,” a life without “soul,” would be empty indeed.
While there is profound value in these subjective experiences, we have to be careful about extrapolating from them to objective declarations. Stephen Colbert famously captured this tendency by coining the term “truthy” for things that feel true regardless of the facts. Many of us are only too happy to make assertions about reality based on truthiness. It seems to me that we are, as a society, losing the ability to distinguish between facts and opinions. When you have “you truth” and I have mine, neither of us really has “truth;” we have opinions.
In his bestselling book, “Thinking Fast and Slow,” Daniel Kahneman popularized Keith Stanovich’s idea about System 1 and System 2 thinking. System 1 is “fast” thinking, intuitive and based on cognitive shortcuts and heuristics (mental models) that allow us to respond to our environment quickly, but these shortcuts mean that System 1 is loaded with inaccuracies. System 2 is more conscious and deliberate thinking, but it is slower. System 1 works well in the domain of subjective experience; System Two works better in the realms of science and philosophy. The two systems have evolved because they serve different purposes and both have value. But we also need to recognize the limitations of both.
Applying a Hamlet-esque System 2 in the face of an onrushing car is not a good strategy (“To move or not to move, that is the question.. Is it better to—” Splat!).
Implicitly trusting our System 1 intuitions is equally dangerous and cognitive science shows us that our naïve intuitions and subjective perceptions are prone to a long list of cognitive biases.
These, too, permeate the Enneagram world. Three of the most common are:
- Confirmation bias is the tendency to non-consciously and unintentionally ignore the information that goes against our assumptions and embrace information that supports it. This can lead us to make assumptions about characteristics of a particular Ennea-type. If we believe that, say, all Eights, Nines and Ones have big bellies, we will see big-bellied Eights, Nines, and Ones everywhere but not see the ones that are lean.
- The anchoring bias is the tendency to be rooted to a particular perception of something based on an initial valuation. For example, a savvy negotiator will set a price for something that is in his or her favor, knowing that the initial price serves as an anchor that will influence the final sale price. I know that I fall victim to an Enneagram-related anchoring bias pretty regularly. I unconsciously associate Ennea-type Two with females and have at times struggled to identify the Ennea-types of male-Two clients. The anchor limits my ability to see the bigger picture.
- The Clustering Bias is the tendency to see patterns where they do not exist or overvalue patterns that do exist. It is easy to see patterns in people’s behavior if we are looking for them, and the geometric structure of the Enneagram inclines those prone to this bias to start seeing interrelationships everywhere they look.
Cognitive biases can work together as well. One common example of this is when people are introduced to the Enneagram and conclude that they are a particular type. They read all there is to read about that type, and then go on a panel discussion and start talking about what it is like to be that type. They say all the right words, but the affect seems all wrong and it feels like they are reading from a script rather than relating genuine experience. This is because they have set an anchor as being a particular type and then found all the confirming evidence to support the conclusion while ignoring the signs that others seem able to see.
In the subjective experience domain, we need to be careful that we don’t confuse states and stages. Subjective experiences—especially those in emotionally heightened environments such as retreats or workshops—can make us feel like we have changed when in fact we have simply experienced a temporary heightened-emotional state. While these states provide the catalyst for growth, true maturation—the “stage” work—is achieved more typically through the ongoing grunt work found in the philosophical domain and deliberate practice of techniques related to the subjective experience domain. When the subjective domain work is random, non-deliberate, and conducted without equal attention to the philosophical domain, people end up as workshop junkies going for their fix and later finding themselves unfulfilled until the get their next fix.
Subjective experience has value, but as will all ways of knowing we have to tread cautiously and consciously, and not make assumptions that overstep our epistemic boundaries.
The battle between proponents of science and proponents of spirituality will continue, but it is a false battle. Not only are these two things not at war (even if their more-dogmatic proponents are), there is no reason to try to integrate them. In fact, attempting to do so cheapens them both. Rather than being swept up in the rhetoric of science vs. spirituality, the Enneagram community would do well to focus on clearly identifying epistemic categories and using the appropriate tools for the appropriate task. Doing so will elevate us above the confusion, and make our pursuit of truth that much easier.
*In Plato’s “Euthyphro” dialog, Socrates points out the uselessness of relying on the gods to determine for us what is good. If they arbitrarily determine what is good, they can arbitrarily change their mind, and thus “good” has no lasting meaning; if the gods rely on some a priori notion of good independent of them, then the gods are irrelevant and serve as little more than messengers we can identify ourselves.