Subscribe: Apple Podcasts | Android | TuneIn | RSS
In my 30s, I was involved in an organization that promoted Nonviolent Communication, a self-help practice that helped me gain self-awareness, empathy, and conflict resolution skills. Unfortunately, my involvement in this group had an unintended side effect: over the course of a decade, I gradually lost my ability to think critically about Nonviolent Communication itself, and I eventually became somewhat of a true believer; I go into more detail about this experience in an article titled “How I Became a Zealot (Then Freed Myself).” During this time, I had similar experiences—though to a lesser degree—with various spiritual and environmental groups, too.
Eventually, I noticed a pattern. After encountering helpful ideas, I was getting involved with groups of people that were promoting those ideas. In some cases, I got very involved; that was the case with Nonviolent Communication, which was my main vocation for a while. In other cases, I was involved to a lesser degree, as a student and spectator; for instance, I might study a group’s ideas, listen to their podcasts, and so forth. Whether or not I got highly involved in these groups, over time, something disturbing was happening: I was losing my ability to think critically about the ideas being promoted by these groups—almost as if I’d been brainwashed.
I now believe this pattern affects many people involved in many types of groups, including political groups, spiritual groups, religious groups, self-help groups, and even professional groups. Having had a number of years to reflect on this pattern, this article is my attempt to shed some light on it and explore what we can do about it.
How True Belief Limits Critical Thinking
What does it mean to be a true believer? You’re a true believer in a set of beliefs when those beliefs seem like facts or obvious truths to you. True belief limits your ability to think outside the box of your belief system, because when your beliefs seem like obvious truths, you’re less likely to seek alternative ways of understanding things, and you’re more likely to dismiss other beliefs that conflict with yours.
I used to think that people become true believers by choice—by consciously, voluntarily choosing to put their faith in a belief system. However, I’ve come to realize that true belief is more like a habit than a choice, and that true belief can arise unconsciously and involuntarily through the influence of a group’s culture. In other words, you can become a true believer by accident when you’re under the influence of a group. In the sections below, I explore how this can happen and what you can do about it.
How We Become True Believers
If we start using some belief system to make sense of life and guide our actions on a regular basis—and if we stop considering other perspectives—viewing life through the lens of this belief system can start to become a habit. At first, learning and applying this belief system may have been a voluntary choice—but, as this behavior becomes more of a habit, this habit can eventually become unconscious and involuntary. At that point, we view life through the lens of that belief system automatically, and those beliefs no longer seem hypothetical to us; they seem like facts or obvious truths. We’ve become true believers without intending to—and, perhaps, without even recognizing that this has happened. I call this process belief installation.
Note that I’m intentionally choosing to use the term belief installation instead of the more familiar term brainwashing. Belief installation is something that happens to us when we take certain actions. Brainwashing, on the other hand, is something that others do to us. Belief installation can occur without brainwashing; belief installation is something that we can do to ourselves, without any help required from anyone else.
For belief installation to occur, you must do two things over an extended period of time: you must start using some belief system to make sense of life and guide your actions, and you must stop considering alternative perspectives. Unfortunately, in many groups, these very behaviors are encouraged by several aspects of group culture—so being involved with these groups tends to transform you into a true believer. I call these groups belief-installing groups.
How Groups Exert Influence
To understand how belief-installing groups work, let’s start by exploring two things that are present in the culture of many groups: norms and ideologies.
Many groups are focused on exerting some kind of influence. For instance, political groups strive for political influence, and spiritual or religious groups strive for spiritual or religious influence. In order to exert influence, the culture of these groups must contain two things: some way of specifying the particular form of influence that the group should exert, and some way of ensuring that group members act to exert that influence. What is it in the culture of these groups that makes all this happen?
One way of answering this question is through the concept of group norms—shared beliefs about what’s good or desirable and what’s bad or undesirable. The vector of influence that a given group exerts is specified by these group norms—these shared beliefs about what’s bad and what’s good. Each group tries to influence people away from what it considers bad, toward what it considers good.
Group norms not only specify this vector of influence but also channel the energy of group members to exert this influence. To understand how this works, consider what happens to a group member who isn’t conforming to group norms: the group starts trying to bring that group member’s behavior back into alignment. It may do so through disapproving glances, lectures about inappropriate behavior, and harsh words. If the misbehavior continues, the group member may eventually be shunned from the group. And, it doesn’t stop there; if the group has political power, the misbehaving person may eventually be imprisoned—or worse. As you can see, failing to conform to a group’s norms is generally an uncomfortable experience. Conversely, those who do conform to group norms are met with acceptance, approval, and other positive experiences.
Norms are enforced not only externally, but internally, as well; after we’ve internalized a group’s norms, we effectively have an inner police force which monitors our thoughts and feelings for misbehavior (and good behavior) and responds appropriately. At that point, we start punishing and rewarding ourselves.
Norms play an important role in all groups, but they tend to play a particularly important role in spiritual groups, religious groups, political groups, and other groups whose mission involves influencing people. The more important it is to a group to exert influence, the more central a role norms are likely to play in that group’s belief system.
What do you call a belief system that’s organized around norms? An ideology. An ideology is a belief system that uses norms to channel group energy to influence people away from what the group considers undesirable, toward what the group considers desirable. Spiritual groups, religious groups, and political groups are usually ideological. That’s because these groups are focused on exerting spiritual, religious, or political influence, and ideologies—that is, belief systems organized around norms—are the central mechanism by which groups exert influence.
Five Factors that Can Convert You into a True Believer
Norms and ideologies are present in many groups, but not all such groups install their belief systems in unsuspecting group participants. What is it that makes belief-installing groups different? What are the cultural factors that create the conditions for belief installation to take place?
Big Promises
One such factor is what I call big promises—promises that supporting the group’s mission will cause very good things to happen, prevent very bad things from happening, or both. All groups have norms, but not all groups make big promises. Norms simply tell us what a group values; big promises tell us that by getting involved with the group and conforming to its norms, we can help avert disaster, create paradise, or both.
Big promises attempt to hook into our deepest fears and desires to increase both the perceived urgency and the perceived importance of supporting a group’s mission. The more you’re convinced by a group’s big promises, the more likely you are to start viewing the group’s mission as very important, and the more likely you are to start using the group’s belief system to make sense of your life and guide your actions.
Zeal and Dogmatism
Zeal is another cultural factor that creates the conditions for belief installation. When a group’s communication has a zealous tone, this reinforces the urgency and importance of supporting the group’s mission; zeal calls attention to the group’s message and makes it more likely that you’ll take note of it.
Dogmatism is a third factor that promotes belief installation. I define dogmatism as the tendency to promote a belief system as absolutely true and to dismiss alternative perspectives. Some groups are so confident in their ideology that they reject all other approaches; “Our way is all we need!” becomes their mantra. The more involved you get with a group that dogmatically promotes its ideology, the harder it gets to think critically about that ideology.
Anti-Intellectualism
Critical thinking gets even more difficult when a fourth cultural factor is in place: anti-intellectualism. Many spiritual and religious groups devalue reason and intellect, focusing instead on cultivating other human faculties like attention, awareness, empathy, compassion, open-heartedness, and devotion. Many political groups devalue reason, too.
When anti-intellectualism becomes a group norm, any expression of critical thinking is met with disapproval. In a group that devalues reason, it can be difficult or impossible to have a critical discussion about the group’s ideology within the group itself. If you internalize the norm of anti-intellectualism, it gets hard to even think critical thoughts about the group.
High Involvement
Aside from these cultural factors, there’s another factor that’s required for belief installation to take place: high involvement. Here’s how this might play out. When you first encounter a belief-installing group, the first thing to catch your attention is likely to be their big promises. You may start learning more about the group and its views and practices, but no matter how zealous and dogmatic the group is, if your level of involvement with the group is low, you’re unlikely to become a true believer. That’s because your low involvement ensures that you’ll continue to be exposed to alternative perspectives, and that helps you maintain your ability to think critically about the group’s beliefs.
However, the more involved you get with a dogmatic group, the more the group’s dogmatism will impact your thinking. Consider what it’s like to be highly involved with a highly dogmatic group: everyone you interact with, every book you read, and all the media that you consume affirm the absolute truth of the group’s ideology and dismiss all other perspectives. Your livelihood may require you to promote the group’s ideology, yourself. No one forces you to live this way; you do so voluntarily because you’ve bought into the group’s big promises and you view the group’s mission as very important.
Under these conditions, it’s very difficult to maintain your ability to think critically about the group’s ideology, so it’s quite likely that you’ll become a true believer. Once you do, you’ll start zealously promoting the group’s big promises and dogmatically promoting the views and practices that support the group’s mission. At that point, you’ve been assimilated! Since high involvement in a given group promotes installation of that group’s beliefs, belief-installing groups tend to encourage their members to get highly involved.
Put all the above factors together—big promises, zeal, dogmatism, anti-intellectualism, and high involvement—and what do you get? A belief-installing group; in other words, an oppressive, exploitative group that promotes and propagates itself at the expense of the autonomy and clarity of its members.
Would it be possible to have a group that effectively exerts influence without controlling participants via true belief? I think so. In belief-installing groups, after a while, group participants tend to start noticing they’ve been taken advantage of by the group, and they start to feel resentful about it. Perhaps a group that avoids belief installation could have better and longer-lasting relationships with group participants, leading to greater effectiveness in the long run.
The Joy of True Belief
It can be joyful to be a true believer, for a number of reasons. We all want certainty, and when you’re a true believer, your group’s ideology provides clear, certain solutions to many of life’s problems. The fact that those solutions aren’t always very effective doesn’t bother you much because you don’t blame the group’s ideology for this; for instance, you may blame yourself for not being skillful enough in carrying out the group’s practices.
We all need a sense of belonging, and our sense of belonging can be deeply fulfilled by being a true believer. When you become a true believer, you join a closely-knit tribe of fellow true believers. Everyone else in this tribe seems trustworthy because they share your ideology and they adhere to it unwaveringly. Their actions are highly predictable because their actions are guided by the same ideology that guides you.
Of course, the price of admission to this tribe is the loss of your freedom to question the group’s ideology. If you start expressing critical thinking about the group’s ideology, you’ll start finding yourself viewed as an outsider—someone who’s no longer to be trusted. At best, you’ll be ignored by those who used to be your compatriots; at worst, you’ll be shunned or attacked.
Uninstalling True Belief
Once you’ve become a true believer, how can you reverse the process of belief installation and escape true belief? The only way to escape true belief is to start seriously considering other ways of looking at things. This opens the possibility of recognizing and acknowledging the extent to which you’ve become a prisoner of your group’s belief system.
Unfortunately, for true believers, there’s a big obstacle to considering alternative beliefs: when you’re a true believer, your beliefs no longer seem like beliefs; they seem like facts or obvious truths. The possibility that there could be alternative ways of looking at things doesn’t even cross your mind. When you’re a true believer, you don’t consider yourself a true believer; you consider yourself lucky to have found the truth. So, to uninstall true belief, you must cultivate the willingness to question the most important truths that you currently take for granted.
True belief can be thought of as a set of deeply ingrained habits of thought and behavior—somewhat like an addiction. Recovering from this addiction involves learning to think and behave differently, and this requires effort. As I mentioned earlier, one reason true belief feels so good is because of the righteous certainty that goes with it; to recover from true belief, you have to let go of this certainty. This can be hard work; the eventual rewards for this work are greater mental and behavioral flexibility and greater effectiveness.
When you’ve been part of a dogmatic, zealous, belief-installing group, you can bet that your group won’t support your efforts to reduce your involvement and uninstall the group’s ideology from your mind. If you try to help other group members uninstall the group’s ideology too, the group will be even less supportive. Trying to uninstall the group’s ideology is likely to trigger negative reactions from the group.
If you’ve become highly involved in such a group—to the point where you’ve become dependent on the group to get many of your needs met—the group’s negative reactions may threaten many important needs of yours. Because of this, it can be very difficult to uninstall a group’s ideology while you’re still highly involved in that group. If you want to be successful, you’ll need to reduce your involvement and find other ways of getting your needs met.
What if you’re involved in a group, you recognize some belief-installing dynamics in the group, and you want to change the group to make it less exploitative—less belief-installing? Is this possible? Perhaps, but it likely won’t be easy. You’ll need to name the exploitative dynamics and advocate for changes in the group’s norms. Mostly, you should expect to be ignored. If you start gaining some traction, you can expect considerable pushback from many group members—especially those who benefit from the existing exploitative dynamics.
Preventing True Belief
Is it possible to participate in a belief-installing group without becoming a true believer? I think so, but doing so requires taking some precautions. You’ve already gotten started on the most important precaution, which is simply to understand how belief installation works. With this understanding, you can recognize the factors that put you at risk. Now, let’s explore what you can do to counteract each of these risk factors.
Let’s start with big promises. The key to counteracting big promises is to recognize the emotional dynamics that make them work. A group that makes big promises is like a manipulative lover in an exploitative relationship. The group tries to manipulate and control you through seduction, intimidation, or both. It tries to seduce you with promises of paradise or intimidate you with warnings of disaster, making big promises that paradise can be achieved or disaster averted if you get involved in the group and buy into its ideology.
If you want to be involved in a group that makes big promises, there are a couple of things you can do to counteract their manipulative power. You can counteract big promises by cutting them down to size—that is, transforming them into realistic expectations. Most spiritual groups, religious groups, and political groups have something helpful to offer; all groups have problems, too. See if you can look past a group’s big promises to get a realistic view of both the benefits and the drawbacks of being involved in the group. Seek actual evidence of how the group has been helpful and harmful. What do you admire about the group’s members and leaders? What turns you off about them? Trust your instincts and come to your own conclusions.
A spiritual or religious group may claim that its ideas would be helpful for anyone, and a political group may imply that its ideas would be beneficial for society at large. However, the reality is that each group’s ideology is helpful within a certain context—for instance, a group’s ideology may be helpful to a certain set of people having certain goals and certain issues, but less helpful or even harmful for others. So, another way to cut big promises down to size is to look for the limits of the helpfulness and effectiveness of each group’s ideology.
Another way to counteract the manipulative power of big promises is to reduce your emotional vulnerability to them. Big promises attempt to hook into our deepest desires and fears. If you find yourself getting hooked by big promises, ask yourself how you’re getting hooked. What is it that you desire or fear? What’s going on inside you that’s allowing you to be seduced or intimidated by the group’s promises? Every experience we must have and every experience we can’t allow makes us vulnerable to exploitation, so increasing your equanimity and your resilience—your ability to be okay with and adapt to a wide range of life’s ups and downs—is a great way to reduce your susceptibility to exploitation. When you recognize an emotional vulnerability within yourself, this is a great topic to discuss with a psychotherapist or trusted healer who can help you overcome it.
There’s nothing wrong with having big goals, and some groups have been effective in achieving remarkable results. The idea here isn’t to limit your aspirations but to limit your vulnerability to being exploited by manipulative, controlling group dynamics.
Now, let’s consider how you can counteract zeal and dogmatism. Every communication contains both an explicit and an implicit message; the explicit message is what is stated; the implicit message is how it’s stated. A zealous tone conveys the implicit message, “What I’m saying is urgent and important!” and a dogmatic tone conveys the message, “What I’m saying is the one and only truth!” If you’re unaware of these implicit messages, they can have a manipulative effect on you; bringing them into awareness counteracts that effect. So, when you encounter zeal or dogmatism, take note of them. For instance, you might say to yourself, “This person sounds pretty zealous and dogmatic.” Notice the implicit messages that are being communicated, and notice the effect that these messages have on you. Once you get good at spotting zeal and dogmatism, you’ll be much less susceptible to manipulation.
To counteract anti-intellectualism, recognize the benefits of critical thinking and open-mindedness. In political groups, notice how anti-intellectualism is a form of oppression. In spiritual groups and religious groups, avoid buying into the idea that you must let your intellect wither away in order to cultivate other faculties like attention, awareness, empathy, compassion, open-heartedness, and devotion. Recognize how a strong intellect empowers you and protects you from exploitation.
To counteract the risks of high involvement, avoid becoming overly involved in any given group and keep your life balanced. If your work requires a high level of involvement with a given group, cultivate friendships with people who are not involved with that group. If your friends are all highly involved with a given group, seek work in a different field. If your work and your friends are all connected to a given group, expose yourself to ideas from outside that group (via books, podcasts, and so forth).
By taking the precautions I’ve described here, I hope you might be able to explore spiritual groups, religious groups, and political groups more confidently while maintaining your ability to think clearly and critically about these groups and their ideas.
For Further Reading
- For insightful perspectives on how these dynamics are playing out in the realms of politics and culture, see “The Memetic Tribes Of Culture War 2.0,” by Peter Limberg and Conor Barnes, and Political Tribes: Group Instinct and the Fate of Nations, by Amy Chua.
- Indoctrination happens in professional realms too—even in science! For a detailed account of how belief installation has impacted the field of quantum physics, see What Is Real?: The Unfinished Quest for the Meaning of Quantum Physics, by Adam Becker.
- Sociologists have been studying the relationships between groups, ideologies, and exploitation for a long time now. If the ideas in this article seem interesting and you haven’t explored sociology much yet, I recommend you pick up an introductory sociology book. Here’s one I like: The Sociology Book: Big Ideas Simply Explained
Photo 1082140154_e1efa3e373_o.jpgVSHn by Conor Lawless is used under a Creative Commons Attribution 2.0 Generic license.