When the World Tilts: Cognitive Dissonance & the Collapse of Belief Systems

This month, I’m shifting focus from the mental health care system to a psychological phenomenon that is both fascinating and alarmingly pervasive.

As someone who works in mental health philanthropy, I find this phenomenon particularly relevant, not only because it affects individuals’ well-being, but because it presents an enormous challenge to social cohesion, public health, and trust in institutions. If we want to support resilience at the individual and collective level, we must understand what happens when belief systems break down, and why some people cling to them even more tightly in the face of evidence to the contrary.

I’m talking about what happens when everything you believed to be true is revealed to be a lie, or at least, deeply flawed. When the narrative you’ve trusted collapses under the weight of new facts, global events, or public reckoning, the result can be more than ideological discomfort, it can be a profound psychological rupture. This phenomenon is called cognitive dissonance, and it helps explain everything from post-WWII soul-searching in Germany and Japan, to the unshakable belief of flat earthers and conspiracy culture, to the hardened resolve of MAGA loyalists in the United States. Each one of these could be its own article, but I wanted to take a high-level view of each to show the widespread nature of this phenomenon.

The Psychology of Dissonance

Coined by psychologist Leon Festinger in the 1950s, cognitive dissonance refers to the psychological stress experienced when a person holds two or more contradictory beliefs, values, or ideas at the same time. When people are confronted with information that conflicts with their existing worldview, they experience internal tension and seek to resolve it. But rather than update their beliefs, people often rationalize, deny, or double down. As Festinger observed in his study When Prophecy Fails, "[a] man with a conviction is a hard man to change. Tell him you disagree, and he turns away. Show him facts or figures and he questions your sources” (Festinger et al., 1956).

People typically reduce dissonance by reinterpreting conflicting information, seeking out supportive sources, or minimizing the importance of the contradiction. For example, a smoker who knows the health risks may rationalize the habit by citing anecdotal exceptions or emphasizing the stress-relief benefits (Ichikawa et al., 2011).

Modern neuroscience supports Festinger’s theory with studies showing that the anterior cingulate cortex, a brain region involved in emotional regulation and conflict monitoring, becomes active during episodes of dissonance. This physiological response reinforces how psychologically uncomfortable it can be to confront a deeply held but inaccurate belief (Ichikawa et al., 2011).

Historical Reckoning

In the aftermath of World War II, citizens of Nazi Germany and Imperial Japan faced unbearable truths. In Germany, the exposure of concentration camps shocked a public that had either believed in state propaganda or chosen willful ignorance. The psychological dissonance was immense—how could an advanced society justify (or deny) such atrocities (Buruma, 1995)?

The process of denazification and the Nuremberg trials forced the German public into moral reckoning, yet responses varied. Some experienced deep remorse, others remained silent, and many turned to quiet revisionism. This was not just political or social upheaval; it was psychological disorientation on a massive scale. As historian Ian Buruma writes in The Wages of Guilt, the existential crisis stemmed not only from military defeat, but from the collapse of the moral frameworks that had justified the Nazi regime (Buruma, 1995).

Japan's reckoning took a different path. The Emperor who was once believed to be divine, renounced his godhood, shattering the ideological backbone of Imperial Japan. The state’s narrative of rightful expansion and racial superiority crumbled. But unlike Germany, Japan’s post-war response leaned more heavily on denial, selective memory, and cultural silence. Official narratives downplayed responsibility, and history textbooks often softened or omitted wartime atrocities altogether. This has leaked over into international reckoning as well, as the holocaust is taught in schools, but Unit 731 typically is not (Buruma, 1995).

These divergent responses illustrate how cognitive dissonance is resolved not just by individuals, but by entire societies. Germany's halting but visible process of memorialization stands in contrast to Japan's quieter, often unresolved discomfort. In both cases, generational trauma took root, shaping national identity, foreign policy, and public discourse for decades to come. And the echoes of that dissonance still resound today, in disputes over war memorials, restitution, and historical truth (Buruma, 1995).

Conspiracy as Comfort

Conspiracy movements like QAnon, flat earth theory, and anti-vaxxers don’t flourish simply because people are uninformed. In fact, many adherents are intelligent and curious. What draws them in is not ignorance, but a need for certainty, control, and community in an increasingly unpredictable world (Weill, 2022).

These movements provide psychologically satisfying explanations as they reduce complexity, identify a clear enemy, and flatter believers as being members of an enlightened minority. In doing so, they ease the dissonance between how people want the world to be (stable, fair, and understandable) and how it often feels (chaotic, unjust, and overwhelming) (Weill, 2022). As Carl Sagan once warned, “[u]nable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness.” The ease with which these narratives offer clarity and comfort makes them dangerously seductive, especially when reality feels overwhelming.

A vivid example of cognitive dissonance in action appears in the documentary Behind the Curve when flat earther Patricia Steere discusses the wild beliefs others hold of her, saying:

The funny thing is, is I’m a conspiracy realist but there are conspiracies about me. It started off with me being called a shill as if I’m doing this for money… Then I never thought that the name Patricia, which is my first name, would be spun into the fact that the last three letters are “cia,” which means I’m in the CIA. As if the government would be that dumb but OK. Other things that have been said are that I’m a reptilian and people see my eyes shapeshift while I’m on YouTube, that I drink blood, and the most recent one is that I’m transgender. Now the thing about all these things is I can’t prove any of it wrong. I could, and have, shown my birth certificate, my drivers license, photos of myself as a child and they’ll say well if you’re CIA all of that stuff can be constructed. People will still say you don’t have a family, you don’t have a brother and sister. There is nothing I can do. Anybody can believe what they want to believe, but I wonder if in their hearts people who do believe that know they’re lying. Or are they so conspiratorial that they actually believe it. Then it makes me worry about maybe things I believe in; am I like another version of that. But I know I’m not.

For a moment, she hesitates, visibly wrestling with the implications that perhaps her beliefs are also inaccurate. But rather than adjust her worldview, she doubles down, preserving her belief, and her identity. This perfectly illustrates one of the core tenets of cognitive dissonance theory mentioned above—when facts challenge a deeply held belief, especially one tied to identity, the facts often lose.

Conspiracy theories can thus serve as emotional scaffolding, holding up not just beliefs, but one’s sense of self. Belonging to a community of "truth seekers" offers affirmation and support. And when dissonance arises, group loyalty and personal identity can make belief revision feel like an existential threat (Weill, 2022).

However, it’s important to acknowledge that some conspiracies are rooted in elements of truth, showing that mistrust is sometimes warranted, and that skepticism can be a rational response to systems that have failed the very people they are meant to save. This makes overcoming cognitive dissonance that much more challenging. History offers no shortage of examples, particularly in science and healthcare, sectors currently facing a sharp erosion of public trust. Unethical medical trials, industry funded research, and regulatory failures have all at times confirmed conspiracy theorists' worst fears. These moments of institutional failure complicate the landscape, helping to explain why anti-science and anti-establishment movements have gained such mainstream traction.

MAGA & the Politics of Identity

In the United States, the MAGA movement has evolved from a political position into a fully-fledged identity. For many supporters, it provides not just policy preferences, but a moral worldview and social belonging. And as we’ve seen in previous examples, when that worldview is challenged, by election outcomes, criminal indictments, or scientific consensus, the typical response is not reassessment, but deeper entrenchment. As political psychologist Jonathan Haidt explains, “once people commit to a moral worldview, they become blind to counterevidence and often react with hostility to those who challenge it” (Haidt, 2012).

For those who feel economically displaced or culturally alienated, the MAGA movement offers a cohesive narrative, which is that someone else is to blame for your suffering. It transforms personal frustration into political mission, reinforcing feelings of betrayal while offering validation and purpose. Belonging is a powerful psychological force, and when belief becomes a social identity, walking away can feel like abandoning your community, your values, and yourself (Cherwitz, 2025).

This process is further amplified by motivated reasoning and the tribal nature of modern politics. People unconsciously twist or reject facts that don’t align with their in-group’s narrative. In this environment, truth becomes not a matter of evidence, but of loyalty. Social media and partisan news further insulate individuals from dissonant viewpoints, creating self-reinforcing echo chambers that make cognitive dissonance harder, and more painful, to resolve. The result is a collective form of dissonance reduction through dismissing facts that contradict the identity, while the identity itself becomes more rigid, more defiant, and more essential. (Cherwitz, 2025).

Love & Control

Nowhere is cognitive dissonance more personal, or more devastating, than in abusive relationships. Those in this situation are often trapped in a psychological bind, as they believe their partner loves them, and yet they are being hurt. The contradiction between care and control generates an excruciating inner conflict. And instead of walking away, many rationalize the abuse, believing it must be their fault or that things will improve (Gaba, 2021).

This dissonance is intensified by the cultural narratives we’re taught about love, loyalty, and redemption. Abusers often alternate cruelty with affection—what psychologists call “intermittent reinforcement”—using techniques such as “love bombing” and “gaslighting,” which creates powerful trauma bonds. Over time, the person’s internal world becomes distorted, with the abuser at its center. There are striking parallels between abusive relationships and coercive control systems like cults. In both, doubt is discouraged, information is manipulated, and independence is eroded (Gaba, 2021).

For mental health care providers, this presents a critical challenge. Escaping abuse is not only about physical safety; it’s also about cognitive recovery, helping survivors recognize the dissonance and rebuild their sense of self. Therapeutic support must be trauma-informed, empowering, and above all, compassionate (Gaba, 2021).

Cults & Multilevel Marketing

At their extremes, belief systems, whether ideological, conspiratorial, or relational, can begin to resemble cults. What differentiates a “cultic” system isn’t just devotion to a cause or person, but a set of psychological mechanisms designed to suppress doubt and enforce conformity. Social psychologist Robert Jay Lifton, who studied ideological indoctrination in contexts such as Maoist China and Nazi Germany, identified eight key criteria of thought reform. Among these were milieu control (the regulation of information and communication), sacred science (the belief that the group’s ideology is beyond question), and loaded language (a lexicon that reduces complex realities to slogans) (Lifton, 1961).

These features are increasingly visible in digital spaces. Online echo chambers, where dissent is met with hostility and algorithms reinforce ideological purity, mirror the psychological insulation Lifton warned about. Once immersed in such an environment, adherents often find it difficult to evaluate evidence that contradicts their worldview, not because they are irrational, but because their social belonging, identity, and even sense of safety are entangled with the belief system itself (Barnty, 2024).

Multilevel marketing (MLM) schemes, while not always as extreme as cults, often operate using similar psychological tools. They promise financial freedom, personal empowerment, and a tight-knit community, especially appealing to women seeking flexible work and connection. But as social psychologist Dr. Janja Lalich has argued, many MLMs exploit emotional vulnerability and social capital to maintain control. Members are often told that failure is due not to a flawed system, but to their own lack of faith or effort, an insidious reframing that intensifies cognitive dissonance (Lalich, 2004).

In both cults and MLMs, dissent is framed as betrayal, questions are discouraged, and a narrative of "us versus them" maintains group cohesion. This manipulation of belief and emotion traps individuals in a cycle of hope, shame, and renewed commitment, even when reality fails to deliver.

The New Face of Deprogramming

Historically, “deprogramming” involved dramatic interventions, efforts by family members to forcibly extract loved ones from cults and other high control, coercive environments. These confrontational tactics, popularized in the 1970s and ’80s, often caused further psychological harm and reinforced mistrust. In many cases, they created new traumas without resolving the underlying dissonance or the vulnerabilities that led them to the high control environment in the first place (Hassan, 2023).

Modern approaches to deprogramming have shifted dramatically. Today, mental health professionals, exit counselors, and peer support networks favour trauma-informed, person-centered strategies that focus on restoring autonomy and critical thinking, not imposing new beliefs. The process is often slow and deeply relational. As cult recovery expert Dr. Steven Hassan explains, helping someone “exit” a high-control belief system requires patience, empathy, and respect for their agency. Rather than directly attacking a person’s beliefs, practitioners aim to gently explore inconsistencies, build trust, and reconnect individuals with their pre-cult identity and values (Hassan, 2023).

Techniques such as motivational interviewing, harm reduction, and psychoeducation help individuals regain cognitive flexibility, the ability to entertain multiple perspectives without fear. Peer-led recovery groups also play a crucial role. Hearing from others who have navigated similar journeys can reduce shame and reinforce the idea that questioning one’s beliefs is not a weakness, but a strength (Barnty, 2024).

This softer, relational approach echoes what we know about trauma recovery, which is that safety is the prerequisite for change. As Robert Jay Lifton noted, “psychological coercion works by isolating the individual from any alternate framework of reality” (Lifton, 1961). Therefore, healing begins by reintroducing those alternate frameworks in non-threatening ways, through community, validation, and the rebuilding of personal narrative (Barnty, 2024).

Deprogramming is no longer about “breaking” someone out of a system; it’s about creating the emotional conditions in which they can begin to question it themselves.

Why This Matters for the Mental Health Sector

Cognitive dissonance is more than a psychological quirk; it often marks the beginning of a deeper internal reckoning. When left unaddressed, it can harden into denial, paranoia, or even aggression. But when met with empathy, support, and systems designed for healing, it can serve as a catalyst for transformation. Whether it’s a nation confronting historical wrongdoing, a person leaving an abusive relationship, or a community emerging from conspiracy-fueled division, healing often begins with acknowledging the tension between belief and reality (Festinger, 1956).

But this process doesn’t unfold in isolation. Mental health care must extend beyond clinical diagnosis and treatment to encompass the broader social, relational, and cultural systems people live within. Investing in trauma-informed care, survivor-led programs, peer support networks, and public education that encourages critical thinking and emotional safety helps lay the groundwork for this psychological reckoning (Hassan, 2023). Understanding what happens when deeply held worldviews are challenged is essential, not only to prevent further polarization, but to support mental resilience and collective healing.

Bibliography

Barnty, B. (2024). Exit strategies: Deprogramming and recovery from cult influence. ResearchGate.

Buruma, I. (1995). The Wages of Guilt: Memories of War in Germany and Japan. Farrar, Straus and Giroux.

Cherwitz, R., & Zagacki, K. (2025). Why MAGA Republicans’ cognitive dissonance is so hard to combat. Chicago Tribune.

Festinger, L., Riecken, H., & Schachter, S. (1956). When Prophecy Fails. University of Minnesota Press.

Gaba, S. (2021, May 24). Narcissists, relationships, and cognitive dissonance. Psychology Today.

Haidt, J. (2012). The Righteous Mind: Why Good People Are Divided by Politics and Religion. Pantheon.

Hassan, S. A. (2023). Beyond cult "deprogramming". Psychology Today.

Ichikawa, N., Siegle, G. J., Jones, N. P., Kamishima, K., Thompson, W. K., Gross, J. J., & Ohira, H. (2011). Feeling bad about screwing up: Emotion regulation and action monitoring in the anterior cingulate cortex. Cognitive, Affective, & Behavioral Neuroscience, 11(3), 354–371.

Lalich, J. (2004). Bounded Choice: True Believers and Charismatic Cults. University of California Press.

Lifton, R. J. (1961). Thought Reform and the Psychology of Totalism: A Study of "Brainwashing" in China. Norton.

Weill, K. (2022). Off the Edge: Flat Earthers, Conspiracy Culture, and Why People Will Believe Anything. Algonquin Books.

Leave a Reply

Your email address will not be published. Required fields are marked *