Here's something that should bother you: propaganda works on smart people. Not just on the poorly educated, or the easily led, or the people on the "other side." It works on everyone, because it's designed to exploit features of the human brain that existed long before modern media — features that, in other contexts, are genuinely useful. Psychologists Daniel Kahneman and Amos Tversky spent decades mapping these patterns beginning in the 1970s. Their work — which earned Kahneman the Nobel Prize in Economics in 2002 — showed that human decision-making deviates from rationality in systematic, predictable ways. They called these deviations cognitive biases. Here are the ten that manipulators rely on most — and what to do about each. Confirmation Bias What it is: The tendency to search for, interpret, and remember information that confirms what you already believe — while dismissing or underweighting information that challenges it. How it's exploited: Political media, partisan news outlets, and social media algorithms all feed you more of what you already agree with. The goal is to keep you comfortable and keep you clicking. When you only see information that confirms your worldview, it feels like you're learning — when actually you're just getting more certain about things you were already certain about. The catch: Confirmation bias doesn't feel like bias. It feels like being right. The fix is deliberate: regularly seek out the strongest version of the argument you disagree with. Not to be converted — but to understand what you're actually arguing against. The Availability Heuristic What it is: We judge how likely or common something is by how easily examples come to mind. If we can think of lots of examples quickly, we assume the thing is common. If we struggle to think of any, we assume it's rare. How it's exploited: Fear-based news coverage works through this bias. If you see 40 stories about violent crime in a month, you will feel like violent crime is everywhere — even if the actual crime rate is declining. Terrorist attacks receive massive media coverage relative to their actual statistical frequency, causing people to dramatically overestimate their personal risk. The catch: Ask yourself: is this thing actually common, or does it just feel common because I've seen it a lot recently? Check the actual numbers. The Bandwagon Effect What it is: The tendency to adopt beliefs or behaviors because other people do. Also called herd mentality or social proof. How it's exploited: "Everyone is saying..." "The majority of people believe..." "Scientists agree that..." These phrases are used to short-circuit individual evaluation. If it's what everyone thinks, why bother checking? Cult recruiters use this constantly — the sheer number of believers is presented as evidence that the belief must be correct. The catch: Consensus matters in science, where it's built on evidence. But consensus is manufactured in politics, advertising, and media all the time. "Lots of people believe it" is a data point, not a proof. The Authority Bias What it is: The tendency to attribute greater accuracy to the opinion of an authority figure — and to be more easily persuaded by them — regardless of whether their authority is actually relevant to the claim. How it's exploited: Tobacco companies hired doctors to appear in cigarette ads. COVID-era misinformation often featured people with medical credentials making claims outside their expertise. PR firms coached witnesses for congressional testimony. The white coat, the title, the credential — these trigger deference even when the person behind them has a conflict of interest. The catch: Check whether the authority's expertise is actually relevant to the claim. And always ask: who is paying them? The Anchoring Effect What it is: When making decisions, people rely heavily on the first piece of information they receive (the "anchor") — even if it's arbitrary or wrong — and adjust from there rather than evaluating from scratch. How it's exploited: A product is listed at $500, then "discounted" to $299. The $500 number was always fake — but it anchors your perception of value. In political messaging, the first characterization of an event sets the frame. If the first story you read calls a protest a "riot," every subsequent story about it will be processed through that frame. The catch: Identify what the anchor is before you accept any framing. Ask: where did this starting point come from, and who chose it? The In-Group/Out-Group Bias What it is: Humans naturally form social groups and develop favoritism toward their group (in-group) and suspicion or hostility toward other groups (out-group). This is a deep evolutionary feature — it once helped with survival in small tribes. How it's exploited: This is the engine of almost all political propaganda. Define the enemy clearly. Make them alien, threatening, subhuman. Assign all positive traits to the in-group and all negative traits to the out-group. Every authoritarian movement in history has used this bias as its primary tool for building loyalty and enabling violence. The catch: When you notice you're evaluating a claim differently based on who made it — your side vs. their side — that's this bias. A true claim is true regardless of who says it. The Sunk Cost Fallacy What it is: Continuing to invest in something (time, money, belief, identity) because of what you've already invested — rather than based on the current evidence for whether it's worth continuing. How it's exploited: Cults rely heavily on this. The more time, money, and relationships a member sacrifices, the harder it becomes to accept that the sacrifice was for nothing. High-control groups escalate demands precisely because each act of compliance makes the previous ones harder to abandon. Political identity works similarly — people stay loyal to a party or leader long after the evidence has turned, because admitting they were wrong means admitting years of investment were misguided. The catch: The only relevant question is: given what I know now, is this worth continuing? What you already spent is gone. It shouldn't determine what you do next. The Affect Heuristic What it is: We use our current emotional state as a source of information when making decisions. When we feel afraid, we judge risks as higher. When we feel good, we judge things as safer. Emotional state shapes judgment — and it can be manufactured. How it's exploited: Every political ad that opens with ominous music, a crying family, or a frightening image is exploiting this. The emotion is activated before any facts are presented. By the time the claim arrives, you're already in a threat-processing state — not an analytical one. This is why fear-based messaging is so common: it works by bypassing rational evaluation. The catch: Notice the emotional temperature of any message before evaluating its claims. The question isn't "does this feel right?" — it's "is this true?" The Illusory Truth Effect What it is: Repeated exposure to a claim — regardless of whether it's true — increases the likelihood that people will rate it as true. Simply hearing or reading something multiple times makes it feel more familiar, and familiarity is unconsciously equated with accuracy. How it's exploited: This is the entire premise of propaganda as a system. It doesn't matter if the claim is true — if it's repeated often enough, through enough channels, it becomes the baseline assumption. The tobacco industry's strategy of manufacturing doubt explicitly relied on this: flood the zone with competing claims until the public gives up trying to adjudicate and just accepts uncertainty. The catch: Something being repeated constantly is not evidence that it's true. In fact, in a media environment, it may be evidence of an organized campaign. The Backfire Effect What it is: When presented with evidence that contradicts a strongly held belief, some people don't update their belief — they hold it more strongly. The correction triggers a defensive response. How it's exploited: This is less exploited and more simply known about. Manipulators know that once someone is deeply committed to a belief, direct confrontation often strengthens rather than weakens it. This is why cult members often become more devoted after predictions fail. It's also why simply showing people fact-checks doesn't automatically correct misinformation. The catch: Be aware that you probably experience this too. When something you believe is challenged and you feel your defenses going up — that's the moment to slow down, not to dig in. The defensiveness itself is a signal worth examining. Putting It Together These ten biases don't operate in isolation. A skilled manipulator — whether a cult leader, a political operative, or an advertising agency — deploys several simultaneously. An emotionally primed message (affect heuristic), delivered by a credentialed authority (authority bias), repeated across multiple channels (illusory truth), targeting a group's identity (in-group bias), within an algorithm that only shows confirming information (confirmation bias) — that is a complete system. The good news: naming the mechanism substantially weakens it. Research from Kahneman's work and subsequent studies consistently shows that metacognitive awareness — knowing that you're susceptible to a bias — improves decision-making even if it doesn't eliminate the bias entirely. You don't have to be immune. You just have to be watching. They didn't ask if we wanted to understand this. Here it is anyway. References Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.
Tversky, A. & Kahneman, D. (1974). "Judgment under Uncertainty: Heuristics and Biases." Science, 185(4157), 1124-1131.
Brañas-Garza, P. et al. (2021). "The Impact of Cognitive Biases on Professionals' Decision-Making." Frontiers in Psychology, 12.
Hassan, S. (1990). Combating Cult Mind Control. Park Street Press.
Nyhan, B. & Reifler, J. (2010). "When Corrections Fail: The Persistence of Political Misperceptions." Political Behavior, 32(2), 303-330.
Cialdini, R. (1984). Influence: The Psychology of Persuasion. HarperCollins.