You've probably had this experience: you hear a political claim, it immediately feels true, and no amount of evidence to the contrary seems to change your mind. Or the opposite — someone you disagree with says something reasonable, and you can't bring yourself to agree even when part of you knows they have a point.
This isn't a character flaw. It's how human brains work. We evolved to make fast decisions, trust our group, and avoid being wrong in public. None of those instincts serve us well when we're trying to evaluate a healthcare bill or figure out whether our senator kept their promises.
The good news: once you know the names of these shortcuts, you start noticing them everywhere — in news coverage, in campaign ads, in your own reactions. That noticing is the first step toward thinking more clearly about politics.
Here are the six that show up most often, with illustrative examples from across the political spectrum.
If you already believe that government spending is wasteful, you'll notice every story about a failed government program. If you believe corporations are corrupt, you'll notice every story about corporate malfeasance. Both things happen constantly — you're just keeping score selectively.
This is confirmation bias: the tendency to search for, favor, and remember information that confirms what we already believe. It's not that we're lying to ourselves — our brains genuinely process confirming information more easily and dismissing evidence that challenges our beliefs takes real mental effort.
Politicians and media outlets know this. They're not trying to change your mind — they're trying to confirm it, because that's what keeps you engaged.
Before sharing a political claim, ask: what would the other side say about this? Find one credible source that disagrees with you and actually read it. You don't have to change your mind — but you'll understand the full picture.
When a policy you support fails, you explain it as bad luck, underfunding, obstruction, or forces outside anyone's control. When a policy your opponents supported fails, it's proof they were incompetent or corrupt.
This is the fundamental attribution error applied to politics. It's why the same set of economic statistics can look like success to one side and failure to the other — we're not just disagreeing about the facts, we're disagreeing about whose fault the facts are.
This is directly relevant to how PolicyLogic scores promises. Our Their Role score is specifically designed to account for this — a senator who couldn't pass a bill because they were in the minority party shouldn't get the same score as one who had every advantage and still failed. Attribution matters.
When evaluating a political outcome, ask: would this have happened anyway? What else was going on? What did similar governments in other places experience? The answers won't always exonerate — but they'll tell you how much credit or blame actually belongs to this person.
"The goal isn't to have no opinions. It's to make sure your opinions are based on what's actually true — not just what feels true."
Humans are tribal by nature. For most of history, your group was your survival. Trusting your group and being suspicious of outsiders was rational. That instinct doesn't go away just because we're talking about tax policy.
In-group bias explains why the same policy can be popular or unpopular depending on who proposes it. Studies have repeatedly found that people change their position on an issue when they're told the other party supports it — not because the policy changed, but because their team loyalty kicked in.
Test yourself: if the exact same policy were proposed by the other party, would you still support or oppose it? If your answer changes, that's worth examining. The policy is the policy — who proposes it shouldn't change whether it works.
We judge how likely or common something is based on how easily an example comes to mind. If you can vividly picture it, your brain tells you it must be frequent. This is why dramatic, emotional news stories have such an outsized effect on how we understand policy problems.
After a high-profile crime, people consistently overestimate crime rates. After a dramatic plane crash, flying feels more dangerous than driving — even though the statistics are the reverse. The same thing happens with policy: a single vivid example of welfare fraud, a single story of a violent crime committed by an immigrant, a single story of corporate executives getting massive bonuses after a bailout — these stories shape how we think about entire systems, even when they're statistically rare.
When a vivid story is shaping your view of a policy, ask: how common is this, actually? Look for base rates, not just examples. One real story is not a trend — and policy designed around outliers often fails the majority.
Most of us think we form opinions by gathering evidence and drawing conclusions. In reality, we usually decide what we want to believe first — then we look for evidence that supports it, and we scrutinize any evidence against it much more harshly than evidence for it.
This is motivated reasoning — and it's not limited to uninformed people. Research has shown that smarter, more educated people are often better at motivated reasoning, because they have more sophisticated tools for rationalizing what they already believe.
In politics, motivated reasoning shows up whenever someone dismisses an inconvenient fact as "biased," "from a liberal/conservative source," or "just statistics." The dismissal happens before the evaluation.
When you find yourself dismissing a source or a fact, ask: am I doing this because it's actually unreliable — or because I don't like what it's saying? Those are different things. Apply the same standard of scrutiny to evidence that confirms your beliefs as to evidence that challenges them.
"You're either with us or against us." "If you support gun rights, you don't care about dead children." "If you support stricter immigration enforcement, you're racist." "If you oppose the policy, you want people to suffer."
These are false dichotomies — and they're one of the most common rhetorical tools in political messaging, because they force you to pick a side and make it feel like the stakes are absolute.
Real policy almost never works this way. Most serious policy questions involve tradeoffs, competing values, and multiple possible approaches — none of which are perfect. The false dichotomy collapses that complexity into a loyalty test.
When you're presented with a choice between two extreme options, ask: are those really the only two? Almost always, there's a spectrum of approaches between them. The person forcing the binary usually benefits from it — they want you choosing their option, not thinking about what else might be possible.
"Knowing about these biases doesn't make you immune to them. It just gives you a better chance of catching yourself."
So what do you do with all this?
The point isn't to become a political skeptic who trusts nothing. It's to become a more deliberate thinker — someone who can tell the difference between a claim that's persuasive because it's true and a claim that's persuasive because it's exploiting a shortcut in your brain.
A few practical habits that help:
Slow down the feeling of certainty. When something immediately feels obviously true, that's often a signal to look closer — not to dismiss it, but to verify it. The most effective misinformation is the kind that feels like something you already knew.
Ask who benefits from you believing this. Every piece of political communication is trying to get you to do or believe something. That doesn't make it wrong — but it's worth knowing whose interests are being served.
Read the original source. Most political outrage travels through several layers of interpretation before it reaches you. A "study that found X" rarely says exactly what the headline claims. Find the study.
Disagree with your own side sometimes. If you never find fault with the people you vote for, that's a sign your evaluation is based on loyalty rather than performance. The scorecards on this site are designed to give you that honest picture — for officials you like and dislike equally.