01 —The Study That Changed Everything
In 2018, researchers published a paper in Science that basically put numbers to what most of us already felt in our gut. They tracked every major disputed story shared on a prominent social media platform between 2006 and 2017. The dataset was massive: 126,000 stories, shared by roughly three million people, more than 4.5 million times.
The findings hit hard. False news — stories rated inaccurate by independent fact-checkers — reached 1,500 people roughly six times faster than accurate stories. It burrowed 20 times deeper into social networks. And at every step of the chain, people were 70% more likely to hit "share" on the lie than on the truth.
This wasn't opinion or media commentary. This was eleven years of cold, hard data from the biggest misinformation study ever done at the time. And what it found should bother all of us.
False news spreads farther, faster, deeper, and more broadly than the truth in all categories of information.
— Research published in Science, Vol. 359, Issue 6380, 2018They looked at every category — politics, urban legends, business, terrorism, science, entertainment, natural disasters. Across every single one, the lies beat the truth on every measure of spread. No exceptions.
02 —The Surprising Culprit: It's Not Bots
Here's what caught the researchers off guard. Everyone expected bots to be the villain — automated accounts blasting fake content everywhere. But when they stripped all the bots out of the dataset, the numbers barely moved.
Bots shared true and false stories at roughly the same rate. The massive speed advantage that lies have over the truth? That comes almost entirely from real human beings.
We are the algorithm. Actual people — not machines — choose to share falsehoods at six times the rate they share verified info. This isn't a technology crisis. It's a human psychology crisis.
So why do we share false information?
The answer comes down to one thing: novelty. False stories tend to be more unexpected than true ones. They contain information that feels surprising and unheard of. And human brains are biologically hardwired to pay attention to — and share — novel information.
Evolutionarily, this makes sense. If something bizarre or dangerous is suddenly happening in your environment, spreading the word to your group is a survival mechanism. The trap of the digital age is that a sensationalized political quote or a terrifying false health claim hijacks that exact same survival circuitry.
A piece of false or misleading information is introduced — often containing a grain of plausible detail, packaged with a startling or emotionally charged headline.
The claim triggers a strong emotional response — fear, outrage, disgust, or excited disbelief. This emotional arousal directly lowers the threshold for sharing.
Research shows that most viral content spreads fastest within the first 15–30 minutes. Speed is everything in the information cascade.
Each reshare exposes the content to new networks. Unlike truth, which tends to stay within the original audience, false content jumps between communities.
By the time a correction or debunking is published, the false version has already reached 20× more people. Corrections rarely catch up — and when they do, far fewer people see them.
03 —The Emotions That Drive Sharing
Not all lies go equally viral. The ones that travel farthest hit very specific emotional buttons. When researchers compared people's reactions to false vs. true stories, a pattern jumped out immediately.
True stories mostly made people feel anticipation and trust. Useful, but boring. False stories triggered something way more powerful: surprise, fear, disgust, and outright anger.
These are all high-arousal emotions — the kind that make you act before you think. When a headline makes your blood boil or your jaw drop, you're already reaching for the share button before you've finished reading the second paragraph.
How false content triggers each emotion:
Relative emotional response rates associated with sharing behaviour. Based on sentiment analysis of resharing patterns.
The more novel and emotionally provocative the information, the more likely humans are to share it — regardless of whether it is accurate.
— Sinan Aral, MIT Sloan School of Management04 —A Brief History of Viral Lies
This problem is not new. Humans have been falling for — and spreading — false information for centuries. The internet just poured rocket fuel on a very old fire.
A New York newspaper published a series of articles claiming astronomers had discovered life on the moon — complete with bat-winged humanoids. The story spread globally before any correction reached most readers.
A dramatised radio adaptation of a science fiction story caused widespread anxiety when thousands of listeners missed the introduction and believed a fictional alien invasion was a live news broadcast. A single medium. Enormous reach. No fact-check possible.
Research begins documenting how false stories on early social media platforms outperform accurate ones. The infrastructure for viral misinformation — instant global reach, algorithmic amplification, no editorial gatekeeping — is now fully in place.
The landmark paper quantifies the gap definitively: 6× faster, 20× deeper. Policymakers, platforms, and researchers begin taking the scale of the problem seriously.
The WHO coins the term "infodemic" during the COVID-19 pandemic to describe the parallel spread of health misinformation alongside the virus itself. Studies find false health claims spreading across platforms before most official guidance can reach the public.
05 —Why Corrections Fail
You'd think that once a false story gets publicly debunked, the correction would spread just as quickly. But it doesn't. Not even close.
Multiple studies show that corrections and retractions are shared at a fraction of the rate of the original claim. One analysis found that retractions get roughly 1% of the engagement of the initial lie. Long before a fact-checker can hit "publish", the false version has already cemented itself as truth for millions of people.
This runs headfirst into a well-documented psychological roadblock: the backfire effect. When you directly confront someone with facts that contradict a deeply held belief, their brain often doubles down on the falsehood. If the correction feels like an attack on their identity, the mind builds walls against it.
The Memory Problem
There's another layer that makes this worse. Your brain doesn't store facts like files on a computer — it rebuilds them from scratch every time you try to remember something. So a false story you read three weeks ago and later saw "corrected"? Your brain might still pull up the original lie, because that first version burned itself in deeper. The version you hear first almost always wins in memory, even if you logically know it was wrong.
"I'll believe it when I see it" is actually backwards for misinformation. Most people believe it the moment they read it, and then keep believing it no matter what they see afterwards.
06 —The Real-World Cost
Misinformation isn't just an abstract political talking point. It causes actual, measurable damage to everyday life.
In health: Panic-inducing health claims cause people to delay life-saving treatments, try dangerous folk remedies, and bypass accurate medical guidance. Researchers have actually traced preventable deaths right back to viral misinformation during major disease outbreaks.
In finance: Fabricated rumours about companies or investments drain bank accounts. Studies looking at financial markets found that viral lies can measurably crash or pump stock prices — and by the time the truth comes out, the trades have already been locked in.
In relationships: Surveys repeatedly show that fake news destroys trust between loved ones. Families are torn apart over conflicting online realities. When a society completely loses its shared baseline of facts, the social fabric starts to unravel.
The speed gap between a lie and its correction isn't a glitch. It exposes something fundamentally broken in how humans process attention and emotion.
— MIT Media Lab, Computational Journalism researchA friend sends you a link to an article with the headline: "Scientists confirm: drinking coffee reverses memory loss." The article has a photo of researchers and cites "a new study." Before you share it, you should:
You see a post claiming a well-known technology company is about to collapse, with a screenshot of an internal email as "proof." The post is being reshared rapidly and the screenshot looks authentic. The most important first step is:
Research shows that corrections of false stories are read by roughly what percentage of the audience that saw the original false claim?
07 —What You Can Actually Do
The numbers look grim, but you aren't helpless. A growing body of evidence proves that a few tiny changes to your daily scrolling habits can short-circuit the entire misinformation machine.
Click each item below to mark it as part of your verification habit:
Studies show that a simple prompt asking "is this accurate?" before sharing reduces misinformation sharing by 20–30%. The pause breaks the automatic emotional-to-action chain.
Most viral misinformation travels through several layers before reaching you. Search for the original source — not the article sharing it, but the actual study, event, or quote being referenced.
A real story of significance will be reported by multiple independent outlets. If you can only find a single source for a startling claim, that's a major warning sign.
If a story makes you feel immediately outraged, vindicated, or frightened — that is precisely the moment to slow down. High emotional arousal is the most reliable predictor of uncritical sharing.
Multiple independent fact-checking organisations operate globally and publish their methodology. Checking verified fact-checking outlets before sharing a surprising claim takes under 60 seconds.
If you shared something that turned out to be false, post the correction to the same audience. You are unlikely to reach everyone — but you will reach some, and you'll build credibility in the process.
08 —Where We Go From Here
That 6× number is already a few years old. Newer research suggests the gap is actually getting wider, not smaller. There's more content than ever fighting for your attention, and the stuff that triggers the strongest emotional reactions still wins the algorithm lottery every time.
The conversation has moved past "is this a problem?" to "what actually works?" Teaching digital literacy to kids shows real promise. Adding small speed bumps before the share button — a prompt, an extra click, a "are you sure?" — consistently reduces mindless resharing. Some platforms are starting to experiment with ranking accuracy over pure engagement. All of these help, but none of them are a silver bullet.
The honest truth is that changing your own habits is the single most effective weapon against misinformation. Not because platforms shouldn't do better (they absolutely should), but because every lie that goes viral travels through individual human decisions. Every time you pause before sharing, you break one link in the chain. Do that enough times, across enough people, and the math starts to shift.
Truth doesn't spread slowly because it's weak. It spreads slowly because it rarely makes your blood boil or your jaw drop. Accurate information is usually… calm. And calm doesn't go viral. Which means if we want the truth to travel, we have to carry it ourselves — on purpose, every single time.
References & Sources
Disclaimer: The content on this page is provided for informational and educational purposes only. All statistics, research findings, and data referenced are sourced from publicly available peer-reviewed studies and reputable institutions; sources are listed above. This article does not constitute professional, legal, medical, or psychological advice. The emotional response percentages shown in the bar chart are illustrative representations based on published research trends and should not be interpreted as exact universal figures. Readers are encouraged to consult original sources for full research methodology and context. Random Internet Facts does not claim authorship of or affiliation with any of the cited studies or institutions.