On April 17, 2013, CNN reported that a suspect had been arrested in the Boston Marathon bombing. John King, the network's chief national correspondent, described the suspect as a "dark-skinned male." Other networks followed. The report was wrong. No arrest had been made. The FBI issued an extraordinary statement criticizing the media for "inaccurate" reporting that was "uninformed." Within 48 hours of any major event, the information landscape is a disaster zone. And the evidence shows it is getting worse, not better. The Science of False Speed A landmark 2018 study published in Science by researchers at MIT analyzed every major contested news story on Twitter from 2006 to 2017 -- some 126,000 stories, tweeted by 3 million people, over 4.5 million times. Their finding: false stories were 70% more likely to be retweeted than true ones. True stories took approximately six times longer to reach 1,500 people than false ones. The researchers concluded that falsehoods spread faster because they are more novel, triggering more surprise and disgust -- emotions that drive sharing. Craig Silverman's 2014 study for the Columbia Journalism Review, "Lies, Damn Lies, and Viral Content," found that news organizations routinely publish unverified claims during breaking events, and that corrections, when they come, reach a fraction of the original audience. Case Study: The Boston Marathon Bombing The Boston Marathon bombing on April 15, 2013 remains the textbook example of breaking news failure: CNN reported an arrest that had not happened, describing the suspect's skin color. John King later called it "embarrassing."
The New York Post published a front-page photo of two innocent spectators, Salaheddin Barhoum and Yassine Zaimi, labeling them "Bag Men" with the subtitle "Feds seek these two pictured at Boston Marathon."
Reddit users in the r/findbostonbombers subreddit incorrectly identified Sunil Tripathi, a missing Brown University student who had no connection to the bombing. His family was harassed. Tripathi had died by suicide.
The Associated Press, Fox News, and the Boston Globe all reported an arrest at various points. None were accurate. The actual suspects, Tamerlan and Dzhokhar Tsarnaev, were not publicly identified until the FBI released photos on April 18 -- three days after the bombing. Case Study: Sandy Hook On December 14, 2012, initial reporting on the Sandy Hook Elementary School shooting was riddled with errors: The shooter was initially identified as Ryan Lanza -- actually the shooter's brother. Ryan was at work in New Jersey when his name went out on national television.
Reports claimed the shooter's mother was a teacher at the school. She was not.
Reports described a second shooter who did not exist.
The weapon used was misidentified in early coverage. Some of these errors were never fully corrected in public consciousness, creating fertile ground for conspiracy theorists who exploited the confusion to claim the entire event was staged. The Correction Problem Here is the structural problem: corrections cannot outrun the original false report. Researchers Brendan Nyhan and Jason Reifler documented what they call the "continued influence effect" -- even when people are shown corrections, the original misinformation continues to shape their understanding. Their 2010 study, "When Corrections Fail," found that corrections can actually strengthen belief in the original false claim among those who are ideologically predisposed to believe it. The asymmetry is brutal: A false report airs during prime breaking news coverage, reaching millions
The correction is issued hours or days later, often in a lower-profile format
Social media amplifies the original; the correction gets a fraction of the engagement
People who saw the original may never see the correction A University of Washington study on the Boston bombing specifically found that misinformation was significantly more prevalent on Twitter than accurate information during the first hours after the attack, and that users who shared corrections often deleted their tweets, while the original false claims remained. The Economic Incentive News organizations operate on attention. Breaking news drives traffic, ratings, and ad revenue. Being first is rewarded. Being right is quietly appreciated but rarely compensated. The 24-hour cable news cycle and the social media attention economy have compressed the window for verification to nearly zero. A reporter who waits to confirm loses the scoop. A network that holds a story loses viewers. The incentive structure punishes caution and rewards speed -- even when speed means being wrong. What Actually Happens in the First 48 Hours The pattern is remarkably consistent across events: Hours 0-6: Fragmentary, unverified reports. High error rate. Eyewitness accounts conflict. Numbers of casualties are wrong.
Hours 6-24: Official statements begin, but are often incomplete or inaccurate. Law enforcement may release information they later revise. Media fills gaps with speculation.
Hours 24-48: A clearer picture begins to emerge. Many initial claims are quietly walked back or corrected. By this point, the public has largely moved on. The solution is not complicated, but it is inconvenient: treat all breaking news as provisional. Assume the first reports are partially wrong. Wait. Verify. The truth will not expire, but the falsehoods your brain absorbs in the first 48 hours may never fully leave. They didn't ask if we wanted to know this. But the next time "breaking news" flashes across your screen, the smartest thing you can do is wait. _- The Department_