Let's start with the math. In 2024, Meta — the company that owns Facebook, Instagram, and WhatsApp — generated $160.6 billion in advertising revenue. Alphabet, which owns Google and YouTube, generated $264.6 billion. TikTok brought in an estimated $28.6 billion. None of these companies charge you to use their platforms. So where does the money come from? It comes from selling advertisers access to your attention. And the longer they can keep your eyes on the screen, the more they get paid. That is the entire business model. Everything else flows from it. The Engineering of Endless Scroll Social media platforms don't become addictive by accident. They are designed by some of the most sophisticated behavioral psychologists and UX researchers in the world, working from a simple brief: keep people on the platform as long as possible. The tools they use borrow directly from casino design and behavioral psychology research: Variable reward schedules. Psychologist B.F. Skinner showed in the 1950s that the most addictive pattern of reward isn't constant reward or no reward — it's unpredictable reward. Slot machines pay out randomly. Social media feeds work the same way: sometimes you scroll and find something amazing. Sometimes you find nothing. That unpredictability is what makes you keep pulling the lever. Infinite scroll. Invented by UX designer Aza Raskin in 2006 (who has since publicly apologized for it), infinite scroll removes the natural stopping point that a page break or "load more" button would provide. There is no bottom. There is no moment where the platform says "okay, that's enough." Raskin has estimated that infinite scroll wastes around 200,000 collective human hours every day. Notification systems. Every notification is a small hit of social validation — a like, a comment, a share. The red badge on the app icon is specifically designed to create a compulsion loop. Platform engineers have confirmed in internal documents (later leaked) that notification timing is deliberately optimized to maximize return visits. Autoplay. YouTube's autoplay feature was specifically designed to reduce the number of intentional choices a user makes. The less you consciously decide to watch, the more you watch. What It's Doing to Attention The data on what this is producing is striking. Studies found that average screen-based attention had dropped to around 43 seconds by 2024, down from 47 seconds in prior years, with users switching tasks an average of 566 times across an 8-hour workday. Research tracking social media behavior found that Gen Z averages approximately 6.5 seconds of focused attention per social media post. Deep reading habits declined by an estimated 39% between 2014 and 2024, correlating with the widespread adoption of infinite scroll and autoplay features. The average person now spends approximately 2 hours and 40 minutes on social media every day. TikTok alone accounts for over 1 hour and 37 minutes of that for regular users. Across a year, that's over 40 full days of waking life. Former Google design ethicist Tristan Harris, who testified before Congress on these issues and co-founded the Center for Humane Technology, has described the dynamic this way: "A handful of people working at a handful of technology companies are steering the thoughts of two billion people." The Insiders Who Warned Us The most credible evidence for deliberate design comes from the engineers and executives who built these systems. Sean Parker, Facebook's founding president, said in a 2017 interview: "How do we consume as much of your time and conscious attention as possible? [...] It's a social-validation feedback loop... exactly the kind of thing that a hacker like me would come up with, because you're exploiting a vulnerability in human psychology." Chamath Palihapitiya, Facebook's former VP of User Growth, said in 2017: "I think we have created tools that are ripping apart the social fabric of how society works [...] The short-term, dopamine-driven feedback loops we've created are destroying how society works." Frances Haugen, a former Facebook data scientist, leaked tens of thousands of internal documents to Congress and the SEC in 2021 — what became known as the Facebook Papers. They revealed that Facebook's own internal research showed Instagram was damaging the mental health of teenage girls, that the algorithm amplified rage and division because it drove engagement, and that executives were aware of both and chose growth anyway. These aren't critics speculating from the outside. These are people who built the machine, describing what it was built to do. The Advertising Machine Underneath It All Here's the part that ties everything together. Every second you spend on a social media platform generates data: what you looked at, how long, what you clicked, what you scrolled past, where you paused. This data is aggregated into a behavioral profile that is sold to advertisers with extraordinary precision. Facebook's ad platform allows advertisers to target people based not just on age and location, but on stated interests, inferred emotional states, relationship status, political leanings, and purchasing behavior. Cambridge Analytica — the political data firm that harvested the data of 87 million Facebook users without consent — demonstrated how this targeting capability could be used not just to sell products, but to shift political behavior. The business model only works if your attention is captured and held. Which means every design decision — every algorithm tweak, every notification, every autoplay — is made in service of that goal, not yours. What You Can Actually Do This isn't an argument to delete everything. It's an argument to be awake to what's happening when you open the app. Some practical steps that researchers and former platform insiders recommend: Turn off all social media notifications. Check intentionally, not reactively.
Use browser extensions like NewsFeed Eradicator (Facebook) or DF Tube (YouTube) to remove algorithmic feeds while keeping the utility of messaging and search.
Set specific windows for social media use rather than checking throughout the day.
Regularly audit who and what you follow — the feed is only as good as what you put into it.
Be aware that content producing strong emotional reactions (outrage, envy, fear) is being amplified algorithmically because it drives engagement, not because it's important. The platforms are tools. Extraordinary, powerful tools. The problem isn't the tool — it's not knowing you're holding one. They didn't ask if we wanted to know how this works. Now you do. References Axios. (2017). "Sean Parker: Facebook was designed to exploit human vulnerability."
U.S. Senate Commerce Committee. (2021). Testimony of Frances Haugen - Protecting Kids Online.
Center for Humane Technology. "The Problem" - https://www.humanetech.com/the-problem
BBC Ideas. (2019). "The slot machine in your pocket" - Interview with Aza Raskin.
The Verge. (2017). Reports on Chamath Palihapitiya's Stanford statements on social media addiction.