Join our FREE personalized newsletter for news, trends, and insights that matter to everyone in America

Newsletter
New

Redesigning Platforms In Wake Of Social Media Trial

Card image cap



In a landmark case, a jury found this week that Meta and YouTube negligently designed their platforms and harmed the plaintiff, a 20-year-old woman referred to as Kaley G.M. The jury agreed with the plaintiff that social media is addictive and harmful and was deliberately designed to be that way. This finding aligns with my view as a clinical psychologist: that social media addiction is not a failure of users, but a feature of the platforms themselves. I believe that accountability must extend beyond individuals to the systems and incentives that shape their behavior.

In my clinical practice, I regularly see patients struggling with compulsive social media use. Many describe a pattern of “doomscrolling,” often using social media to numb themselves after a long day. Afterwards, they feel guilty and stressed about the time lost yet have had limited success changing this pattern on their own.

It’s easy to understand why scrolling can be so addictive. Social media interfaces are built around a powerful behavioral mechanism known as intermittent reinforcement, says Judson Brewer, an addiction researcher at Brown University, which is the strongest and most effective type of reinforcement learning. This is the same mechanism that slot machines rely on: Users never know when the next reward—a shower of quarters, or a slew of likes and comments—will appear. Not all the videos in our feeds captivate us, but if we scroll long enough, we are bound to arrive at one that does. The ongoing search for rewards ensnares us and reinforces itself.

Why Social Media Feels Addictive

Individuals typically struggle on their own to address compulsive social media use. This should be no surprise, as habits are not typically broken through sheer discipline but rather by altering the reinforcement loops that sustain them. Brewer argues that “there’s actually no neuroscientific evidence for the presence of willpower.” Placing the burden to self-regulate solely on users misses the deeper issue: These platforms are engineered to override individual control.

A growing body of research identifies social media use and constant digital connectivity as important influences on the growing incidence of adolescent mental health problems. Brewer notes that adolescents are particularly vulnerable, as they are in a “developmental phase” in which reinforcement learning processes are especially strong. This vulnerability can be exploited by the design features of large social media platforms.

How Platforms Are Designed to Maximize Engagement

NPR uncovered records from a recent lawsuit filed by Kentucky’s attorney general against TikTok. According to these documents, TikTok implemented interface mechanisms such as autoplay, infinite scrolling, and a highly personalized recommendation algorithm that were systematically optimized to maximize user engagement.

TikTok’s algorithmically tailored “For You” content continuously tracks user behaviors, such as how long a video is watched, whether it is replayed, or quickly skipped. The feed then curates short videos, or reels, for the user based on past scrolling behavior and what is most likely to hold attention.

These documents show one example of a tech company knowingly designing products to maximize attention. I believe social media companies also have the capacity to reduce addictiveness through intentional design choices.

How Governments Are Regulating Social Media

The good news is we are not helpless. There are multiple levers for change: how we collectively talk about social media, how our governments regulate its design and access, and how we hold companies accountable for practices that shape user behavior.

Some countries are moving quickly to set policy around social media use. Australia has imposed a minimum age of 16 for social media accounts, with similar bans pending in Denmark, France, and Malaysia.

These bans typically rely on age verification. Users without verified accounts can still passively watch videos on platforms like YouTube, but this approach removes many of the most addictive features, including infinite scroll, personalized feeds, notifications, and systems for followers and likes. At the same time, age verification may cause different problems in the online ecosystem.

Other countries are targeting social media use in specific contexts. South Korea, for example, banned smartphone use in classrooms. And the United Kingdom is taking a different approach; its Age Appropriate Design Code instructs platforms to prioritize children’s safety while designing products. The code includes strong privacy defaults, limits on data collection, and constraints on features that nudge users toward greater engagement.

How Social Media Platforms Could Be Redesigned

A report called Breaking the Algorithm, from Mental Health America, argues that social media platforms should shift from maximizing engagement to supporting well-being. It calls for revamping recommendation systems to spot patterns of unhealthy use and adjusting feeds accordingly—for example, by limiting extreme or distressing content.

The report also argues that users should not have to intentionally opt out of harmful design features. Instead, the safest settings should be the default. The report supports regulatory measures aimed at limiting features such as autoplay and infinite scroll while enforcing privacy and safety settings.

Platforms could also give users more control by adding natural speed bumps, such as stopping points or break reminders during scrolling. Research shows that interrupting infinite scroll with prompts such as “Do you want to keep going?” substantially reduces mindless scrolling and improves memory of content.

Some social media platforms are already experimenting with more ethical engagement. Mastodon, an open-source, decentralized platform, displays posts chronologically rather than ranking them for engagement, and does not offer algorithmically generated feeds like “For You.” Bluesky gives users control by letting them customize their own algorithms and toggle between different feed types, such as chronological or topic-based filters.

In light of the recent verdict, it is time for a national conversation about accountability for social media companies. Individual responsibility will always be important, but so are the mechanisms employed by big tech to shape user behavior. If social media platforms are currently designed to capture attention, they can also be designed to give some of it back.