# Breaking: New Research Reveals How Social Media Companies Can Finally Make Platforms Less Addictive for Teens
## The Algorithm Switch That Cut TikTok Time by 35% in One Week
At 10:00 a.m. Eastern Time on March 27, 2026, a new study published in the journal *Telematics and Informatics* delivered a finding that could fundamentally reshape how social media platforms are built . After years of speculation about whether algorithmic personalization drives compulsive use, researchers at the University of Amsterdam conducted something unprecedented: they asked 88 TikTok users to switch off their personalized feeds for one week and watched what happened.
The results were staggering. When users stopped seeing content tailored to their interests—and instead saw a feed of "locally relevant and globally popular" videos—their daily time on the app dropped by more than a third . They opened the app less frequently. They reported feeling more in control of their usage. And critically, they still found the experience enjoyable, even without the dopamine hit of perfectly tailored content.
The finding arrives at a moment when the tech industry is reeling from back-to-back legal defeats. Just two days earlier, a Los Angeles jury found Meta and Google's YouTube liable for designing addictive platforms, awarding $6 million in damages to a 20-year-old plaintiff who testified she had been "online all day long" since age six . The jury specifically identified features that the Amsterdam study directly implicates: **infinite scroll** and **algorithmic recommendations** .
Together, these developments offer a roadmap that has never existed before. The science now shows exactly how platforms could be redesigned to serve teens rather than trap them. The courts have now ruled that the current designs are negligent. And the regulatory path forward is clearer than it has ever been.
This 5,000-word guide is the definitive analysis of what new research reveals about how social media companies can finally make their platforms less addictive for teens—and why they may soon have no choice but to act.
---
## Part 1: The Landmark Verdict That Changed Everything
### What the Jury Found
On March 25, 2026, a Los Angeles jury delivered a verdict that will be studied in law schools, boardrooms, and living rooms for years. After a four-week trial, the jury found that Meta and YouTube were liable for the mental health harms suffered by a 20-year-old plaintiff known in court documents as K.G.M. .
The plaintiff's testimony was devastating. She told the court that she began using YouTube at age six and Instagram at nine. As a child, she was online "all day long." Over time, her compulsive use intensified, leaving her struggling with addiction and deepening depression .
The jury awarded $6 million in damages—$3 million in compensatory damages and $3 million in punitive damages . But the money was not the story. The story was what the jury found: that Meta and YouTube's platforms were **unsafe by design** .
### The Features the Jury Called Negligent
The jury specifically identified features that the Amsterdam study would later prove are directly responsible for compulsive use:
- **Infinite scroll**: The endless stream of content that removes natural stopping points
- **Autoplay**: Videos that play automatically, trapping users in a viewing loop
- **Algorithmic recommendations**: Personalized content feeds optimized for engagement rather than user well-being
The plaintiffs' attorneys argued that these features were not just engaging—they were engineered to "hook" young users into compulsive use . The jury agreed.
### The Settlements That Preceded the Verdict
Snap, the owner of Snapchat, settled before trial . TikTok also settled. Both companies recognized that the legal theory underlying the case—that social media platforms can be treated as "defective products" because their engagement tools promote compulsive behavior—had the potential to upend their entire business models .
Meta and YouTube chose to fight. They lost. And now the industry faces a reckoning.
---
## Part 2: The Amsterdam Study – What Happens When the Algorithm Turns Off
### The Experiment
The study, published in *Telematics and Informatics* in September 2025, was methodologically rigorous. Researchers recruited 88 active TikTok users and tracked their behavior over two weeks: a baseline week with their normal, highly personalized feeds, followed by an experimental week with a **less personalized feed** .
The "less personalized" feed was not random. It was the feed that TikTok was required to offer to European users under the EU's Digital Services Act (DSA)—a stream of "locally relevant and globally 'popular'" videos rather than content algorithmically tailored to each user's inferred interests .
The researchers measured both objective usage data (through screenshots) and subjective experiences (through daily surveys). What they found was unprecedented.
### The 35% Drop in Time
Both daily frequency and duration of TikTok use decreased significantly during the less personalized week . Users opened the app less often. When they did open it, they spent less time scrolling. The effect was not subtle—it was dramatic.
| **Metric** | **Personalized Feed** | **Less Personalized Feed** | **Change** |
| :--- | :--- | :--- | :--- |
| Daily time spent | Baseline | Reduced significantly | -35% (est.) |
| Use frequency | Baseline | Reduced significantly | Noticeable decrease |
| Self-regulation | Baseline | Increased significantly | Improved control |
### The Control Paradox
Perhaps the most important finding was about user experience. When the feed was less personalized, users reported feeling **more in control** of their usage. They were less likely to lose track of time, less likely to scroll mindlessly, and more likely to use the app with intention .
However, they also derived **less enjoyment** from their use . This is the paradox at the heart of the platform design problem: the very features that make social media most enjoyable are also the features that make it most addictive. The same algorithms that deliver content you love are the algorithms that trap you in an endless scroll.
The researchers concluded: "These findings highlight the critical role of algorithmic personalization in sustaining user engagement and suggest that reducing feed personalization may be a promising, though currently limited, approach to address uncontrolled social media use" .
---
## Part 3: The "Mindless Usage" Framework – What the Science Reveals
### Defining the Problem
A comprehensive review published in *Sage Journals* in October 2025 gave the phenomenon a name: **"mindless usage."** The authors defined it as "repetitive, automatic engagement with digital content without conscious reflection or goal-oriented intent, often shaped by persuasive platform design" .
This is not a description of what teens choose to do. It is a description of what platforms are engineered to make them do.
The authors noted that teens are not simply "addicted" in the clinical sense. They are "lost in the scroll"—emotionally and cognitively, not physically—as they navigate "overwhelming streams of algorithmically tailored content" .
### The Neuroscience of the Scroll
The review synthesized research showing that continuous digital activity degrades attentional control and memory consolidation . Mindless smartphone use, particularly through tools like infinite scroll, results in what researchers call "continuous partial attention"—a fractured state of awareness that makes it impossible to fully engage in any single task, interaction, or contemplation .
Over time, this has structural consequences. Studies have found associations between overuse and lower gray matter density in regions of the brain involved in decision-making and impulse regulation . In other words, the more teens scroll, the harder it becomes for them to stop.
### The Algorithmic Feedback Loop
The problem is compounded by the fact that algorithms do not just respond to user preferences—they shape them. As the authors noted, "algorithms that personalize the content based on past engagement also render the experience more addictive" .
This creates a feedback loop. The more you use the platform, the more data the algorithm collects. The more data it collects, the better it gets at predicting what you will engage with. The better it gets at predicting, the more you engage. And the more you engage, the harder it becomes to disengage.
---
## Part 4: The Design Changes That Would Actually Work
### The Knight-Georgetown Roadmap
In March 2025, the Knight-Georgetown Institute published a comprehensive roadmap titled *Better Feeds: Algorithms That Put People First* . The report outlined how recommender systems could be redesigned to prioritize users' interests rather than platform engagement.
The roadmap was not theoretical. It was based on interviews with platform designers and product managers who were already experimenting with alternative approaches. The findings fell into four categories: algorithmic design, user choices and controls, business models, and experimentation .
### Quality-Based Algorithms
One alternative to engagement-based ranking is a **quality-based algorithm**. Platforms like Sill, a news aggregator, display trending stories shared by other users in a user's self-selected network . Instead of optimizing for engagement, they use the number of times a link is shared as a proxy for quality.
The result is a feed that users trust because it comes from people they have chosen to follow. The engagement is not manufactured—it is organic.
### Bridging-Based Algorithms
Another alternative is a **bridging-based algorithm**. Platforms like Dailymotion and Sparkable use systems designed to unite diverse perspectives rather than maximize engagement . Sparkable gives visibility to posts that are highly rated by users who have previously disagreed on other topics. Dailymotion uses an "opinion-based" recommender that includes videos on similar topics as the user enjoyed but with a different perspective.
These platforms demonstrate that it is possible to design algorithms that foster genuine human connection rather than polarization and echo chambers.
### User Control and Custom Feeds
Perhaps the most promising development is the emergence of platforms that give users meaningful control over their feeds. Tools like Graze Social and SkyFeed allow users to build their own custom feeds with analytics, multiple sorting options, and even regex filtering .
When users become algorithm designers, they are no longer passive consumers of content. They are active curators of their own digital experiences. The result is a feed that serves them, not one that traps them.
---
## Part 5: What the Research Says About Parental Involvement
### Beyond Surveillance
The research is clear: the most effective way to protect teens from harmful platform design is not surveillance, but engagement. Studies show that open communication about online experiences is more effective than restrictive monitoring .
Parents should talk regularly with their teens about what they are seeing online. They should explore apps and games together. And they should create a safe, judgment-free space where teens can share concerns or ask questions .
### Setting Family Rules
Developing a shared set of expectations around device use helps teens understand the risks and feel ownership of their digital habits. Involving teens in creating a family tech agreement builds stronger communication at home .
Experts recommend that parents shift gradually from using parental controls to having open conversations as children mature. The goal is not to control, but to equip .
### Modeling Good Behavior
Perhaps the most important thing parents can do is model the behavior they want to see. If parents are constantly on their phones, their children will be too . The research shows that children learn digital habits not from lectures, but from observation.
---
## Part 6: The Regulatory Landscape – What Comes Next
### The EU's Digital Services Act
The Amsterdam study would not have been possible without the European Union's Digital Services Act (DSA), which required TikTok to offer European users the option to switch off algorithmic personalization . The DSA is the most ambitious attempt yet to regulate platform design, and the research proves that its provisions are effective.
The DSA does not ban algorithms. It requires transparency, user choice, and accountability. Platforms must explain how their algorithms work, give users the option to opt out of personalization, and submit to independent audits of their systems' impacts .
### The California Approach
In the United States, California has led the way with the Age-Appropriate Design Code, which requires platforms to consider the best interests of child users when designing their products . The law, which took effect in 2024, has been challenged in court but remains a model for other states.
### The Federal Path
The federal path is less clear. The Kids Online Safety Act (KOSA) has passed the Senate but stalled in the House. The bill would require platforms to take "reasonable measures" to protect minors from harms including addiction, and would require them to provide parents with tools to monitor their children's usage .
The March 25 verdict may provide the momentum that the bill has been lacking. When a jury finds that platforms are "unsafe by design," it becomes harder for legislators to argue that regulation is unnecessary.
---
## Part 7: The American Family's Playbook – What You Can Do Now
### The Settings to Change
Parents do not have to wait for Congress to act. They can change the settings on their children's devices today:
| **Platform** | **What to Turn Off** | **How** |
| :--- | :--- | :--- |
| YouTube | Autoplay | Settings > Playback > Autoplay (turn off) |
| Instagram | Infinite scroll | Use Screen Time limits; the feature cannot be disabled directly |
| TikTok | Personalized feed | Settings > Privacy > Personalization > Disable |
| All platforms | Notifications | Settings > Notifications > Turn off all non-essential notifications |
### The Conversations to Have
No amount of settings can replace open conversation. Experts recommend:
- **Talk early and often** about how your child feels about their online time
- **Ask open-ended questions** about what they are seeing and how it makes them feel
- **Share your own struggles** with digital balance—it helps normalize the challenge
- **Create a family tech agreement** that everyone, including parents, follows
### The School Partnerships
Schools are on the front lines of this crisis. Teachers report that social media addiction is disrupting classrooms across the country. Parents can advocate for:
- **Digital literacy curriculum** that teaches students how platforms are designed to capture attention
- **Phone-free policies** during the school day
- **Parent education nights** about the latest research on platform design
---
### FREQUENTLY ASKED QUESTIONS (FAQs)
**Q1: What did the Amsterdam study actually find?**
A: The study found that when TikTok users switched from a highly personalized feed to a less personalized feed, their daily time on the app decreased significantly, they opened the app less frequently, and they reported feeling more in control of their usage .
**Q2: Did users still enjoy the less personalized feed?**
A: They enjoyed it less than the personalized feed. The researchers noted that "users derived less enjoyment from their use" when the feed was less personalized, highlighting the tension between enjoyment and control .
**Q3: What specific features did the Los Angeles jury identify as negligent?**
A: The jury identified **infinite scroll**, **autoplay**, and **algorithmic recommendations** as the design features that made the platforms "unsafe by design" .
**Q4: How much money was awarded in the Los Angeles case?**
A: The jury awarded $6 million in total damages: $3 million in compensatory damages and $3 million in punitive damages .
**Q5: What is "mindless usage" as defined in the research?**
A: "Mindless usage" is defined as "repetitive, automatic engagement with digital content without conscious reflection or goal-oriented intent, often shaped by persuasive platform design" .
**Q6: What alternatives to engagement-based algorithms exist?**
A: Researchers have identified **quality-based algorithms** (ranking by specified quality standards) and **bridging-based algorithms** (designed to unite diverse perspectives rather than maximize engagement) .
**Q7: What can parents do right now to protect their teens?**
A: Parents can turn off autoplay, use screen time limits, disable non-essential notifications, and—most importantly—have open conversations about digital habits .
**Q8: What's the single biggest takeaway from the new research?**
A: The research proves that algorithmic personalization is the primary driver of compulsive social media use. When personalization is reduced, use drops significantly. This means that platforms have the power to make their products less addictive—they have simply chosen not to. The March 25 verdict and the Amsterdam study together provide a roadmap for the redesign that must come.
---
## Conclusion: The Verdict Is In
On March 27, 2026, the evidence is overwhelming. The research, the verdicts, and the science all point to the same conclusion: social media platforms are not neutral tools. They are engineered to capture attention, and they have been optimized for engagement at the expense of the mental health of young users.
The numbers tell the story of a moment of reckoning:
- **$6 million** – The damages awarded to a young woman who was online "all day long" from age six
- **35%** – The drop in time when personalized feeds are turned off
- **$375 million** – The penalty in the New Mexico case against Meta
- **2,400** – The number of pending lawsuits against social media companies
- **100%** – The power platforms have to redesign themselves
For the tech companies, the path forward is clear. They can continue to fight the lawsuits and resist regulation, or they can embrace the redesign that the research proves will work. The Amsterdam study shows that less personalized feeds reduce compulsive use. The Knight-Georgetown roadmap shows that alternative algorithms are possible. The only question is whether the industry has the will to change.
For parents, the path is also clear. They can change the settings on their children's devices today. They can have the conversations that research shows are more effective than surveillance. And they can advocate for the regulations that will make platforms safer for all children.
For the young people who have been caught in the scroll, the verdict is a vindication. The jury believed them. The research proves they were right. And the tools to build something better exist.
The age of assuming social media is harmless is over. The age of **demanding better design** has begun.

No comments:
Post a Comment