Algorithms don’t tuck children into bed.
That sentence kept rattling around in my head after I read about the US ruling holding Meta and Google liable for contributing to a young user’s social media addiction. Many people are calling it historic. And in some ways, it is. For the first time, a court has said out loud what millions of parents, educators, and researchers have been whispering for years: digital platforms aren’t neutral tools. They’re engineered environments that shape behaviour—especially in the young, the vulnerable, the still-developing mind.

I sat in my wheelchair in Rimini, scrolling through reactions on my phone (yes, the irony isn’t lost on me), and I felt two things at once. Relief that someone in a courtroom finally named the problem. And a gnawing discomfort that we might be about to make the same mistake we always make—find a villain, punish them, and go home feeling righteous.
The Machine That Doesn’t Want You to Leave
Let’s be honest about what these platforms do. I’m simplifying complex design architecture here so it’s accessible, but the core idea is straightforward: infinite scrolling, algorithmic content suggestions, and relentless notifications exist for one reason—to keep you glued to the screen. This isn’t a bug. It’s the product. Your attention is the commodity being sold to advertisers, and every extra second you spend on the app is revenue.
When that system targets a 13-year-old whose prefrontal cortex won’t fully mature for another decade, “entertainment” and “addiction” start to blur into the same thing. The dopamine loop—post, like, refresh, repeat—mimics the reward pathways exploited by slot machines. Except slot machines are age-restricted and hidden in casinos. Social media lives in every pocket.
So yes, the ruling carries weight. It establishes a principle: companies can be held responsible for the psychological architecture they build . That matters. That’s not nothing.
But Here’s Where It Gets Uncomfortable
The reaction split predictably down the middle. One camp celebrated. The other urged caution, warning that we risk “turning a complex problem into an overly simplistic solution: punishing companies and feeling good about oneself” .
I find myself straddling both sides—which is an uncomfortable place to be, though I’m used to uncomfortable positions (literally and figuratively).
Can a fine, no matter how large, truly fix a phenomenon with roots this deep ? Think about it. We’re talking about an entire generation raised on touchscreens. About family dinners where every person stares at a different device. About bedrooms where the blue glow of a phone replaces the warmth of a goodnight conversation. The algorithm didn’t create that culture. It exploited it.
The Adults in the Room (or Not)
Here’s the part nobody wants to hear. The source I read puts it bluntly: “Behind every child with a smartphone there is—or should be—an adult” .
Should be. Those two words carry an enormous weight.
In too many families, technology has become a default babysitter. Access is granted freely, but genuine dialogue about what’s happening on those screens? Rare. Limits get imposed—screen time caps, app restrictions—but without explaining why . A rule without understanding is just a wall a teenager will climb over.
I grew up in a household where obstacles were constant—my body didn’t cooperate, medical treatments pulled me across borders from Albania to Italy at age five, and every small victory required enormous effort. But my family talked to me. They explained. They didn’t just set limits; they helped me understand the shape of the world I was navigating. That dialogue—messy, imperfect, sometimes exhausting—was the real education.
Not every family has those resources, that time, that energy. I know this. Single parents working multiple jobs, families in crisis, communities without support structures—pointing the finger at them feels cruel. And yet the question remains: if we don’t equip adults to guide children through digital spaces, who will? The algorithm?
Schools Are Struggling Too
Education systems haven’t kept pace . Digital literacy gets mentioned in curricula, but it’s often surface-level—”don’t share your password” rather than “here’s how a recommendation engine decides what you see next.” Students learn to use technology without understanding how technology uses them .
Teaching young people to recognise when a tool stops being useful and starts becoming invasive—that’s a skill as important as reading or arithmetic in 2026 . We don’t teach kids to drive without explaining brakes. Why do we hand them smartphones without explaining the mechanics of manipulation?
The Danger of a Single Story
What worries me most about this ruling isn’t the ruling itself. It’s the narrative it might create: Big Tech is the enemy, and once we punish them, the problem is solved .
That’s a comforting story. It has a clear villain and a satisfying ending. It also happens to be incomplete.
Regulation matters. Holding corporations accountable for design choices that harm minors is right and necessary. But “no regulation can ever replace the educational role of adults” . Both things are true simultaneously. The world is rarely kind enough to give us problems with only one cause and one solution.
I’ve spent years building FreeAstroScience, a community dedicated to making knowledge accessible. One thing I’ve learned is that real change doesn’t come from a single dramatic gesture. It comes from thousands of small, sustained efforts—a parent asking “what did you watch today?” instead of “how long were you on your phone?”, a teacher explaining how engagement metrics work, a platform redesigning its notification system not because a court ordered it but because it’s the decent thing to do.
The Right Question
The source I’ve been reflecting on ends with a provocation that I think deserves to echo: “The question is whether we are ready, as a society, to take our share of responsibility” .
Are we?
It’s easier to blame an algorithm than to examine our own habits. Easier to demand a company change its code than to change the way we parent, teach, and model behaviour around screens. Easier to celebrate a courtroom victory than to do the slow, unglamorous work of digital education.
Punishing Big Tech is a start . A necessary one. But if we stop there—if we treat the verdict as a finish line rather than a starting gun—we’ll have missed the point entirely. The biggest mistake isn’t the algorithm. It’s believing someone else will solve this for us .
I don’t have all the answers. I have a degree in physics, a wheelchair, and an internet connection—and I’ve seen firsthand how technology can liberate and how it can trap. The difference, almost always, comes down to awareness. To someone caring enough to say: let me show you how this works, and let’s figure it out together.
That’s not something you can legislate. But it’s something you can start tonight, at your own kitchen table, with the people you love.
Never give up on that conversation.
Gerd Dani — FreeAstroScience
Rimini, Italy — April 2026
