EN IT
· 9 min read

Your kids are not your users

A manifesto for people who build tech and are also parents: on engagement, attention extraction, and a simple rule—build as if your child were the user.

You know the word “engagement.” You use it in meetings, in project docs, on calls with clients. You know what it means: time spent on the platform, return frequency, interactions per session. You know it’s the metric that matters. You know your work, ultimately, is judged by your ability to make it go up.

Then you go home. And your child is in front of a screen. And that screen is doing exactly what you know how to do: capturing attention, generating engagement, maximizing time-on-platform.

Only this time the user isn’t a user. It’s your child.


The dissonance #

If you work in tech and you have kids, you live with a cognitive dissonance that almost no one names. By day you design systems meant to keep people in. By night you try to pull your child away from identical systems. By day you talk about “user retention” as a success. By night your child’s “retention” in front of the tablet scares you.

You know how pull-to-refresh works. You implemented it, or at least you discussed implementing it. You know it replicates the lever gesture of a slot machine. You know the delay before loading isn’t a technical limitation but a design pattern: the calculated pause that maximizes dopamine release. You know it because it’s your job to know it.

And you know your child, at eight, at ten, at twelve, has no tools to defend themselves against what you know how to build.

It’s not a personal contradiction. It’s a system contradiction. It concerns anyone who works in tech and has kids—which at this point means millions of people.


What we know and can’t pretend not to know anymore #

Lining things up helps you look them in the face.

We know that variable-ratio reinforcement schedules—the ones social media feeds are built on—are the most powerful psychological mechanism ever documented for inducing and maintaining compulsive behavior. We’ve known since the ’50s. Skinner proved it on pigeons. We apply it to children.

We know that an adolescent’s brain is in full development. The prefrontal cortex—the part that governs judgment, impulse control, the ability to evaluate consequences—doesn’t finish maturing before 25. We design systems that deliberately bypass it to speak directly to the limbic system. We do it for a living.

We know that habitual social media use is associated with a measurable reduction in cortical thickness in brain regions linked to cognitive control. These aren’t opinions. These are MRI data on thousands of adolescents.

We know that rates of depression, anxiety, self-harm, and suicide among adolescents have doubled or tripled since 2010, in step with the spread of smartphones, across the Western world.

We know that platforms know it. Meta’s internal documents, made public by Frances Haugen, showed that Instagram worsened body image issues for one in three girls. The company knew. It kept going.

We know all of this. And we keep building.


“But I don’t work on social” #

I know. Me neither. I build management software, platforms for public administration, B2B software. I don’t design algorithmic feeds and I don’t optimize recommendation systems for teenagers.

But the point isn’t what we build today. The point is the culture we build it in.

We work in an industry that has normalized the idea that human attention is a resource to extract. That “user experience” is synonymous with “time the user spends on the platform.” That success is measured in sessions, clicks, conversion rate, daily active users. We talk about human beings with the vocabulary of mining, and it doesn’t seem strange.

This culture runs through all of us. Even those who don’t work on social breathe in its metrics, its values, its priorities. When we go home, that culture comes home with us. It seeps into the way we look at our child’s screen. Sometimes with concern, sure, but also with a subtle, unconfessable familiarity. We recognize those mechanisms. We know they work. And a part of us—the professional part—can’t help admiring them.

That’s the familiarity we have to break.


The privilege of awareness #

Anyone who works in tech and has kids has something most parents don’t: knowledge of the architecture.

We know how those systems work, not just that they work. We know notifications don’t arrive at random but are optimized for the moment of maximum psychological vulnerability. We know infinite scroll isn’t an aesthetic choice but a capture device. We know “the algorithm” isn’t a mysterious entity: it’s code written by people like us, with specific goals, optimized to specific metrics.

This awareness is an enormous privilege. And privilege creates responsibility. If you see the fire and others don’t, you can’t pretend nothing’s happening and say “not my fire.”

And yet that’s exactly what we do, as an industry. We know. And we stay quiet. Because speaking up would mean admitting the problem isn’t “out there,” in kids’ bad habits, in parents’ inability, in the lack of “digital education.” The problem is also inside. In how we think about software. In the metrics we choose to optimize. In the questions we don’t ask.


The questions we don’t ask #

In twenty years in tech, I’ve never heard anyone in a project meeting ask these questions:

Could this system create addiction? If so, do we have a responsibility to prevent it?

Are we designing for the user’s well-being or for maximizing their time of use? Are those the same thing?

If a minor used this product, would it be safe? Not “compliant with regulations.” Safe.

Are we measuring success with metrics that align our interests with those of the people using our software? Or are we measuring something that’s convenient for us to measure?

Would we build this system exactly like this if we knew the first user would be our child?

The last question is the most important one. And it’s the one we never ask.


The child test #

I’m proposing a rule. Not a law, not a framework, not a process. A personal rule, for anyone who builds software and has kids.

Before you implement a system, ask yourself: am I okay with the first user being my child?

Not “my child at twenty, an adult, aware, trained.” My child now. With the age they have. With the prefrontal cortex they have. With the resistance to stimuli they have. With the blind trust in technology they have.

If the answer is yes, build it. If the answer is “it depends,” stop and ask what it depends on. If the answer is no, you have a problem. And the problem isn’t your child.

This isn’t a sentimental test. It’s a design test. It’s the precautionary principle translated into the language of software development. The applied version of Hans Jonas’s imperative: act so that the consequences of your action are compatible with the permanence of an authentic human life.

Except Jonas was talking about atomic bombs and genetic engineering. We’re talking about algorithmic feeds and push notifications. The fact they seem harmless is exactly what makes them dangerous.


We are not powerless #

I know what you’re thinking. “I’m an employee. A freelancer. A team lead at a ten-person company. I don’t decide Meta’s policies.” True. You don’t.

But you decide how you build your software. Which metrics to optimize. Whether to implement dark patterns or refuse. Whether to add a usage timer or infinite scroll. Whether to design a system that respects the user’s attention or one that loots it.

Above all, you decide what kind of professional you want to be.

You can be the one who says “the market demands it” and implements whatever pays. Or you can be the one who says “I’m not building it like this” and proposes an alternative. That’s not idealism, it’s craftsmanship. A serious carpenter doesn’t use rotten wood because it’s cheaper. A serious cook doesn’t serve spoiled food because it increases margin. A serious engineer doesn’t sign off on a structurally unsafe design because the client is in a hurry.

And yet in software—where the consequences can involve millions of people and the brain of an entire generation—we accept standards we wouldn’t accept in any other trade.


The manifesto #

I work in tech. I have a child. I can’t keep the two separate anymore.

My child is not a user. Not a metric, not a session, not a daily active user. They are a human being with a developing brain, a judgment still under construction, a trust in the world that depends also on how I—and people like me—build that world.

My work has consequences. Not the abstract, distant consequences of moral philosophy. The concrete ones of a system that runs twenty-four hours a day and interacts with millions of brains. If I don’t take responsibility for it, who should?

Compliance is not enough. Respecting the GDPR, the AI Act, the EAA is the bare minimum, not the finish line. The question isn’t “is it legal?” The question is “is it right?” Those are two different questions, and the second matters more.

Speed is not a value. “Ship fast” isn’t a virtue when what you’re shipping can cause harm. Hurry is the refuge of those who don’t want to think about consequences. Critical thinking is slow by nature. Code is fast. Wisdom is not confusing the timelines of one with the timelines of the other.

Technical training is not enough. Knowing how to write code without knowing how to read the implications of that code isn’t competence—it’s specialized blindness. We need engineers who have read Jonas, developers who know Mill, designers who have studied developmental psychology. Not as general culture. As work tools.

My child will judge me. Not by the revenue I generated, not by the projects I delivered, not by the technologies I mastered. They’ll judge me by the world I helped build. And in that judgment, “I was just following orders” won’t be an acceptable defense. It never has been.


To those who build #

If you work in tech and you have kids, you know what I’m talking about. You know there’s a conversation our industry refuses to have. You know the discomfort you feel when your child disappears into a screen isn’t parental paranoia. It’s professional competence telling you something.

Listen to it.

I’m not saying stop building software. I’m saying build it as if the first user were your child. Because, for all intents and purposes, it could be.

And if not your child, someone else’s.

Which is the same thing.


“We have not inherited the world from our parents. We have borrowed it from our children.”— proverb attributed to Antoine de Saint-Exupéry