Break It to Make It

It's 2026. Everyone's talking about AI. AI here, AI there, AI everywhere. So does it even make sense to talk about operating systems anymore? I think it does. In fact, I'd argue this conversation has never been more relevant.

Because before a child can meaningfully interact with artificial intelligence — before they can understand what a model does, what a prompt is, or why an output might be wrong — they need something more fundamental. They need to understand how a computer actually works. Not as a consumer. As a thinker.

And that understanding doesn't come from a machine that hides everything behind a polished interface. It comes from a machine that talks back. One that breaks.


The Day My Computer Died

My first operating system was Windows XP. I remember the exact moment everything changed: I got curious, poked around where I shouldn't have, and deleted a bunch of system files. One morning, the computer simply refused to boot.

If someone asked me when my real interest in technology began, I'd point to that day. Not the day I got a computer — the day I broke it.

That black screen wasn't a catastrophe. It was an invitation. Suddenly I had a question that mattered to me personally: How do I fix this? And that question opened a door to a world I didn't know existed. Finding a Windows ISO file online. Waiting an entire day for it to download. Burning it to a CD. Installing the OS. Hunting for a crack. Installing drivers one by one, each one a small victory.

Nobody told me to do any of this. Nobody assigned it as homework. The broken machine itself was the teacher, and curiosity was the only curriculum.

Then came Linux. Ubuntu first, then Fedora, SUSE. Eventually Arch — and with it, manually compiling the kernel. There was breaking everything that could be broken, and some things that supposedly couldn't. And occasionally, just occasionally, I'd manage to put it all back together.

It was the most rewarding digital game I've ever played. Not because I was talented. Because I was allowed to fail.


What Breaking Things Actually Teaches

So what does all of this give a person? If you ask me — a tremendous amount. But let's start with the most important thing: it's genuinely fun.

There is a particular kind of joy that comes from solving a problem you created yourself. It's different from getting a good grade or finishing an assignment. It's closer to the satisfaction a child feels when they build a tower of blocks, knock it over, and build it again — taller this time. That cycle of creation, destruction, and reconstruction is not a flaw in the learning process. It is the learning process.

A child who breaks and fixes their own computer learns how a system works from the inside out. They learn what's possible, what's not, and how to find the boundary between the two. They learn how to diagnose a problem, how to work around it, and when to start over from scratch.

But more than any technical skill, they learn something deeper: they learn how to learn. They learn how to sit with frustration, how to search for answers, how to read something written for someone more experienced and extract what they need. They learn that not knowing is not a dead end — it's a starting point.

And those skills transfer to everything else they'll ever do.

This isn't just my opinion. MIT professor Seymour Papert spent over fifty years researching exactly this. His theory of constructionism is built on a simple, powerful idea: people construct knowledge most effectively when they're actively building — and occasionally breaking — things in the world. Programming and debugging, he argued as early as 1968, give children a way to think about their own thinking.

"Children learn best through tinkering, unstructured activities that resemble play, and research based on partial knowledge — by solving problems that are interesting to them." — Seymour Papert

The keyword there is interesting to them. Not interesting to the teacher. Not interesting to the curriculum designer. A child who desperately wants to install a game on Linux and hits a dependency error isn't suffering — they're engaged. The desire to play becomes the engine that drives real learning. The obstacle becomes the lesson.


The "Just Works" Trap

Here's the paradox. Modern operating systems — especially macOS and iOS — are beautifully designed. Everything just works. And that's exactly the problem.

When a computer hides its file system, restricts access to the terminal, manages all drivers automatically, and resolves every conflict silently in the background — a child learns to see it as a magic box. Something that does things, but never something they understand. The elegance becomes a wall. The simplicity becomes a ceiling.

There is a word for environments that protect you from every possible mistake: they're called cages. Comfortable ones, certainly. Beautiful ones, even. But cages nonetheless.

Marc Scott, a UK computer science teacher and network manager, put it bluntly in his widely-shared essay "Kids Can't Use Computers":

"Windows 7 and Mac OS X are great operating systems. They're easy to use, require almost no configuration, and generally 'just work.' It's fantastic that everyone can now use a computer with minimal technical literacy, but it's also a disaster."

He's right. When things never go wrong, there's no reason to understand how they work. And when there's no reason, there's no motivation. The comfort becomes the cage.

Scott goes further, and I think this is the part that resonates most: he argues that parents are part of the problem. Every time Techno-Dad rushes in to fix the WiFi, install the software, or troubleshoot the printer, a learning opportunity evaporates. He compares it to potty-training — we invest enormous time and patience in that because we know it's essential. We should treat technological literacy the same way. Buy them a computer, yes — but when it breaks, let them fix it.


The Data Says the Same Thing

You might think a generation raised with iPads in their cribs would be the most technically literate in history. The data says otherwise — and the trend is getting worse, not better.

The 2023 ICILS study — the International Computer and Information Literacy Study, covering 35 countries — found that U.S. eighth-graders scored below the international average in computational thinking. Their scores had dropped 37 points since 2018. Sixteen countries outperformed the U.S. in computer and information literacy. And only 33% of students reported using computers daily for actual learning purposes.

These are children who were born the year the iPhone launched. They grew up surrounded by screens. And yet, as NCES Commissioner Peggy Carr noted: many of them lack the basic skills they need to be safe while using the very technology that defines their generation.

Dirk Hastedt, executive director of the IEA, was even more direct. In an Edutopia interview, he called the concept of digital natives a "myth" and warned: "We can't keep assuming that students will pick up technology skills purely through osmosis. We're creating a digital divide and we're leaving some students behind."

The countries that do perform well — Denmark, Finland, South Korea — don't just hand kids devices and hope for the best. They integrate structured digital literacy programs that emphasize problem-solving and critical thinking. They treat computer science not as an elective luxury, but as a foundational skill — as fundamental as reading and arithmetic.

There is a painful irony here. The more seamless we make technology, the less people understand it. The more intuitive the interface, the more invisible the machine. We've optimized for convenience and, in doing so, accidentally optimized against understanding.


So What Should a Kid's First Computer Be?

Not a MacBook. Not because MacBooks are bad — they're excellent machines. I use one myself. But they're excellent at hiding complexity. And for a child who's just beginning to understand technology, complexity is the curriculum.

A cheap laptop with Linux on it will teach a child more about computers in six months than five years of using a Mac ever could. Not because Linux is better software (Well, it is), but because it demands engagement. It shows its guts. It asks questions. It breaks — and trusts you to fix it.

As a How-To Geek writer recently put it: on Mac and Windows, the operating system is a bridge to apps — you learn it, adapt to it, and move on. On Linux, the OS itself is the learning experience. The rules are yours to rewrite. The workflow is yours to design. The system doesn't assume it knows better than you — it assumes you're capable of figuring it out.

Windows sits somewhere in the middle. It's more exposed than macOS, but still increasingly polished. It's a reasonable starting point, especially if gaming is part of the equation. But even Windows has been moving steadily toward the "just works" paradigm, sanding down the rough edges that once forced users to learn.

Linux is the real teacher. It's the operating system that says: "Here's everything. Figure it out." And when a child does figure it out — when they install their first package from the command line, fix their first broken boot, customize their first desktop environment — they don't just learn a technical skill. They learn that they are capable. That the machine serves them, not the other way around. That is a lesson that lasts a lifetime.


The Bigger Picture

This isn't really about Mac vs. Linux. It's about a deeper question: do we want our children to be consumers of technology, or people who understand it?

Every generation has had its version of this choice. A hundred years ago, if you owned a car, you probably knew how to fix it. Today, most of us can't change our own oil. We've traded understanding for convenience, and in most domains, that tradeoff is fine. But technology isn't most domains. Technology is the medium through which nearly everything in modern life flows — work, education, communication, governance, finance, creativity.

In an age where AI is reshaping every industry, understanding what's happening under the hood isn't a nice-to-have — it's essential. A child who has wrestled with a command line, configured a network, or recovered a broken OS has a fundamentally different relationship with technology than one who has only ever tapped icons on a glass screen. The first child sees a tool. The second sees magic. And you cannot think critically about magic.

Seymour Papert understood this half a century ago. The computer, he believed, was not a device for delivering information — it was an object to think with. A medium for experimentation. A place where a child could test ideas, fail safely, and build understanding through their own hands.

We have drifted far from that vision. We've turned computers into appliances. Sealed, polished, and impenetrable. We've made them so easy to use that using them teaches you nothing at all.

It's time to give children back the gift of difficulty.

So buy your kid a computer. A cheap one. Install something that will fight back. And when it breaks — and it will — resist the urge to fix it for them.

That's not a failure. That's the first lesson