How Open Source Is Quietly Rewriting the World

The biggest political shift of the decade isn’t a political one. It’s about who owns the means of computation — and the answer, slowly but unmistakably, is becoming “everyone who wants to.”

The thirty-year story of open source is usually told as a software-engineering story: Linus, GNU, Apache, Linux taking over server farms, Git eating subversion, Kubernetes taking over orchestration. All true, all visible, all still important. But that telling misses the bigger arc.

The actual story is about ownership. Specifically: who owns the building blocks of digital civilisation, and on what terms can the rest of us use them?

Forty years ago that question had a single answer: a small set of vendors, in a small set of countries, with a small set of business models. Today the answer is genuinely different — and the world is quietly being rewritten because of it.

From product to substrate

The first wave of open source replaced expensive proprietary products: Apache replaced commercial web servers, Linux replaced commercial Unix, MySQL replaced commercial databases at the low end and Postgres replaced them at the high end.

The second wave replaced proprietary substrates: Kubernetes replaced cloud-vendor-specific orchestration, Kafka replaced proprietary message buses, Postgres extended into vector and time-series workloads, Linux became the assumed kernel underneath every cloud.

The third wave — the one we’re in — is replacing proprietary intelligence. The frontier-grade language models that nation states tried to keep classified are now downloadable, runnable on a single GPU box, and improving every month. That is qualitatively different from open-sourcing a database.

Why this changes geopolitics

Pre-2023, the question “does country X have AI?” collapsed to “can country X negotiate a contract with one of three US vendors?” That answer encoded a remarkable amount of dependency: training data, compute, alignment policy, pricing, deprecation, sanctions exposure. Sovereign questions answered in someone else’s boardroom.

Today the same question collapses to “does country X have GPUs and an engineer who can run vLLM?” The dependency tree shrank by about three orders of magnitude.

That’s not a marginal improvement. It’s the entire substrate of digital sovereignty being relaid in two years.

Field note from Europe. Every European institution we’ve worked with in the last 12 months has independently arrived at the same conclusion: AI capability cannot live in a vendor relationship. It has to live on infrastructure they control. Open weights make that not just possible but cheap. There is no conference where this is the announced theme — and yet it is the thing every single CIO is quietly working on.

Why this changes economics

Closed software has a marginal cost approaching zero and a margin captured by whoever owns the licence. Open software has the same marginal cost and the margin captured by whoever uses it productively. That sounds like an accounting nuance. In aggregate it’s a tectonic shift in where value accrues.

Open Linux underpins the entire cloud industry, including the parts of it that ostensibly compete with open Linux. The cloud industry didn’t lose to Linux; the cloud industry was built on Linux, and the value of the underlying OS that would have been captured by an Oracle-of-the-1990s now flows through the cost structure of every web app on Earth.

The same pattern is playing out in AI. The entire enterprise AI build-out underway right now is sitting on open-weight LLMs as the substrate. The next decade of value will accrue to whoever assembles the agents, the workflows, the integrations, the customer relationships — not to whoever owns the model file. The model file is becoming a Linux kernel: foundational, free, taken for granted.

Why this changes power

Closed software concentrates power. Open software diffuses it. That’s the whole story, in one sentence, since 1985.

What’s new is the kind of power being diffused. Linux diffused operating-system power. Postgres diffused database power. Kafka diffused message-bus power. Open weights are now diffusing language-and-reasoning power — which until very recently looked like it might end up in the hands of three or four companies forever.

It hasn’t. And the reason it hasn’t is the same reason Linux escaped the proprietary-Unix gravity well: a critical mass of researchers, engineers, and users decided the open path was better, contributed to it, and made the closed path a worse deal for the marginal user.

The most important sentence in computing remains: “You can run this. You can change it. You can keep running it after we’re gone.”

What this means in practice

If you build inside an enterprise — especially a European one — the practical takeaways are:

  • Treat “open vs. closed” as a sovereignty question, not a feature question. The feature gap closes. The sovereignty gap is structural.
  • Invest in the muscles that compound on top of open infrastructure. Inference operations, evals, retrieval pipelines, agent orchestration, integration adapters — these compound for you. Vendor-managed equivalents compound for the vendor.
  • Be sceptical of “open-ish.” Research-only licences, evaluation-only weights, weights-with-strings-attached — these are marketing categories, not open source. The four-freedoms test still applies.
  • Recognise that the open path is now the default. Five years ago picking open infrastructure was a contrarian call. In 2026 it’s the conservative one.

The quiet part

Most of this is happening without a single announcement. There’s no “Year of Open Source AI” press release. The CIOs putting GPUs in their own data centres aren’t holding press conferences. The engineers swapping closed APIs for open weights are doing it in pull requests, one commit at a time.

That’s how open source has always won: not in keynotes, but in git log. The world doesn’t announce that it’s being rewritten. It just is.

Curious How This Looks in Production?

We deploy open weights on customer hardware every week. Book a 30-minute call and we’ll walk through real architectures.