Free Software, Closed Platforms, and the Older North Star
How Engelbart, open code, and the social layer ended up on different timelines
On the same phone where a free operating system quietly schedules your processes, a closed platform decides who you see, what you read, and which parts of your life count as “engagement”.
One layer is built on ideas that came out of hacker collectives, public labs and a decades-long argument about user freedoms.
The other is shaped by investor decks, ad dashboards and regulatory negotiations.
We rarely hold these two stories side by side. We treat “open source” as something that happens in infrastructure and developer tools; platforms are just “where everyone is”.
But if you look at both from the vantage point of Douglas Engelbart, they become two answers to the same question:
Who gets to shape the tools that shape our collective thinking?
And then a deeper question shows up:
How did we manage to demand freedom at the level of compilers and kernels, while accepting dependency at the level of identity, public discourse and coordination?
1. Engelbart before licenses: the timeless question
If you read Engelbart today, you don’t feel like visiting a museum. You feel like opening a bug report against the present.
He was not arguing about app stores, content moderation or “creator economy” revenue shares. His lens was older and simpler:
- How do groups of people face complex problems together?
- Can we increase that capacity over time?
- What infrastructures—conceptual and technical—do we need so that each project, each crisis, each discovery leaves the next group better equipped?
He called this Collective IQ. Tools, in his view, were not gadgets; they were part of a co-evolving system of methods, conventions, interfaces and institutions. If you change one piece, you’re rewiring the whole.
When Engelbart built NLS/Augment in the 1960s, the words “free software” and “open source” did not exist. There was no GPL, no OSI. But his work presupposed something very close to what those movements later tried to guarantee:
- People must be able to inspect and adapt the tools they depend on.
- The knowledge produced with those tools must be structured and reusable.
- Communities must be able to improve their own capacity to improve—the famous “bootstrapping”.
From that angle, the later debates about licenses and business models are details. Important details, but still details.
Engelbart is upstream of all that. He lives at the level of the exam question, not the answer key.
2. What free software actually fought for
When the free software movement emerges with Richard Stallman and the GNU project in the 1980s, it looks—on paper—like a completely different story. The protagonists are hackers and compilers, not knowledge workers in a DKR.
But the core demand is surprisingly compatible with Engelbart:
- Freedom to run the program for any purpose.
- Freedom to study how it works.
- Freedom to modify it.
- Freedom to share copies, modified or not.
These four freedoms are not about “free as in price”; they are about who holds the pen. Who can read the code, change the code, and pass those changes on without asking a distant owner for permission.
Free software made this moral and political:
- Proprietary software was framed as a restriction on user autonomy.
- Licenses like the GPL were created to turn this ethic into enforceable reality.
- Communities formed around the idea that users should not be mere consumers.
In Engelbart’s vocabulary, you might say: free software tried to ensure that the tools of Collective IQ would not become black boxes leased to the people who needed them most.
But the focus was still mostly on code itself: editors, compilers, kernels, utilities. The social layer—identity, discourse, coordination—was not yet the main battlefield.
3. “Open source”: translating ethics into business language
In the late 1990s, a second shift happens. The term “open source” is coined and popularized. The code is largely the same, the repos are often the same, but the story changes.
Instead of foregrounding user freedoms, the open source narrative foregrounds:
- Quality – more eyes, fewer bugs.
- Innovation – faster iteration, shared components.
- Business value – lower vendor lock-in, shared costs, bigger ecosystems.
Where free software talked about rights, open source talked about efficiency.
You can see why this worked:
- Governments and corporations could adopt open source without signing up to a full ethical program.
- Foundations and consortia could form around shared technical assets: Linux, Apache, Kubernetes, PostgreSQL.
- “Open source” became a language you could speak in boardrooms and procurement offices.
If you care about Engelbart’s concerns, this is a mixed victory:
- On the one hand, the world gets a huge amount of inspectable, modifiable, forkable infrastructure.
- On the other hand, the deeper question—what are these tools doing to our collective thinking?—is easy to sideline in favor of ROI and market share.
Open source wins many battles over how code is produced and shared, but it doesn’t automatically decide what kind of social worlds that code will sustain.
4. Platforms as the inverse gesture
While free software and open source were consolidating in compilers, servers and developer tooling, another architecture was rising on the social layer: the closed platform.
Seen through Engelbart’s eyes, platforms are almost the exact opposite of what he needed.
They take the fundamental freedoms and quietly invert them:
- You have no access to the code that shapes what you see.
- You have no right to modify that code or run your own variant against the same social graph.
- You “share” by renting space under terms that can change unilaterally.
The real interface is often not the screen but the Terms of Service. The real API is a set of business and regulatory constraints, exposed just enough to keep the ecosystem alive and just limited enough to prevent serious alternatives.
Ivan Illich would recognize this move immediately.
What could have been a convivial tool—something you can understand, repair and recombine—becomes a system that turns people into clients:
- Your identity is an account in someone else’s database.
- Your relationships are edges in someone else’s graph.
- Your past contributions are rows in someone else’s analytics pipeline.
On paper, nothing prevents a large platform from releasing more code, opening more APIs, inviting more participation in governance. In practice, the structural incentives pull the other way.
When your business is built on capturing and monetizing attention, the last thing you want is an easy exit.
5. Who can fork, who can leave, and what that does to Collective IQ
Free software and platforms tell very different stories about exit.
In a mature free software ecosystem:
- If a project’s direction no longer serves a community, a fork is possible.
- If you don’t like one vendor, you can often find another implementing the same stack.
- If you need to leave, you can at least take the code and the knowledge with you.
It’s not painless, but the option exists. Forkability exerts a quiet pressure on maintainers: they know that hard power sits, in the last resort, with those who can run their own copy.
In platforms, exit looks like this:
- You can deactivate the account.
- You can delete the app.
- You can “leave”.
But you cannot fork the context:
- The threads you started.
- The relationships you grew.
- The shared history your group wrote together.
All of that remains inside the mall. If you want the people, you must come back através do mesmo door.
From an Engelbart perspective, this is devastating:
- Collective IQ depends on cumulative, navigable records of what groups have thought, tried and learned.
- When those records are locked into one company’s interface, they become fragile and non-transferable.
- The group’s capacity to learn does not fully belong to the group; it is mediated by a private entity.
In free software, the hard question is usually “who will maintain the fork?”
In platforms, the hard question is “can we afford to lose everything if we leave?”
Those are very different forms of dependency.
6. Governance: commons vs dashboards
Elinor Ostrom spent her life studying how communities govern shared resources without collapsing into tragedy.
Her questions apply uncomfortably well to digital infrastructure:
- Who can set or change the rules?
- Who monitors behavior?
- How are conflicts resolved?
- How can participants contest decisions?
Many free software projects, for all their flaws, at least approximate a commons:
- Rules of contribution and decision-making are often public.
- Mailing lists, issue trackers and governance documents are archived.
- Authority can be messy and political, but it is arguable; you can point to a proposal, a vote, a request for comments.
Contrast that with the governance of large platforms:
- The effective rules are written into the ToS, community guidelines and enforcement playbooks.
- The real constraints include advertisers, regulators, PR risk and quarterly earnings.
- Disputes about moderation, ranking and access happen under asymmetric opacity: the company knows everything; the users see only the outcome.
If you draw Ostrom’s map, platforms are not commons. They are firms managing critical parts of the social layer as private assets, with some consultative processes on top.
That doesn’t make every platform evil. It does make them structurally misaligned with the idea of the social layer as shared infrastructure.
From the outside, we call them “public squares”.
From the inside, they are closer to shopping centers with a PR department.
7. Arendt’s public space inside private malls
Hannah Arendt cared about something deceptively simple: a space where people can appear to one another as equals, speak, be seen, and take responsibility for what they do and say.
Modern platforms borrow her imagery while bending its structure.
They give us:
- A sense of visibility – your words and images travel far.
- A sense of publicness – you can address “everyone”.
- A sense of consequence – things go viral, reputations shift.
But the substrate is private:
- The “square” can be reconfigured overnight by an A/B test.
- What is visible is filtered through engagement metrics and ranking algorithms designed for business outcomes.
- The permanent record of “what happened” is modifiable—hidden behind interfaces that can be changed without public consent.
Arendt worried about factual truth—not in the sense of being right about everything, but in the sense of sharing a stable enough world that we are at least arguing about the same events.
When the primary surface for that shared world is optimized for engagement and controlled by a single owner, factual truth becomes a side effect, not a design goal.
The result is subtle:
- We experience ourselves as speaking in public,
- while structurally we are tenants in a privately governed arena whose shape we do not control.
8. Bitcoin and the protocol instinct we forgot to apply
Satoshi Nakamoto enters this picture from a different angle.
Whatever one thinks of Bitcoin as an asset or a political project, its architecture embodies a notable instinct:
- The rules are public and verifiable.
- The implementation is open source.
- Many clients can speak the same protocol.
- Forking is a real—if costly—option.
Again, Engelbart is upstream. Satoshi is not thinking about Collective IQ, DKRs or NICs. But the design carries a lesson that the social web largely ignored:
If something is infrastructure, don’t make it a product.
Make it a protocol and let many products compete on top.
We applied that instinct to a form of money.
We did not apply it to everyday conversation, identity and coordination.
We built a protocol where anyone can, in principle, validate the history of transactions.
We did not build a protocol where anyone can, in principle, validate or migrate the history of their conversations and relationships.
Instead, we placed that history inside a handful of firms and called their interfaces “the internet”.
9. What free software taught the social layer—and what it didn’t
Free software and open source have already changed the world once.
They gave us:
- A vast layer of shared infrastructure—from operating systems to databases to cryptographic libraries.
- A culture of inspecting and fixing rather than just consuming.
- A set of institutions—foundations, licenses, practices—that embody, imperfectly, a commons.
They also revealed real limits:
- Governance struggles, personality conflicts, burnout.
- Funding challenges for critical but unglamorous components.
- Usability gaps that left many end-users in the arms of polished proprietary alternatives.
Seen from Engelbart’s vantage point, the verdict might be:
- You have done well in constructing tools that can, in principle, be adapted by their users.
- You have not yet systematically aimed those tools at the Collective IQ of whole societies, especially at the level of identity, discourse and coordination.
- You allowed the social layer itself to solidify as a set of closed platforms on top of your open stack.
That is not a condemnation. It is a sign that the work is incomplete.
The free software lineage teaches us that:
- Protocols matter more than platforms.
- Forkability matters more than brand continuity.
- Commons governance is possible, but it must be designed.
What it doesn’t give us, by default, is a blueprint for the social layer: how attention is stewarded (Weil), how tools remain convivial (Illich), how public space is protected (Arendt), how the digital commons is governed (Ostrom), and how resilience and verifiability are embedded at the architectural level (Satoshi).
For that, we still need Engelbart’s question in the center of the table.
10. Where Project WGN fits (and where it stays modest)
Project WGN is one attempt—among many that should exist—to take this constellation seriously at the level of a social protocol.
That means a few concrete commitments:
- Protocol before app Treat the social layer more like email, the web or Bitcoin’s wire protocol—something many clients can implement—than like “yet another platform”.
- Identity you carry, not rent Your presence should live in an encrypted file or key material you control, not as a fragile row in someone else’s database. Clients come and go; your identity and history remain yours.
- Attention as a moral resource Following Simone Weil, treat attention not as a commodity to harvest but as something to protect. Filters you own rather than feeds that own you.
- Convivial tools instead of captive clients In Illich’s terms, aim for tools that can be understood, inspected and, within reason, modified—by communities, not just vendors.
- Commons governance from day one With Ostrom in mind, design the protocol and its surrounding institutions as a shared resource: clear rules, clear paths for contestation, meaningful ways for participants to shape its evolution.
- Architectural resilience without casino dynamics Borrow from Satoshi the parts that matter for infrastructure—verifiability, forkability, resistance to unilateral control—while explicitly rejecting speculative token economics as the center of gravity.
This is not an answer to everything. WGN is not “the solution”; it is a concrete question with running code:
What would it look like if the social layer felt more like free software and less like a feed—while keeping Engelbart, Weil, Illich, Arendt, Ostrom and Satoshi in the design loop from the start?
If it fails, it should at least fail in a direction that makes the next attempt easier.
11. A small shift: stop calling platforms “the internet”
There is one practical shift you can carry away from this, even before any protocol like WGN exists in your hands.
It is this:
Stop calling platforms “the internet”.
Each time you open a social app, try to see it clearly:
- You are not “entering the public square”; you are walking into someone’s mall.
- Your identity is not a fact of nature; it is an account on their servers.
- Your collective history there is not guaranteed; it exists at their discretion, under their ranking, their retention policies, their business model.
This does not require cynicism.
You can still enjoy the place, use it, benefit from it.
The shift is simply to keep Engelbart’s older, quieter question in the back of your mind:
- Does this environment help my group become more intelligent over time?
- Can we inspect and adapt the tools that shape our thinking here?
- If we had to leave, what part of our shared understanding would come with us?
Free software taught us to ask these questions about compilers, libraries and kernels.
The next step is to ask them, without flinching, about our feeds, our identities, our “public squares”.
Engelbart remains contemporary because we still haven’t answered him.
The more code we open and the more platforms we build, the more his atemporal north star quietly insists:
Tools that shape collective thought belong, in the end, to the collectives that use them—or they will shape us in ways we did not choose.