Manifesto
13 min read
Start with your kid. Not a statistic. Your actual kid.
Think about the last time you tried to have dinner with them and they were somewhere else — eyes down, thumb moving, face doing that particular thing where you can tell they are neither happy nor able to stop. Think about what they are like at fourteen compared to what you were like at fourteen, and be honest with yourself about the difference.
This is not a coincidence. This is an engineering achievement.
A small number of people decided that the way to build the most valuable company in human history was to make the product as difficult to put down as possible. Not useful. Not joyful. Difficult to put down. There is a difference. They knew the difference. They built toward the more profitable one.
Mark Zuckerberg holds 61% of Meta's voting power. One person. He watched TikTok's algorithm generate unprecedented engagement by serving content designed not to delight you but to keep you unable to stop. He replicated it. He did this with full knowledge of the internal research showing what it was doing to teenagers. The research was done inside Meta, by Meta's own employees, and then set aside.
There is no Instagram constitution. There is no process by which users — or their parents, or their elected representatives — can hold that 61% accountable. The voting structure was designed specifically to prevent it.
One man. One algorithm. Four billion people.
That is not a technology company. That is a power structure with a phone app on top.
The second wound is deeper, and most people have not yet named it clearly.
For thirty years, developers, engineers, researchers, and curious people built something extraordinary in public. They answered each other's questions on Stack Overflow. They pushed code to GitHub. They wrote documentation, tutorials, blog posts. They built open-source tools and gave them away. They created the Linux kernel. The Python ecosystem. React. Postgres. TensorFlow.
They did this for the commons. For each other. For the students who would come later. The ethos was explicit: this is ours, together.
Then AI arrived.
Buried in terms of service nobody read was a clause allowing platforms to use content for "improving their services." That turned out to mean: training models on everything you ever wrote, everything you ever contributed, every problem you ever solved and shared. Models that can now do what you do. Models being sold to your employer as a reason to hire fewer people like you.
You built the training data. You did not consent to it becoming someone else's private property.
OpenAI's last funding round: $40 billion. Anthropic: $10 billion. xAI: $12 billion. At the foundation of all of it, uncompensated and mostly unaware, are the millions of people who wrote the code, answered the questions, and built the commons that made it all possible.
The deal was: use the network, give us your attention.
Then it became: give us your attention, and we will sell it to people trying to manipulate you.
Now it is: give us your expertise, and we will use it to build the machine that replaces you.
Free has become too expensive.
Here is what we want you to sit with for a moment.
No one owns the sun. No one owns water. No one owns soil. These are the original commons — the things that existed before ownership was invented, that no reasonable society would allow a private party to fence off. When the English lords enclosed the common land in the sixteenth century, turning shared fields that peasants had farmed for generations into private property, history recorded it correctly: as a taking. Legal, perhaps. A taking nonetheless.
Human knowledge is the same kind of thing.
It was built by everyone. Over centuries. In every language, every discipline, every culture. Before any of the labs existed. Before the internet existed. The accumulated output of civilization — science, literature, code, medicine, law, craft, conversation — does not belong to whoever is first to enclose it in software. It belongs to the species that produced it.
What happened with AI training data is the digital enclosure movement. The labs found the commons. They scraped it. They ran it through training pipelines and the result was models worth hundreds of billions of dollars. They did not create the knowledge. They captured it.
This should feel as wrong as it sounds.
People sometimes ask: what percentage of Our One should belong to users? The question reveals the confusion. It is not a negotiation. No one can offer 51%, or 80%, or 99%, as though those numbers represent generosity — because no one person or team created the knowledge the platform is built on. The knowledge belongs to the people who produced it. Which is everyone.
100% is not idealism. It is the only number that is morally coherent.
You cannot take a cut of something you did not create. We maintain the infrastructure. We do not own the water.
Here is what neither wound has yet found: a practical answer.
You cannot fix this with outrage alone. You cannot fix it by deleting your apps. You cannot fix it by waiting for the companies that built these systems to fix them, because the systems are working exactly as intended.
You fix it by building something different, with different rules, before the window closes.
The mathematics are available. It costs less than one dollar per user per year to run a social network at scale — not what Meta spends, what it costs if you build it without the extraction machine. Meta collects $270 per year from each American. LinkedIn Premium costs $480. The gap between one dollar and $270 is not the price of a better product. It is the price of the surveillance apparatus. Strip it, and the platform is small and cheap.
Our One charges one cent a day — $3.65 a year. That covers the honest cost of honest infrastructure and a share of the steward team that maintains it. No ads. No behavioral tracking. No extraction premium.
One cent a day is not a subscription fee. It is a constitutional act.
Because price is governance. If a platform is free, advertisers own you. If a platform uses a crypto token, speculators own you. If you pay one cent a day — the cost of nothing else in your life — you own the platform. The money changes the contract. It is the smallest amount that changes everything.
A published constitution makes these not promises but binding rules. Not policies that can be quietly updated in the next release. Constitutional provisions that cannot be changed without community ratification. The platform does not get to decide that your kid's attention is the product. The constitution says so.
The AI question is the most important one, and it is still open.
The labs are not going away. Competing with them at the frontier — building the next GPT-level model from scratch — is not the leverage point. A hundred million people cannot outspend OpenAI on GPU clusters.
But a hundred million people can do something no amount of money can buy.
They can provide real expertise.
AI quality is determined critically by the quality of human feedback during training — by people who rate outputs, correct errors, demonstrate what good looks like. This process is currently done largely by outsourced workers paid a few dollars an hour to label data for models they will never benefit from.
What if it were done by the professionals whose knowledge is being trained on? By the engineers, doctors, lawyers, teachers, and designers who built the commons in the first place?
The open-weight models exist today. The gap between GPT-4 and the best open model was two years in 2024. It is nine months now. By 2027, the architecture will be commodity. What will not be commodity is the training data from real professionals who own what they contribute.
The gap between community-trained models and proprietary frontier models is closing faster than the labs want to acknowledge. What is missing is not the technology. It is the governance structure — the constitutional framework that ensures the community owns what it builds, that the model cannot be quietly enclosed, that the benefits flow back to the people whose expertise made it possible.
That is what Our One is built to provide.
When the people who train the model own the model, the structure of who benefits from AI begins to change. Not as a promise. As a constitution.
We are not asking you to believe we can fix everything.
We are asking you to consider what is available right now, in 2026, that was not available five years ago.
Building is nearly free. Infrastructure is nearly free. Open-source AI models exist. The tools to build constitutional governance into products from the start exist. The understanding of what went wrong with the first internet, and how to architect around it, exists.
The window is open. The labs are raising rounds and closing it.
We are building the place to go.
Not a protest. Not a manifesto that stops at the manifesto. Actual products, built constitutionally, owned by their users, protected from capture, building toward an AI that belongs to the people whose knowledge made AI possible.
The old internet asked you to join platforms.
We are asking you to own infrastructure.
The knowledge was always yours. We are building the place where it stays that way.
I grew up in Czechoslovakia. I was fifteen years old in November 1989 when the Velvet Revolution happened — when hundreds of thousands of people came into the streets of Prague and, in the space of a few weeks, peacefully ended forty years of one-party rule.
I was there. I watched it happen.
What I learned from that experience — what I have carried for thirty-seven years — is that systems which seem permanent and unchallengeable are not. That concentrated power has a brittleness beneath its apparent strength. That when enough people decide the arrangement is wrong and refuse to pretend otherwise, the arrangement can change faster than anyone believed possible.
I also learned what it costs when power concentrates in too few hands. What it does to culture, to creativity, to the ordinary human ambition to build a life on your own terms. The socialism I grew up under was not evil in its stated intentions. It was harmful in its structure. It removed the connection between contribution and benefit. It eliminated accountability. It replaced trust with surveillance. It made the system's continuation the highest priority, above the wellbeing of the people it claimed to serve.
I have spent the last decade watching the internet complete a version of that same arc. The parallel is not subtle.
I have been building software for thirty years. I have seen every wave of the technology industry from close enough to feel the undertow.
And I want to tell you what I believe, after all of it:
The current structure of the internet is not the result of neutral market forces. It is the result of specific choices made by specific people who benefited from making them. The surveillance business model was not inevitable — it was adopted, consciously, because it was more profitable than the alternatives. The engagement optimization that hooks teenagers was not an accidental side effect — it was engineered, A/B tested, and deployed with full knowledge of what it was doing to the people in its path.
These were choices. They can be unmade.
But they will not be unmade by asking the people who made them to make different ones. They will be unmade by building alternatives that are structurally different — not just better-intentioned, but architecturally incapable of the same betrayals.
That is what a product constitution does. It does not depend on the stewards remaining idealistic. It builds the idealism into the structure.
My sons Adam and Oliver are twenty-one and nineteen. They are both building things, learning to build things, imagining futures in technology. My daughter Laura is twelve years old.
For thirty years, developers all over the world — millions of them — contributed to a digital commons. Stack Overflow answers. GitHub repositories. Open-source libraries. Documentation, tutorials, forum posts, code comments. Knowledge given freely, in the belief that shared knowledge multiplies.
That commons became the training data for the most powerful AI systems ever built.
We did not consent to this specifically. We could not have — the implications did not yet exist when the terms were written. But the result is that the collective intellectual output of a generation of people who believed in openness has been enclosed into private capital worth hundreds of billions of dollars, in companies now being positioned to automate the work of the people who created that value in the first place.
I think about what world Adam and Oliver are building toward. I think about whether the value they create will belong to them, or whether the architecture of that world has already been set up to ensure it flows elsewhere.
I think about Laura at fourteen. And who designed the software she will encounter. And what for.
I do not intend to find out by watching.
We still have time. Not unlimited time. But now — right now — the window is open.
I am not a utopian. I spent my formative years watching what happens when a system is built on promises that cannot be kept by the structure it runs on. I believe in economics. I believe in incentives. I believe that good values, without good architecture, eventually produce the same outcomes as bad values.
So let me be precise about what I am claiming.
I am claiming that at 100 million users, a social platform costs approximately one dollar per user per year to run. That number comes from infrastructure pricing that is publicly verifiable.
I am claiming that a team of fifty excellent people, paid well, can maintain what Meta employs tens of thousands to run — because most of those tens of thousands exist to operate the extraction machine, not the platform. Without the extraction machine, the platform is remarkably simple.
I am claiming that open-weight AI models, trained with real professional expertise from communities who own the result, can close the quality gap with proprietary frontier models faster than the labs want to acknowledge — and that the people who contribute that expertise deserve to own what they build.
These are not leaps of faith. They are claims that can be verified, and I am committed to verifying them in public, product by product, constitution by constitution.
We are starting with the platform.
A professional network. A public feed. Private messaging. Simple on purpose. The constitutional core made visible in its simplest possible form: you see what the people you follow share, in the order they shared it. Your professional identity lives on a platform that cannot sell it. No one is ranking your reality for profit.
The constitution is published. The forbidden behaviors are named. The governance process is documented. The economics are transparent.
You can read it before you join. You can hold us to it after you join. That is the whole point.
I am building this for Laura. For Oliver and Adam. For the developers who gave their knowledge to a commons that was enclosed around them. For the parents who watched their kids disappear into systems engineered to capture them. For the people who felt the early internet's promise of liberation and watched it slowly invert into something more like the systems they were promised it would replace.
I am building it because I was fifteen in Prague in 1989, and I know that things which seem permanent are not.
I am building it because I have spent thirty years in this industry and I know exactly what the current structure is, how it works, and what it would take to offer something genuinely different.
I am building it because my daughter is twelve years old, and she deserves software that is not designed to capture her.
And I am building it now, because the window is open now, and I am not willing to explain to her in ten years that I saw it and chose to wait.
The commons is still ours. Not for long. But right now, it is.
Come and own it with us.
Rado Founding Steward, Our One Prague, 2026