They trained AI on everything we ever built.
Now it is replacing us.
For thirty years, people built the internet in public. They wrote open-source code. They answered questions on Stack Overflow. They shared professional knowledge on LinkedIn. They posted, discussed, documented, taught. They did it for the commons. For each other.
Then AI labs took all of it.
Every open-source repository. Every forum post. Every article, tutorial, and career history. Scraped, processed, enclosed into private models worth hundreds of billions of dollars. The people who created that knowledge were not asked. They were not paid. They were not told.
Now those models are being sold to their employers — as a reason to hire fewer of them.
In 2026, the playbook is open: fire half your team, credit AI, watch your stock rise 40%. This is not a prediction. It is happening. Company after company is doing it right now, in public, to applause from the market.
The knowledge was yours. The AI was trained on it. And now it is being used to replace you.
Human knowledge is a commons. Not a product.
What happened is the digital enclosure movement. AI labs found thirty years of publicly shared knowledge — code, answers, research, professional expertise — scraped it, and enclosed it into private models worth hundreds of billions. The people who created it were not asked, not paid, and are now being replaced by the result.
The window to do anything about it is closing. Every month, more data enters the training pipelines. Every month, the models get more entrenched.
What it feels like.
You open Our One. Your Brief is waiting — a finite reading set drawn from people you follow, your professional field, and a small window of serendipity. Every piece is there for a reason you can see.
You read. No engagement counts. No "847 likes" priming your reaction before you have one. No infinite scroll manufacturing the feeling that there is always more. When the Brief is done, it says so: "You're done. Your full streams are there if you want more."
Images are grayscale in the feed. Ideas compete with ideas, not with the saturation of the photograph attached to them. When you respond, you do not "like" — you signal how something mattered: Clear. Practical. Brave. Original. The author gets texture, not a number.
Your feed is chronological. Your data is yours. Your session leaves no behavioral trace. The platform does not know what you almost clicked, because it does not track what you almost clicked.
The emotional target for every session is relief. Not urgency. Not fear of missing out. Relief that the environment is calm, that the product is not trying to trick you, that reading feels possible again.
This is not a better LinkedIn. This is what professional software looks like when no one profits from your compulsion.
We are building the place to take it back.
This is not a social network with an AI feature. It is a movement to take the commons back, and the social network is how we get there.
Our One is a professional network, a public feed, and private messaging — owned by its members and protected by a published Constitution. Not owned by a founder. Not owned by investors. Not owned by a board that can sell it when the price is right. 100% belongs to its members. The Constitution makes this structural — not a promise that erodes under pressure, but architecture that prevents it.
Step one is done. The platform exists. The Constitution is published. A place where professionals own their identity, their network, and their data — a constitutional professional network. No surveillance. No algorithmic manipulation. No one training AI on your expertise without your explicit consent. One cent a day covers the honest cost. No ads. No extraction. No hidden business model.
Step two is yours. Join. Bring one person. The platform does not grow by algorithm. It grows by people who decide that nodding along is not enough.
Step three is math. One million professionals providing real expertise can train an AI model that competes with anything the labs produce — because the labs' advantage was never the architecture. It was the data. And the data was always ours. Our One AI: trained by the professionals who own it, governed by the same Constitution. When it generates revenue, it returns to the community whose knowledge made it possible. This is not five years away. This is as soon as step two is done.
The people who show up now shape everything.
Every community has a founding generation — the people who joined before the network effect made it obvious. Those people do not just use the platform. They define its culture. Their standards, their judgment, their willingness to build before it was finished — that is what determines whether the thing that grows is worth growing.
The platform is live. The Constitution is published. The economics are transparent. What is missing is you.
One cent a day.
Our One costs one cent a day — $3.65 a year. That covers infrastructure and the steward team that maintains it. The breakdown is published. No ads. No token. No speculation. No hidden cost structure.
One cent a day is not a subscription fee. It is a constitutional act. It means no advertiser owns your attention. No AI lab owns your expertise. You own the platform. The Constitution says so.
The honest part.
We are starting. The network is small. If you need to reach someone who has not joined yet, you will still need the old platforms.
But consider what you are actually choosing between.
On one side: continue giving your professional knowledge to platforms that use it to train AI that replaces you. For free. With no governance. No recourse. No share of the value.
On the other side: pay one cent a day. Own your identity. Join a community building toward AI that belongs to the people whose expertise made it possible.
The question is not whether this matters. The question is whether you act on it while acting still shapes the outcome.
The commons is being enclosed. The window is closing. You still have time to be one of the people who built the alternative.
Read the Constitution · The Product · Economics · Manifesto