Every company taking AI seriously is building some kind of a knowledge layer right now. They're the way forward. They're going to make a lot of work better. But, they rely on you to grow and thrive. And once you're out the door — for good or bad reasons — they still have your stuff, and they own it forever. That's the part that needs to change before it gets too late.

So here's a wacky idea.

We used to think we owned Photoshop. We didn't, technically — it was always a license — but the license was perpetual, the disc was on your shelf, and Adobe had no practical way to revoke it. Then the industry switched to subscriptions, and the licensing nature of software became impossible to ignore. You stop paying, it stops working. The shift was uncomfortable for a minute, and then everybody got used to it, because the new arrangement turned out to be better for both sides — Adobe got a steady revenue stream, users got continuous updates, and the fiction of permanently owning software quietly disappeared.

We should treat our knowledge like we treat software subscriptions.

Three half-baked ideas.

The strict subscription version is the simplest. Your work contract is the license. The whole time you're employed, the company has the right to use what your knowledge contributes to the model. The day you leave, the license ends with the employment, same way a SaaS subscription ends when you stop paying. Cleanest in concept, technical nightmare in practice — surgically removing one person's contributions from a trained model isn't something the technology can really do yet, and might never be able to do cleanly.

The time-limited license is softer. The company can use what you contributed, but only for a set period after you leave. Two years, five, whatever the contract says. After that, it has to be purged or relicensed at a new rate.

The royalty model is the messiest of the three and probably the most ideal. You license your knowledge in perpetuity, but the company pays you — or your estate — for as long as the model uses it. Closer to a record-label deal than a software subscription. Different analogy, same energy: ownership and use get separated, and the person who created the value keeps getting paid every time it generates value.

IP law still works, but big cracks are starting to show.

I floated this idea to my wife the other night, because she's a lawyer and would absolutely have thoughts.

She did. "No way. Anything you make on company time belongs to them — that's just IP law."

She's right. For how much longer, though?

Traditional IP has edges. Whatever you make for the company belongs to the company. And while your tangible footprint might remain their property forever, you could at least walk out the door with everything that made that footprint possible: taste, judgment, experience. They'll never have someone who contributes exactly the way you do.

That part is gone.

For a company's knowledge layer to actually work, it needs more than the documents lying around. Documents are the official record of what people thought after they thought it. The biggest value isn't in the deck you made for the Johnson account. The real signal lives in the sausage being made — the sentiment in a Slack thread, the inference buried in a Zoom transcript.

Some of that gets captured passively in the systems people already work in. But the part that actually matters has to be drawn out. Somebody has to sit you down and ask you to articulate what you know. A real knowledge layer isn't surveillance. It's an interview. A long one. Repeated, contextual, structured to extract the part of you that doesn't show up in the artifacts.

Every session you sit through, the model of you gets a little more accurate, a little more useful to the company. When you leave — when they decide the real you isn't worth what the model of you already cost — the model stays. It doesn't negotiate salary, it doesn't have a hard quarter, and it doesn't go find something better.

That last bit doesn't really make you want to sign up for that interview. Does it?

Researchers at Harvard Business School and MIT studied this and published in March. Their finding, in one sentence: workers know they have control over how much they share, and they restrict it the moment they learn their expertise may be used to train AI to do their job. The effect was significant enough that participants in their experiment refused real money to opt out, once they understood the deal.

I wrote earlier about how the real bottleneck for expressing intent in AI design isn't the tech — it's getting experienced people to sit down and bare their brains to the system. I believe in what deep knowledge layers can do to further design. Yet if I were the one being asked to externalize thirty years of judgment into a system someone else owns, I don't know that I'd do it.

The deal doesn't work. I sit down, I tell you everything I know, you get something out of it forever, and I get nothing out of it forever. A document I make for the Johnson account stops being useful when Johnson moves on. A model trained on how I think keeps producing original output for any client long after I'm gone. The taking is one-time. The using is forever. That's the asymmetry no previous version of this deal had to account for.

So here's where we are. The technology works. Companies need your knowledge. The legal framework doesn't properly account for it. The people who'd have to sit down for the interview have the most reason to refuse. And the three ideas I floated earlier are probably all unworkable — the strict version is technically impossible, the time-limited version is gameable, and the royalty version will get fought tooth and nail by every company whose AI strategy depends on never paying anybody twice for the same thinking.

I know. I'm throwing shit at the wall.

But somebody has to. The current deal — sit down, train your replacement, leave, get nothing — is going to break before the people with the most leverage have any reason to fix it. The Harvard team also found that when workers were given the choice, sixty-four percent preferred a policy where they could own and sell their own work data. People aren't confused about what's happening. They've already figured out the deal — they're just waiting to see whether anyone in charge plans to offer them a different one.