
Tim Baker
A converted sceptic with 40 years of scar tissue
TCB Consulting Ltd | March 2026
For the first time in my career, the speed at which we can now build sophisticated, custom software systems is genuinely astonishing. The promise of AI is not a future prospect; it is here, and the competitive advantage it offers is enormous.
But a profound and largely unrecognised danger has emerged alongside these benefits. It is a trap that many organisations are walking into without necessarily realising they are doing it. And by the time they realise it, it will be too late.
The intellectual property generated during an AI-assisted project — the strategic decisions, the architectural rationale, the iterative problem-solving — is currently being locked inside the AI account of whoever is doing the work. When that relationship ends, the client retains the product but loses the “brain” that built it.
This is a fundamentally new problem. Historically, when a developer or a consultant left a project, the loss of their personal context was expected and accepted. Knowledge lived in people’s heads, in fragmented meeting minutes, and in occasional design documents. The majority of it was never written down. This was an acknowledged, if frustrating, limitation of the craft.
AI changes this entirely. For the first time, the full context of a project — every discussion, every considered option, every dead end, every decision and its rationale — is explicitly recorded, beautifully organised, and immediately retrievable. It does not have to live in someone’s head anymore.
The tragedy is that, at present, this extraordinary record is tied to a personal account on a third-party platform. The client, in effect, does not own the most valuable asset they have paid for: the understanding of how their system works.
The final deliverable (a website, an app, a database schema) is only part of the value. The accumulated context — why certain decisions were made, what was rejected and why, how the system is intended to evolve — is equally valuable intellectual property. This context currently lives in the AI’s project memory, which is owned by the developer’s account, not the client’s.
Think of it like this: you hire a master chef to create a signature dish for your restaurant. He delivers a stunning plate of food. But when he leaves, he takes the recipe, the list of ingredients, the memory of the techniques he tried and discarded, and the fundamental understanding of the flavour profile with him. You have the dish, but you cannot replicate it, adapt it, or train a new chef to make it. You have a delicious black box.
When the knowledge was only ever in the chef’s head, this was unavoidable. But when the entire recipe book is sitting in his personal cloud account, is it still professionally acceptable to let him walk away with it?
The answer is not to avoid AI-assisted development; the benefits are too significant to ignore. The answer is to change our professional standards. We must systematically and deliberately extract the context from the AI’s memory and document it in client-owned artefacts — design documents, decision logs, architectural rationale notes.
This requires discipline. It requires a new way of working, where documenting the why is as important as delivering the what. It means building what we at TCB call an “IT Playbook” for every client, ensuring that the brain of the system belongs to the organisation that paid for it, not the consultant who built it.
As AI-assisted development becomes mainstream, questions of IP ownership in AI contexts will become a significant legal and commercial issue. Contracts, service level agreements, and professional standards will need to evolve to address this.
What the industry needs to develop — and quickly — is the ability for AI platforms to productise the entire project context and hand it over alongside the system itself, in the same way that any responsible developer hands over documentation, credentials, and access rights at the end of an engagement. The technology to do this does not yet exist in any meaningful form. It will come. But the organisations commissioning AI-assisted work today should be pushing their suppliers and their platforms to make it happen sooner rather than later, rather than waiting for it to arrive as a standard feature.
But you do not have to wait for the lawyers to catch up. The next time you engage a consultant to build a system using AI, ask a very simple question: when you leave, who gets to keep the brain?