The AI Service Operator

25. March 2026

Why the most valuable companies of the next decade won’t be software firms

For twenty years, software was the dominant value creation model of the digital economy. Companies built systems, customers entered data, processes became visible and manageable. Value came from licenses, subscriptions, and transaction fees. Whether SaaS, ERP, CRM, or platform: software was the interface between people and organizations.

That model created enormous value. But it has hit a limit that can’t be pushed further.

Software never did the work. It coordinated the work.

The actual value creation – understanding, prioritizing, communicating, deciding – remained human. That’s why the most valuable companies of the next decade won’t be those that process data customers enter more efficiently. They’ll be the ones that run manual, fragmented value chains by machine – and in doing so, restructure markets that have been bottlenecked by human attention for decades.

I call this new type of company the AI Service Operator.

Software as interface – and its structural limit

SaaS is often described as if it replaced processes. In reality, it merely formalized them. Most software products are structured data management systems with workflows. They store information, distribute tasks, and make dependencies visible. Decisions, exceptions, and accountability stay with the human.

That works well where processes are stable, standardizable, and low in variance. It works poorly where reality enters the picture: local conditions, unclear requirements, document chaos, liability questions, time pressure, human interaction.

In precisely these areas – services, healthcare, construction, real estate, legal, trades – platforms and horizontal software models have hit the same wall over and over. Not for lack of demand. For lack of operational control.

Why service markets don’t have a matching problem

The recurring error of the platform economy was the assumption that services could be brokered like products. Request, match, transact. But real services aren’t a single step. They’re a chain.

What actually happens is a sequence of operational states: A request has to be understood, classified, and qualified. Requirements are rarely complete. Documents arrive unstructured. Schedules collide. Exceptions are the rule. Communication isn’t a ticket – it’s a stream of follow-ups, corrections, and alignments. And after delivery, the most relevant part often begins: follow-through, liability, retention.

Platforms failed at this reality because they didn’t run it – they outsourced it. The bottleneck remained human attention. Growth didn’t increase efficiency; it increased coordination load. The economics broke.

The technological break: from description to execution

What has changed with modern AI systems is not primarily their intelligence, but their role in the system. Language models, agentic architectures, and persistent context make it possible, for the first time, to not just document unstructured work but to run it continuously.

This shifts software from the description layer to the execution layer. Systems no longer stop at visibility – they take ownership of the next step. Not occasionally, but persistently.

Out of this shift emerges the AI Service Operator: an operating organization where AI doesn’t assist – it carries. Software here isn’t a product. It’s infrastructure. Value doesn’t come from usage. It comes from outcomes.

Three markets, one pattern

Three markets, different services—yet the same structural shift: Value is no longer created primarily through execution, but through the management of processes and relationships. An AI Service Operator does not intervene on an ad hoc basis, but organizes the entire workflow—from the initial inquiry to ongoing support.

Real Estate

Real estate brokerage has long been a coordination problem. High demand met limited attention. Inquiries had to be sorted, documents reviewed, showings scheduled. Most of the work sat in front of the actual transaction – and was barely captured as value creation.

An AI Service Operator changes this fundamentally. The market doesn’t become more efficient in the traditional sense – it becomes quieter. Unqualified inquiries disappear early. Interactions are run consistently. The client experiences not process, but direction.

The result isn’t a marginal efficiency gain. It’s a different economic model. Commissions drop, throughput rises, quality becomes reproducible. Brokerage becomes an operational service – not a matchmaking role.

For the owner of a brokerage, this shifts not the business but the basis of control. What protected them before – local knowledge, personal relationships, experience handling exceptions – doesn’t become worthless, but it’s no longer a defensible position on its own. Their operational knowledge either becomes the foundation of a new model – or it gets absorbed from the outside.

Home Services

Home services were almost always organized transactionally. A problem appeared, a contractor showed up, the system forgot the customer. Value creation ended with the job. Repeat business was accidental.

The AI Service Operator shifts this from the ground up. Services are no longer triggered – they’re managed. Ongoing care replaces one-off dispatch. The customer experiences continuity, not reaction.

Economically, this creates a fundamentally different business. Acquisition loses importance, retention wins. Revenue becomes predictable, margins stabilize. Service stops being project work and becomes recurring value creation.

For the owner of a trades or service company, this is a quiet shift with hard consequences. The business runs on availability, trust, and local reputation – things that can’t be digitized. But an operator that manages customer relationships by machine, anticipates maintenance cycles, and coordinates jobs before the customer calls doesn’t change the service. It changes who owns the relationship.

Legal Services

Legal services have long suffered from a mismatch between effort and value. Initial consultations were expensive, routine cases consumed highly qualified time, while complex decisions were made under pressure.

With an AI Service Operator, what changes isn’t legal responsibility but its distribution. Routine disappears from expert attention. Humans are deployed where judgment, liability, and experience are decisive.

The result isn’t the automation of law. It’s a new economic viability. Flat fees become possible, quality rises, and expertise concentrates where it has impact.

For the owner of a law firm or legal services business, this means: the model built on the value of qualified time isn’t replaced – but its economic moat shrinks. Whoever can run standard work by machine sets the price. Whoever can’t doesn’t lose clients immediately – but they lose the margin that carries their firm.

The common structure

In all three cases, the AI Service Operator doesn’t change individual steps – it changes the state of the market. Attention is relieved. Relationships become persistent. Execution replaces coordination. Value comes from operational control, not from brokering.

This isn’t a product innovation. It’s an organizational shift.

Why AI Service Operators aren’t software companies

AI Service Operators don’t sell licenses. They’re not neutral. They carry outcome responsibility. Their margin doesn’t come from usage – it comes from control over execution.

That makes them resemble traditional software firms only on the surface. In reality, they’re closer to industrial organizations – except their machines are made of models, context, and orchestration.

Skin in the game – and the question that follows

Nassim Taleb drew a distinction that cuts more precisely here than any technology analysis: the distinction between actors who bear the consequences of their decisions and those who don’t. He calls it skin in the game – and considers it the only reliable ordering principle in complex systems.

Software companies traditionally had no skin in the game. They sold tools. The outcomes – good or bad – were someone else’s problem. The AI Service Operator breaks with that logic. It runs processes, it carries outcome responsibility, its margin depends on execution quality. In Taleb’s terms: it’s liable.

That sounds like progress. And for the market, it is. But for operators and owners who hold control today, the question looks different.

What shifts with control

In every case described above, what shifts isn’t just value creation – it’s the control architecture. Those who previously commanded operational processes – through experience, through people, through relationships built over decades – don’t lose competence. They lose grip. Execution migrates into systems they didn’t build and whose internal logic they don’t fully control.

This isn’t an efficiency loss. It’s an ownership shift.

As long as software was a tool, decision authority stayed with the human. You entered what the system processed. Accountability stayed where it always was – with the entrepreneur, the decision-maker, the person carrying liability.

The AI Service Operator changes that structure fundamentally. Systems don’t just decide which information becomes visible – they decide which step comes next. Orchestration replaces instruction. And with that, a question shifts that was previously answered in silence: Who carries responsibility for processes that no one runs manually anymore?

This isn’t an abstract development. It’s a concrete ownership question: Does the operational control built over decades become a strategic asset – or a legacy that gets outrun from the outside?

Conclusion

The decisive line of the coming years doesn’t run between “AI” and “non-AI.” It runs between interface and execution.

Software made work visible.

AI Service Operators run it.

For entrepreneurs and owners, that raises a question that can’t be deferred: When execution becomes machine-driven and operational control migrates into systems that didn’t originate in your house – who reorders the ownership? And who bears the consequences when that reordering doesn’t happen?

Image generated with ChatGPT
Share this article

More article

Article, Podcast
11. March 2026
When Agents of Chaos gets written about it will be framed as a warning about uncontrollable AI. About security gaps, data loss, manipulation. That’s…
Article, Podcast
25. February 2026
Entrepreneurial responsibility was long tied to a simple assumption: those who performed, organized, and carried risk would, in return, gain influence, control, and economic…
Article, Podcast
11. February 2026
How the collapse of newspapers foreshadows the future of software business models: Twenty years ago, the internet hit newspaper publishers. Suddenly, they could reach…
Article, Podcast
11. March 2026
When Agents of Chaos gets written about it will be framed as a warning about uncontrollable AI. About security gaps, data loss, manipulation. That’s…
Article, Podcast
25. February 2026
Entrepreneurial responsibility was long tied to a simple assumption: those who performed, organized, and carried risk would, in return, gain influence, control, and economic…