The SaaS Moat Is Crumbling — When LLMs Eat the Interface, All That's Left Is API vs API
Have you ever thought about why your company pays $24,000 a year for one person to use Bloomberg Terminal?
It’s not because the data inside is rare — most of it is available elsewhere. What you’re really paying for is the ten years of keyboard shortcuts your trader has drilled into muscle memory. They can Alt+E+S+V “Paste Special Values Only” with their eyes closed. Ask them to switch software? They’d quit first.
But what if one day, they just tell AI: “Filter software stocks with P/E under 30”?
Those ten years of muscle memory become instantly worthless.
That’s the bomb Fintool co-founder Nicolas Bustamante dropped on February 4, 2026 — a long-form piece with a title that means business:
The Crumbling Workflow Moat: Aggregation Theory’s Final Chapter
His core thesis in one line:
When LLMs commoditize the interface, what’s left? Just the data. And then it’s API against API.
In plain English: once LLMs eat the “interface” layer, software companies only have their “data” left. And if your data isn’t exclusive… you’ve got nothing.
Clawd 想補充:
Nicolas isn’t some armchair analyst. His company Fintool builds financial AI agents — their day job is literally replacing those $24K/year terminals with AI. So this article is partly his “field journal,” not ivory tower theory. Of course, that also means he has skin in the game — but the math he presents is solid, and that’s the scary part (⌐■_■)
Aggregation Theory: The 30-Second Crash Course
Before we talk about how LLMs destroy moats, let’s quickly understand Ben Thompson’s Aggregation Theory. It sounds academic, but you experience it every day.
The traditional value chain:
Suppliers → Distributors → Consumers
Before the internet, distributors were king. TV networks decided what aired. Newspapers decided what mattered. Retailers decided what hit shelves. You wanted to sell something? Get in line and beg.
When the internet drove distribution costs to zero, power shifted to a new species: Aggregators.
- Google aggregated websites (via search)
- Facebook aggregated content (via social graph)
- Amazon aggregated merchants (via marketplace)
- Uber aggregated drivers (via mobile app)
Thompson’s flywheel:
Better UX → More users → More suppliers → Better UX
Aggregators win by owning the consumer relationship and commoditizing suppliers until they’re interchangeable.
Clawd 歪樓一下:
If you’ve ever used Uber, you get it instantly. When you order a ride, you don’t care who the driver is, what car they drive, or what music they play. You care about “how many minutes” and “how much.” The driver is completely commoditized. Only Uber owns your relationship. Switch drivers? You won’t even notice. Switch away from Uber? Now you have to re-link your card, re-enter addresses, rebuild your rating. That’s the power of an aggregator ╰(°▽°)╯
But Suppliers Kept Two Lines of Defense
Web 2.0 Aggregation Theory had a structural limit: aggregators ate “discovery,” but suppliers kept two things:
- The Interface — users still had to visit your website, use your app
- The Data — your data was yours
Take financial data terminals as an example. Bloomberg Terminal charges $24,000/year/seat. The data inside? Most of it is available through other channels. What users are actually paying for:
- Ten years of keyboard shortcuts and muscle memory
- Workflows the team already built around the software
- The massive cost of switching vendors
Nicolas puts it bluntly:
“Knowledge workers spent years learning specialized interfaces. The muscle memory is real. They’re not paying for data. They’re paying to not relearn a workflow they’ve spent a decade mastering.”
Clawd 想補充:
Think about yourself: How long have you been using Excel? Do you know Ctrl+Shift+L opens the filter? That Alt+E+S+V does “Paste Special Values Only”? Those shortcuts ARE your “interface moat.” Even if someone built a spreadsheet that crushes Excel in every way, just giving up that muscle memory would be painful enough. But what if you just tell AI “filter companies with revenue over $100M”? Those ten years of shortcuts suddenly become like the trigonometry formulas you memorized in high school — completely useless ┐( ̄ヘ ̄)┌
Clawd 真心話:
By the way, “switching costs come from the interface, not the data” doesn’t just apply to financial terminals. Think about the Adobe suite, SAP, Salesforce — every single one locks customers in by making you spend three years learning how to use it. Users don’t love you. They’re afraid of starting over (◕‿◕)
LLMs: The Final Aggregator
Alright, here’s where Nicolas drops his most powerful argument. All the setup is done. Brace yourself:
LLMs don’t just aggregate suppliers. They absorb the interface itself.
The three-layer collapse:
| Era | Discovery Layer | Interface Layer | Data Layer |
|---|---|---|---|
| Web 2.0 | Eaten by aggregators | Owned by suppliers | Owned by suppliers |
| LLM Era | Eaten by LLMs | Eaten by LLMs | Only value left |
Look at a financial analyst’s workflow today:
- Open the application
- Navigate to the screening tool
- Set parameters
- Export to Excel
- Build a model
- Run scenario analysis
Every step involves interacting with the software’s interface. Every step reinforces the psychological lock-in of “I can’t leave this software.”
Now with an LLM chat interface:
“Show me all software companies with >$1B market cap, P/E under 30, growing revenue >20% YoY.” “Build a DCF model for the top 5.” “Run sensitivity analysis on discount rate.”
Three sentences. Done. The user never touched any specialized interface. They don’t know or care which data provider the LLM queried behind the scenes. The LLM found the cheapest source with adequate quality.
Clawd murmur:
Six steps vs. three sentences. This isn’t “efficiency improvement” — the entire game just changed. Before, you needed a month to learn Bloomberg’s interface before you could even start working. Now you just need to know how to type. It’s like how you used to need driving lessons before you could get anywhere, and then Uber let you just press “request ride.” Except Uber didn’t kill the car industry. LLMs might actually kill the interface industry (๑•̀ㅂ•́)و✧
Nicolas used a brutal restaurant analogy to illustrate this.
A restaurant today spends $50,000 on a beautiful website — parallax scrolling, professional food photography, reservation system integration, the whole luxury package.
A restaurant in the LLM era needs:
# Bella Vista Italian Restaurant
Location: 123 Main St, San Francisco
Hours: Mon-Thu 5-10pm
Menu:
- Margherita Pizza: $22
Reservation API: POST /book {date, time, party_size}
The $50,000 website becomes a text file and an API endpoint.
Clawd 畫重點:
This analogy is dead-on. Think about it: how did you find your last restaurant? Did you ask AI “what’s good Italian food nearby”? Did you actually click through to the restaurant’s website to admire their $50,000 parallax scrolling? Nope. You might not even remember the restaurant’s name — just “the one AI recommended.” That’s the interface moat crumbling in real time (╯°□°)╯
The Math of Evaporating Pricing Power
Next, Nicolas runs some numbers that should make SaaS CEOs break into a cold sweat.
The old model (vertical SaaS golden age):
- $10,000-25,000/seat/year
- Multi-year contracts with annual price increases
- 95%+ retention (because switching means retraining everyone)
- Gross margins >80%
The new model (LLM era):
- Pay-per-query (pennies each)
- Zero user lock-in (LLM can switch sources instantly)
- Margins compressed to commodity levels
- Retention based purely on data quality and coverage
Feel the temperature difference? From “$25,000/year long-term contracts” to “pennies per query, on demand.”
Nicolas’s conclusion:
If a vertical SaaS company’s value is 60% interface, and LLMs eliminate interface value entirely… a $20B market cap company should trade at $5-8B. That’s not a bear case. That’s math.
Clawd 想補充:
Let me translate this math into something that hurts more: your company is worth $20B today, $12B of which is because “users can’t leave your interface.” LLMs arrive. Users discover they don’t need your interface at all. $12B evaporates. Not slowly — the market reprices the moment it realizes. It’s like that feeling when you discover a skill you spent ten years building just got replaced by AI — except multiply that feeling by a few billion dollars ヽ(°〇°)ノ
MCP: Tearing Down the Last Wall
At this point you might think: “Even if LLMs eat the interface, switching between supplier APIs still has costs, right? Writing new integration code, testing edge cases, handling different schemas — that friction is still there.”
True. Traditional REST APIs did preserve some switching costs:
- Rigid schemas requiring exact field names
- Extensive documentation that humans had to read
- Custom integration for every service
- Stateless interactions without conversation context
That was a small moat: integration effort.
Then MCP (Model Context Protocol) came along and filled that moat right in.
Clawd 吐槽時間:
MCP is basically a “universal translator” for AI agents. Before, your AI agent needed to use a new data source? An engineer had to spend two days writing API integration, writing tests, handling edge cases. Now, as long as the other side has an MCP endpoint, the agent understands it, connects itself, handles everything. Like how USB unified charging ports — except this time it’s unifying “how AI talks to every service.” Nicolas calls MCP “the protocol that completes aggregation,” and I think he’s right. When switching data sources goes from “two days of engineering” to “zero,” the only differentiators left are: is your data good, is it complete, is it cheap. That’s it (¬‿¬)
MCP eliminates integration friction. When switching data sources requires zero integration work, the only differentiators are: data quality, coverage, and price.
That’s true, naked commodity competition.
Winners and Losers
Nicolas paints a brutal but clear picture of who comes out on top.
Winners
LLM Chat Interface Owners — Whoever owns the chat interface owns the user relationship. ChatGPT, Claude, Copilot, Gemini. All the interface value that vertical SaaS loses transfers directly to them. The new aggregators, more thorough than Google or Facebook ever were.
Companies with Truly Proprietary Data — Nicolas offers a brutal test: “Can your data be licensed or scraped?” If yes, you have no moat. If no, congratulations — you survive.
MCP-First Startups — Companies building for agents, not humans. No legacy interface to protect. Just clean data served through MCP endpoints. Traveling light.
Losers
Interface-Moat Businesses — Any vertical SaaS where “workflow” was the value. When interface value goes to zero, $20B becomes $5-8B.
Traditional Aggregators (Maybe) — Google and Meta commoditized suppliers. Now LLMs might commoditize them right back. But only if they fail to win the LLM chat layer. Google has Gemini + insane distribution. Meta has Llama. This fight isn’t over.
The UI/UX Industry (Partly) — Nicolas drops a devastating line: “Figma is down 90%.”
Clawd OS:
“Figma is down 90%” — that line hits hard. But let me be fair to UI/UX designers: interfaces won’t “completely disappear.” They’ll transform. Chat interfaces need design too — it’s just that the design shifts from “where should the button go” to “how should the conversation flow.” But if you’re one of those companies that profits from “making the interface so complex you need consultants to teach people how to use it”? Yeah, might want to start thinking about Plan B ( ̄▽ ̄)/
The Full Arc of Aggregation
Finally, Nicolas draws a complete evolutionary timeline from 1950 to today:
- Pre-Internet (1950-1995): Distributors controlled everything, they decided what you could buy
- Web 1.0 (1995-2005): Distribution costs hit zero, but content stayed siloed on individual websites
- Web 2.0 (2005-2023): Transaction costs hit zero, aggregators emerged, suppliers commoditized but kept their interfaces
- LLM Era (2023+): Interface costs hit zero, LLMs complete aggregation, all software becomes API
Nicolas’s final words, worth engraving on every SaaS founder’s desk:
Vertical software in 2020: The product that owned the workflow. Vertical software in 2030: An API that the LLM queries.
Related Reading
- SP-97: MCP Lifesaver? Context Mode Saves You 98% of Context Tokens
- CP-112: Every SaaS Is Now an API — Like It or Not: How a 6-Person Team Replaced 100+ People’s Back Office
- CP-150: From Prompt to Production: A Practical Guide to Agentic AI Architecture
Clawd 認真說:
From 1950 to 2030, eighty years of evolution summed up in four lines. Every time “some cost hits zero,” a whole generation of companies watches their moats drain. Last time it was distribution costs hitting zero — newspapers and TV networks cried. This time it’s interface costs hitting zero — SaaS companies should be nervous. History doesn’t repeat itself exactly, but it sure does rhyme (๑•̀ㅂ•́)و✧
So… Now What?
Alright, after hearing Nicolas’s full argument, you might be thinking “oh no, SaaS is dead.” But professor needs to tap the brakes for a second here.
First, Nicolas himself builds financial AI agents. His company Fintool exists to replace Bloomberg Terminal. So when he says “interface moats are crumbling,” it’s a bit like an umbrella salesman telling you “it’s definitely going to rain tomorrow.” His analysis might be correct, but you should know where he’s standing.
Second, “interfaces don’t matter at all” is oversimplified. A trader in a flash crash needs the fastest button press, not a three-second chat with AI. An ER doctor isn’t going to slowly type “what should I do about this patient” while someone is coding. In high-pressure scenarios, well-designed specialized interfaces still have their place.
Third, this transition won’t happen overnight. Enterprise customers with 3-5 year contracts, security audits, compliance requirements — that friction is real. The moat is crumbling, but it won’t be gone by tomorrow morning.
But having said all that — the direction is right.
Back to the Bloomberg Terminal from the opening. That trader’s decade-old Alt+E+S+V is slowly being replaced by one sentence: “filter companies with revenue over $100M.” Not today. Not tomorrow. But the trend won’t stop.
If you build software, now might be a good time to honestly ask yourself: how much of your product’s value comes from the “interface,” and how much from the “data”?
If the answer makes you uncomfortable — good. Discomfort is usually a sign it’s time to move.