vertical integration

Why Vertical Integration is the Only True Defensibility in AI

Moats have become much more discussed by aspiring startup founders with the growth of AI technologies. What will prevent you from facing endless competition?

“What we’re trying to find is a business that for one reason or another, it can be because it’s the low-cost producer in the field, it can be because it has a powerful franchise, has a protective moat around it.”

— Charlie Munger, Vice Chairman of Berkshire Hathaway

If all the application layer startups are building solutions on models that aren’t controlled by them, can there really be a moat for AI startups that aren’t infrastructure/ foundation models?

The answer, as demonstrated by early winners, is Vertical Integration, often manifesting as what some call the “Innovation Stack”. True defensibility is found not in superior models or thin feature wrappers, but in deeply integrating AI capabilities into human systems, workflows, data, and distribution channels.

At Remagine Ventures, we’ve invested in a number of AI Agents automating tasks, but often what we’re looking for is not just the model’s ability to automate the task, but also how it’s integrated in the team’s workflow: the tools that the majority of the industry uses and the time to value for a user. There’s been many posts about moats (or lack thereof) in AI and I wanted to suggest a few approaches based on the common wisdom available today.

The Illusion of the Application Layer Moat

For founders today, relying solely on thin application layers built atop public foundation models is a dangerous trap.

We are firmly in the era where AI tools have commoditised capabilities that once required significant resources. Simply wrapping a UI around an LLM or making minor prompt tweaks is “easy prey for incumbents with distribution”. This danger was illustrated when OpenAI launched polished end-user experiences like Sora 2, directly competing with and evaporating the competitive advantages of startups that thought they had room to build on the underlying model. There’s very rarely a safe space as we saw with Google launching a vibe coding platform, OpenAI announcing that they’re working on music generation or Apple announcing deep, on-device generative AI integration across iOS and macOS, instantly making specialized, single-purpose mobile AI apps for functions like real-time transcription and image editing redundant or severely disadvantaged on their platform.

If that’s not enough of a threat, the intelligence layer itself is commoditising. Open-source models are often only three to six months behind the frontier models, and for most consumer use cases, the difference is virtually indistinguishable. Even major model builders like Microsoft recognise that building an amazing product requires more than just marching up intelligence scores; it requires focusing on what people want in the product, leading to the “spiky intelligence era”.

The competitive moat, therefore, cannot be intelligence alone. It must shift to how that intelligence is integrated. ‘The New New Moats‘ by Greylock’s Jerry Chen argues that AI also doesn’t change how startups market, sell or partner, so in a way, the old moats still matter.

The Integration Moat: Building the Innovation Stack

We are firmly in the era where AI tools have commoditised capabilities that once required significant resources. Simply wrapping a UI around an LLM or making minor prompt tweaks is “easy prey for incumbents with distribution” . This danger was illustrated when OpenAI launched polished end-user experiences like Sora, directly competing with and evaporating the competitive advantages of startups that thought they had room to build on the underlying model, or Apple announcing deep, on-device generative AI integration across iOS and macOS, instantly making specialized, single-purpose mobile AI apps redundant.

The strongest defensive positions available to founders today are built upon Innovation Stacks: integrated layers where combinations of hardware, software, proprietary data, and distribution reinforce each other. The company that nails this integration for a specific industry will secure an advantage for a surprisingly long time, primarily by creating a virtuous, proprietary data loop that feeds back into and improves the model.

This focus means moving beyond simply bolting AI onto existing, flawed systems. If the underlying system or workflow is ineffective, it must be re-engineered to properly utilise the new AI capabilities. You cannot build a new skyscraper on a crumbling foundation.

For early-stage AI startups, vertical integration translates into three core strategies:

1. Owning Proprietary Context and Workflow

The true bottleneck for AI value creation is the lack of context. Trillion-dollar models can master quantum physics but fail when asked to find a company’s Q2 board deck because they aren’t connected to proprietary information.

This reality creates immense opportunity for vertical players:

  • Dropbox Dash: This product aims to close the context gap by building a universal context layer that connects a user’s work apps—like Google, Slack, email, and Salesforce—to the AI assistant. Unlike external consumer AI tools, Dash knows about the user’s specific work, company, and files. This integrated approach allows the AI to answer mission-critical questions that foundation models otherwise cannot, such as “When does our office lease expire?”. This strategy represents a natural evolution for Dropbox, which has a track record of reliability and trust in organizing and sharing important information.
  • Rivet (Tax Prep): In specialised professional fields like tax preparation, context management is the defensible workflow. Rivet built a backend system that quietly ingests, saves, and tracks all client information, including emails, Zoom calls, Slack transcripts, and texts. This organization automates the tedious parts of the accountant’s job—remembering everything and organizing scattered information. This deep integration into the customer’s complex context is difficult for generic platforms to replicate, allowing the firm to run efficiently and at scale.

2. Integration into the Physical World (or High-Opex Workflows)

Vertical integration often involves replacing highly fragmented, manual operational expense (opex) with specialized technology.

  • Traba (Industrial Staffing): Traba built a technology company to address the highly fragmented and manual industrial staffing industry. They integrate the entire workflow—vetting, guiding workers to shifts, fraud detection, and follow-ups—all powered by backend models. This integration leads to significant commercial value, offering double the fill rate and better quality workers within hours, rather than weeks. Furthermore, by mastering this industrial supply chain, Traa positions itself to lead the charge in deploying humanoid robots when the technology matures.
  • Drone-Based Infrastructure Inspection (e.g., Sky-Future): This integrated stack replaces the high-risk, high-opex manual process of infrastructure inspection (bridges, wind turbines, power lines). The company provides the entire solution: AI-driven autonomous drone flight paths, custom hardware for data collection (e.g., thermal, lidar), edge computing to detect defects in real-time, and a proprietary software layer that translates findings directly into remediation work orders. The moat is the deep, annotated dataset of high-resolution defect imagery combined with a workflow that makes the manual inspection process obsolete.

3. Verticalising “Taste” and Feedback Loops

In an era where building is easy, competitive advantage shifts to taste—the “deep, data-driven understanding of what customers actually want”. Achieving this requires vertically integrating market research into the product development cycle.

  • Dialogue AI and Listen Labs (AI-Native Research): These platforms use AI to conduct thousands of high-quality voice and video interviews simultaneously, replacing the slow, expensive traditional market research dominated by Fortune 500 companies. Dialogue AI, for instance, focuses on “interview quality,” creating experiences that mimic expert human researchers who are adaptive and capable of probing deeper. Crucially, they found that participants are often more candid and truthful with AI interviewers than with human moderators, leading to higher quality insights that teams can rapidly act on. By providing superior insights faster and cheaper, these integrated research platforms define the “moat” for companies like Wayfair and the music AI startup Suno.

In startup parlance, particularly in Y Combinator circles, taste has become the watchword for product-market fit in an era of abundant building capacity.

Why ‘Taste’ Is Silicon Valley’s New Competitive Moat, Forbes

Conclusion for Early-Stage Founders

The lesson for early-stage AI founders is clear: avoid building features dressed up as businesses. If you are not building a solution that utilizes hardware, software, proprietary data, and distribution in a reinforcing way, you risk having your temporary advantage vanish fast.

AI doesn’t eliminate moats; it simply changes which ones matter. The era of “move fast and break things” is ceding ground to “move fast and listen carefully”. In the AI economy, deep integration, ownership of proprietary context, and relentless focus on the customer experience are not nice-to-haves – they are the only true defensibility.

If you want to learn more about moats in AI startups, I recommend this chat with Ycombinator partners:

Follow me
Co Founder and Managing Partner at Remagine Ventures
Eze is managing partner of Remagine Ventures, a seed fund investing in ambitious founders at the intersection of tech, entertainment, gaming and commerce with a spotlight on Israel.

I'm a former general partner at google ventures, head of Google for Entrepreneurs in Europe and founding head of Campus London, Google's first physical hub for startups.

I'm also the founder of Techbikers, a non-profit bringing together the startup ecosystem on cycling challenges in support of Room to Read. Since inception in 2012 we've built 11 schools and 50 libraries in the developing world.
Eze Vidra
Follow me
Total
0
Shares
Previous Article
Weekly firgun newsletter - Oct 24 2025

Weekly Firgun Newsletter - October 24 2025

Next Article
weekly firgun oct 31 2025

Weekly Firgun Newsletter - October 31 2025

Total
0
Share