Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Opinion: The "selling shovels" approach in the AI gold rush has become ineffective.
Author: Ben Basche
Translation: Shen Chao TechFlow
Shen Chao Guide: “In the gold rush, selling shovels” has been a golden rule in the startup world. But in the AI era, this logic no longer works—because miners are opening hardware stores themselves. OpenAI, Anthropic, and Google are systematically swallowing middleware layers, programming assistants, browser automation, and other startup tracks. Ben Basche believes that the companies that will truly survive are not tool sellers, but “jewelers” who use AI as raw material in vertical fields—deeply understanding specific industries, mastering local knowledge, and possessing irreplaceable context.
Full text below:
There’s a saying that became a boon for the startup community around the dot-com bubble: “In the gold rush, sell shovels and pickaxes.” It means the real money isn’t made by the miners, but by those supplying the miners. The ones who got rich were Levi Strauss, not the gold prospectors.
This is a good framework. It worked well for a while.
But in AI, it’s wrong. If your company is built on this logic, you should take a good look at what has happened over the past twelve months.
Labs are the entire tech stack
Here’s what has actually happened—initially quietly, then suddenly exploding.
OpenAI released Operator, a computer agent capable of browsing the web, filling out forms, and executing end-to-end tasks. Then they launched Responses API and Agents SDK, allowing developers to access native tool invocation, memory, and orchestration without third-party frameworks. Next came Codex, a cloud-based programming agent that can autonomously write, test, and iterate software. Plus Deep Research. Any of these products, two years ago, would have been enough to support a funded startup.
Anthropic released Claude Code, Computer Use, Projects with persistent memory, and MCP (Model Context Protocol)—almost overnight becoming the mainstream standard connecting AI with external tools and data. They then donated MCP to the Linux Foundation, ensuring it’s infrastructure, not a product. Later, they launched Claude in Excel, Claude in Chrome, Cowork.
Google released Gemini 2.0, with native tool invocation and multimodal perception, embedded into Vertex AI as an enterprise-grade control plane, providing out-of-the-box organizational strategies and orchestration.
Each of these moves is eating into a territory once held by startups.
The “selling shovels” logic assumes: labs stay on their own track. They build foundational models, provide APIs, and leave the tool layer, orchestration layer, and application layer to the ecosystem. That assumption is dead.
The middleware slaughter
Let’s look at what’s happening specifically at the middleware layer.
LangChain was the quintessential “shovel seller” bet during the 2023 AI boom. A framework for chaining LLM calls, connecting tools, and managing memory. Thousands of teams built products on it, with over 100,000 GitHub stars. By 2024, various teams started blogging about why they were removing it from production. Not because it was bad, but because the underlying models had become smart enough that they no longer needed it. The abstraction layer built by LangChain solves yesterday’s problems.
Meanwhile, OpenAI released its own Agents SDK. Microsoft launched AutoGen and Semantic Kernel. Labs and their parent companies didn’t acquire LangChain; they simply built what LangChain does natively into their platforms.
The same script plays out at every layer: agent frameworks, prompt management tools, RAG pipelines, evaluation frameworks, observability tools. All of these are being absorbed into native products by the underlying model providers.
The brutal truth: when OpenAI or Anthropic embed orchestration capabilities directly into their APIs, they don’t need to win on features. They only need to be “good enough” and “already there.” Developers default to the path of least resistance. The startup with clever middleware must achieve a huge lead, maintain that advantage amid continuous model evolution, and compete against rivals with unlimited capital and control over foundational infrastructure. That’s not a business; it’s a countdown scientific project.
Miners opening hardware stores, no more shovels to sell
The analogy of “selling shovels” fails in AI because of a key structural difference. In 1849, Levi Strauss and other hardware merchants didn’t mine gold themselves. Miners and suppliers were separate,利益分离的角色。
In AI, labs are both mining and selling shovels, building roads, and printing maps. They have strong incentives to control the entire tech stack because each additional layer they own increases lock-in, profit expansion, and distribution moat.
Anthropic donating MCP to the Linux Foundation isn’t charity. It’s to ensure the standard they designed becomes universal infrastructure, like Ethernet becoming a universal standard. Standards are the most powerful moat in tech—intangible and permanent.
So, if your startup’s value proposition is “we make it easier for developers to work with models,” you need to face a fact: the entity in the middle has already noticed you, has resources to copy you, and has structural reasons to do so.
What actually works?
Back to the gold rush analogy. If you can’t sell shovels anymore, what should you sell?
Jewelry.
Or more precisely: treat gold as industrial raw material, and make products that miners themselves aren’t interested in—because they’re too niche, too localized, deeply embedded in domain knowledge they will never own.
The AI version is building applications in vertical fields—areas that require real-world context that labs don’t have and can’t easily acquire.
Think about what OpenAI, Anthropic, and Google are not good at structurally:
They don’t deeply understand your industry’s workflows. They have no relationship with your clients. They can’t cheaply access private data that makes models truly effective in specific scenarios. They will never deeply study why South African artisans invoice as they do, or why mobile payments in Kenya aren’t simple, or why US medical pre-authorization is a complex, deeply embedded operational issue.
Labs are building horizontal infrastructure. The opportunity lies in verticals—areas requiring local knowledge of geography, regulation, culture, and industry-specific nuances to truly succeed.
That’s why emerging-market fintech, jurisdiction-specific legal AI, regulated industry compliance tools, and niche workflow automation are more defensible than “building a better LangChain.”
Moat isn’t in the model. It’s in the context.
The industrial use of gold
There’s a second version of this idea worth clarifying: using AI like industrial gold. Not as a store of value or display piece, but as a component embedded into products that create lasting economic value.
Gold’s conductivity is nearly unmatched. It’s in every circuit board. No one talks about it, no hype around it in this context. It quietly serves as a key input in larger systems.
The most durable AI companies being built now treat models as components—inputs into products that solve real problems—rather than products themselves. AI is the gold in the circuit board, not the display case.
The practical approach: pick a domain with real pain points, complex workflows, and hard-to-get data, then build a product that leverages models to make it much better. AI is the implementation detail; the product is what replaces painful manual processes.
This is the opposite of “wrapping GPT-4 in a shell.” The shell is the display case; the circuit board is invisible.
Recent tracks being phased out
To be clearer, here are some startup categories that labs have been systematically swallowing since late 2024:
Agent orchestration frameworks. Now native in OpenAI Agents SDK, Anthropic toolchains, Google Vertex Agent Builder.
AI coding assistants. OpenAI’s Codex can now autonomously code entire repositories. Claude Code can too. GitHub Copilot is Microsoft’s native solution. The standalone track of purely coding assistants has been greatly compressed.
Browser and computer automation. OpenAI Operator, Anthropic Computer Use, Google Gemini Astra. All leading labs now have products in this space. All startups using LLMs for RPA are on the defensive.
RAG pipelines and vector search tools. Basically commoditized. Most model APIs now have native retrieval capabilities. Differentiation at the framework level has disappeared.
General AI assistants and productivity tools. Directly eaten by Claude, ChatGPT, and Gemini.
Prompt management and evaluation tools. Increasingly becoming native features. LangSmith still has some room, but it’s a race against time.
The pattern is consistent: labs discover a category with significant developer interest, judge it closely related to their core product, then release a version. Not necessarily better, but more integrated, default cheaper, and with distribution channels that startups can’t match.
What should you do now?
If you’re building an AI startup today, the question isn’t “does this have demand.” Demand is everywhere. The real question is: will this be wiped out by a product from a billion-dollar-plus lab?
If the answer is “yes” or even “maybe,” then it’s not a business; it’s a feature.
A durable approach has these features: deep vertical specificity (labs can do general, but not your kind of general), private data or relationships that can’t be scraped from public sources, regulatory and compliance complexities that make “just calling an API” insufficient, and distribution channels in communities where trust and local context matter more than raw capability.
The gold rush is real. Gold is everywhere. But miners are now opening stores, backed by unlimited capital.
Sell jewelry. Use gold as industrial raw material. Make things miners aren’t interested in—because they’re too niche, too localized, too deeply embedded in domain knowledge they will never own.
That’s the right approach, in my view.