Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Huang Renxun GTC Closed-Door Strategy Q&A Transcript: Holding Trillions of Dollars in Orders, Confronting Wall Street Analysts, Predicting the Collapse of a $2 Trillion Software Empire, and Fully Revealing the $8 Trillion New Opportunity in AI Agents
Question: How will AI · Token Economics disrupt the traditional software industry?
The City Lord says | After yesterday’s 2-hour keynote, Jensen Huang appeared again at GTC2026 today, conducting a closed-door long Q&A session with financial analysts, addressing many sharp questions.
In this session, Huang Huang detailed how data centers are evolving from mere computing tools into “AI factories” that manufacture Tokens, and disclosed his market demand forecasts for Blackwell and Rubin architectures exceeding $1 trillion. Discussions covered how Tokenomics will reshape the IT software industry, the enormous potential of physical AI, the evolution of hardware architectures (including Groq and copper connection tech), and strategies for companies to create shareholder value while maintaining high profit margins.
00:00:02 The third inflection point in AI development and agentic systems
00:10:21 Tokenomics and the trillion-dollar market outlook
00:19:24 Software industry reshaping and AI investment return pathways
00:30:06 Hardware architecture evolution: Rubin vs. GROC differentiation
00:42:06 Full-stack AI factory design: memory optimization and rack architecture
00:50:52 Capital strategies and competitive moats: understanding factory output efficiency
00:59:23 Data center connectivity evolution: from copper cables to silicon photonics
01:08:59 Reducing token costs and innovations in hybrid expert models
01:17:55 AI vision outlook: integration of physical robots and training inference
Core insights
• The third inflection point in AI: The industry has moved from generative AI and reasoning phases into “agentic systems,” where AI can autonomously perform complex tasks.
• Compute as manufacturing: Computers have shifted from tools to manufacturing devices (AI factories), with their output being economically valuable Tokens, measured by token generation efficiency per unit of power consumption.
• Trillion-dollar market potential: NVIDIA is confident in the market demand forecasts for Blackwell and Rubin architectures before 2027, exceeding $1 trillion.
• IT industry transformation: The traditional $2 trillion software licensing industry is transitioning toward a model of reselling Tokens via agentic systems, with market size potentially expanding to $8 trillion.
• Full-stack architecture advantage: By integrating GPUs, CPUs (Grace), storage, and networking, NVIDIA can reach 40% of the non-hyperscale cloud market (enterprise, industrial, edge).
• “AI has moved from conversational to ‘agentic’—no longer just answering questions, but executing tasks.”
• “Future computers are no longer just tools but manufacturing devices, with the core product being economically valuable Tokens.”
• “Our visualization demand for Blackwell and Rubin exceeds $1 trillion—this is not a floating point number but clear orders and visibility.”
• “The traditional $2 trillion software licensing industry is transforming into a model of reselling Tokens, with market size potentially reaching $8 trillion.”
• “If you don’t understand Tokenomics, you don’t understand AI business. Buying the cheapest equipment doesn’t mean winning; the key is token generation efficiency per unit of power.”
• “Physical AI will far surpass digital AI in scale because the world doesn’t happen inside laptops but in the physical space of atoms.”
The third inflection point in AI: from “answer questions” to “execute tasks”
Huang Huang clearly states that the industry is at a critical crossroads in the evolution of generative AI. If the first wave was about content generation, the second about widespread reasoning, then we are now at the third inflection point—agentic systems. These systems possess autonomy, capable of decomposing and executing complex tasks based on set goals.
This shift directly redefines talent and corporate operations. “In the past, when engineers joined a company, they were given a laptop. Now, they’re also given a Token budget.” Huang Huang emphasizes that if a high-salary engineer doesn’t consume Tokens during work, their output will be questioned. In this context, open-source projects like OpenClaw are no longer “toys” but operating systems of the AI era, responsible for resource scheduling, network management, and driving agents to perform tasks.
Compute as manufacturing: heading toward a trillion-dollar “AI factory”
NVIDIA is redefining what a computer is. Huang Huang compares modern data centers to ASML’s manufacturing equipment or power plants: their raw material is electricity, and their output is highly valuable Tokens. This means the standard for measuring computing power is no longer just chip prices but “Token economics.”
“Computers used to be just tools; future computers are manufacturing devices. Their energy and production efficiency are critical because they determine your revenue.” Huang Huang points out that customers buy expensive Blackwell systems not to resell hardware but to produce the lowest-cost, highest-value Tokens. NVIDIA ensures its products deliver unmatched ROI by continuously improving token generation efficiency per watt.
Confidence in the roadmap: Blackwell and Rubin’s trillion-dollar visibility
Addressing concerns about the sustainability of AI investment, Huang Huang provides strong data-backed rebuttals. He reveals that NVIDIA is confident in the visualization demand for Blackwell and Rubin architectures before 2027, exceeding $1 trillion.
“This is not a floating point number; we have clear foresight, explicit demand forecasts, and purchase orders exceeding $1 trillion.” This visibility comes not only from hyperscale cloud providers but also from NVIDIA’s full-stack platform advantage. Huang Huang explains that by integrating Grace CPUs, GPUs, networking, and storage, NVIDIA can reach the 40% of the market that cannot be served by a single chip (such as enterprise on-premises and industrial edge). “If you only make chips, you can’t reach that 40%—they buy platforms.”
Tokenomics: transforming the $2 trillion IT industry into an $8 trillion market
In the interview, Huang Huang makes a bold prediction: the global $2 trillion software licensing industry is undergoing a transformation. Future IT companies will no longer just license software but become Token distributors and resellers.
“The current IT industry could grow from $2 trillion to $8 trillion. 100% of the global IT industry will become resellers of models from OpenAI, Anthropic, and open-source models.” This business model shift will reshape profit margins. Although sales costs (COGS) for token production will increase, the value provided by AI agents far exceeds traditional software, leading to exponential market expansion. “Future tech companies will lease Tokens, generate Tokens, and their business models will fundamentally change.”
Hardware architecture game: GROC, copper connections, and liquid cooling revolution
On the micro-architecture level, Huang Huang demonstrates NVIDIA’s meticulous balancing act. He analyzes GROC’s position in the inference market, believing its ultra-low latency will capture about 25% of high-end inference.
Regarding the debate over “optical vs. copper,” Huang Huang offers pragmatic judgment: “You should use copper technology as long as possible because it’s reliable and easy to manufacture. Only when physical limits are reached will we switch to CPO (co-packaged optics).” He reveals that even as NVIDIA moves toward 1152 architecture, copper connections will remain prevalent in storage, management, and specific links to maintain system resilience and cost advantages. Additionally, full liquid-cooled racks are now standard for NVIDIA AI factories to handle rising power demands.
Physical AI: the ultimate frontier of artificial intelligence
While digital AI is reshaping office work at a rapid pace, Huang Huang believes the real blue ocean lies in the physical world. He predicts physical AI will eventually command 70% of the market.
“The scale of industries related to physical AI far exceeds that of digital AI. The world doesn’t happen inside our laptops but in the physical space where atoms exist.” From factory automation to autonomous driving and robots with long-term memory, physical AI models must process continuous physical laws rather than simple discrete Tokens. This demands higher compute power and makes NVIDIA’s Omniverse and similar simulation platforms indispensable for physical AI training.
Inference as thinking, compute as national strength
Finally, Huang Huang reiterates: “99% of future compute will be used for inference. Nobody pays for training itself; they pay for results. Inference is the process of converting Tokens into economic value.”
This continuous cycle from “pretraining” to “post-training” to “real-time inference” exemplifies NVIDIA’s full-stack strength. NVIDIA is not just manufacturing chips but setting the pace of the AI era. In this trillion-dollar factory, every generated Token is redefining productivity boundaries. As Huang Huang says, if you don’t understand this economic logic, you are destined to be left behind in this new era.
Web3 Sky City full transcript
The three inflection points in AI and the rise of agentic systems
Host: Good morning, everyone. Hope you enjoyed yesterday’s presentation. It was a bit long, but it was a great summary for us. Now, we’ll use this time to focus on your needs and follow-up questions. We’ll start with a few slides, maybe the first one or so, then open the floor for questions, and I’ll hand it over to Jensen.
Huang Huang: As I mentioned yesterday, there are three recent inflection points in AI. The first is generative AI. The second is reasoning. And now we are at the third—agentic systems. Each builds on the previous.
Agentic systems are autonomous—they can set goals and decompose complex tasks to execute them. They’re no longer just answering questions but performing tasks. Tasks can be anything, but one of the hottest applications is software coding. I believe in your companies, and of course in mine, engineers are using agentic systems every day.
In the past, when engineers joined a company, they were given a laptop. Now, when you join, you’re also given tokens. Token budgets are now real. Every engineer will have their own token budget. Imagine hiring a $300,000/year engineer—if they don’t consume tokens during work, their productivity will be questioned. So it’s clear: every engineer will have a large number of tokens to consume, and these tokens will be produced.
Earlier I mentioned that if you connect these dots—previously, engineers or programmers came to work with just a tool, a laptop. Now, we give them a laptop plus tokens. These tokens must be manufactured. So, computers used to be tools; future computers are manufacturing devices. They produce saleable products—Tokens. This is akin to how generators produce electricity long ago. These are manufacturing systems. Their energy and production efficiency are critical because they determine your revenue. Understand? That’s the third inflection point.
Open source. Many open-source projects, when first released, look like toys. But if you step back and analyze from first principles—what is open source? I explained yesterday.
From first principles, open source is a computer—a computer running an AI operating system. A personal AI computer. It has all the attributes of a computing system. It manages resources, schedules tasks, handles I/O, and connects to networks. It’s a complete computing device. Clear? So you see, the red line isn’t the y-axis; it’s the growth trend. That’s the extraordinary part.
Now, every company in the world needs to think: what is your open source strategy? Every software company, every company, needs an open source strategy—just like you had a Linux strategy, an internet strategy, or a mobile cloud strategy. Now, the big question is: what is your open source strategy? This is a very important matter.
Roadmap update: Blackwell and Rubin’s trillion-dollar demand
Huang Huang: Let me clarify further based on what I just said. A year ago, I said we had a clear forecast of $500 billion in shipments for Blackwell and Rubin before 2026. In short, in some month of 2025, I said we had a clear forecast for demand for Blackwell and Rubin, including demand orders and procurement orders, totaling a very clear $500 billion. Many of you asked questions like: what is our current progress? What is the latest update?
So I want to give you an update. What month is it now? It’s March. So, we are in March. Still a long way to 2027. I want to make that clear. But since we are building infrastructure and factories, and delivery cycles are long, they want to secure early, confirmed demand or purchase orders to ensure supply, understand?
Therefore, we are very confident in the visualization demand exceeding $1 trillion for Blackwell and Rubin. Note: this is not a floating point; it’s a clear forecast, explicit demand, and confirmed purchase orders. This visibility is not only from hyperscale cloud providers but also from NVIDIA’s full-stack platform advantage. Huang Huang explained that by integrating Grace CPUs, GPUs, networking, and storage, NVIDIA can reach that 40% of the market that cannot be served by a single chip (like enterprise on-prem or industrial edge). “If you only make chips, you can’t reach that 40%—they buy platforms.”
Tokenomics: transforming the $2 trillion IT industry into an $8 trillion market
In the interview, Huang Huang made a bold prediction: the current $2 trillion software licensing industry is about to transform. Future IT companies will no longer just license software but become Token resellers and distributors.
“The current IT industry could grow from $2 trillion to $8 trillion. 100% of the global IT industry will become resellers of models from OpenAI, Anthropic, and open-source models.” This business model change will reshape profit margins. Although sales costs (COGS) for token production will increase, the value AI agents provide far exceeds traditional software, leading to exponential market growth. “Future tech companies will lease Tokens, generate Tokens, and their business models will fundamentally change.”
Hardware architecture game: GROC, copper connections, and liquid cooling revolution
On the micro-architecture level, Huang Huang demonstrates NVIDIA’s precise balancing. He analyzed GROC’s position in inference, believing its ultra-low latency will capture about 25% of high-end inference.
Regarding the debate over “optical vs. copper,” Huang Huang pragmatically states: “Use copper as long as possible because it’s reliable and easy to manufacture. Only when physical limits are reached will we switch to CPO (co-packaged optics).” He revealed that even as NVIDIA moves toward 1152 architecture, copper connections will remain in storage, management, and specific links to maintain resilience and cost advantages. Also, full liquid-cooled racks are now standard for NVIDIA AI factories to handle rising power demands.
Physical AI: the ultimate frontier of AI
While digital AI rapidly reshapes office work, Huang Huang believes the blue ocean is in the physical world. He predicts physical AI will eventually account for 70% of the market.
“The scale of industries related to physical AI far exceeds digital AI. The world doesn’t happen inside laptops but in the physical space of atoms.” From factory automation to autonomous driving and robots with long-term memory, physical AI models must process continuous physical laws, not just discrete Tokens. This demands higher compute power and makes NVIDIA’s Omniverse and similar simulation platforms essential for physical AI training.
Inference as thinking, compute as national strength
Finally, Huang Huang emphasizes: “99% of future compute will be for inference. Nobody pays for training; they pay for results. Inference is the process of turning Tokens into economic value.”
This cycle—from pretraining, post-training, to real-time inference—embodies NVIDIA’s full-stack strength. NVIDIA is not just making chips but setting the pace of the AI era. In this trillion-dollar factory, every generated Token redefines productivity. As Huang Huang states, if you don’t understand this economic logic, you will be left behind in this new era.
Web3 Sky City full transcript
The three inflection points in AI and the rise of agentic systems
Host: Good morning, everyone. Hope you enjoyed yesterday’s presentation. It was a bit long, but a great summary. Now, we’ll use this time to focus on your needs and follow-up questions. We’ll start with a few slides, maybe the first one or so, then open the floor for questions, and I’ll hand it over to Jensen.
Huang Huang: As I said yesterday, there are three recent inflection points in AI. The first is generative AI. The second is reasoning. And now we are at the third—agentic systems. Each builds on the previous.
Agentic systems are autonomous—they have agency, and you can set goals for them. They no longer just answer questions but can perform tasks. Tasks can be anything, but one of the hottest applications is coding. I believe your engineers, and of course mine, are using agentic systems every day.
In the past, when engineers joined a company, they were given a laptop. Now, when you join, you’re also given tokens. Token budgets are real now. Every engineer will have their own token budget. Imagine hiring a $300,000/year engineer—if they don’t consume tokens during work, their productivity will be questioned. So it’s clear: every engineer will have a large number of tokens to use, and these tokens will be produced.
Earlier I mentioned that if you connect these dots—previously, engineers or programmers came to work with just a tool, a laptop. Now, we give them a laptop plus tokens. These tokens must be manufactured. So, computers used to be tools; future computers are manufacturing devices. They produce salable products—Tokens. This is similar to how generators produce electricity long ago. These are manufacturing systems. Their energy and production efficiency are critical because they determine your revenue. Understand? That’s the third inflection point.
Open source. Many open-source projects, when first released, look like toys. But if you analyze from first principles—what is open source? I explained yesterday.
From first principles, open source is a computer—a computer running an AI operating system. A personal AI computer. It has all the attributes of a computing system. It manages resources, schedules tasks, handles I/O, and connects to networks. It’s a complete computing device. Clear? So you see, the red line isn’t the y-axis; it’s the growth trend. That’s the extraordinary part.
Now, every company in the world needs to think: what is your open source strategy? Every software company, every company, needs an open source strategy—just like you had a Linux strategy, an internet strategy, or a mobile cloud strategy. Now, the big question is: what is your open source strategy? This is a very important matter.
Roadmap update: Blackwell and Rubin’s trillion-dollar demand
Huang Huang: Let me clarify further based on what I just said. A year ago, I said we had a clear forecast of $500 billion in shipments for Blackwell and Rubin before 2026. In short, in some month of 2025, I said we had a clear forecast for demand for Blackwell and Rubin, including demand orders and procurement orders, totaling a very clear $500 billion. Many of you asked questions like: what is our current progress? What is the latest update?
So I want to give you an update. What month is it now? It’s March. So, we are in March. Still a long way to 2027. I want to make that clear. But since we are building infrastructure and factories, and delivery cycles are long, they want to secure early, confirmed demand or purchase orders to ensure supply, understand?
Therefore, we are very confident in the visualization demand exceeding $1 trillion for Blackwell and Rubin. Note: this is not a floating point; it’s a clear forecast, explicit demand, and confirmed purchase orders. This visibility is not only from hyperscale cloud providers but also from NVIDIA’s full-stack platform advantage. Huang Huang explained that by integrating Grace CPUs, GPUs, networking, and storage, NVIDIA can reach the 40% of the market that cannot be served by a single chip (like enterprise on-premises and industrial edge). “If you only make chips, you can’t reach that 40%—they buy platforms.”
Tokenomics: transforming the $2 trillion IT industry into an $8 trillion market
In the interview, Huang Huang made a bold prediction: the current $2 trillion software licensing industry is about to transform. Future IT companies will no longer just license software but become Token resellers and distributors.
“The current IT industry could grow from $2 trillion to $8 trillion. 100% of the global IT industry will become resellers of models from OpenAI, Anthropic, and open-source models.” This business model change will reshape profit margins. Although sales costs (COGS) for token production will increase, the value AI agents provide far exceeds traditional software, leading to exponential market growth. “Future tech companies will lease Tokens, generate Tokens, and their business models will fundamentally change.”
Hardware architecture game: GROC, copper connections, and liquid cooling revolution
On the micro-architecture level, Huang Huang demonstrates NVIDIA’s precise balancing. He analyzed GROC’s position in inference, believing its ultra-low latency will capture about 25% of high-end inference.
Regarding the debate over “optical vs. copper,” Huang Huang pragmatically states: “Use copper as long as possible because it’s reliable and easy to manufacture. Only when physical limits are reached will we switch to CPO (co-packaged optics).” He revealed that even as NVIDIA moves toward 1152 architecture, copper connections will remain in storage, management, and specific links to maintain resilience and cost advantages. Also, full liquid-cooled racks are now standard for NVIDIA AI factories to handle rising power demands.
Physical AI: the ultimate frontier of AI
While digital AI rapidly reshapes office work, Huang Huang believes the blue ocean is in the physical world. He predicts physical AI will eventually account for 70% of the market.
“The scale of industries related to physical AI far exceeds digital AI. The world doesn’t happen inside laptops but in the physical space of atoms.” From factory automation to autonomous driving and robots with long-term memory, physical AI models must process continuous physical laws, not just discrete Tokens. This demands higher compute power and makes NVIDIA’s Omniverse and similar simulation platforms essential for physical AI training.
Inference as thinking, compute as national strength
Finally, Huang Huang emphasizes: “99% of future compute will be for inference. Nobody pays for training; they pay for results. Inference is the process of turning Tokens into economic value.”
This cycle—from pretraining, post-training, to real-time inference—embodies NVIDIA’s full-stack strength. NVIDIA is not just making chips but setting the pace of the AI era. In this trillion-dollar factory, every generated Token is redefining productivity. As Huang Huang states, if you don’t understand this economic logic, you will be left behind in this new era.