Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
In the era of AI, tokens are king
On March 16, 2026, at NVIDIA’s GTC conference, Jensen Huang redefined data centers with a single statement.
Holding a championship belt labeled “InferenceX,” he indicated that data centers are no longer just powerhouses for computation but are now “Token factories”—input data and electricity, and the output is the most core value product of the AI era, akin to a new “industrial oil.”
Tokens are the fundamental measurement units for AI large models processing text. Activities like AI-generated content and data processing consume computing power and are measured in Tokens. The production efficiency and associated costs of Tokens directly determine a company’s competitiveness in the AI age.
This transformation driven by computing power and capital not only changes how data centers, cloud service providers, and AI companies operate but also presents Chinese enterprises with an excellent opportunity for domestic substitution.
Reconstructing the Value of Tokens
The concept of a “Token factory” proposed by Jensen Huang isn’t new, but at GTC 2026, this positioning gained real industry significance.
Huang previously said at GTC 2024 that during the last industrial revolution, raw materials entering factories were water, and the output was electricity. “Today, the raw materials entering these server rooms are data and electricity, and the output is Tokens. Although intangible, these Tokens are highly valuable and will be distributed worldwide.”
Two years later, this prediction has become reality, driven by the industry’s shift from “model training” to “inference applications.”
Yuan Shuai, Deputy Director of Investment at the China Urban Development Research Institute, stated that the explosion of AI Token services and the AI computing power industry chain is driven by the widespread adoption of AI agents, surging Token demand, and decreasing computing costs. Market expectations have shifted from conceptual hype to practical demand. Future commercialization challenges will focus on cost control and regulatory compliance, with long-term implications for reshaping the value distribution in the AI industry.
Yuan further analyzed that the explosive application of AI agents, represented by “little lobsters,” has become a key engine. These intelligent agents replace manual labor to automate tasks, creating unlimited demand for Tokens. Meanwhile, NVIDIA’s new generation of chips significantly reduces inference costs, forming a positive cycle of “rising Token demand—lower computing costs—more applications.”
The Token economy is shaping new workplace models.
The latest recruitment trend in Silicon Valley is: “How many Tokens are in your offer?” Huang predicts that in the future, every engineer at a company will have an annual Token budget. Their base annual salary might be several hundred thousand dollars, with NVIDIA providing roughly half of that amount as a Token allocation, enabling them to achieve a tenfold efficiency increase.
To seize control of Token production, NVIDIA has launched a comprehensive solution.
It is reported that NVIDIA’s new AI platform, Vera Rubin, aims to reduce Token costs by 90%.
This industry transformation quickly propagated through the entire supply chain, with Alibaba Cloud’s actions being the most direct.
On March 18, Alibaba Cloud announced a 5%–34% price increase for computing cards like the PingTouGe ZhenWu 810E. Behind this is the continuous surge in Token calls caused by the previous AI agent boom, further intensifying supply-demand tensions and prompting price adjustments.
Sources say Alibaba Cloud is focusing its scarce AI computing resources on Token-related businesses and has established the Alibaba Token Hub division to integrate full-chain resources and seize opportunities.
From NVIDIA’s computing layout to Alibaba Cloud’s resource allocation, one clear signal has emerged: Tokens are the key link connecting computing power, models, and commercial value. Whoever controls them will stand firm in the AI era.
Capital Frenzy and Domestic Breakthroughs
The rise of the Token economy has already stirred the capital markets.
In March 2026, Hong Kong’s “Token First Stock,” Xunce Technology, suddenly surged 37% in late trading.
On March 17, Beijing time at 2:00 AM, Huang Huang delivered a highly anticipated keynote at GTC 2026, signaling the strongest message of the AI computing wave: “Every future data center will become a ‘factory’ producing Tokens.”
This statement immediately drew market attention to the “Token” industry chain, making Xunce Technology, as the first Token stock in Hong Kong, a sought-after target.
Data shows that Xunce was founded in 2016, focusing on real-time data infrastructure and analytics platforms. It listed in Hong Kong on December 30, 2025, often called the “Chinese version of Palantir.”
On March 6, Xunce released its 2025 earnings forecast, showing that by December 31, 2025, the company achieved revenue of 1.283 billion yuan, a year-on-year increase of 102.95%. Its net profit after non-recurring items narrowed to 55 million yuan. In detail, in the first half of 2025, revenue was 198 million yuan; in the second half, 1.085 billion yuan, a 448% quarter-over-quarter increase, compared to 630 million yuan in the same period of 2024.
The company attributed its growth mainly to the accelerated data demand driven by the deployment of AI large models. As large model capabilities become more commercialized, the market’s core competition shifts from the model layer to the deployment layer. Real-time data infrastructure, as the core foundation for AI deployment, is experiencing a full-scale demand explosion.
Notably, after Alibaba Cloud’s price hike, Alibaba Cloud’s Hong Kong stock also rose by 2.4%.
Yuan Shuai pointed out that the stock movements of Hong Kong AI companies and Alibaba Cloud’s price adjustments reflect a profound market shift from “big model competition” to “Token economy deployment.”
He believes that the market’s focus has shifted from large model technical parameters to operational indicators like Token call volume and computing utilization. Alibaba Cloud’s price increase driven by surging Token calls demonstrates market recognition of its scarce computing resources and commercialization potential.
He predicts that future investment hotspots will focus on intelligent agent development, computing leasing, and Token operations. Companies providing high-frequency essential intelligent services and consolidating idle computing resources will be more favored by capital.
“Recent market expectations have shifted from merely hype around technology to actual business realization,” said Gao Heng, an expert at the China Society of Science and Technology News. The fact that Alibaba Cloud’s stock can still rise after a price adjustment indicates that the capital market no longer blindly favors low prices to seize market share but accepts a new reality: in the AI era, computing power isn’t necessarily better when cheaper but should be high-quality, stable, low-latency, and inherently carry a premium.
He further explained that in the past, there was concern that cloud providers would engage in price wars, squeezing profits. Now, the market believes AI will bring cloud providers back to resource pricing power. Future investment will not only focus on GPUs and servers but also on more promising high-utilization data centers, liquid cooling, power infrastructure, inference clouds, model gateways, Token billing and scheduling platforms, and AI applications that can convert Token consumption into cash flow.
Compared to foreign companies, China’s AI large models have a clear cost-performance advantage—OpenRouter data shows that since February 2026, Chinese AI large models’ Token unit prices are only 1/6 to 1/10 of foreign competitors, with weekly call volumes often surpassing U.S. counterparts.
This advantage stems from China’s long-term accumulation in computing resource scheduling, data processing, and other fields, combined with the domestic large model industry’s market competition, hardware cost advantages, and commercial pricing strategies.
In the future, with the advancement of the “East Data, West Computing” project, China’s advantages in green electricity supply and computing cluster construction will become more prominent, enabling domestic enterprises to fully replace Token infrastructure.
Implementation Challenges
Amid the boom, the practical challenges of Token economy deployment are emerging, notably power supply.
“The end of AI is computing power; the end of computing power is electricity,” a widely circulated phrase in the tech community, is gradually becoming tangible.
Training GPT-4 consumes up to 240 million kWh of electricity; the popular intelligent agent OpenClaw requires dozens to hundreds of times more computing power than traditional dialogue AI for complex tasks; in Shenzhen, a super-scale intelligent computing center with over 6,000 PFLOPS consumes over 70% of its operating costs on electricity.
As demand for computing power surges, electricity has become a “hard threshold” for large-scale Token production. However, energy-saving technologies like liquid cooling, policies promoting green electricity, and hardware energy efficiency upgrades are gradually easing this pressure.
With AI development, computing power demand and electricity consumption continue to grow exponentially.
According to IEA data, by 2030, global data centers are expected to consume up to 945 TWh of electricity annually, with China and the U.S. leading this growth, accounting for nearly 80% of the increase in global data center power consumption. Current grid construction cycles often take 5 to 10 years.
Gao Heng believes that the biggest current problem in the Token industry chain is not insufficient demand but “a lot of activity, but few complete cycles.” Many companies’ revenue growth relies on capital expenditure expansion and short-term orders, but stable, repeatable, and scalable business models are still lacking.
He states that, more practically, many AI applications seem to have high call volumes, but customers may not be willing to pay long-term because many scenarios are still “usable” rather than “indispensable.” The key to sustainable development is not to keep adding cards or burning more money but to truly link Token consumption with customer value. Those who can prove they are selling ongoing efficiency improvements and revenue-generating capabilities, rather than one-time compute, will survive.
Wang Peng from the Institute of Social Sciences, Beijing, pointed out that the rapid development of the Token industry also brings multiple regulatory challenges: first, the black-box and traceability issues of algorithms, as high-frequency AI content generation based on Token measurement greatly increases content regulation difficulty—how to intercept harmful information in real-time and ensure traceability is a primary challenge.
Second, data rights and distribution issues—there’s no clear, fair mechanism for sharing copyright fees for training data and the benefits from Token generation, which could lead to copyright disputes. Lastly, on the security of computing power, as a core strategic resource in the AI era, cross-border scheduling compliance and supply chain security require stricter policies and regulations.
Despite these difficulties, industry consensus is that the rise of the Token economy is an inevitable trend for AI development. Overcoming these challenges depends on technological innovation, ecological collaboration, and policy guidance.
Gao Heng predicts that AI will evolve from a technological industry into a new infrastructure sector. It will no longer only serve a few internet companies but will permeate manufacturing, finance, government, healthcare, education, and other sectors—restructuring enterprise cost structures, software pricing models, and even the entire digital economy’s pricing logic. The real big opportunity lies not in model launches but in who first gains the right to charge for this new infrastructure.
Looking ahead, Yuan Shuai believes that the future will see major changes in the industry chain: standardization of computing resources, marketization of Token trading, and ecosystem development of intelligent agents. Tokens may become a universal valuation unit in the digital economy, spanning AI services, data trading, and computing leasing. This shift will drive AI industry transformation from technology-driven to scenario-driven, with more vertical intelligent agent applications landing, accelerating digital economy penetration and integration.
Meanwhile, regulators will gradually improve rules for Token trading and computing services, establish data security assessments and anti-monopoly mechanisms, and guide healthy industry development.
Based on reports from Daily Economic News, DuKe, and others.