IBM Evaporates $40 Billion, Block Cuts Half Its Staff While Stock Price Rises: In the AI Era, What Assets Deserve Tokenization?

PANews

On February 23, 2026, a seemingly calm Monday, IBM’s stock experienced its worst single-day decline since October 2000. By the close, it had fallen 13.2%, evaporating approximately $40 billion in market value within hours. The trigger was not a disastrous earnings report or regulatory crackdown, but a product announcement: AI startup Anthropic announced that its Claude Code tool could modernize COBOL programs running on IBM systems, even though COBOL is precisely IBM’s profitable “moat” business.

Three days later, a similar story played out in the opposite direction. On February 26, Jack Dorsey’s fintech company Block announced layoffs of about 4,000 employees, nearly 50%, citing AI-driven efficiency improvements. But market reactions were entirely different—Block’s stock surged over 24% in after-hours trading. In a letter to shareholders, Dorsey admitted, “I believe that within the next year, most companies will reach the same conclusion and make similar structural adjustments.”

These two events, driven by the same factor—AI—elicited vastly different market responses: one plummeted, the other soared. What exactly happened behind the scenes? The answer may point to a deeper question: AI is redefining “what constitutes a valuable asset.” For executives, investors, and traditional business decision-makers, understanding this revaluation logic is no longer a strategic foresight but an urgent matter of survival.

1. The Same AI, Different Market Judgments

To understand the stark contrast between these two cases, we must first examine their respective asset structures.

IBM’s plunge, on the surface, was due to the technological threat posed by Claude Code. In essence, it was a re-pricing of its core asset model. COBOL, a programming language developed in the late 1950s, still supports about 95% of global ATM transactions and many critical systems in finance, aviation, government, and other sectors. Anthropic stated in its blog: “Trillions of lines of COBOL code run in production daily, powering critical systems. Yet, the number of people who understand COBOL is decreasing year by year.”

For a long time, modernizing COBOL systems has been a complex, costly endeavor—forming IBM’s profitable “moat.” But Anthropic claimed, “With AI, teams can modernize COBOL codebases in just a few quarters without spending years.” The market’s underlying message was clear: IBM’s reliance on labor-intensive maintenance and mainframe-related service revenues is being eroded by AI technology.

Interestingly, IBM’s stock rebounded 2.68% the next day. Wall Street analysts from Wedbush and Evercore ISI quickly defended, calling the plunge an “unfounded overreaction.” Their reasoning cut to the core: corporate clients wouldn’t abandon their mainframe systems just because a new AI tool can translate legacy code. There’s a huge gap between code syntax translation and the deep integration of hardware and software in system modernization.

IBM also responded on the same day, emphasizing a key point: the challenge of modernization isn’t the COBOL language itself but the IBM Z platform—translating code cannot capture the platform’s complexity, which derives from decades of hardware-software integration. The platform’s value comes from this decades-long synergy, which code translation cannot replicate.

In contrast, Block’s event involved similar large-scale layoffs driven by AI, yet the market responded with a 24% increase. The key difference lies in the changing asset structure of Block. Since 2024, Block has been restructuring its business model and workforce, heavily investing in AI tools to improve efficiency, including developing its own tool called Goose.

Block’s CFO, Amrita Ahuja, explained the layoffs: “We are taking bold, decisive actions, but they are built on strength.” This “strength” is supported by data: in 2025, gross profit reached $10.36 billion, up 17% year-over-year. Strong financials provided a buffer for the company to push forward with large-scale restructuring.

Market interpretation was straightforward: Block isn’t passively shrinking under AI pressure but proactively optimizing its asset structure—exchanging more “human assets” for higher “technological asset” productivity. Cutting 50% of staff while raising full-year guidance indicates that AI is amplifying the value per employee.

2. In the AI Era, Four Asset Types Are Being Repriced

These cases reveal an emerging trend: AI is becoming a “re-pricer” of asset value. Different asset types show sharply divergent value trajectories under AI valuation frameworks.

First are labor-intensive assets. The value of IBM’s COBOL maintenance teams, traditional analysts, and programmers—“information processors”—is being diluted by AI. Anthropic mentioned that Claude Code can identify risks that would take human analysts months to find. This doesn’t mean humans are no longer important, but that jobs relying on information asymmetry and procedural knowledge are being compressed in value by technology.

However, it’s crucial to recognize that AI replaces “information processing,” not “value creation.” Futurum Group analyst Mitch Ashley pointed out that successful COBOL modernization involves multiple dimensions—business scope definition, technical assessment, data migration planning, behavioral equivalence verification, observability, and organizational change management—of which code translation is only one part. The ability to understand complex systems, grasp business essence, and make strategic judgments remains scarce.

Second are data assets, which are becoming high-value in the AI era. With the rapid development of generative AI, the value attributes of data are being reshaped. A study published in PLOS One by Tang et al. indicates that generative AI alters how data is acquired, processed, and utilized. Data’s value depends not only on its intrinsic quality and relevance but also on its application scenarios, transformation capacity, and market demand within generative AI frameworks.

This means that data’s uniqueness, continuity, and governance are becoming core value dimensions. A dataset might be extremely valuable in one context but useless in another. Companies capable of providing exclusive, high-quality, continuous data for AI training are gaining new pricing power.

Third are algorithm and model assets. The collaboration between OpenAI and Paradigm on EVMbench, which evaluates AI’s ability to detect, repair, and exploit smart contract vulnerabilities, illustrates that algorithms are becoming quantifiable assets. Model weights, algorithm frameworks, and training methodologies are increasingly recognized, controllable, and monetizable intangible assets.

Fourth are traditional tangible assets, which are experiencing divergence. Assets relying on “information asymmetry” and “intermediary labor” face devaluation pressure, while physical assets with “AI immunity”—such as energy facilities, scarce resources, and critical infrastructure—maintain relatively stable value. The reason is simple: AI can analyze and optimize their operations but cannot replace their physical existence and intrinsic value.

3. From “Asset Revaluation” to “AI Immunity”

Based on the above analysis, companies need a systematic framework to determine whether their assets are appreciating or depreciating in the AI era. RWA Research Institute proposes an “AI Immunity” asset identification framework, which includes three core features.

The first is uncodability. This refers to value elements that are difficult for AI to fully learn or replicate. While COBOL code can be translated by AI, the transaction processing capabilities built into IBM’s Z series hardware, quantum-safe encryption, and the near-perfect reliability of these systems cannot be duplicated by AI tools. Futurum Group’s research notes, “Code translation cannot capture the actual complexity; platform value stems from decades of hardware-software integration.” Similarly, offline scene control, tacit industry knowledge, and complex relationship networks—elements hard to “encode”—constitute the first line of AI immunity.

The second feature is data moat. Does the enterprise possess exclusive, continuous, and governable data assets? Is it merely using publicly available data, or can it generate data others cannot access? CITIC Bank has begun exploring using large models to evaluate data asset value and is attempting to “bring data assets onto the balance sheet.” The logic is that in the AI era, data is not just raw material for production but an asset itself. However, not all data can form a moat—public web data can be quickly consumed by AI models, whereas exclusive data sources enable premium valuation.

The third feature is AI-enabled resilience. Can the asset be enhanced rather than replaced by AI? This distinguishes IBM’s impact—where legacy COBOL systems are being replaced—from Block’s transformation—where payment and financial services can be AI-empowered. IBM has developed watsonx Code Assistant for Z, a dedicated tool allowing clients to securely refactor and modernize legacy code on the platform while maintaining enterprise-grade security. When assets can synergize with AI rather than be threatened by it, their value increases.

Conversely, AI-vulnerable assets share three traits: reliance on “information processing” as core value, susceptibility to standardization and automation, and lack of data generation or accumulation ability. Enterprises can perform a “stress test” on their asset portfolios based on these traits.

4. New Opportunities in RWA: Which Assets Are Worth Tokenizing?

Extending this framework to the RWA (Real-World Asset Tokenization) field leads to a clear conclusion: RWA is not “anything can be on-chain,” but rather, in the wave of AI revaluation, selecting hard assets capable of withstanding AI cycles.

As of March 2026, the total on-chain RWA value exceeded $25 billion, nearly quadruple the amount from a year earlier. However, the Hong Kong Web3.0 Standardization Association’s white paper on RWA industry, published in August 2025, states: “The idea that everything can be RWA is a fallacy.” Successful large-scale implementation requires meeting three thresholds: value stability, clear legal rights, and verifiable off-chain data.

Combining the “AI Immunity” framework, we can further specify: assets suitable for tokenization are those whose value remains stable under AI revaluation.

First are “AI immune” physical assets. These include energy assets, infrastructure, and scarce resources. Their value does not depend on information processing but on physical existence and utility. The white paper mentions renewable energy RWA (such as charging stations, photovoltaic assets) and GPU computing assets—these fall into this category. GPU computing power, driven by AI industry demand and trustworthy “digital DNA,” is becoming an ideal anchor for RWA.

Second are programmable data assets. Assets with exclusive data sources that can be monetized via smart contracts, combining “data moat” and “AI empowerment”—such as proprietary datasets, IP rights, and carbon credits. But caution is needed: not all data can be assets—only those that can be continuously generated, properly rightsed, and verified.

Third are hybrid assets, combining “uncodifiable” physical control rights with “programmable” digital rights. For example, property rights of commercial real estate can be tokenized, but physical operation, maintenance, and leasing—offline scene control—remain with professional entities. This “physical + digital” dual-layer structure leverages blockchain’s liquidity while anchoring offline “AI immunity” value.

Conversely, two asset types require caution in tokenization in the AI era: those heavily reliant on intermediary labor, whose value is compressed by AI; and standardized assets lacking data moats, which have limited bargaining power under AI valuation.

5. Action Guidelines: From Cognition to Decision

IBM’s $40 billion evaporation signals an era—assets relying on information asymmetry and labor stacking are being re-priced by AI. Block’s countertrend rise signals another—companies that embrace AI and optimize their asset structure are being revalued by the market.

For listed companies and traditional enterprises, this is not just technological anxiety but a fundamental restructuring of asset valuation systems. CEOs must answer an unavoidable question: What is the value of my asset portfolio in the eyes of AI?

Based on this analysis, three actionable recommendations are proposed:

First, immediately initiate an “AI stress test” of assets. Using the “AI Immunity” framework’s three features—uncodifiability, data moat, and AI-enabled resilience—evaluate core business units. Identify which are most vulnerable to AI impact and which may benefit from AI amplification.

Second, establish a dynamic asset portfolio management mechanism. In the context of AI revaluation, asset allocation should shift from a static “buy-and-hold” strategy to actively increasing “AI immune” assets and planning transformation or divestment of AI-vulnerable assets. This requires coordination across strategy, technology, and business departments.

Third, revisit RWA strategies. Before tokenizing assets, use the “AI Immunity” framework to screen underlying assets. The core value of RWA isn’t just “on-chain” but enabling better liquidity and valuation through tokenization. If the underlying assets are devalued in the AI era, tokenization merely accelerates value loss.

It’s important to note that, according to China’s No. 42 document jointly issued by eight departments, any form of token issuance and tokenized trading within mainland China is strictly prohibited. The discussion of RWA tokenization here refers only to compliant offshore digital practices. Enterprises exploring related business must strictly adhere to the “strictly prohibited domestically, registered offshore” regulatory red lines.

As AI begins to price assets, the only true safety lies in things AI cannot price—not code, not data, but human judgment of value itself.


(This article is based on publicly available data and sources, including Nasdaq, Tencent News, Futurum Group, PLOS One, 21st Century Business Herald, Industrial and Commercial Times, and other authoritative media and research institutions. The views expressed do not constitute investment advice.)

View Original
Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments