On November 12, 2025, the artificial intelligence company Anthropic PBC announced it would invest $50 billion to collaborate with the UK-based Fluidstack Ltd. to build customized AI data centers, with locations covering Texas, New York, and other areas. The first facilities are expected to go into production in 2026. The project is projected to create 800 permanent jobs and 2,400 construction jobs, aiming to advance the Trump administration's strategy of “maintaining America's AI leadership” by strengthening domestic technological infrastructure. This move marks Anthropic's first direct involvement in data center construction, having previously relied on partners such as Amazon Web Services and Google Cloud.
Anthropic Strategic Layout and Infrastructure Architecture
Anthropic's $50 billion investment is a key move in the AI arms race. The collaboration model with Fluidstack is quite innovative—this emerging cloud service provider will offer “gigawatt-level” power support, continuing the business model where Google served as a “back-up guarantee” during its collaboration with cryptocurrency mining companies TeraWulf Inc. and Cipher Mining Inc. The data center design focuses on large-scale training needs, utilizing cutting-edge technologies such as liquid cooling and high-voltage direct current power supply, with computing power density expected to reach 5-8 times that of traditional data centers.
Core Parameters of the Project
Total investment amount: 50 billion USD
First phase of production: 2026
Computing power target: Support training of hundred billion parameter models.
Energy Solution: Gigawatt-level Power Supply
Job creation: 3,200 temporary and permanent positions
Site selection strategies reflect a dual consideration of energy and policy. Texas has attracted several tech giants to build AI infrastructure due to its relaxed regulations and low energy costs; New York, on the other hand, benefits from talent aggregation and tax incentives. Anthropic CEO Dario Amodei emphasized in a statement: “These sites will help us build stronger AI systems that can drive breakthroughs while creating jobs in the U.S.” This statement echoes the Trump administration's policy direction of linking AI infrastructure with job growth.
AI Infrastructure Competition Intensifies
The addition of Anthropic has further escalated the already heated competition for AI data centers. After OpenAI announced the $500 billion “Stargate” project in the U.S. earlier this year, it has expanded to other countries; Meta is building a 2GW mega facility in Louisiana, and CEO Mark Zuckerberg recently claimed that the company will invest $600 billion in data centers and jobs in the U.S. over the next few years; giants like Microsoft and Nvidia also have plans for investments amounting to tens of billions of dollars each. According to incomplete statistics, the total amount committed by major tech companies for AI infrastructure investments has exceeded $1.5 trillion.
This scale of investment has raised concerns among some investors about a bubble. AI technology has yet to establish a stable profit model, with global AI companies' total revenue projected to be around $380 billion in 2024, while annual spending on infrastructure has already surpassed $300 billion. OpenAI's CFO Sarah Fryer recently attempted to calm these worries by stating, “When considering the actual impact and value to individuals, I believe the enthusiasm for AI is still not enough.” However, this justification does little to mask the uncertainty surrounding the capital return cycle.
Anthropic Technological Breakthrough and Computing Power Demand Explosion
Anthropic's decision to directly build data centers reflects the exponential growth in computational power demand for large model training. Its latest model, Claude 4, requires 10^25 FLOPs of computational power, which is 1000 times that of the first-generation model. With current progress, the computational power demand for models may increase by another three orders of magnitude by 2027. Although cloud service providers can offer elastic computing power, customized hardware has more advantages in energy efficiency and cost—building their own data centers can reduce the cost of computational power by 30-40%.
In terms of technical pathways, Anthropic may adopt its patented “Constitution AI” training architecture, which requires extremely high computational stability due to its iterative alignment process. At the same time, the disclosed “gigawatt-level” power demand confirms the positive correlation between AI computing power and energy density—single data centers are consuming power close to the scale of civilian electricity in medium-sized cities. This concentration of energy may drive the commercialization of cutting-edge energy technologies such as next-generation nuclear fusion and modular nuclear fission.
Impact of AI Industry Chain and Investment Logic
The wave of data center construction is reshaping the related industrial chain. GPU supplier Nvidia's data center revenue is expected to grow by 120% year-on-year in 2025; power equipment manufacturer Eaton's stock price has risen by 65% this year; cooling technology provider Vertiv has orders scheduled until 2027. The collaboration model between Anthropic and Fluidstack could become an industry paradigm—specialized cloud service providers are responsible for infrastructure, AI companies focus on algorithm development, and tech giants provide credit endorsement.
Investment logic needs to differentiate between returns at different stages. During the construction period, prioritize attention on power grid equipment and cooling system suppliers; during the operational period, shift focus to computing power leasing and model service companies; in the long term, it is necessary to assess the actual profitability of AI. The current valuation model shows that the market pricing for AI infrastructure includes a growth premium of 40-50%, and this optimistic expectation requires revenue validation over the next 2-3 years. Investors should closely monitor the commercialization progress of companies like Anthropic—if their claimed 300,000 enterprise users cannot be converted into sustainable revenue, the current investment scale may face correction pressure.
AI Geopolitical Competition and Policy Drivers
The Trump administration views AI leadership as a national security issue, and the American AI Infrastructure Act passed in 2025 provides a 25% tax credit for related investments. The Anthropic project aligns perfectly with this policy direction and may receive federal and state subsidies. This government-business collaboration model contrasts with France, where Fluidstack announced in February this year the construction of a €11 billion “supercomputer,” receiving direct support from the Macron government.
From a geopolitical perspective, the United States is maintaining its AI advantage over China through massive investments. Currently, the U.S. leads by a factor of 3.2 in total computing power and by a factor of 5 in the number of top models. However, China has advantages in application scenarios and data accumulation, along with stronger government coordination capabilities. The infrastructure layout of companies like Anthropic is not only a choice of technological direction but also a strategic deployment in the context of global technological competition. In the next five years, the distribution of AI infrastructure may redefine the boundaries of national competitiveness in the digital age.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Anthropic invests $50 billion to build AI data centers across the United States, competing for Computing Power hegemony.
On November 12, 2025, the artificial intelligence company Anthropic PBC announced it would invest $50 billion to collaborate with the UK-based Fluidstack Ltd. to build customized AI data centers, with locations covering Texas, New York, and other areas. The first facilities are expected to go into production in 2026. The project is projected to create 800 permanent jobs and 2,400 construction jobs, aiming to advance the Trump administration's strategy of “maintaining America's AI leadership” by strengthening domestic technological infrastructure. This move marks Anthropic's first direct involvement in data center construction, having previously relied on partners such as Amazon Web Services and Google Cloud.
Anthropic Strategic Layout and Infrastructure Architecture
Anthropic's $50 billion investment is a key move in the AI arms race. The collaboration model with Fluidstack is quite innovative—this emerging cloud service provider will offer “gigawatt-level” power support, continuing the business model where Google served as a “back-up guarantee” during its collaboration with cryptocurrency mining companies TeraWulf Inc. and Cipher Mining Inc. The data center design focuses on large-scale training needs, utilizing cutting-edge technologies such as liquid cooling and high-voltage direct current power supply, with computing power density expected to reach 5-8 times that of traditional data centers.
Core Parameters of the Project
Total investment amount: 50 billion USD
First phase of production: 2026
Computing power target: Support training of hundred billion parameter models.
Energy Solution: Gigawatt-level Power Supply
Job creation: 3,200 temporary and permanent positions
Site selection strategies reflect a dual consideration of energy and policy. Texas has attracted several tech giants to build AI infrastructure due to its relaxed regulations and low energy costs; New York, on the other hand, benefits from talent aggregation and tax incentives. Anthropic CEO Dario Amodei emphasized in a statement: “These sites will help us build stronger AI systems that can drive breakthroughs while creating jobs in the U.S.” This statement echoes the Trump administration's policy direction of linking AI infrastructure with job growth.
AI Infrastructure Competition Intensifies
The addition of Anthropic has further escalated the already heated competition for AI data centers. After OpenAI announced the $500 billion “Stargate” project in the U.S. earlier this year, it has expanded to other countries; Meta is building a 2GW mega facility in Louisiana, and CEO Mark Zuckerberg recently claimed that the company will invest $600 billion in data centers and jobs in the U.S. over the next few years; giants like Microsoft and Nvidia also have plans for investments amounting to tens of billions of dollars each. According to incomplete statistics, the total amount committed by major tech companies for AI infrastructure investments has exceeded $1.5 trillion.
This scale of investment has raised concerns among some investors about a bubble. AI technology has yet to establish a stable profit model, with global AI companies' total revenue projected to be around $380 billion in 2024, while annual spending on infrastructure has already surpassed $300 billion. OpenAI's CFO Sarah Fryer recently attempted to calm these worries by stating, “When considering the actual impact and value to individuals, I believe the enthusiasm for AI is still not enough.” However, this justification does little to mask the uncertainty surrounding the capital return cycle.
Anthropic Technological Breakthrough and Computing Power Demand Explosion
Anthropic's decision to directly build data centers reflects the exponential growth in computational power demand for large model training. Its latest model, Claude 4, requires 10^25 FLOPs of computational power, which is 1000 times that of the first-generation model. With current progress, the computational power demand for models may increase by another three orders of magnitude by 2027. Although cloud service providers can offer elastic computing power, customized hardware has more advantages in energy efficiency and cost—building their own data centers can reduce the cost of computational power by 30-40%.
In terms of technical pathways, Anthropic may adopt its patented “Constitution AI” training architecture, which requires extremely high computational stability due to its iterative alignment process. At the same time, the disclosed “gigawatt-level” power demand confirms the positive correlation between AI computing power and energy density—single data centers are consuming power close to the scale of civilian electricity in medium-sized cities. This concentration of energy may drive the commercialization of cutting-edge energy technologies such as next-generation nuclear fusion and modular nuclear fission.
Impact of AI Industry Chain and Investment Logic
The wave of data center construction is reshaping the related industrial chain. GPU supplier Nvidia's data center revenue is expected to grow by 120% year-on-year in 2025; power equipment manufacturer Eaton's stock price has risen by 65% this year; cooling technology provider Vertiv has orders scheduled until 2027. The collaboration model between Anthropic and Fluidstack could become an industry paradigm—specialized cloud service providers are responsible for infrastructure, AI companies focus on algorithm development, and tech giants provide credit endorsement.
Investment logic needs to differentiate between returns at different stages. During the construction period, prioritize attention on power grid equipment and cooling system suppliers; during the operational period, shift focus to computing power leasing and model service companies; in the long term, it is necessary to assess the actual profitability of AI. The current valuation model shows that the market pricing for AI infrastructure includes a growth premium of 40-50%, and this optimistic expectation requires revenue validation over the next 2-3 years. Investors should closely monitor the commercialization progress of companies like Anthropic—if their claimed 300,000 enterprise users cannot be converted into sustainable revenue, the current investment scale may face correction pressure.
AI Geopolitical Competition and Policy Drivers
The Trump administration views AI leadership as a national security issue, and the American AI Infrastructure Act passed in 2025 provides a 25% tax credit for related investments. The Anthropic project aligns perfectly with this policy direction and may receive federal and state subsidies. This government-business collaboration model contrasts with France, where Fluidstack announced in February this year the construction of a €11 billion “supercomputer,” receiving direct support from the Macron government.
From a geopolitical perspective, the United States is maintaining its AI advantage over China through massive investments. Currently, the U.S. leads by a factor of 3.2 in total computing power and by a factor of 5 in the number of top models. However, China has advantages in application scenarios and data accumulation, along with stronger government coordination capabilities. The infrastructure layout of companies like Anthropic is not only a choice of technological direction but also a strategic deployment in the context of global technological competition. In the next five years, the distribution of AI infrastructure may redefine the boundaries of national competitiveness in the digital age.