The AGI arms race
Superintelligence will be the most powerful technology in history, creating dominance unseen since the nuclear era; whoever builds it first gains decisive strategic advantage
“Show me the incentive and I'll show you the outcome.” ― Charlie Munger
There is currently an arms race to build AGI first. It is unlike any other arms race in history. The powerful incentives driving this arms race can be viewed through multiple lenses, and in my view, even in the face of huge public backlash, it will not stop. Governments and Companies alike will race ahead until the end, with the ferocity of competition inversely proportional to the safety measures taken1.
Here’s why:
Geopolitical supremacy
Superintelligence will be the most powerful technology—and most powerful weapon—mankind has ever developed. Those who have it stand to yield dominance over those who don’t—a situation unseen since the the nuclear era2.
The resulting dynamic is existential for the US and China. Whoever builds AGI first, then ASI, will wield a decisive strategic and military advantage.
Economic power
Staggering investments are pouring into AI development, with the US, EU, China, and UAE allocating hundreds of billions of dollars. Goldman Sachs forecasts that global tech giants and utility companies will funnel an astonishing trillion dollars into AI advancement in the near future.
To put this in context, this is four times more than the cost of the Apollo project (spaceflight) and ten times the Manhattan project (nuclear) spend3.
Houston… we have an arms race!
Companies
Apple — $125bn (2025)
Amazon — $100bn (2025)
Microsoft — $80bn (2025)
Meta — $65bn (2025)
Google — $75bn (2025)
OpenAI — $57.9bn (last 5 yrs)
Alibaba — $53bn (2025-2027)
Bytedance — $20bn (2025)
xAI — $16bn (last 5 yrs)
Anthropic — $14.3bn (last 5 yrs)
Tencent — >$10bn per year
SSI — $2bn
Mistral — $1.05bn (last 5 yrs)
Thinking Machines — $2bn (2025)
Baidu — not disclosed
DeepSeek — not disclosed
Governments
United States — $510bn+ (includes Stargate project)
China — $137bn+ (est.)
European Union — €200bn
France — €109bn
United Arab Emirates — $100bn
United Kingdom — £14bn (private sector)
Estimates vary, but economists and AI researchers have suggested AGI could generate tens of trillions of dollars in economic value. McKinsey estimated AI could generate up to $23 trillion annually by 2040, and AGI would likely exceed this4.
A true AGI breakthrough might represent the largest economic discontinuity in human history—potentially comparable to or exceeding the industrial revolution in impact, but compressed into a much shorter timeframe.
The size of the prize is real.
Existential risk
Some believe AGI is inevitable, so being first means being able to set safeguards and avoid a worse case scenario (e.g. Alignment, hostile AGI build elsewhere). Ilya Sutskever, pioneer of deep learning and co-founder of OpenAI, left in early 2024 because he didn’t feel OpenAI were taking the appropriate safety measures to build Safe Superintelligence Inc.. He subsequently raised $1bn (rumoured $2bn) with one goal and one product: a safe superintelligence5.
"If we don’t, someone else will, and maybe less safely" is a powerful motivator.
The combination of these—Geopolitical supremacy, economic power, and existential risk—factors creates a winner-takes-most race dynamic—one that encourages speed over caution.
Summary
An arms race to build AGI first is underway, with competition likely to reduce safety measures as nations and companies race to be first
Superintelligence will be the most powerful technology in history, creating dominance unseen since the nuclear era; whoever builds it first gains decisive strategic advantage
Massive investments (over $1 trillion total) from tech giants and governments dwarf historical projects like Apollo; AGI could generate tens of trillions in economic value
Some argue being first allows setting safeguards, creating a "if we don't, someone else will" dynamic that further accelerates the race
Read, “Armstrong, S. & Bostrom, N. & Shulman, C. (2013): “Racing to the precipice: a model of artificial intelligence development”, Technical Report #2013-1, Future of Humanity Institute, Oxford University: pp. 1-8” to understand this dynamic.
For a full breakdown of the geopolitical importance of Superintelligence, read Leopold Aschenbrenner’s, The Free World Must Prevail Essay, part of his Situational Awareness Series. Here’s an excerpt:
”Superintelligence is not just any other technology—hypersonic missiles, stealth, and so on—where US and liberal democracies’ leadership is highly desirable, but not strictly necessary. The military balance of power can be kept if the US falls behind on one or a couple such technologies; these technologies matter a great deal, but can be outweighed by advantages in other areas.
The advent of superintelligence will put us in a situation unseen since the advent of the atomic era: those who have it will wield complete dominance over those who don’t.
I’ve previously discussed the vast power of superintelligence. It’ll mean having billions of automated scientists and engineers and technicians, each much smarter than the smartest human scientists, furiously inventing new technologies, day and night. The acceleration in scientific and technological development will be extraordinary. As superintelligence is applied to R&D in military technology, we could quickly go through decades of military technological progress.”
The full Apollo project cost about $250bn USD in 2020 dollars, and the Manhattan project less than a tenth that.
There’s a lot of talk about AI not living up to the hype yet within real organisations. A lot of this can be explained by (in)effective data integration.
Most companies face serious data problems:
Much data still exists only on paper (lab notebooks, QA reports)
Some is "barely digitized" as unsearchable scanned PDFs
Data often remains siloed within individual machines on factory floors
There's rarely a unified dataset connecting all operations
There are two major challenges:
Data Access is a Human Problem
Security concerns, especially in industries worried about espionage
Political/personal resistance to sharing data
Interdepartmental rivalries
The author describes how Palantir dedicates about a third of its workforce just to negotiate data access
Data Cleaning Requires Judgment
Converting data to standardized formats
Understanding what the data actually represents in real-world contexts
Making case-by-case decisions that are hard to automate
Even modern LLMs struggle to effectively automate this process
The author argues that AI's economic impact will be limited by these data integration challenges:
Implementation will happen at "the speed of human adaptation"
Each company will require its own unique data integration effort
The process will be labor-intensive and expensive
Like computer adoption decades ago, it will occur gradually and unevenly across industries
The conclusion is not that "AI is overrated," but that AI transformation will be gated by these practical data integration challenges - the hard, human work of accessing, cleaning, and standardizing data that must happen before AI can deliver its full value.
See excerpt from Safe Superintelligence’s website:
”Superintelligence is within reach.
Building safe superintelligence (SSI) is the most important technical problem of our time.
We have started the world’s first straight-shot SSI lab, with one goal and one product: a safe superintelligence.
It’s called Safe Superintelligence Inc.
SSI is our mission, our name, and our entire product roadmap, because it is our sole focus. Our team, investors, and business model are all aligned to achieve SSI.
We approach safety and capabilities in tandem, as technical problems to be solved through revolutionary engineering and scientific breakthroughs. We plan to advance capabilities as fast as possible while making sure our safety always remains ahead.
This way, we can scale in peace.
Our singular focus means no distraction by management overhead or product cycles, and our business model means safety, security, and progress are all insulated from short-term commercial pressures.”