Analysis of AI Chips by Nvidia, AMD, Google, and Tesla
This article will provide an in-depth review of the artificial intelligence (AI) chips produced by AMD, Google, and Tesla. The landscape of AI computing is rapidly evolving, and Tesla, with its powerful Dojo 3 chip, has the potential to surpass AMD in terms of AI chip performance and production volume.
The Tesla Abundance slide notably features AI compute alongside its projects like Optimus, Robotaxi, and Full Self-Driving (FSD). This highlights the critical role of Tesla's Dojo 2 and Dojo 3 AI training chips, which are vital for refining the FSD technology and training the Optimus robot.
Projected estimates indicate that AMD shipped approximately 300,000 to 400,000 units of its Instinct MI300 AI chips in 2024, contributing significantly to a revenue figure of around $5 billion. This results in an average selling price (ASP) for their chips calculated as follows:
$5 billion ÷ 300,000 units = ~$16,667 per chip
$5 billion ÷ 400,000 units = ~$12,500 per chip
Looking ahead, AMD is expected to increase its sales to about 500,000 AI chips in 2025, which could yield $7.5 billion in revenue.
Nvidia AI Chips in 2025
For Nvidia, its data center revenue operates as a benchmark for its AI chip sales. Analysts predict that by 2024, Nvidia will achieve approximately $110.36 billion in data center revenue. By factoring in Nvidia's stronghold in the AI market, a revenue estimate of $120 billion for 2025 appears realistic.
The total number of chips sold will depend on the ASPs, with Nvidia’s H100 GPUs reportedly priced between $20,000 and $40,000 each. Assuming an average ASP of $30,000 per chip, the projections look like this:
$120 billion ÷ $30,000 = 4,000,000 chips (or 4 million chips).
Google TPUs in 2025
Google, on the other hand, develops Tensor Processing Units (TPUs) mainly for its internal use, meaning their figures refer to production and utilization within Google’s data centers rather than external sales. In 2024, the worldwide shipment of self-developed cloud AI ASIC accelerators, which includes TPUs, is projected to be around 3.45 million units. Google is anticipated to hold a 74% market share, amounting to approximately 2.55 million TPUs. With an expected 20% market growth in 2025, total shipments may rise to around 4.14 million units, with Google likely maintaining its 74% share:
4.14 million × 0.74 = 3.06 million TPUs.
Regarding performance, the current TPU v4 offers capabilities of 275 teraFLOPs (bfloat16), while the newer TPU v5e provides 197 teraFLOPs (bfloat16). The upcoming TPU v6, expected in 2025, is projected to achieve around 400 teraFLOPs per chip (bfloat16).
Tesla Dojo 2 in 2025
Tesla's Dojo 2 AI training chips are expected to move into mass production by late 2025. Previously, Tesla mentioned that its Dojo 1 chips, which offer 367 teraFLOPs of performance, equate to about 5% of 50,000 to 100,000 Nvidia H100 chips, suggesting that Tesla produced between 15,000 to 30,000 Dojo 1 chips.
The company is reportedly investing around $500 million annually in its Dojo supercomputers. Assuming an ASP of $10,000 per chip, similar to high-end AI chips, the calculations are:
$500 million ÷ $10,000 = 50,000 chips.
Given that Dojo 2 is expected to be ten times more powerful than its predecessor, this could mean performance ratings as high as 3-4 petaflops, which would position it as approximately twice as powerful as the Nvidia H100 models.
Tesla also plans to develop the Dojo 3 chips by 2026. If successful, these could remarkably enhance performance to 40 petaflops, closely competing with Nvidia's B300s, putting Tesla in a strong position in the AI chip market.
As Tesla focuses on producing million-unit AI data centers by 2026 and if the Dojo 3 chip becomes a success, Tesla could emerge as the second-largest player in AI chips, surpassing AMD.
Nvidia, AMD, Google, Tesla, AI