Companies

How Intel Can Still Succeed in the AI Market

Published February 26, 2025

Chip giant Intel (NASDAQ: INTC) has faced challenges in breaking into the profitable AI accelerator market, which is currently dominated by Nvidia. Although the company’s Gaudi family of AI accelerators showed promise and was competitively priced, sales have been hindered by an immature software ecosystem. Intel has not only missed its projected AI chip sales for 2024 but has also decided to pivot its strategy. The Falcon Shores project, originally intended as a successor to Gaudi 3, has been canceled as a commercial product, leading Intel to concentrate on rack-scale AI solutions that are expected to be available by 2026.

Despite being a minor player in the AI accelerator arena, Intel's CPU business may help fill the gap. As the AI sector evolves and shifts focus from training AI models to deploying them for real-world use, Intel’s Xeon server CPUs could become vital assets in the competition.

Affordable AI as an Opportunity

The recent achievement by the Chinese start-up DeepSeek, which successfully trained an AI model comparable to those of prominent U.S. companies at a significantly lower cost, could eventually benefit Intel. While powerful accelerators like those from Nvidia are essential for training advanced AI models, this is not necessarily true for inference—the actual execution of these models. Smaller, simpler AI models can already operate effectively on CPUs, particularly those equipped with AI acceleration capabilities.

Intel recently expanded its Xeon 6 family of server CPUs by introducing new models aimed at lower price points and specialized applications. The Xeon 6500 and 6700 series are designed for data center use. These latest server CPUs are more efficient and compact than their previous iterations, and Intel asserts that customers could see up to 68% lower costs of ownership compared to systems that are five years old. Furthermore, while these chips can complement AI accelerators when forming AI training clusters, Intel has also highlighted that they offer up to 50% superior AI inference performance compared to AMD's latest server CPUs.

In addition to its Xeon 6 chips for data centers, Intel launched models specifically for network and edge computing applications. These chips are up to 70% more power efficient than earlier versions and also feature AI capabilities. For example, a 38-core video edge server can handle AI inference tasks across 38 simultaneous camera streams.

It is crucial to note that not every AI application necessitates the most advanced models trained on vast datasets. As AI technology becomes more refined and affordable, CPUs enhanced with AI acceleration can play a significant role in a company's AI strategy.

Expanding Market Potential

As the AI landscape progresses, the initial rush to implement AI solutions will culminate in focused discussions about return on investment. While Intel’s efforts to prioritize cost efficiency with its AI accelerators did not succeed due to software challenges, a similar focus could be productive for its CPUs.

In scenarios that involve small, optimized AI models that do not require expensive AI accelerators, systems equipped with Intel's latest Xeon 6 CPUs could emerge as the most cost-effective choice. In situations where AI accelerators are needed, Intel's CPUs can still contribute meaningfully.

According to IDC, total annual expenditure on machine learning and analytics is predicted to reach $361 billion by 2027, with generative AI accounting for $153 billion of that sum. As AI models become more efficient and easier to operate, a rising portion of this spending may be directed toward infrastructure that does not rely on high-end AI accelerators.

Although Intel has been sidelined in the AI accelerator market, the company's overall opportunities within the AI domain remain viable. The failure of Gaudi to thrive means that Intel's potential has been somewhat diminished, but it still has a competitive edge in the ongoing AI race.

Intel, AI, Nvidia