X

Intel Doubles AI Chip Power to Expand the Revolution

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
4 min read
Intel Mobileye self-driving robotaxi

Intel's Mobileye subsidiary users Guadi processors to train the AI systems that pilot its self-driving robotaxis.

Intel

What's happening

Intel offers details about Gaudi2, its second-generation chip for accelerating AI and a key part of its effort to catch up to Nvidia.

Why it matters

The AI revolution needs more horsepower to tackle challenges like self-driving cars and fusion energy.

What's next

Intel's Israel-based Habana Labs unit is working on another, faster successor called Gaudi3.

Intel took the wraps off a highly anticipated AI accelerator chip on Tuesday, a key part of the chipmaker's effort to reclaim ground lost to Nvidia and other rivals in the hot computing area.

The Gaudi2, designed by Intel's Israel-based Habana Labs, is twice as fast as its first-generation predecessor, the chipmaker said at its Vision conference for Intel customers and partners. The chip should be in servers that ship by the end of the year, said Eitan Medina, Habana's chief operating officer.

AI chips like the Gaudi line accelerate the particular math calculations at the heart of today's artificial intelligence technology. A third-generation Gaudi3 is already being developed with higher performance, more memory and better networking abilities, Medina said.

The Gaudi2 and similar chips, like Nvidia's new H100, are designed to boost the artificial intelligence revolution that's sweeping the computing industry. The powerful chips are behind efforts to train AI models, which learn by processing complex real-world data to find patterns, more quickly and economically. They promise improved voice recognition for auto generating captions, as well as more involved operations, such as self-driving cars. (Mobileye, Intel's autonomous vehicle subsidiary, trains its AI systems with first-generation Gaudi processors, Medina said, but the company has other automotive customers, too.)

AI technology spending will surge 20% to $433 billion in 2022, IDC predicted in February. "AI has emerged as the next major wave of innovation," analyst Ritu Jyoti said in a statement.

Competing on price could be a winning strategy as AI spreads beyond giants with deep technical expertise like Amazon and Google, which use the technology for tasks like cutting shipments' packaging and showing search results. At a lower price tag, AI will likely spread to newer applications, such as screening for fraud, monitoring crop health and flagging trouble spots on medical scans.

"From the business penetration of AI," Medina said in an interview, "we are in the very early phases."

Intel trying to catch up

Along with new graphics processing units, Gaudi2 is a centerpiece of Intel's effort to reclaim computing leadership it's lost over the last two decades. During Intel's heyday, central processing units, the all-purpose brain of every computing device, were the stars of the computing show. GPUs, which Nvidia specialized in designing, were dedicated to speeding up video games.

Over time, GPUs took on important computing tasks that had been the domain of CPUs and expanded into AI. Investors noticed, giving Nvidia a market cap of $424 billion, more than double Intel's $181 billion.

Although AI-specific accelerators are a hot area, Nvidia is sticking with GPUs, which can also be used for supercomputer calculations and other high-performance computing tasks. That flexibility is a selling point, said Ian Buck, vice president of Nvidia's hyperscale and high-performance computing group.

Intel Gaudi2 AI processor

The Gaudi2 AI processor from Intel's Habana Labs division

Intel

GPUs' flexibility advantage

"You don't know where your AI model is necessarily going to go," Buck said about the flexibility of GPUs. "If you're an AI startup, your productivity is everything."

Cruise, General Motors' self-driving car subsidiary, seems to agree with that approach. The company rents Nvidia GPUs on Google's cloud computing infrastructure because GPUs have more mature AI software and "extreme amounts of flexibility," said Hussein Mehanna, head of Cruise's AI work.

"There's always something new," and GPUs and GPU software can rapidly be adapted to cope, Mehanna said. "There's always a new architecture, some new types of layers that we're adding, merging [AI] models and separating models."

Plenty of startups, including Graphcore, SambaNova Systems, Tenstorrent and Cerebras, are, like Intel, working on more specialized processors to accelerate AI. In the view of Cerebras Chief Executive Andrew Feldman, GPUs were better than CPUs for AI, but now it's apparent their graphics origins are holding them back, and AI accelerators will prevail.

With AI accelerators now on the market to challenge the GPU approach, "the battle will be over the next five years," Feldman said.

Intel's two-pronged approach

Intel is betting both on AI-specific accelerators and flexible GPUs. Its Ponte Vecchio GPU is an enormously complicated processor that powers the Argonne National Laboratory's Aurora supercomputer, which is expected to be powered up this year. In 2023, Intel will sell Ponte Vecchio to the broader market and develop successor chips that are cheaper and made in larger quantities, says Raja Koduri, who worked on GPUs at two Intel chip rivals, AMD and Apple, before joining Intel in 2017.

Koduri also leads the new Arc line of conventional GPUs that accelerate video games in Intel PCs. The first of those products, code-named Alchemist, are now shipping, with more powerful products arriving later this year for laptops and gaming PCs. With a road map stretching to 2025, Intel also is working on successors called Battlemage and Celestial.

In other words, Intel is attacking Nvidia on all fronts. "The market is really hungry for a third player" besides Nvidia and AMD, Koduri said.

For AI customers, it's potentially confusing for Intel to offer both AI accelerators and general-purpose GPUs. Server processor chief Sandra Rivera, who oversees Intel's AI work, says Intel opted for a wider product range instead of a one-size-fits-all approach. The idea is to meet customers where they are, she said. 

Expect Intel to take advantage of its position as a seller of CPUs, GPUs and AI accelerators that can be linked tightly together so customers don't have to assemble their own collections of IT gear.

"It's a playbook we've run for a long time," Rivera said. "Innovate and integrate."