News

Artificial Intelligence

Machine Learning

Global

Nvidia and Groq Strike $20 Billion Licensing Deal for High-Performance Inference Tech

Groq licenses its LPU architecture to Nvidia as its founders join the chip giant to scale real-time AI.

Groq licenses its LPU architecture to Nvidia as its founders join the chip giant to scale real-time AI.

Groq licenses its LPU architecture to Nvidia as its founders join the chip giant to scale real-time AI.

NewDecoded

Published Dec 28, 2025

Dec 28, 2025

3 min read

Groq announced a major non-exclusive licensing agreement with Nvidia today for its high-performance inference technology. The deal is reportedly valued at approximately $20 billion in cash and grants Nvidia access to Groq's Language Processing Unit architecture. This strategic move aims to expand global access to low cost and high speed AI processing through shared hardware innovations.

As part of the historic agreement, Groq's top leadership will join Nvidia to help integrate and scale the licensed technology. Founder Jonathan Ross and President Sunny Madra will lead efforts to bring Groq's unique hardware designs into Nvidia's massive AI infrastructure. Their expertise is expected to help Nvidia maintain its dominance as the industry shifts its focus toward running AI models efficiently.

Despite the massive transfer of talent, Groq will continue to operate as an independent company. Simon Edwards, formerly the Chief Financial Officer, has been appointed as the new Chief Executive Officer to manage the company's remaining operations. Groq's popular developer platform, GroqCloud, will continue to function without any interruption for its current users.

The partnership centers on Groq’s specialized chip architecture which uses Static Random Access Memory rather than traditional high bandwidth memory. This design provides deterministic latency and high throughput, making it ideal for real-time applications like chatbots and voice agents. Nvidia’s integration of this technology could significantly reduce the costs and energy requirements for enterprise level AI deployments.

Industry analysts view the $20 billion transaction as a strategic effort to neutralize emerging hardware rivals. By licensing the intellectual property and hiring the core engineering team, Nvidia secures a competitive edge in the inference market without the regulatory hurdles of a full corporate buyout. This ensures Nvidia remains at the forefront of the generative AI revolution as global demand for speed grows.


The Strategic Shift to AI Inference

Decoded Take

Decoded Take

Decoded Take

This agreement signals a fundamental change in the AI hardware landscape where the ability to run models matters just as much as the power required to train them. By securing Groq’s technology, Nvidia is moving to close the efficiency gap that specialized startups have exploited over the last year. The massive valuation underscores that the next phase of the AI war will be won on latency and cost effectiveness rather than raw computing power alone. This deal effectively allows Nvidia to maintain its market dominance by incorporating the best of breed inference technology directly into its ecosystem.

Share this article

Related Articles

Related Articles

Related Articles