Newsvidia

DeepSeek has rattled large AI players — but smaller chip firms see it as a force multiplier

ai chip firms players

DeepSeek has rattled large AI players — but smaller chip firms see it as a force multiplier

DeepSeek has rattled the U.S.-led AI ecosystem with its latest model, shaving hundreds of billions in chip leader Nvidia’s market cap. While the sector leaders grapple with the fallout, smaller AI companies see an opportunity to scale with the Chinese startup.

Several AI-related firms told CNBC that DeepSeek’s emergence is a “massive” opportunity for them, rather than a threat. 

Andrew Feldman, CEO of artificial intelligence chip startup Cerebras Systemsm, said:

Developers are very keen to replace OpenAI’s expensive and closed models with open source models like DeepSeek R1…

The company competes with Nvidia’s graphic processing units and offers cloud-based services through its own computing clusters. Feldman said the release of the R1 model generated one of Cerebras’ largest-ever spikes in demand for its services. 

Feldman added,

R1 shows that [AI market] growth will not be dominated by a single company — hardware and software moats do not exist for open-source models,

Open source refers to software in which the source code is made freely available on the web for possible modification and redistribution. DeepSeek’s models are open source, unlike those of competitors such as OpenAI.

DeepSeek also claims its R1 reasoning model rivals the best American tech, despite running at lower costs and being trained without cutting-edge graphic processing units, though industry watchers and competitors have questioned these assertions.

Feldman said,

Like in the PC and internet markets, falling prices help fuel global adoption. The AI market is on a similar secular growth path,

Inference chips 

DeepSeek could increase the adoption of new chip technologies by accelerating the AI cycle from the training to “inference” phase, chip start-ups and industry experts said.

Inference refers to the act of using and applying AI to make predictions or decisions based on new information, rather than the building or training of the model.

Phelix Lee, an equity analyst at Morningstar, with a focus on semiconductors, said :

To put it simply, AI training is about building a tool, or algorithm, while inference is about actually deploying this tool for use in real applications,

While Nvidia holds a dominant position in GPUs used for AI training, many competitors see room for expansion in the “inference” segment, where they promise higher efficiency for lower costs.

AI training is very compute-intensive, but inference can work with less powerful chips that are programmed to perform a narrower range of tasks, Lee added.

A number of AI chip startups told CNBC that they were seeing more demand for inference chips and computing as clients adopt and build on DeepSeek’s open source model. 

Sid Sheth, CEO of AI chip start-up d-Matrix, said :

[DeepSeek] has demonstrated that smaller open models can be trained to be as capable or more capable than larger proprietary models and this can be done at a fraction of the cost,

“With the broad availability of small capable models, they have catalyzed the age of inference,”

he told CNBC, adding that the company has recently seen a surge in interest from global customers looking to speed up their inference plans. 

Robert Wachen, co-founder and COO of AI chipmaker Etched, said dozens of companies have reached out to the startup since DeepSeek released its reasoning models.

He said. 

Companies are [now] shifting their spend from training clusters to inference clusters,

“DeepSeek-R1 proved that inference-time compute is now the [state-of-the-art] approach for every major model vendor and thinking isn’t cheap – we’ll only need more and more compute capacity to scale these models for millions of users.”

Jevon’s Paradox 

Analysts and industry experts agree that DeepSeek’s accomplishments are a boost for AI inference and the wider AI chip industry. 

According to a report from Bain & Company,

DeepSeek’s performance appears to be based on a series of engineering innovations that significantly reduce inference costs while also improving training cost,

It added,

In a bullish scenario, ongoing efficiency improvements would lead to cheaper inference, spurring greater AI adoption,

READ the latest news shaping the Nvidia market at Newsvidia

DeepSeek has rattled large AI players — but smaller chip firms see it as a force multiplier, source

Follow us on LinkedIn!

Join our weekly newsletter!

Please enable JavaScript in your browser to complete this form.

Your Header Sidebar area is currently empty. Hurry up and add some widgets.