DeepSeek's new V4-Pro platform is trained on Huawei chips, not Nvidia tech. Reuters
DeepSeek's new V4-Pro platform is trained on Huawei chips, not Nvidia tech. Reuters
DeepSeek's new V4-Pro platform is trained on Huawei chips, not Nvidia tech. Reuters
DeepSeek's new V4-Pro platform is trained on Huawei chips, not Nvidia tech. Reuters

Will DeepSeek's new AI model crash Nvidia's $5tn party?


Alvin R Cabral
Add as a preferred source on Google
  • Play/Pause English
  • Play/Pause Arabic
Bookmark

In early 2025, Nvidia lost nearly $600 billion in market value after an upstart generative AI large language model from China tried to prove there is no need to splash billions of dollars to develop such a platform.

Now, after Nvidia became the first company to cross the $5 trillion market capitalisation threshold on Friday, Chinese platform DeepSeek marked the occasion by dropping another apparently powerful and pocket-friendly model – and which, in theory, could spark another cap tank.

DeepSeek burst on to the scene with its highly affordable generative AI services and aim to challenge American AI tech dominance. It launched V4-Pro on the same day Nvidia closed with a cap of $5.06 trillion, nearly a trillion more than second-placed Alphabet.

V4-Pro uses a 1.6 trillion parameter AI model and costs about 50 times less than Anthropic's Claude Opus to run using an application programming interface.

Also notable is that it was trained on Chinese chips, specifically from Huawei Technologies, not Nvidia chips. DeepSeek's earlier V3 and R1 LLMs were trained on chips from the world's most valuable listed company.

According to is capabilities, V4-Pro activates only 49 billion parameters per token (despite having 1.6 trillion in total), meaning it has "frontier-level output at the compute cost of a 37-billion model", said Christian Schmidt, chief commercial and revenue officer at Samsung Mena.

V4-Pro costs only $3.48 for a million tokens of output, versus Anthropic's $30 and OpenAI's $25 for the same workload. That is already rock bottom, but the lower-tier V4-Flash is, at this point, comically low at $0.28.

"The numbers are real ... the cost of frontier AI just dropped again," Mr Schmidt said. "The chip monopoly the US relied on to slow China is no longer guaranteed. Open source is now competing with closed source at the highest level."

Familiar playbook

California-based Nvidia lost nearly $590 billion in the aftermath of DeepSeek R1's launch. DeepSeek even managed to unseat ChatGPT at the top of Apple's App Store at the time.

That DeepSeek emerged from China to offer such value is no surprise. The country, the world's second-biggest economy, is considered America's biggest tech rival and has competed with the rest of the world by delivering more affordable products, including smartphones and cars.

DeepSeek claims to have spent just $6 million – and only two months – to train its R1 model. By comparison, it has been reported that Meta Platforms used $60 million for its Llama model, while OpenAI splashed out more than $6 billion, with both companies taking longer to develop their models as well. It is unclear how much was invested in developing the V4 LLMs.

An understated fact here is that Huawei has committed to helping DeepSeek by providing its Ascend 950 chips. Huawei began to make more noise in generative AI almost a year ago when the company was reported to be developing the Ascend 910D.

"That Huawei piece is buried in most coverage. So here we have a frontier-adjacent model trained and served on non-Nvidia silicon, priced below anything closed source can sustain," said Shwetank Kumar, chief scientist at California-based EnCharge AI.

Nvidia chief executive Jensen Huang appeared unhappy, warning the partnership between DeepSeek and Huawei would be "horrible for the US". Perhaps he was also miffed that, reportedly, Huawei denied Nvidia, as well as OpenAI, access to its new Ascend chips.

Nvidia boss Jensen Huang issued a warning about Huawei's support for DeepSeek. AFP
Nvidia boss Jensen Huang issued a warning about Huawei's support for DeepSeek. AFP

"The old assumption that you cannot train a frontier model without Nvidia hardware is now empirically wrong," said Rishav Ganguli, founder of New Dawn AI in India.

While he noted DeepSeek's admission that the company trails GPT-5.4 and Gemini 3.1 Pro by three to six months, the 2026 Stanford AI Index describes Chinese AI labs as having "effectively closed" the performance gap.

"This is no longer a lagging-edge phenomenon ... for the last two years, a lot of strategy has been built on the assumption that a small number of US labs would sit at the top of a steep capability curve, and everyone else would pay to rent from them," Mr Ganguli said. "That assumption is being repriced in real time."

Mr Kumar said the price is "where this turns strategic", with companies spending tens of billions of dollars on training LLMs.

DeepSeek's pricing is a "ceiling, set deliberately below what OpenAI and Anthropic would need to charge to cover their own frontier training", he added. "DeepSeek doesn't need margin on tokens. They need OpenAI and Anthropic to be unable to raise prices without looking extractive."

Is Nvidia still good value?

As it stands, Nvidia's market standing has not yet been affected by DeepSeek's latest salvo – though it is worth mentioning that their multibillion-dollar market cap slide last year took about a week to occur.

Perry Wu, founder and chief executive of Darrius FinTech, which runs the Darrius.AI platform designed to help investors understand shifting market dynamics, said Nvidia's stock price increase was a "trend-level advance".

Darrius.AI's analysis showed an "early buy signal" on April 9, when the stock was at $183, turning into a "buy confirmation" when it hit $187. The stock has since trended upwards, closing at a record $208.26 on Friday, "confirming an uptrend".

After that fateful day on January 27, 2025, when Nvidia's stock tanked, it has since ballooned by nearly 76 per cent. "This is no longer about 'whether to get in', it’s about the fact that smart money has already been in for a while," Mr Wu said.

His upbeat assessment, however, comes with a note of caution – not for investors, but for Nvidia itself. "Nvidia must keep outperforming. There is little room for disappointment."

Updated: April 25, 2026, 11:27 AM