DeepSeek-V2 Shakes Up AI Industry: GPT-4 Performance at 1% Cost

DeepSeek-V2 Shakes Up AI Industry: GPT-4 Performance at 1% Cost

CTOL Editors
2 min read

DeepSeek-V2 Shakes Up AI Industry: High Performance Meets Unbelievably Low Cost

DeepSeek-V2, a second-generation Mixture of Experts (MoE) large model from Hangzhou-based DeepSeek AI Tech, is making headlines with its remarkably low pricing and competitive performance. Developed by a subsidiary of the quant hedge fund Huanfang, DeepSeek-V2 has quickly earned the nickname "price butcher" due to its performance matching OpenAI's GPT-4 at nearly one-hundredth of the cost. With 2360 billion parameters and highly efficient training capabilities, DeepSeek-V2 represents a significant challenge to leading AI models globally.

Key Takeaways

  • Performance and Cost: DeepSeek-V2 offers superior performance in both Chinese and English benchmarks, rivaling top models like GPT-4 and LLaMA3-70B. Its cost-effectiveness is particularly disruptive, with API pricing at 1 RMB per million tokens for inputs and 2 RMB for outputs, which is 1% of the cost of GPT4.
  • Institutional Backing: DeepSeek is strategically supported by Huanfang Quantitative, which is venturing deep into AI technology, leveraging significant investments and infrastructure, including thousands of Nvidia A100 GPUs.
  • Market Impact: DeepSeek-V2's launch is set to redefine market dynamics by offering high-performance AI capabilities at substantially lower prices, threatening the dominance of larger tech firms and potentially altering the competitive landscape in AI technology.
  • Profitability and Expansion: Despite low prices, DeepSeek maintains a robust profit margin above 70%, made possible by its incredible training efficiency and high utilization of server capabilities.
  • Test on CTOL-Human-F1: Using the official API, we tested DeepSeek V2 using our own proprietary test set CTOL-Human-F1. Our initial evalution shows that DeekSeek V2 does not beat Llama 3 70B, but the margin is not huge. We suspect the root cause is the language disparity: our test is conducted in English while DeepSeek2 model outperforms all others in Chinese tasks. However, we are very optimistic in the near future, DeepSeek will outperform Llama 3 70B.


DeepSeek-V2's market entry is not just a technological advancement but a strategic move that shakes the very foundations of the AI industry's economic models. By leveraging the existing infrastructure and research capabilities of Huanfang, DeepSeek has managed to deliver a product that not only outperforms but does so at a fraction of the cost. This model's ability to train with 8.1 trillion tokens and achieve superior throughput rates underscores a significant shift towards more economically sustainable AI practices. The ramifications for AI application in business, especially where cost has been a prohibitive factor, are profound. Furthermore, the large language model (LLM) industry may be on the cusp of significant disruption. OpenAI risks losing its forefront position if it does not enhance the efficiency of its model inference. In this fiercely competitive sector, the deceleration of innovation and a deficiency in consumer-facing product acumen pose additional threats to OpenAI's dominance. Microsoft, a principal cloud collaborator with OpenAI, along with other competitors (Amazon, Google) who have heavily invested in the general AI sector yet yielded underwhelming products, could face substantial financial repercussions.

Did You Know?

  • AI as a Stock Market Tool? Despite speculation, Huanfang's management asserts that their AI advancements, including DeepSeek, are not intended for stock market manipulation but have broader, more significant applications.
  • Massive Investment in AI: Since 2019, Huanfang has invested heavily in AI training platforms, with the latest, Yinghuo-2, supported by 10,000 Nvidia A100 GPUs, highlighting the firm's commitment to leading in AI development.
  • Strategic Locations: DeepSeek's expansion includes a massive office setup in Beijing's Haidian District, covering an area equivalent to 20 tennis courts, illustrating the scale at which the organization is operating to drive AI innovations.

You May Also Like

This article is submitted by our user under the News Submission Rules and Guidelines. The cover photo is computer generated art for illustrative purposes only; not indicative of factual content. If you believe this article infringes upon copyright rights, please do not hesitate to report it by sending an email to us. Your vigilance and cooperation are invaluable in helping us maintain a respectful and legally compliant community.

Subscribe to our Newsletter

Get the latest in enterprise business and tech with exclusive peeks at our new offerings