China's AI Lab DeepSeek Takes On OpenAI, Other Leaders With R1 Open Source Model: Everything You Need To Know
DeepSeek has open-sourced its flagship models along with six smaller variants, ranging from 1.5 billion to 70 billion parameters.

Chinese artificial intelligence (AI) research lab DeepSeek is making waves globally with the launch of its open-source AI model, DeepSeek-R1. Promising advancements in mathematical reasoning, code generation, and cost efficiency, the model positions DeepSeek as a serious competitor to industry leaders like OpenAI.
Market Disruption, Stock Swings
The global financial markets reacted sharply to DeepSeek's breakthrough, reported Moneycontrol on Monday. US stock futures saw significant declines, with Nasdaq contracts falling 1.9 per cent during Asian trading hours, driven by concerns that DeepSeek’s affordable AI model could disrupt the business models of major US firms like Nvidia, OpenAI, and Google. Japanese chipmaker stocks also took a hit, with shares of Nvidia supplier Advantest Corp. plunging 8.6 per cent in Tokyo.
In contrast, Chinese and Hong Kong tech shares surged, with the Hang Seng Tech Index rising by 2 per cent, reflecting optimism about DeepSeek’s rapid growth and potential to challenge Silicon Valley’s dominance.
What Is DeepSeek-R1?
DeepSeek-R1 is touted as an advanced AI reasoning model that surpasses existing benchmarks in various tasks, reported Forbes.
🚀 DeepSeek-R1 is here!
— DeepSeek (@deepseek_ai) January 20, 2025
⚡ Performance on par with OpenAI-o1
📖 Fully open-source model & technical report
🏆 MIT licensed: Distill & commercialize freely!
🌐 Website & API are live now! Try DeepThink at https://t.co/v1TFy7LHNy today!
🐋 1/n pic.twitter.com/7BlpWAPu6y
Unlike traditional large language models (LLMs), which rely on supervised fine-tuning, DeepSeek-R1-Zero was developed using reinforcement learning (RL), enabling robust reasoning capabilities. The refined DeepSeek-R1 model achieves performance comparable to OpenAI’s GPT models while requiring significantly fewer resources.
The company’s technical innovations, such as multi-head latent attention (MLA) and a mix-of-experts approach, have drastically reduced computing requirements. DeepSeek claims its model consumes just one-tenth of the computational power needed for comparable systems, such as Meta’s Llama 3.1.
DeepSeek’s Open-Source Strategy
DeepSeek has open-sourced its flagship models along with six smaller variants, ranging from 1.5 billion to 70 billion parameters.
Released under an MIT license, these models can be modified, fine-tuned, and commercialised by developers worldwide. This open approach challenges the dominance of proprietary models by Western AI firms and fosters broader access to advanced AI tools.
Founded in 2023 by Liang Wenfeng, a hedge fund veteran and AI industry expert, DeepSeek emerged as an independent entity from High-Flyer, a Chinese quantitative hedge fund established in 2015. Unlike many Chinese AI firms tied to tech giants like Baidu or Alibaba, DeepSeek operates autonomously, focusing on long-term innovation rather than short-term profitability.
Liang, who holds degrees in engineering from Zhejiang University, redirected High-Flyer’s resources toward DeepSeek to pursue groundbreaking AI research. His vision was fueled by scientific curiosity, a stark contrast to the profit-driven motives of many competitors.
Driven By Young Researchers From Chinese Universities
DeepSeek’s workforce comprises young researchers from top Chinese universities, including Tsinghua and Peking University. Despite their limited industry experience, these graduates bring fresh academic perspectives and collaborative energy, enabling the company to tackle resource-intensive research challenges effectively.
DeepSeek’s achievements are particularly remarkable given the US government’s export restrictions on advanced chips, such as Nvidia’s H100. While the company initially secured a stockpile of these chips, its focus shifted to optimising resource efficiency, allowing it to thrive despite hardware limitations.
Through techniques like custom communication schemes and memory optimisation, DeepSeek has developed strategies that minimise resource requirements without sacrificing performance. This approach exemplifies the lab’s emphasis on innovation over-dependence on high-end hardware.
DeepSeek’s decision to open-source its AI models has been hailed as a significant step toward democratising access to advanced technology. By sharing its research and tools, the company is empowering developers worldwide while challenging the hegemony of Western AI firms.
Trending News
Top Headlines
