Ant Group Unveils World’s Latest Trillion-Parameter Open-Source Language Model

Wait 5 sec.

(Image source: Photo taken at the Ant Group booth during the 2025 World Artificial Intelligence Conference)Ant Group on Thursday announced the release and open-sourcing of its latest large language model, Ling-1T, marking a major milestone in global AI development.Ling-1T, the flagship model in Ant Group’s Bailing (Ling) 2.0 series, is touted as the largest and most powerful non-reasoning model developed by the Bailing team to date, demonstrating both unprecedented scale and performance.According to benchmark tests, Ling-1T achieves state-of-the-art results across multiple complex reasoning tasks. It has outperformed other open-source models in high-difficulty areas including code generation, software development, competition mathematics, professional mathematics, and logical reasoning.For instance, in the AIME 25 competition mathematics leaderboard, Ling-1T scored an accuracy rate of 70.42% with an average token usage of just over 4,000, surpassing Google’s Gemini-2.5-Pro, which used more than 5,000 tokens on average and scored 70.10%. By delivering higher accuracy with fewer tokens, Ling-1T demonstrates both superior efficiency and cost-effectiveness, positioning Ant Group’s technology at the forefront of reasoning precision.Ling-1T’s release comes amid a broader acceleration of AI development in China and the U.S., with companies racing to expand capabilities across large language models and multimodal AI systems. Around the National Day and Mid-Autumn Festival holidays, several major AI initiatives were unveiled.OpenAI launched its AI video model Sora2, announced GPT-5 Pro, and introduced the ChatGPT Apps SDK. Meanwhile, DeepSeek released DeepSeek-V3.2-Exp, which enhances training and inference efficiency and reduces API costs. Alibaba Tongyi unveiled the next-generation Qwen3-Omni multimodal model and its open-source Tongyi DeepResearch framework, while Zhipu released its flagship GLM-4.6, topping Hugging Face’s trending global chart and ranking fourth globally on LMArena.As Kai-Fu Lee, founder and CEO of 01.AI, noted, foundational AI models represent a high-stakes “tech arms race,” with tens of billions of dollars at play. Nvidia CEO Jensen Huang highlighted surging AI computing demand over the past six months, emphasizing unprecedented interest in the company’s Blackwell architecture chips and calling this surge the beginning of a “new industrial revolution.” Nvidia recently announced plans to invest $100 billion over the next decade in OpenAI, supporting deployment of systems requiring 10 gigawatts of power — equivalent to millions of GPUs. OpenAI CEO Sam Altman added that the future of artificial general intelligence (AGI) depends on smarter models, longer context processing, and improved memory systems, with AI’s ability to generate new knowledge as the core metric.Ant Group’s Ling team has steadily developed large models, beginning with the open-source Ling-Lite and Ling-Plus in March of this year. Ling-Lite contains 1.68 billion parameters, while Ling-Plus reaches 290 billion. These models have laid the foundation for more advanced applications in life services, finance, and healthcare.Ling-1T adopts the Ling 2.0 architecture and has been pre-trained on over 20 trillion high-quality, high-inference-density tokens. It supports a context window of up to 128K tokens and employs an evolutionary chain-of-thought (Evo-CoT) approach that combines in-training and post-training processes, enhancing efficiency in reasoning and precise inference. Notably, Ling-1T is trained using FP8 mixed-precision technology — the largest known base model to do so — achieving memory savings, flexible parallelization, and over 15% acceleration in training.The Ant Bailing team also introduced the Linguistics-Unit Policy Optimization (LingPO) algorithm during reinforcement learning, a sentence-level policy optimization approach that stabilizes the training of trillion-parameter models. Additionally, a hybrid reward mechanism ensures code correctness, feature completeness, and continuous improvements in visual aesthetics understanding. On the ArtifactsBench frontend benchmark, Ling-1T achieved a score of 59.31, ranking first among open-source models for visualization and frontend development tasks.Alongside Ling-1T, Ant Group is developing Ring-1T, a trillion-parameter deep reasoning model. A preview version was open-sourced on September 30, providing developers early access to Ant’s latest AI capabilities.Large language models often require high-performance AI chips such as H100 and H800, which can be scarce and costly. The Ant Ling team emphasizes a technical framework that allows seamless switching between heterogeneous computing units and distributed clusters, enabling performance optimization without reliance on cutting-edge GPUs. This approach aims to make high-quality AI accessible and cost-efficient even in resource-constrained environments.The team, led by He Zhengyu, Ant Group’s Vice President and CTO, combines deep technical expertise with years of experience in large-scale systems. He holds a Ph.D. in Computer Science from Georgia Tech and previously led Google’s open-source gVisor project. Since joining Ant Group in 2018, He has overseen cloud-native transformation, green computing, confidential computing innovation, and the company’s open-source strategy. Under his leadership, the Bailing large model initiative now targets applications spanning life services, finance, and healthcare.The release of Ling-1T coincides with record levels of AI venture capital investment. According to PitchBook, AI startups worldwide have attracted $192.7 billion since the start of 2025, with over half of all global venture capital expected to flow into AI this year. Mature startups dominate funding allocations, while smaller or non-AI companies face challenges securing capital amid tighter IPO and M&A conditions.In the latest quarter, 62.7% of U.S. venture capital was invested in AI companies, compared to 53.2% globally. PitchBook’s research director, Kyle Sanford, observed that “the market is becoming increasingly polarized — either you’re working on artificial intelligence, or you’re not; either you’re a big company, or you’re not.” Analyst Dimitri Zabelin noted that most exits currently consist of frequent but low-value acquisitions, with a smaller number of high-value IPOs.OpenAI exemplifies the rapid growth of AI startups. Since ChatGPT’s debut in 2022, its user base has grown to 800 million weekly active users. OpenAI’s $6.6 billion funding round recently boosted its valuation to $500 billion, surpassing SpaceX and making it the world’s most valuable startup. In the first seven months of 2025, OpenAI’s revenue roughly doubled, with annual revenue projected at $12 billion. Agreements for nearly $1 trillion in computing power position OpenAI as a potential leader in AI profitability.Ling-1T’s release underscores the growing pace of competition in AI development, particularly between China and the United States. The model demonstrates advances not only in parameter scale but also in efficiency, accuracy, and reasoning capability. As foundational AI models become central to economic and technological strategy, companies like Ant Group are pushing the envelope on open-source accessibility, infrastructure optimization, and practical application scenarios.With global venture capital surging and competition intensifying, the AI landscape is entering a critical phase. Ling-1T’s open-source debut signals that high-performance AI is no longer limited to a handful of tech giants, but increasingly accessible to researchers, developers, and enterprises worldwide — setting the stage for the next wave of innovation in large language models and artificial general intelligence.更多精彩内容,关注钛媒体微信号(ID:taimeiti),或者下载钛媒体App