*
Friday: 02 January 2026
  • 01 January 2026
  • 20:33
DeepC focuses on revolutionizing the artificial intelligence market again

Khaberni - At the beginning of 2026, the Chinese company DeepC published a new technical research paper, co-authored by the company's founder Liang Wenfeng, clearly indicating the company's direction towards rethinking the fundamental infrastructure for training massive artificial intelligence models, with the aim of reducing costs while maintaining competitiveness.

The research paper, published on the open platform "arXiv," introduces a new methodology called Manifold-Constrained Hyper-Connections (mHC), part of the company's efforts to make the training of core models more efficient, amid fierce competition with American companies that have broader computing and funding capabilities.

Higher efficiency with less computational load
According to the research team, "DeepC" tested the new methodology on models containing 3 billion, 9 billion, and 27 billion parameters, and the results concluded that the new structure smoothly scales up without a significant increase in the computational burden, according to a report published by the site "scmp" and reviewed by "Al Arabiya Business".

The researchers explained that mHC allows stable and large-scale training, with better scalability compared to traditional solutions based on "superconnections", confirming that the improvements came thanks to smart improvements at the infrastructure level, making the additional cost almost negligible.

Development on "ByteDance" ideas
The idea stems from the Hyper-Connections concept first introduced by "ByteDance" researchers in September 2024, as an improvement on the famous ResNet architecture, which forms the basis of many modern artificial intelligence models, including GPT models from "OpenAI" and the AlphaFold system from "Google DeepMind".

However, "DeepC" believes that the previous approach did not take into account the rapidly rising memory costs, which restrict its practical usability in training giant models.

Here comes the role of mHC, which adds mathematical constraints (Manifold Constraints) that ensure higher efficiency in resource utilization.

An early indicator of what's to come
Observers believe that DeepC’s research papers often serve as an early indicator of the engineering trends that the company will adopt in its upcoming models.

German researcher Florian Brand, specialized in the Chinese artificial intelligence system, said that "DeepC" publications often precede major announcements about new models.

Expectations are currently growing that the company will launch its next model in the weeks leading up to the Chinese New Year (Spring Festival) in mid-February, especially since "DeepC" had previously unveiled its prominent model R1 just before a similar national holiday last year.

The founder is actively involved in research
The research paper also reflects the continued direct involvement of Liang Wenfeng in the company's core research, despite his limited media appearances.

Listing his name as the last author in the study indicates his close engagement in the company's deep technical directions.

Ultimately, DeepC's move confirms that the artificial intelligence race in 2026 will not only be about who has the most computing power, but also who can build smarter and more efficient models at a lower cost.

Topics you may like