MiniMax, the developer of large models, has fulfilled its promise and officially open-sourced its Minimax2.7 large model released in March today. This is another major domestic open-source model following the release of Zhipu GLM-5.1, marking that domestic large models are further aligning with international top levels in programming capabilities and cost-effectiveness.

After GLM-5. 1, Minimax 2. 7 is officially open-sourced: Top domestic AI is waiting for DeepSeek V4

Significant Improvement in Programming Ability, Catching Up with Claude Opus

Minimax2.7This open-source release is most attention-grabbing due to its enhanced programming ability. According to official and test data:

  • SWE-Pro Benchmark: M2.7 scored 56.22%, approaching the top level of Claude Opus almost completely.

  • MMClaw Evaluation: Its actual performance under the OpenClaw environment significantly surpasses M2.5, approaching the latest Sonnet4.6.

  • Cost Advantage: Compared to overseas paid models like GPT-5.4Pro and Claude Opus, Minimax2.7 provides developers with a more cost-effective high-performance option.

Currently, domestic models have already occupied an important share in the global open-source large model market. With the successive open-sourcing of GLM-5.1 and Minimax2.7, the competitive landscape of leading players has undergone subtle changes: There are rumors that due to strategic adjustments, the future Qwen3.6Plus may no longer adopt a fully open-source operating model. It continues to focus on deepening the open-source ecosystem and consolidating the developer base.

All Eyes on DeepSeek V4

With other major companies making their moves, the biggest suspense in the market now falls on DeepSeek. It is reported that Liang Wenfeng has statedDeepSeek V4will be released in late April. V4 will natively support multimodal, offering "fast, expert, and visual" modes, and will be fully compatible with domestic AI hardware systems, focusing on security, controllability, and low-cost deployment.