MiniMax, the developer of large models, has fulfilled its promise and officially open-sourced its Minimax2.7 large model released in March today. This is another major domestic open-source model following the release of Zhipu GLM-5.1, marking that domestic large models are further aligning with international top levels in programming capabilities and cost-effectiveness.

Significant Improvement in Programming Ability, Catching Up with Claude Opus
SWE-Pro Benchmark: M2.7 scored 56.22%, approaching the top level of Claude Opus almost completely.
MMClaw Evaluation: Its actual performance under the OpenClaw environment significantly surpasses M2.5, approaching the latest Sonnet4.6.
Cost Advantage: Compared to overseas paid models like GPT-5.4Pro and Claude Opus, Minimax2.7 provides developers with a more cost-effective high-performance option.
Currently, domestic models have already occupied an important share in the global open-source large model market. With the successive open-sourcing of GLM-5.1 and Minimax2.7, the competitive landscape of leading players has undergone subtle changes: There are rumors that due to strategic adjustments, the future Qwen3.6Plus may no longer adopt a fully open-source operating model. It continues to focus on deepening the open-source ecosystem and consolidating the developer base.
All Eyes on DeepSeek V4
With other major companies making their moves, the biggest suspense in the market now falls on DeepSeek. It is reported that Liang Wenfeng has stated
