Real AI Successfully Wins ISCRA Project, Europe's First Humanistic Large Language Model HOMINIS Set to Launch


BrowserUse unveils BU-30B-A3B-Preview, a 30B-parameter MoE model for web automation, balancing high performance with lightweight operation to reduce AI browser costs.....
Japanese data scientist Takahito Honda launched the open-source programming language Sui, aimed at solving the accuracy issues of code generated by large language models, claiming it can achieve 100% accuracy. Its design concept is inspired by Japanese aesthetics "Wabi-Sabi", emphasizing refinement and elimination of redundancy, with core principles including ensuring zero syntax error rate and using numbers as variables.
Ant Technology Research Institute launched the LLaDA2.0 series, including 16B and 100B versions, among which the 100B version is the industry's first billion-parameter discrete diffusion large language model. The model breaks through the scalability bottleneck of diffusion models, significantly improves generation quality and inference speed, and provides a new direction for the development of the field.
The world's highest-altitude large language model, "Yangguang Qingyan" V1.0, was launched in Tibet, with over one trillion parameters and 28.8 billion tokens of training data, covering multiple fields and filling the gap in Tibetan AI. In response to the national AI+ initiative, AI customer service and translation services have been integrated into the Lhasa community and Gongga Airport.
The MiniMax M2 model uses a full attention mechanism, abandoning linear or sparse attention techniques. The development team believes that although the latter can save computing resources, full attention is more efficient in industrial applications and can improve model performance. This decision aims to optimize actual deployment results and promote the development of AI technology.