Yi-34B-Chat Fine-tuned Model and Quantized Version from Lingyi Wanwu


Xiaomi announced that the free public testing period of its self-developed large model MiMo-V2-Flash has been extended by 20 days, until January 20, 2026. The model has 309 billion parameters, with 15 billion activated parameters, and performs excellently in reasoning and code generation. This move aims to provide users with a longer experience period and demonstrate Xiaomi's continuous investment and confidence in the AI field.
Tencent releases open-source Hunyuan Translator 1.5, supporting 33 languages and optimized for mobile. Available in 1.8B and 7B versions, the 1.8B model uses only 1GB memory after quantization, enabling offline real-time translation on devices like smartphones with excellent inference speed.....
The Allen Artificial Intelligence Institute has released the open-source video language model Molmo2 series, including 4B and 8B versions based on Alibaba's Qwen3, and a fully open-source 7B version based on Ai2Olmo, while also making the training data publicly available, demonstrating its commitment to open source.
Meta plans to release an AI model codenamed "Avocado" in the spring of 2026, which may shift to being closed-source and was trained using Alibaba's open-source model Qwen. The news has attracted market attention, causing Alibaba's stock price to rise.
OpenAI released the GPT-5.2 series model, positioned for daily professional use, aiming to enhance user economic value. The series includes the Instant, Thinking, and Pro versions, with significant improvements over GPT-5.1 in tasks such as spreadsheets, presentations, code writing, long text understanding, and image processing.