Translated data: Mistral AI introduces Mixtral 8x7B, based on the SMoE model, offering performance comparable to GPT-3.5. The model is widely applied in tasks such as mathematics, code generation, and reading comprehension, with faster inference speeds. Mixtral8x7B and Mixtral8x7B – Instruct have been open-sourced and are licensed under Apache 2.0. Mistral AI plans to open-source a GPT-4 level model in 2024, leading the field of open-source large language models.