A high-profile launch in Japan's tech industry has turned into a public controversy over technological transparency. Recently, Rakuten Group announced what it claims to be the "largest and most powerful" self-developed large model. This model, with 70 billion parameters, is a significant achievement developed under the support of the Ministry of Economy, Trade and Industry (METI) GENIAC project. However, soon after its release, the open-source community raised questions about whether it was merely a "shell" of an existing model.
Developers found that the underlying architecture and configuration file config.json clearly pointed to a model developed by a Chinese team. Evidence shows that Rakuten's model not only retained the name "DeepseekV3ForCausalLM" but was actually fine-tuned with Japanese data based on this model, rather than being developed from scratch.
The controversy centers around the "gray area" in Rakuten's communication and handling of the license:
Missing Disclosure: In the press release, Rakuten simply mentioned "integrating the essence of the open-source community," without mentioning the contribution of the model at all.
License Handling Controversy: The community pointed out that Rakuten may have deleted the required MIT license files in the initial release. Although they later added the NOTICE file to meet legal requirements, this "cutting the branch after the fact" approach was criticized for lacking openness and sincerity.
So far, Rakuten Group has not provided a direct response regarding the removal of the license files or the high consistency of the underlying architecture.
