[go: up one dir, main page]

Japan Team Develops AI Foundation with Fugaku Supercomputer

Tokyo (Jiji Press)—A team of researchers from the Tokyo Institute of Technology, Fujitsu Ltd. and others said Friday they have developed a large language model that can serve as a foundation for generative artificial intelligence, using the Japanese supercomputer Fugaku.

Trained extensively on data in Japanese, which account for 60 pct of the total training data, the Fugaku-LLM model is expected to lead to research on generative AI tailored to domestic needs.

The researchers, also including those from Tohoku University, Nagoya University, the government-backed research institute Riken, CyberAgent Inc. and Kotoba Technologies Inc., launched in May 2023 the project employing the supercomputer jointly developed by Fujitsu and Riken.

Fugaku-LLM’s high Japanese language ability can be demonstrated when it answers questions about poems by haiku master Matsuo Basho fluently, they said.

Unlike most other models with Japanese language capabilities, which employ continual learning, Fugaku-LLM is trained from scratch using the team’s own data that do not contain harmful ones so the entire learning process can be understood, they said, adding that it is superior in terms of transparency and safety.

It is also significant that the team successfully trained a large language model with Fugaku, which uses not graphics processing units but central processing units.

In language model training, it is common to use GPUs. But now they are in short supply due to a fierce global language model development race.

Fugaku’s calculation ability was enhanced by optimizing its communication performance, the researchers said.

“We’ve proven our ability to overcome challenges posed by Fugaku,” TIT Prof. Rio Yokota told a press conference. “We haven’t relied on foreign products at all, which is a great achievement,” Yokota went on to say.

Fugaku-LLM’s source code was made public immediately. It is available on Fujitsu’s website as well.