97久久人妻精品中文无码_av老司机亚洲精品天堂_国产精品一级无码免费播放_大伊人无码综合天堂Av,_国产精品

Alibaba Open-Sources its Seven-Billion-Parameter AI Model Similar to Meta’s Llama 2

來源:鈦媒體APP

Credit: Visual China

BEIJING, August 4 (TiPost) -- Alibaba’s cloud computing announced on Thursday that it will open-source its seven-billion-parameter large language model (LLM) Tongyi Qianwen and make it free for commercial use. The global ecological competition around open-sourced LLMs has further intensified.

Compared to the lively AI open source ecology in the West, the Chinese community lacks excellent fundamental models. The open source of Tongyi Qianwen is expected to provide more choices for the open source community and promote the construction of the Chinese AI open source ecosystem, according to AliCloud in its statement.


(資料圖)

On April 7 this year, Alibaba’s own ChatGPT-like product Tongyi Qianwen began to invite testing. As an ultra-large language model, Tongyi Qianwen can complete multi-round conversations, write emails and novels, solve simple math problems and write codes.

AliCloud has never disclosed the size of Tongyi Qianwen"s parameters before, and said the open-sourced model is only a miniaturized version. It added the move aims to help users simplify the process of model training and deployment. Therefore, instead of training models from scratch, the users can build high-quality models quickly by downloading pre-trained models and fine-tuning them.

Zhou Jingren, CTO of AliCloud Intelligence, said at the AliCloud Guangzhou Summit in June this year that they are very supportive of open-sourced models, which enable people to reduce the cost of learning and achieve breakthroughs of their own.

In February, Meta, the parent company of Facebook, made its LLM LLaMA available to research institutions, with four versions of seven billion, 13 billion, 33 billion and 65 billion parameters. On July 18, Meta introduced Llama 2 with 7 billion, 13 billion and 70 billion parameters, for free research and commercial use.

In China, Baichuan Intelligence, a large model startup founded by Wang Xiaochuan, the founder of Sogou, released Baichuan-7B, a seven-billion-parameter open-sourced model, in June this year, and Baichuan-13B, a 13 billion-parameter model, in July. According to Wang, Baichuan Intelligence will

release closed-source large models with tens of billions and hundreds of billions of parameters later.

The open-sourced LLMs are safer because developers and researchers in the community can stress-test it to quickly find and solve problems, and Meta can further improve its own models by fixing the holes, according to Meta in a statement on open-source Llama 2.

However, Meta"s intention to catch up with OpenAI and Google is also very clear. OpenAI made its model available from its inception to the release of GPT-2 in 2019, and since then it has closed the source of its models in order to make profits, including the latest GPT-4 released in March this year. Google"s latest PaLM 2 is also a closed source model.

In May this year, a Google software engineer said in a post that the open source community will pose a potential threat to OpenAI and Google with lower-cost and faster-evolving AI models.

An investor in the AI field also said that if OpenAI’s ChatGPT has brought the "iPhone moment" for AI, he is caring about the appearance of the "Android moment". The biggest difference between the Android operating system and Apple"s iOS is the former"s open source, which guarantees it more than 80% of the global smartphone market share.

標(biāo)簽:

推薦

財富更多》

動態(tài)更多》

熱點