
Tongyi Qianqian Qwen1.5
Alibaba launched a large-scale language model with multiple parameter scales from 0.5B to 72B, supporting multilingual processing, long text comprehension, and excelling in several benchmark tests.
A large open-source multilingual language model developed by over 1,000 researchers from more than 60 countries and 250 institutions, with 176B parameters and trained on the ROOTS corpus, supporting 46 natural languages and 13 programming languages, aims to advance the research and use of large-scale language models by academics and small companies.
I. Basic information
II. Technical details
III. Participants and organization
IV. Significance and impact
V. Version updates
VI. Application prospects







