Project Icon

YAYI2

Refined Multilingual Language Models Utilizing Transformer Architecture

Product DescriptionThis model, developed by Wenge Research, is a multilingual large language model utilizing over 2 trillion tokens in pre-training. It is optimized for general and specialized uses with millions of fine-tuning instructions and human feedback reinforcement learning to align with human values. The model offers enhancements in language understanding, reasoning, and code generation, exceeding the performance of similar-sized open-source models. Discover more through the detailed technical report and join the community in advancing the open-source pre-training model ecosystem with this 30B parameter innovation.
Project Details