Linly
This project advances Chinese language model capabilities by leveraging LLaMA and Falcon foundations with bilingual data. Introducing Linly-ChatFlow, fine-tuned through comprehensive directive training protocols. Models like Linly-OpenLLaMA (3B, 7B, 13B) are robust and open-sourced for diverse applications, supporting full-parameter training and various CUDA deployment strategies.