InternEvo
InternEvo is an open-source, lightweight framework for model pre-training, minimizing dependency needs. It efficiently supports large-scale GPU cluster training and single GPU fine-tuning, with a near 90% acceleration efficiency across 1024 GPUs. Regularly releasing advanced large language models like the InternLM series, it surpasses many notable open-source LLMs. Installation is straightforward, with support for torch, torch-scatter, and flash-attention for training acceleration. Comprehensive tutorials and tools ensure efficient model development, encouraging community contributions without overstating benefits.