EasyLM
EasyLM provides an efficient framework for pre-training, fine-tuning, evaluating, and deploying large language models with JAX/Flax. It supports TPU/GPU scaling across multiple hosts using JAX's pjit utility and integrates with Huggingface's tools for straightforward customization. Models like LLaMA and its successors are available. Participate in discussions on JAX-based LLM training on Discord for further insights.