mergoo
Mergoo is a versatile library that facilitates the merging and training of multiple LLM experts. It features methods like Mixture-of-Experts and Layer-wise merging, providing layer-specific flexibility. Compatible with models such as Llama, Mistral, Phi3, and BERT, Mergoo supports various platforms including CPU, MPS, and GPU. It offers tools for training the router of MoE layers or fully fine-tuning integrated LLMs. Discover integration methods through tutorials and engage with the open-source community for further contributions.