Project Icon

JetMoE

JetMoE-8B: A Budget-Friendly Model Surpassing LLaMA2

Product DescriptionJetMoE-8B, an open-source AI model, exceeds the performance of LLaMA2-7B from Meta AI with a training cost of less than $0.1 million. Using only public datasets and minimal computational resources, JetMoE-8B offers academic accessibility and lowers inference costs due to its 2.2B active parameters. It ranks higher in Open LLM and MT-Bench evaluations, proving that efficient training of large language models is possible without high expenditures. Discover the technical specifics and access options via MyShell.ai and associated resources.
Project Details