Project Icon

DeepSeek-Coder-V2

Comprehensive Open-Source MoE Code Model with Broad Programming Language Coverage

Product DescriptionDeepSeek-Coder-V2 is an open-source Mixture-of-Experts model that offers competitive performance in code and math tasks, comparable to closed-source alternatives. With extensive pre-training of 6 trillion tokens, it supports 338 languages and features a 128K context length. Available in 16B and 236B parameter variants, this model can be utilized through direct download, API platforms, or local setups for various coding tasks.
Project Details