Project Icon

DeepSeek-Coder

Multilingual Code Model for Advanced Programming Tasks

Product DescriptionDeepSeek Coder, trained on 2 trillion tokens, excels in multilingual code modeling. Supporting over 80 programming languages, it performs well on benchmarks such as HumanEval, MBPP, and DS-1000. Available in models from 1B to 33B, it meets different computational requirements. It is designed for effective code completion and infilling, making it suitable for both project-level applications and educational purposes. This open-source language model's efficiency and scalability offer distinct advantages over CodeLlama and GPT-35-turbo.
Project Details