Yi-Coder
Y
Yi Coder
Overview :
Yi-Coder is a series of open-source large language models (LLMs) that deliver state-of-the-art coding performance with fewer than 10 billion parameters. It comes in two sizes—1.5B and 9B parameters—offering base and chat versions designed for efficient inference and flexible training. Yi-Coder-9B has been additionally trained on 24 trillion high-quality tokens from GitHub's code repository-level code corpus and code-related data filtered from CommonCrawl. Yi-Coder excels in various programming tasks, including basic and competitive programming, code editing, repository-level completion, long-context understanding, and mathematical reasoning.
Target Users :
Yi-Coder is suitable for software developers, competitive programmers, and AI researchers. It enhances coding efficiency, assists in tackling complex programming challenges, and provides a research foundation for AI applications in the programming domain.
Total Visits: 1.5K
Website Views : 67.3K
Use Cases
Developers use Yi-Coder for automatic code completion, enhancing development efficiency
Competitive programmers utilize Yi-Coder to solve algorithmic problems, achieving higher rankings
AI researchers employ Yi-Coder for studies in code generation and inference capabilities
Features
Pre-trained with high-quality tokens supporting 52 major programming languages
Long-context modeling: maximum context window of 128K tokens, enabling project-level code understanding and generation
Exceptional performance in models with fewer than 1 billion parameters, matching the performance of larger models
Achieved a 23.4% pass rate on the LiveCodeBench platform with Yi-Coder-9B-Chat, surpassing other models
Demonstrated outstanding performance in code modification tasks on CodeEditorBench with Yi-Coder-9B
Excelled in cross-file code completion in CrossCodeEval
Showcased long-context modeling capabilities in the 'Needle in the code' task
Excelled in mathematical reasoning in the Program-Aid Math Reasoning evaluation with Yi-Coder-9B
How to Use
Visit the Yi-Coder GitHub page for detailed model information and usage guidelines
Download and install necessary software dependencies, such as the Python environment and the Transformers library
Set up the model training or inference environment as per the Yi-Coder README documentation
Utilize the Yi-Coder API for code generation or to engage in existing code editing tasks
Integrate Yi-Coder into projects to leverage its long-context understanding and code generation capabilities to optimize the development process
Participate in Yi-Coder community discussions to gain technical support and share best practices
Contact the Yi-Coder development team via Discord or email for deeper assistance and discussions
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase