Stable Code 3B
S
Stable Code 3B
Overview :
Stable Code 3B is a decoder-only language model with 2.7 billion parameters, pre-trained on 130 billion diverse text and code data points. Stable Code 3B was trained on 18 programming languages and demonstrated state-of-the-art performance compared to models of similar size on various programming languages when evaluated using the BigCode benchmark. It supports long contexts, trained with a sequence length of 16,384, and incorporates the Fill-in-the-Middle (FIM) technique. Users can begin using Stable Code 3B for text generation via code snippets available on the Hugging Face website. Developed by Stability AI based on the GPT-NeoX library, this model is applicable to both English and programming languages.
Target Users :
Users can utilize Stable Code 3B for text generation and as a foundation model for fine-tuning applications.
Total Visits: 29.7M
Top Region: US(17.94%)
Website Views : 127.0K
Use Cases
Example of using Stable Code 3B to generate Python code
Example of using Stable Code 3B to generate JavaScript code
Example of using Stable Code 3B for text generation
Features
Fill-in-the-Middle (FIM) functionality
Supports long contexts, trained with a sequence length of 16,384
Used for text generation
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase