Stability AI has released Stable Code 3B, its newest model for AI-assisted software development. stable-code-3b
is a 2.7 billion parameter decoder-only language model pre-trained on 1.3 trillion tokens of diverse textual and code datasets. This new large language model specializes in accurate and responsive code completion, rivaling models 2.5 times its size. Impressively, Stable Code 3B can run offline on common laptops without a GPU, bringing robust AI coding to more developers.
Offering 60% of the performance of Code Llama 7B at a fraction of the size, Stable Code 3B represents a massive efficiency breakthrough. It leverages Stability AI's foundational 3-billion parameter Stable LM, further trained on over 4 trillion tokens of software engineering data.
The streamlined model allows developers to code AI-assisted applications privately, in real time, sans internet. Stable Code 3B supports expanded context lengths up to 100,000 tokens - drastically beyond its training length of 16,384 tokens. This expansion allows the model to parse and auto-complete vastly more complex code.
Across 18 programming languages, Stable Code 3B demonstrates cutting-edge proficiency based on the multi-lingual programming evaluation benchmark, MultiPL-E. It can ingest sophisticated, multi-file projects and propose context-aware solutions in languages like Python, Javascript, and more.
Stable Code 3B is availabel for non-commercial & commercial purposes with a Stability AI Membership, alongside their other core models like SDXL Turbo and Stable Video Diffusion.