stable-code-3b

PyTorch

1 versions

Stable Code 3B is a coding model with instruct and code completion variants on par with models such as Code Llama 7B that are 2.5x larger.

Run this model

  1. Install our magic package manager:

    curl -ssL https://magic.modular.com/ | bash

    Then run the source command that's printed in your terminal.

  2. Install Max Pipelines in order to run this model.

    magic global install max-pipelines
  3. Start a local endpoint for stable-code/3b:

    max-serve serve --huggingface-repo-id stabilityai/stable-code-3b

    The endpoint is ready when you see the URI printed in your terminal:

    Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
  4. Now open another terminal to send a request using curl:

    curl -N http://0.0.0.0:8000/v1/chat/completions -H "Content-Type: application/json" -d '{
        "model": "stable-code/3b",
        "stream": true,
        "messages": [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Who won the World Series in 2020?"}
        ]
    }' | grep -o '"content":"[^"]*"' | sed 's/"content":"//g' | sed 's/"//g' | tr -d '
    ' | sed 's/\n/
    /g'
  5. 🎉 Hooray! You’re running Generative AI. Our goal is to make this as easy as possible.

About

Stable Code 3B is a 3 billion parameter Large Language Model (LLM) developed by Stability AI that delivers accurate and responsive code completion on par with larger models like Code Llama 7B. It is designed with a decoder-only transformer architecture and robust Fill-in-the-Middle (FIM) capabilities, supporting sequence lengths up to 16,384 tokens.

spiderchart

Model Size Python C++ Javascript Java PHP Rust
Stable Code 3B 32.4% 30.9% 32.1% 32.1% 24.2% 23.0%
CodeLLama 7B 30.0% 28.2% 32.5% 31.1% 25.7% 26.3%
Deepseek Coder 1.3B 28.6% 29.2% 28.7% 29.0% 23.6% 18.5%
Wizard Coder 3B 31.6% 25.6% 26.2% 25.8% 25.3% 20.4%
StarCoder 3B 21.6% 19.8% 21.5% 20.5% 19.0% 16.9%
Replit Code V1.5 3B 23.0% 25.9% 26.2% 23.6% 23.2% 21.5%
Deci Coder 1B 19.1% 6.8% 18.4% 16.7% 2.1% 1.7%

The model is trained on a diverse dataset of programming languages including Python, C++, JavaScript, Java, and Rust, among 18 key languages. Notable datasets include Falcon RefinedWeb, CommitPackFT, and StarCoder. Positioned as a foundational model, Stable Code 3B is ideal for fine-tuning in specific applications but may have limitations or biases due to potential shortcomings in its training data. Users should rigorously evaluate the model prior to deployment.

DETAILS

MODEL CLASS
PyTorch

MODULAR GITHUB

Modular

CREATED BY

stabilityai

MODEL

stabilityai/stable-code-3b

TAGS

arxiv:1910.02054
arxiv:2104.09864
arxiv:2204.06745
arxiv:2305.06161
arxiv:2307.09288
arxiv:2309.12284
arxiv:2310.10631
autotrain_compatible
causal-lm
code
dataset:EleutherAI/proof-pile-2
dataset:bigcode/commitpackft
dataset:bigcode/starcoderdata
dataset:bigcode/the-stack-github-issues
dataset:meta-math/MetaMathQA
dataset:tiiuae/falcon-refinedweb
en
endpoints_compatible
gguf
license:other
model-index
region:us
safetensors
stablelm
text-generation
transformers

@ Copyright - Modular Inc - 2024