stablelm2-1.6b

PyTorch

2 versions

Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.

Run this model

  1. Install our magic package manager:

    curl -ssL https://magic.modular.com/ | bash

    Then run the source command that's printed in your terminal.

  2. Install Max Pipelines in order to run this model.

    magic global install max-pipelines
  3. Start a local endpoint for stablelm2/1.6b:

    max-serve serve --huggingface-repo-id stabilityai/stablelm-2-1_6b

    The endpoint is ready when you see the URI printed in your terminal:

    Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
  4. Now open another terminal to send a request using curl:

    curl -N http://0.0.0.0:8000/v1/chat/completions -H "Content-Type: application/json" -d '{
        "model": "stablelm2/1.6b",
        "stream": true,
        "messages": [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Who won the World Series in 2020?"}
        ]
    }' | grep -o '"content":"[^"]*"' | sed 's/"content":"//g' | sed 's/"//g' | tr -d '
    ' | sed 's/\n/
    /g'
  5. 🎉 Hooray! You’re running Generative AI. Our goal is to make this as easy as possible.

About

Stable LM 2 1.6B is a cutting-edge language model with 1.6 billion and 12 billion parameter variants, designed for high performance while maintaining a relatively small size. The model is trained on multilingual data, covering English, Spanish, German, Italian, French, Portuguese, and Dutch, making it versatile across multiple languages.

The training process leverages a combination of publicly available datasets and synthetic datasets. This blend ensures a broad knowledge base while incorporating diverse and high-quality data. Furthermore, Stable LM 2 employs Direct Preference Optimization (DPO) to enhance its alignment with human preferences, improving the model's efficiency and accuracy in understanding and generating natural language.

With these innovations, Stable LM 2 exemplifies the evolution of smaller-scale language models that maintain state-of-the-art capabilities. It responds to the growing demand for efficient, multilingual AI solutions that prioritize accuracy and real-world applicability.

References

Announcement

HuggingFace

DETAILS

MODEL CLASS
PyTorch

MODULAR GITHUB

Modular

CREATED BY

stabilityai

MODEL

stabilityai/stablelm-2-1_6b

TAGS

arxiv:1607.06450
arxiv:1910.02054
arxiv:1910.07467
arxiv:2101.00027
arxiv:2104.09864
arxiv:2204.06745
arxiv:2206.11147
arxiv:2305.06161
arxiv:2305.14201
arxiv:2307.09288
arxiv:2309.09400
arxiv:2309.16609
arxiv:2402.17834
autotrain_compatible
causal-lm
dataset:CarperAI/pilev2-dev
dataset:DataProvenanceInitiative/Commercially-Verified-Licenses
dataset:bigcode/starcoderdata
dataset:tiiuae/falcon-refinedweb
dataset:togethercomputer/RedPajama-Data-1T
dataset:uonlp/CulturaX
de
en
endpoints_compatible
es
fr
it
license:other
nl
pt
region:us
safetensors
stablelm
text-generation
transformers

@ Copyright - Modular Inc - 2024