sqlcoder-7b

MAX Model

2 versions

SQLCoder is a code completion model fined-tuned on StarCoder for SQL generation tasks

Run this model

  1. Install our magic package manager:

    curl -ssL https://magic.modular.com/ | bash

    Then run the source command that's printed in your terminal.

  2. Install Max Pipelines in order to run this model.

    magic global install max-pipelines
  3. Start a local endpoint for sqlcoder/7b:

    max-serve serve --huggingface-repo-id defog/sqlcoder-70b-alpha

    The endpoint is ready when you see the URI printed in your terminal:

    Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
  4. Now open another terminal to send a request using curl:

    curl -N http://0.0.0.0:8000/v1/chat/completions -H "Content-Type: application/json" -d '{
        "model": "sqlcoder/7b",
        "stream": true,
        "messages": [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Who won the World Series in 2020?"}
        ]
    }' | grep -o '"content":"[^"]*"' | sed 's/"content":"//g' | sed 's/"//g' | tr -d '
    ' | sed 's/\n/
    /g'
  5. 🎉 Hooray! You’re running Generative AI. Our goal is to make this as easy as possible.

About

SQLCoder is a 15B parameter AI model fine-tuned on the StarCoder base model. It excels at natural language to SQL generation tasks, outperforming GPT-3.5-turbo on the sql-eval framework and exceeding the performance of other popular open-source models. Remarkably, SQLCoder outperforms text-davinci-003, a model over ten times its size.

This 15B parameter completion model is optimized for efficiency, requiring a minimum of 16GB of RAM. Users can interact with SQLCoder by inputting natural language SQL questions alongside a Postgres database schema. The model’s outputs adhere to rigorous standards, including detailed schema review, use of table aliases to prevent ambiguity, and explicit casting for calculations such as ratios.

SQLCoder is capable of generating highly complex SQL queries that can join multiple tables, perform aggregations, and filter data based on user-defined conditions. Its robust reasoning ensures accuracy, making it a powerful tool for developers, data analysts, and other professionals working with relational databases.

The model is released under the Open Database License.

References

Hugging Face

DETAILS

MODEL CLASS
MAX Model

MAX Models are extremely optimized inference pipelines to run SOTA performance for that model on both CPU and GPU. For many of these models, they are the fastest version of this model in the world.

Browse 18+ MAX Models

MODULAR GITHUB

Modular

CREATED BY

defog

MODEL

defog/sqlcoder-70b-alpha

TAGS

autotrain_compatible
endpoints_compatible
license:cc-by-sa-4.0
llama
region:us
safetensors
text-generation
text-generation-inference
transformers

@ Copyright - Modular Inc - 2024