deepseek-coder-1.3b

MAX Model

3 versions

DeepSeek Coder is a capable coding model trained on two trillion code and natural language tokens.

Run this model

  1. Install our magic package manager:

    curl -ssL https://magic.modular.com/ | bash

    Then run the source command that's printed in your terminal.

  2. Install Max Pipelines in order to run this model.

    magic global install max-pipelines
  3. Start a local endpoint for deepseek-coder/1.3b:

    max-serve serve --huggingface-repo-id deepseek-ai/deepseek-coder-1.3b-base

    The endpoint is ready when you see the URI printed in your terminal:

    Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
  4. Now open another terminal to send a request using curl:

    curl -N http://0.0.0.0:8000/v1/chat/completions -H "Content-Type: application/json" -d '{
        "model": "deepseek-coder/1.3b",
        "stream": true,
        "messages": [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Who won the World Series in 2020?"}
        ]
    }' | grep -o '"content":"[^"]*"' | sed 's/"content":"//g' | sed 's/"//g' | tr -d '
    ' | sed 's/\n/
    /g'
  5. šŸŽ‰ Hooray! Youā€™re running Generative AI. Our goal is to make this as easy as possible.

About

DeepSeek Coder is an AI model trained from scratch with a dataset comprising 87% code and 13% natural language in English and Chinese. It has been pre-trained on 2 trillion tokens, enabling it to understand and generate both programming and natural language text across diverse contexts.

Models Available

  • 1.3 billion parameter model
  • 6.7 billion parameter model
  • 33 billion parameter model

Example API Usage

Generate a response using the API with a simple curl command:

curl -N http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
    "model": "deepseek-ai/deepseek-coder-7b-instruct-v1.5",
    "stream": true,
    "messages": [
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Please complete the following Python function: `def fibonacci(n):`"}
    ]
}

References

HuggingFace

DETAILS

MODEL CLASS
MAX Model

MAX Models are extremely optimized inference pipelines to run SOTA performance for that model on both CPU and GPU. For many of these models, they are the fastest version of this model in the world.

Browse 18+ MAX Models

MODULAR GITHUB

Modular

CREATED BY

deepseek-ai

MODEL

deepseek-ai/deepseek-coder-1.3b-base

TAGS

autotrain_compatible
code
endpoints_compatible
license:other
llama
pytorch
region:us
text-generation
text-generation-inference
transformers

@ Copyright - Modular Inc - 2024