dolphin-mixtral-8x7b

PyTorch

2 versions

Uncensored, 8x7b and 8x22b fine-tuned models based on the Mixtral mixture of experts models that excels at coding tasks. Created by Eric Hartford.

Run this model

  1. Install our magic package manager:

    curl -ssL https://magic.modular.com/ | bash

    Then run the source command that's printed in your terminal.

  2. Install Max Pipelines in order to run this model.

    magic global install max-pipelines
  3. Start a local endpoint for dolphin-mixtral/8x7b:

    max-serve serve --huggingface-repo-id cognitivecomputations/dolphin-2.7-mixtral-8x7b

    The endpoint is ready when you see the URI printed in your terminal:

    Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
  4. Now open another terminal to send a request using curl:

    curl -N http://0.0.0.0:8000/v1/chat/completions -H "Content-Type: application/json" -d '{
        "model": "dolphin-mixtral/8x7b",
        "stream": true,
        "messages": [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Who won the World Series in 2020?"}
        ]
    }' | grep -o '"content":"[^"]*"' | sed 's/"content":"//g' | sed 's/"//g' | tr -d '
    ' | sed 's/\n/
    /g'
  5. 🎉 Hooray! You’re running Generative AI. Our goal is to make this as easy as possible.

About

The Dolphin model, created by Eric Hartford, is based on the Mixtral architecture and has been enhanced with training on additional datasets, including Synthia, OpenHermes, PureDove, New Dolphin-Coder, and MagiCoder. This combination improves its capabilities across various coding and reasoning tasks.

Sizes

The model is available in two configurations:

  • dolphin-mixtral:8x22b
  • dolphin-mixtral:8x7b

Its scalable design allows for deployment based on resource availability and desired performance. Dolphin demonstrates strong versatility, especially in processing tasks requiring contextual understanding, making it an asset for advanced AI applications.

References

HuggingFace

DETAILS

MODEL CLASS
PyTorch

MODULAR GITHUB

Modular

CREATED BY

cognitivecomputations

MODEL

cognitivecomputations/dolphin-2.7-mixtral-8x7b

TAGS

autotrain_compatible
conversational
dataset:LDJnr/Capybara
dataset:cognitivecomputations/dolphin
dataset:cognitivecomputations/dolphin-coder
dataset:ise-uiuc/Magicoder-Evol-Instruct-110K
dataset:ise-uiuc/Magicoder-OSS-Instruct-75K
dataset:jondurbin/airoboros-2.2.1
dataset:teknium/openhermes
en
endpoints_compatible
license:apache-2.0
mixtral
pytorch
region:us
text-generation
text-generation-inference
transformers

@ Copyright - Modular Inc - 2024