mistral-openorca-7b

MAX Model

1 versions

Mistral OpenOrca is a 7 billion parameter model, fine-tuned on top of the Mistral 7B model using the OpenOrca dataset.

Run this model

  1. Install our magic package manager:

    curl -ssL https://magic.modular.com/ | bash

    Then run the source command that's printed in your terminal.

  2. Install Max Pipelines in order to run this model.

    magic global install max-pipelines
  3. Start a local endpoint for mistral-openorca/7b:

    max-serve serve --huggingface-repo-id Open-Orca/Mistral-7B-OpenOrca

    The endpoint is ready when you see the URI printed in your terminal:

    Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
  4. Now open another terminal to send a request using curl:

    curl -N http://0.0.0.0:8000/v1/chat/completions -H "Content-Type: application/json" -d '{
        "model": "mistral-openorca/7b",
        "stream": true,
        "messages": [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Who won the World Series in 2020?"}
        ]
    }' | grep -o '"content":"[^"]*"' | sed 's/"content":"//g' | sed 's/"//g' | tr -d '
    ' | sed 's/\n/
    /g'
  5. ๐ŸŽ‰ Hooray! Youโ€™re running Generative AI. Our goal is to make this as easy as possible.

About

Mistral OpenOrca is a cutting-edge language model with 7 billion parameters, built on top of the Mistral 7B foundation and fine-tuned using the OpenOrca dataset. At the time of its release, it is positioned as the leading model for its size, outperforming all other 7B and 13B parameter models. Performance evaluations on the HuggingFace Leaderboard further establish it as the best model under 30B parameters.

More information

DETAILS

MODEL CLASS
MAX Model

MAX Models are extremely optimized inference pipelines to run SOTA performance for that model on both CPU and GPU. For many of these models, they are the fastest version of this model in the world.

Browse 18+ MAX Models

MODULAR GITHUB

Modular

CREATED BY

Open-Orca

MODEL

Open-Orca/Mistral-7B-OpenOrca

TAGS

arxiv:2301.13688
arxiv:2306.02707
autotrain_compatible
conversational
dataset:Open-Orca/OpenOrca
en
endpoints_compatible
license:apache-2.0
mistral
pytorch
region:us
text-generation
text-generation-inference
transformers

@ Copyright - Modular Inc - 2024