phi3.5-3.8b

PyTorch

1 versions

A lightweight AI model with 3.8 billion parameters with performance overtaking similarly and larger sized models.

Run this model

  1. Install our magic package manager:

    curl -ssL https://magic.modular.com/ | bash

    Then run the source command that's printed in your terminal.

  2. Install Max Pipelines in order to run this model.

    magic global install max-pipelines
  3. Start a local endpoint for phi3.5/3.8b:

    max-serve serve --huggingface-repo-id microsoft/Phi-3.5-mini-instruct

    The endpoint is ready when you see the URI printed in your terminal:

    Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
  4. Now open another terminal to send a request using curl:

    curl -N http://0.0.0.0:8000/v1/chat/completions -H "Content-Type: application/json" -d '{
        "model": "phi3.5/3.8b",
        "stream": true,
        "messages": [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Who won the World Series in 2020?"}
        ]
    }' | grep -o '"content":"[^"]*"' | sed 's/"content":"//g' | sed 's/"//g' | tr -d '
    ' | sed 's/\n/
    /g'
  5. 🎉 Hooray! You’re running Generative AI. Our goal is to make this as easy as possible.

About

Phi-3.5-mini is a compact, cutting-edge open AI model built on datasets from Phi-3, which includes synthetic data and high-quality, reasoning-rich public web content. It emphasizes precision and safety through technologies like supervised fine-tuning, proximal policy optimization, and direct preference optimization.

Part of the Phi-3 family, this model supports an extensive 128K token context length, enabling it to tackle tasks such as long document summarization, question answering, and information retrieval with impressive efficiency. Its advanced design ensures both strong instruction adherence and robust performance across various scenarios.

Long Context

With its 128K context length, Phi-3.5-mini excels at handling complex and extended content, making it suitable for applications involving long-form content processing.

References

Hugging Face

DETAILS

MODEL CLASS
PyTorch

MODULAR GITHUB

Modular

CREATED BY

microsoft

MODEL

microsoft/Phi-3.5-mini-instruct

TAGS

arxiv:2403.06412
arxiv:2404.14219
arxiv:2407.13833
autotrain_compatible
code
conversational
custom_code
endpoints_compatible
license:mit
multilingual
nlp
phi3
region:us
safetensors
text-generation
text-generation-inference
transformers

@ Copyright - Modular Inc - 2024