mxbai-embed-large-335m

PyTorch

1 versions

State-of-the-art large embedding model from mixedbread.ai

Run this model

  1. Install our magic package manager:

    curl -ssL https://magic.modular.com/ | bash

    Then run the source command that's printed in your terminal.

  2. Install Max Pipelines in order to run this model.

    magic global install max-pipelines
  3. Start a local endpoint for mxbai-embed-large/335m:

    max-serve serve --huggingface-repo-id mixedbread-ai/mxbai-embed-large-v1

    The endpoint is ready when you see the URI printed in your terminal:

    Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
  4. Now open another terminal to send a request using curl:

    curl -N http://0.0.0.0:8000/v1/chat/completions -H "Content-Type: application/json" -d '{
        "model": "mxbai-embed-large/335m",
        "stream": true,
        "messages": [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Who won the World Series in 2020?"}
        ]
    }' | grep -o '"content":"[^"]*"' | sed 's/"content":"//g' | sed 's/"//g' | tr -d '
    ' | sed 's/\n/
    /g'
  5. 🎉 Hooray! You’re running Generative AI. Our goal is to make this as easy as possible.

About

mxbai-embed-large

As of March 2024, this model achieves state-of-the-art (SOTA) performance for models of Bert-large size on the MTEB benchmark. It surpasses commercial models like OpenAI’s text-embedding-3-large and demonstrates performance comparable to models 20x its size.

mxbai-embed-large was trained without any overlap with MTEB data, demonstrating strong generalization across various domains, tasks, and text lengths.

References

Blog post
Hugging Face

DETAILS

MODEL CLASS
PyTorch

MODULAR GITHUB

Modular

CREATED BY

mixedbread-ai

MODEL

mixedbread-ai/mxbai-embed-large-v1

TAGS

arxiv:2309.12871
autotrain_compatible
bert
en
endpoints_compatible
feature-extraction
gguf
license:apache-2.0
model-index
mteb
onnx
openvino
region:us
safetensors
sentence-transformers
text-embeddings-inference
transformers
transformers.js

@ Copyright - Modular Inc - 2024