Why did we open-source our inference engine? Read the post

mixedbread-ai/mxbai-rerank-base-v2 (Encode)

Architecture
Parameters
150M
Tasks
Score
Outputs
Score
Max Sequence Length
8,192 tokens
License

Benchmarks

AskUbuntuDupQuestions

technology reranking en

Quality
ndcg at 10 0.6638
map at 10 0.5047
mrr at 10 0.7531
Performance L4 b1 c16
Query TPS 5.0K
Query p50 58.6ms

CMedQAv1Reranking

medical reranking zh

Quality
map at 10 0.7981
mrr at 10 0.8403

CMedQAv2Reranking

medical reranking zh

Quality
map at 10 0.8032
mrr at 10 0.8469

MMarcoReranking

general reranking zh

Quality
map at 10 0.3216
mrr at 10 0.3464
Performance L4 b1 c16

T2Reranking

general reranking zh

Quality
map at 10 0.5466
mrr at 10 0.7734

Self-hosted inference for search & document processing

Cut API costs by 50x, boost quality with 85+ SOTA models, and keep your data in your own cloud.

Github
1.5K

Contact us

Tell us about your use case and we'll get back to you shortly.