Why did we open-source our inference engine? Read the post

jinaai/jina-colbert-v2 (Encode)

Architecture
XLM-RoBERTa
Parameters
110M
Tasks
Encode
Outputs
Multi-Vec
Dimensions
Multi-Vec: 128
Max Sequence Length
8,192 tokens
License

Benchmarks

CQADupstackPhysicsRetrieval

scientific retrieval en

Quality
ndcg at 10 0.4047
map at 10 0.3496
mrr at 10 0.4005
Performance L4 b1 c16
Corpus TPS 24.9K
Corpus p50 81.3ms
Query TPS 3.0K
Query p50 55.9ms

CosQA

technology retrieval en

Quality
ndcg at 10 0.2607
map at 10 0.2037
mrr at 10 0.1946
Performance L4 b1 c16
Corpus TPS 13.9K
Corpus p50 63.3ms
Query TPS 1.5K
Query p50 59.7ms

FiQA2018

finance retrieval en

Quality
ndcg at 10 0.4051
map at 10 0.3240
mrr at 10 0.4875
Performance L4 b1 c16
Corpus TPS 27.1K
Corpus p50 93.4ms
Query TPS 3.0K
Query p50 59.5ms

LegalBenchConsumerContractsQA

legal retrieval en

Quality
ndcg at 10 0.7615
map at 10 0.7107
mrr at 10 0.7116
Performance L4 b1 c16
Corpus TPS 30.7K
Corpus p50 259.5ms
Query TPS 3.4K
Query p50 60.1ms

NFCorpus

medical retrieval en

Quality
ndcg at 10 0.3583
map at 10 0.1422
mrr at 10 0.5724
Performance L4 b1 c16
Corpus TPS 33.2K
Corpus p50 146.1ms
Query TPS 1.5K
Query p50 55.3ms

NanoFiQA2018Retrieval

finance retrieval en

Quality
ndcg at 10 0.5208
map at 10 0.4318
mrr at 10 0.5644
Performance L4 b1 c16
Corpus TPS 28.9K
Corpus p50 77.4ms
Query TPS 2.6K
Query p50 49.5ms

SCIDOCS

scientific retrieval en

Quality
ndcg at 10 0.1779
map at 10 0.1045
mrr at 10 0.3091
Performance L4 b1 c16
Corpus TPS 28.5K
Corpus p50 105.7ms
Query TPS 2.9K
Query p50 57.3ms

SciFact

scientific retrieval en

Quality
ndcg at 10 0.6702
map at 10 0.6266
mrr at 10 0.6391
Performance L4 b1 c16
Corpus TPS 30.9K
Corpus p50 137.3ms
Query TPS 4.8K
Query p50 56.4ms

StackOverflowQA

technology retrieval en

Quality
ndcg at 10 0.6085
map at 10 0.5717
mrr at 10 0.5717
Performance L4 b1 c16
Corpus TPS 27.4K
Corpus p50 127.2ms
Query TPS 58.2K
Query p50 80.3ms

Self-hosted inference for search & document processing

Cut API costs by 50x, boost quality with 85+ SOTA models, and keep your data in your own cloud.

Github
1.5K

Contact us

Tell us about your use case and we'll get back to you shortly.