Why did we open-source our inference engine? Read the post

lightonai/GTE-ModernColBERT-v1 (Encode)

Architecture
Parameters
305M
Tasks
Encode
Outputs
Multi-Vec
Dimensions
Multi-Vec: 128
Max Sequence Length
8,192 tokens
License

Benchmarks

CQADupstackPhysicsRetrieval

scientific retrieval en

Quality
ndcg at 10 0.3886
map at 10 0.3410
mrr at 10 0.3904
Performance L4 b1 c16
Corpus TPS 21.7K
Corpus p50 88.1ms
Query TPS 2.5K
Query p50 68.2ms

CosQA

technology retrieval en

Quality
ndcg at 10 0.3126
map at 10 0.2347
mrr at 10 0.2366
Performance L4 b1 c16
Corpus TPS 7.4K
Corpus p50 84.4ms
Query TPS 462
Query p50 76.3ms

FiQA2018

finance retrieval en

Quality
ndcg at 10 0.3838
map at 10 0.3133
mrr at 10 0.4648
Performance L4 b1 c16
Corpus TPS 18.9K
Corpus p50 106.6ms
Query TPS 2.4K
Query p50 71.9ms

LegalBenchConsumerContractsQA

legal retrieval en

Quality
ndcg at 10 0.7773
map at 10 0.7300
mrr at 10 0.7321
Performance L4 b1 c16
Corpus TPS 42.9K
Corpus p50 192.4ms
Query TPS 3.6K
Query p50 70.2ms

NFCorpus

medical retrieval en

Quality
ndcg at 10 0.3616
map at 10 0.1390
mrr at 10 0.5824
Performance L4 b1 c16
Corpus TPS 35.9K
Corpus p50 101.3ms
Query TPS 1.7K
Query p50 45.7ms

NanoFiQA2018Retrieval

finance retrieval en

Quality
ndcg at 10 0.5229
map at 10 0.4304
mrr at 10 0.5544

SCIDOCS

scientific retrieval en

Quality
ndcg at 10 0.1607
map at 10 0.0934
mrr at 10 0.2874
Performance L4 b1 c16
Corpus TPS 30.1K
Corpus p50 96.3ms
Query TPS 2.1K
Query p50 68.6ms

SciFact

scientific retrieval en

Quality
ndcg at 10 0.7326
map at 10 0.6940
mrr at 10 0.7090
Performance L4 b1 c16
Corpus TPS 31.9K
Corpus p50 118.1ms
Query TPS 3.4K
Query p50 75.1ms

StackOverflowQA

technology retrieval en

Quality
ndcg at 10 0.5067
map at 10 0.4750
mrr at 10 0.4750
Performance L4 b1 c16
Corpus TPS 26.0K
Corpus p50 127.7ms
Query TPS 52.9K
Query p50 91.7ms

Self-hosted inference for search & document processing

Cut API costs by 50x, boost quality with 85+ SOTA models, and keep your data in your own cloud.

Github
1.5K

Contact us

Tell us about your use case and we'll get back to you shortly.