Why did we open-source our inference engine? Read the post

EmergentMethods/gliner_large_news-v2.1

Architecture
DeBERTa
Parameters
435M
Tasks
Extract
Outputs
Entities
License

Benchmarks

CoNLL-2003

news ner en

Quality
f1 0.5527
precision 0.5704
recall 0.5361

Self-hosted inference for search & document processing

Cut API costs by 50x, boost quality with 85+ SOTA models, and keep your data in your own cloud.

Github
1.5K

Contact us

Tell us about your use case and we'll get back to you shortly.