Why did we open-source our inference engine? Read the post

urchade/gliner_medium-v2.1

Architecture
DeBERTa
Parameters
150M
Tasks
Extract
Outputs
Entities
License

Benchmarks

CoNLL-2003

news ner en

Quality
f1 0.6111
precision 0.6008
recall 0.6218
Performance L4 b1 c16

Self-hosted inference for search & document processing

Cut API costs by 50x, boost quality with 85+ SOTA models, and keep your data in your own cloud.

Github
1.5K

Contact us

Tell us about your use case and we'll get back to you shortly.