Why did we open-source our inference engine? Read the post

urchade/gliner_multi_pii-v1

Architecture
DeBERTa
Parameters
435M
Tasks
Extract
Outputs
Entities
License

Benchmarks

CoNLL-2003

news ner en

Quality
f1 0.5357
precision 0.5287
recall 0.5428

Self-hosted inference for search & document processing

Cut API costs by 50x, boost quality with 85+ SOTA models, and keep your data in your own cloud.

Github
1.5K

Contact us

Tell us about your use case and we'll get back to you shortly.