Why did we open-source our inference engine? Read the post

numind/NuNER_Zero

Architecture
DeBERTa
Parameters
110M
Tasks
Extract
Outputs
Entities
License

Benchmarks

CoNLL-2003

news ner en

Quality
f1 0.6122
precision 0.5512
recall 0.6884

Self-hosted inference for search & document processing

Cut API costs by 50x, boost quality with 85+ SOTA models, and keep your data in your own cloud.

Github
1.5K

Contact us

Tell us about your use case and we'll get back to you shortly.