Why did we open-source our inference engine? Read the post

jackboyla/glirel-large-v0

Architecture
DeBERTa
Parameters
435M
Tasks
Extract
Outputs
Relations
License

Benchmarks

FewRel

general re en

Quality
f1 0.2639
precision 0.2397
recall 0.2934
Performance L4-SPOT b1 c16

Self-hosted inference for search & document processing

Cut API costs by 50x, boost quality with 85+ SOTA models, and keep your data in your own cloud.

Github
1.5K

Contact us

Tell us about your use case and we'll get back to you shortly.