NuNER Zero‑span (8-bit ONNX)

This is an 8-bit quantized ONNX version of numind/NuNER_Zero-span, a zero-shot named entity recognition (NER) model based on the GLiNER architecture.

πŸ”§ Features

  • 🧠 Zero-shot span-based NER
  • πŸ“¦ Quantized to 8-bit for faster, smaller inference
  • πŸ’¬ Input: text + list of labels
  • πŸͺ„ Output: text spans per label (max span length = 12 tokens)
  • πŸ“ Format: ONNX

πŸ“„ License

MIT License – same as the original model.

πŸ™ Acknowledgements

  • Model: NuMind
  • Framework: GLiNER
  • Quantized and exported by Bijin Regi Panicker
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for bijinpanicker/NuNER_Zero-span-ONNX

Quantized
(2)
this model