orionweller commited on
Commit
9876074
Β·
verified Β·
1 Parent(s): 04c021a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -14,7 +14,7 @@ pipeline_tag: fill-mask
14
 
15
  > 🎯 **TL;DR**: State-of-the-art paired encoder and decoder models (17M-1B params) trained identically for fair comparison with open data. Encoders beat ModernBERT. Decoders beat Llama 3.2/SmolLM2.
16
 
17
- πŸ“„ [Paper (Coming Soon)](https://github.com/jhu-clsp/ettin-encoder-vs-decoder) | πŸš€ [GitHub Repository](https://github.com/jhu-clsp/ettin-encoder-vs-decoder)
18
 
19
  This model is part of the Ettin suite - the first collection of paired encoder-only and decoder-only models trained with identical data, architecture, and training recipes. Ettin enables fair comparisons between encoder and decoder architectures across multiple scales, providing state-of-the-art performance for open-data models in their respective size categories.
20
 
 
14
 
15
  > 🎯 **TL;DR**: State-of-the-art paired encoder and decoder models (17M-1B params) trained identically for fair comparison with open data. Encoders beat ModernBERT. Decoders beat Llama 3.2/SmolLM2.
16
 
17
+ πŸ“„ [Paper](https://arxiv.org/abs/2507.11412) | πŸš€ [GitHub Repository](https://github.com/jhu-clsp/ettin-encoder-vs-decoder)
18
 
19
  This model is part of the Ettin suite - the first collection of paired encoder-only and decoder-only models trained with identical data, architecture, and training recipes. Ettin enables fair comparisons between encoder and decoder architectures across multiple scales, providing state-of-the-art performance for open-data models in their respective size categories.
20