Token Classification
GLiNER
PyTorch
English
entity recognition
named-entity-recognition
zero-shot
zero-shot-ner
zero shot
biomedical-nlp
cancer-genetics
oncology
gene-regulation
cancer-research
amino_acid
anatomical_system
cancer
cell
cellular_component
developing_anatomical_structure
gene_or_gene_product
immaterial_anatomical_entity
multi-tissue_structure
organ
organism
organism_subdivision
organism_substance
pathological_formation
simple_chemical
tissue
feat: Upload fine-tuned medical NER model OpenMed-ZeroShot-NER-Oncology-Base-220M
Browse files- .gitattributes +2 -0
- README.md +226 -0
- added_tokens.json +4 -0
- gliner_config.json +134 -0
- openmed_vs_sota_grouped_bars.png +3 -0
- pytorch_model.bin +3 -0
- special_tokens_map.json +23 -0
- spiece.model +3 -0
- test_results.json +7 -0
- tokenizer.json +3 -0
- tokenizer_config.json +855 -0
.gitattributes
CHANGED
@@ -33,3 +33,5 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
+
openmed_vs_sota_grouped_bars.png filter=lfs diff=lfs merge=lfs -text
|
37 |
+
tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
README.md
ADDED
@@ -0,0 +1,226 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
widget:
|
3 |
+
- text: "Mutations in KRAS gene drive oncogenic transformation."
|
4 |
+
- text: "The tumor suppressor p53 pathway was disrupted."
|
5 |
+
- text: "EGFR amplification promotes cancer cell proliferation."
|
6 |
+
- text: "Loss of function of the PTEN gene is common in many cancers."
|
7 |
+
- text: "The PI3K/AKT/mTOR pathway is a critical regulator of cell growth."
|
8 |
+
tags:
|
9 |
+
- token-classification
|
10 |
+
- entity recognition
|
11 |
+
- named-entity-recognition
|
12 |
+
- zero-shot
|
13 |
+
- zero-shot-ner
|
14 |
+
- zero shot
|
15 |
+
- biomedical-nlp
|
16 |
+
- gliner
|
17 |
+
- cancer-genetics
|
18 |
+
- oncology
|
19 |
+
- gene-regulation
|
20 |
+
- cancer-research
|
21 |
+
- amino_acid
|
22 |
+
- anatomical_system
|
23 |
+
- cancer
|
24 |
+
- cell
|
25 |
+
- cellular_component
|
26 |
+
- developing_anatomical_structure
|
27 |
+
- gene_or_gene_product
|
28 |
+
- immaterial_anatomical_entity
|
29 |
+
- multi-tissue_structure
|
30 |
+
- organ
|
31 |
+
- organism
|
32 |
+
- organism_subdivision
|
33 |
+
- organism_substance
|
34 |
+
- pathological_formation
|
35 |
+
- simple_chemical
|
36 |
+
- tissue
|
37 |
+
language:
|
38 |
+
- en
|
39 |
+
license: apache-2.0
|
40 |
+
---
|
41 |
+
|
42 |
+
# 𧬠[OpenMed-ZeroShot-NER-Oncology-Base-220M](https://huggingface.co/OpenMed/OpenMed-ZeroShot-NER-Oncology-Base-220M)
|
43 |
+
|
44 |
+
**Specialized model for Cancer Genetics - Cancer-related genetic entities**
|
45 |
+
|
46 |
+
[](https://opensource.org/licenses/Apache-2.0)
|
47 |
+
[]()
|
48 |
+
[]()
|
49 |
+
[](https://huggingface.co/OpenMed)
|
50 |
+
|
51 |
+
## π Model Overview
|
52 |
+
|
53 |
+
Oncology-focused model for **cancer genetics**, capturing genes, variants, and cellular processes in tumor biology contexts.Useful for **cancer pathway curation**, **driver gene analysis**, and **precision oncology literature mining**.
|
54 |
+
|
55 |
+
OpenMed ZeroShot NER is an advanced, domain-adapted Named Entity Recognition (NER) model designed specifically for medical, biomedical, and clinical text mining. Leveraging state-of-the-art zero-shot learning, this model empowers researchers, clinicians, and data scientists to extract expert-level biomedical entitiesβsuch as diseases, chemicals, genes, species, and clinical findingsβdirectly from unstructured text, without the need for task-specific retraining.
|
56 |
+
|
57 |
+
Built on the robust GLiNER architecture and fine-tuned on curated biomedical corpora, OpenMed ZeroShot NER delivers high-precision entity recognition for critical healthcare and life sciences applications. Its zero-shot capability means you can flexibly define and extract any entity type relevant to your workflow, from standard biomedical categories to custom clinical concepts, supporting rapid adaptation to new research domains and regulatory requirements.
|
58 |
+
|
59 |
+
Whether you are working on clinical NLP, biomedical research, electronic health record (EHR) de-identification, or large-scale literature mining, OpenMed ZeroShot NER provides a production-ready, open-source solution that combines expert-level accuracy with unmatched flexibility. Join the OpenMed community to accelerate your medical text analytics with cutting-edge, zero-shot NER technology.
|
60 |
+
|
61 |
+
### π― Key Features
|
62 |
+
- **Zero-Shot Capability**: Can recognize any entity type without specific training
|
63 |
+
- **High Precision**: Optimized for biomedical entity recognition
|
64 |
+
- **Domain-Specific**: Fine-tuned on curated BIONLP2013_CG dataset
|
65 |
+
- **Production-Ready**: Validated on clinical benchmarks
|
66 |
+
- **Easy Integration**: Compatible with Hugging Face Transformers ecosystem
|
67 |
+
- **Flexible Entity Recognition**: Add custom entity types without retraining
|
68 |
+
|
69 |
+
### π·οΈ Supported Entity Types
|
70 |
+
|
71 |
+
This zero-shot model can identify and classify biomedical entities, including but not limited to these entity types. **You can also add custom entity types without retraining the model**:
|
72 |
+
|
73 |
+
- `Amino_acid`
|
74 |
+
- `Anatomical_system`
|
75 |
+
- `Cancer`
|
76 |
+
- `Cell`
|
77 |
+
- `Cellular_component`
|
78 |
+
|
79 |
+
<details>
|
80 |
+
<summary>See 11 more entity types...</summary>
|
81 |
+
|
82 |
+
- `Developing_anatomical_structure`
|
83 |
+
- `Gene_or_gene_product`
|
84 |
+
- `Immaterial_anatomical_entity`
|
85 |
+
- `Multi-tissue_structure`
|
86 |
+
- `Organ`
|
87 |
+
- `Organism`
|
88 |
+
- `Organism_subdivision`
|
89 |
+
- `Organism_substance`
|
90 |
+
- `Pathological_formation`
|
91 |
+
- `Simple_chemical`
|
92 |
+
- `Tissue`
|
93 |
+
</details>
|
94 |
+
|
95 |
+
**π‘ Zero-Shot Flexibility**: As a GliNER-based model, you can specify any entity types you want to detect, even if they weren't part of the original training. Simply provide the entity labels when using the model, and it will adapt to recognize them.
|
96 |
+
|
97 |
+
## π Dataset
|
98 |
+
|
99 |
+
BioNLP 2013 CG corpus targets cancer genetics entities for oncology research and cancer genomics.
|
100 |
+
|
101 |
+
The BioNLP 2013 CG (Cancer Genetics) corpus is a specialized dataset focusing on cancer genetics entities and gene regulation in oncology research. This corpus contains annotations for genes, proteins, and molecular processes specifically related to cancer biology and tumor genetics. Developed for the BioNLP Shared Task 2013, it supports the development of text mining systems for cancer research, oncological studies, and precision medicine applications. The dataset is particularly valuable for identifying cancer-related biomarkers, tumor suppressor genes, oncogenes, and therapeutic targets mentioned in cancer research literature. It serves as a benchmark for evaluating NER systems used in cancer genomics, personalized medicine, and oncology informatics.
|
102 |
+
|
103 |
+
|
104 |
+
## π Performance Metrics
|
105 |
+
|
106 |
+
### Current Model Performance
|
107 |
+
|
108 |
+
- **Finetuned F1 vs. Base Model (on test dataset excluded from training)**: `0.82`
|
109 |
+
- **F1 Improvement vs Base Model**: `53.4%`
|
110 |
+
|
111 |
+
### π Top F1 Improvements on BIONLP2013_CG Dataset
|
112 |
+
|
113 |
+
| Rank | Model | Base F1 | Finetuned F1 | ΞF1 | ΞF1 % |
|
114 |
+
|------|-------|--------:|------------:|----:|------:|
|
115 |
+
| π₯ 1 | [OpenMed-ZeroShot-NER-Oncology-Large-459M](https://huggingface.co/OpenMed/OpenMed-ZeroShot-NER-Oncology-Large-459M) | 0.5534 | 0.8990 | 0.3456 | 62.5% |
|
116 |
+
| π₯ 2 | [OpenMed-ZeroShot-NER-Oncology-Medium-209M](https://huggingface.co/OpenMed/OpenMed-ZeroShot-NER-Oncology-Medium-209M) | 0.4885 | 0.8765 | 0.3880 | 79.4% |
|
117 |
+
| π₯ 3 | [OpenMed-ZeroShot-NER-Oncology-XLarge-770M](https://huggingface.co/OpenMed/OpenMed-ZeroShot-NER-Oncology-XLarge-770M) | 0.5953 | 0.8750 | 0.2797 | 47.0% |
|
118 |
+
| 4 | [OpenMed-ZeroShot-NER-Oncology-Base-220M](https://huggingface.co/OpenMed/OpenMed-ZeroShot-NER-Oncology-Base-220M) | 0.5324 | 0.8167 | 0.2842 | 53.4% |
|
119 |
+
| 5 | [OpenMed-ZeroShot-NER-Oncology-Multi-209M](https://huggingface.co/OpenMed/OpenMed-ZeroShot-NER-Oncology-Multi-209M) | 0.4343 | 0.7498 | 0.3154 | 72.6% |
|
120 |
+
|
121 |
+
|
122 |
+
*Rankings are sorted by finetuned F1 and show ΞF1% over base model. Test dataset is excluded from training.*
|
123 |
+
|
124 |
+

|
125 |
+
|
126 |
+
*Figure: OpenMed ZeroShot Clinical & Biomedical NER vs. Original GLiNER models.*
|
127 |
+
|
128 |
+
## π Quick Start
|
129 |
+
|
130 |
+
### Installation
|
131 |
+
|
132 |
+
```bash
|
133 |
+
pip install gliner==0.2.21
|
134 |
+
```
|
135 |
+
|
136 |
+
### Usage
|
137 |
+
|
138 |
+
```python
|
139 |
+
from transformers import pipeline
|
140 |
+
|
141 |
+
# Load the model and tokenizer
|
142 |
+
# Model: https://huggingface.co/OpenMed/OpenMed-ZeroShot-NER-Oncology-Base-220M
|
143 |
+
model_name = "OpenMed/OpenMed-ZeroShot-NER-Oncology-Base-220M"
|
144 |
+
|
145 |
+
from gliner import GLiNER
|
146 |
+
model = GLiNER.from_pretrained("OpenMed-ZeroShot-NER-Oncology-Base-220M")
|
147 |
+
|
148 |
+
# Example usage with default entity types
|
149 |
+
text = "Mutations in KRAS gene drive oncogenic transformation."
|
150 |
+
|
151 |
+
labels = ['Amino_acid', 'Anatomical_system', 'Cancer', 'Cell', 'Cellular_component', 'Developing_anatomical_structure', 'Gene_or_gene_product', 'Immaterial_anatomical_entity', 'Multi-tissue_structure', 'Organ', 'Organism', 'Organism_subdivision', 'Organism_substance', 'Pathological_formation', 'Simple_chemical', 'Tissue']
|
152 |
+
entities = model.predict_entities(text, labels, flat_ner=True, threshold=0.5)
|
153 |
+
for entity in entities:
|
154 |
+
print(entity)
|
155 |
+
```
|
156 |
+
|
157 |
+
### Zero-Shot Usage with Custom Entity Types
|
158 |
+
π‘ **Tip:** If you want to extract entities that are not present in the original training set (i.e., use custom or rare entity types), you may get better results by lowering the `threshold` parameter in `model.predict_entities`. For example, try `threshold=0.3` or even lower, depending on your use case:
|
159 |
+
|
160 |
+
```python
|
161 |
+
# You can specify custom entity types for zero-shot recognition - for instance:
|
162 |
+
custom_entities = ["MISC", "Amino_acid", "PERSON", "LOCATION", "MEDICATION", "PROCEDURE"]
|
163 |
+
|
164 |
+
entities = model.predict_entities(text, custom_entities, flat_ner=True, threshold=0.1)
|
165 |
+
for entity in entities:
|
166 |
+
print(entity)
|
167 |
+
```
|
168 |
+
|
169 |
+
> Lowering the threshold makes the model more permissive and can help it recognize new or less common entity types, but may also increase false positives. Adjust as needed for your application.
|
170 |
+
|
171 |
+
## π Dataset Information
|
172 |
+
|
173 |
+
- **Dataset**: BIONLP2013_CG
|
174 |
+
- **Description**: Cancer Genetics - Cancer-related genetic entities
|
175 |
+
|
176 |
+
### Training Details
|
177 |
+
- **Base Model**: gliner-x-base
|
178 |
+
- **Training Framework**: Hugging Face Transformers
|
179 |
+
- **Optimization**: AdamW optimizer with learning rate scheduling
|
180 |
+
- **Validation**: Cross-validation on held-out test set
|
181 |
+
|
182 |
+
## π‘ Use Cases
|
183 |
+
|
184 |
+
This model is particularly useful for:
|
185 |
+
- **Clinical Text Mining**: Extracting entities from medical records
|
186 |
+
- **Biomedical Research**: Processing scientific literature
|
187 |
+
- **Drug Discovery**: Identifying chemical compounds and drugs
|
188 |
+
- **Healthcare Analytics**: Analyzing patient data and outcomes
|
189 |
+
- **Academic Research**: Supporting biomedical NLP research
|
190 |
+
- **Custom Entity Recognition**: Zero-shot detection of domain-specific entities
|
191 |
+
|
192 |
+
## π¬ Model Architecture
|
193 |
+
|
194 |
+
- **Task**: Zero-Shot Classification (Named Entity Recognition)
|
195 |
+
- **Labels**: Dataset-specific entity types
|
196 |
+
- **Input**: Biomedical text
|
197 |
+
- **Output**: Named entity predictions
|
198 |
+
|
199 |
+
## π License
|
200 |
+
|
201 |
+
Licensed under the Apache License 2.0. See [LICENSE](https://www.apache.org/licenses/LICENSE-2.0) for details.
|
202 |
+
|
203 |
+
## π€ Contributing
|
204 |
+
|
205 |
+
I welcome contributions of all kinds! Whether you have ideas, feature requests, or want to join my mission to advance open-source Healthcare AI, I'd love to hear from you.
|
206 |
+
|
207 |
+
Follow [OpenMed Org](https://huggingface.co/OpenMed) on Hugging Face π€ and click "Watch" to stay updated on my latest releases and developments.
|
208 |
+
|
209 |
+
## Citation
|
210 |
+
|
211 |
+
If you use this model in your research or applications, please cite the following paper:
|
212 |
+
|
213 |
+
```latex
|
214 |
+
@misc{panahi2025openmedneropensourcedomainadapted,
|
215 |
+
title={OpenMed NER: Open-Source, Domain-Adapted State-of-the-Art Transformers for Biomedical NER Across 12 Public Datasets},
|
216 |
+
author={Maziyar Panahi},
|
217 |
+
year={2025},
|
218 |
+
eprint={2508.01630},
|
219 |
+
archivePrefix={arXiv},
|
220 |
+
primaryClass={cs.CL},
|
221 |
+
url={https://arxiv.org/abs/2508.01630},
|
222 |
+
}
|
223 |
+
```
|
224 |
+
|
225 |
+
Proper citation helps support and acknowledge my work. Thank you!
|
226 |
+
|
added_tokens.json
ADDED
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"<<ENT>>": 250100,
|
3 |
+
"<<SEP>>": 250101
|
4 |
+
}
|
gliner_config.json
ADDED
@@ -0,0 +1,134 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"class_token_index": 250100,
|
3 |
+
"dropout": 0.3,
|
4 |
+
"embed_ent_token": true,
|
5 |
+
"encoder_config": {
|
6 |
+
"_name_or_path": "google/mt5-base",
|
7 |
+
"add_cross_attention": false,
|
8 |
+
"architectures": [
|
9 |
+
"MT5ForConditionalGeneration"
|
10 |
+
],
|
11 |
+
"bad_words_ids": null,
|
12 |
+
"begin_suppress_tokens": null,
|
13 |
+
"bos_token_id": null,
|
14 |
+
"chunk_size_feed_forward": 0,
|
15 |
+
"classifier_dropout": 0.0,
|
16 |
+
"cross_attention_hidden_size": null,
|
17 |
+
"d_ff": 2048,
|
18 |
+
"d_kv": 64,
|
19 |
+
"d_model": 768,
|
20 |
+
"decoder_start_token_id": 0,
|
21 |
+
"dense_act_fn": "gelu_new",
|
22 |
+
"diversity_penalty": 0.0,
|
23 |
+
"do_sample": false,
|
24 |
+
"dropout_rate": 0.1,
|
25 |
+
"early_stopping": false,
|
26 |
+
"encoder_no_repeat_ngram_size": 0,
|
27 |
+
"eos_token_id": 1,
|
28 |
+
"exponential_decay_length_penalty": null,
|
29 |
+
"feed_forward_proj": "gated-gelu",
|
30 |
+
"finetuning_task": null,
|
31 |
+
"forced_bos_token_id": null,
|
32 |
+
"forced_eos_token_id": null,
|
33 |
+
"id2label": {
|
34 |
+
"0": "LABEL_0",
|
35 |
+
"1": "LABEL_1"
|
36 |
+
},
|
37 |
+
"initializer_factor": 1.0,
|
38 |
+
"is_decoder": false,
|
39 |
+
"is_encoder_decoder": true,
|
40 |
+
"is_gated_act": true,
|
41 |
+
"label2id": {
|
42 |
+
"LABEL_0": 0,
|
43 |
+
"LABEL_1": 1
|
44 |
+
},
|
45 |
+
"layer_norm_epsilon": 1e-06,
|
46 |
+
"length_penalty": 1.0,
|
47 |
+
"max_length": 20,
|
48 |
+
"min_length": 0,
|
49 |
+
"model_type": "mt5",
|
50 |
+
"no_repeat_ngram_size": 0,
|
51 |
+
"num_beam_groups": 1,
|
52 |
+
"num_beams": 1,
|
53 |
+
"num_decoder_layers": 12,
|
54 |
+
"num_heads": 12,
|
55 |
+
"num_layers": 12,
|
56 |
+
"num_return_sequences": 1,
|
57 |
+
"output_attentions": false,
|
58 |
+
"output_hidden_states": false,
|
59 |
+
"output_past": true,
|
60 |
+
"output_scores": false,
|
61 |
+
"pad_token_id": 0,
|
62 |
+
"prefix": null,
|
63 |
+
"problem_type": null,
|
64 |
+
"pruned_heads": {},
|
65 |
+
"relative_attention_max_distance": 128,
|
66 |
+
"relative_attention_num_buckets": 32,
|
67 |
+
"remove_invalid_values": false,
|
68 |
+
"repetition_penalty": 1.0,
|
69 |
+
"return_dict": true,
|
70 |
+
"return_dict_in_generate": false,
|
71 |
+
"sep_token_id": null,
|
72 |
+
"suppress_tokens": null,
|
73 |
+
"task_specific_params": null,
|
74 |
+
"temperature": 1.0,
|
75 |
+
"tf_legacy_loss": false,
|
76 |
+
"tie_encoder_decoder": false,
|
77 |
+
"tie_word_embeddings": false,
|
78 |
+
"tokenizer_class": "T5Tokenizer",
|
79 |
+
"top_k": 50,
|
80 |
+
"top_p": 1.0,
|
81 |
+
"torch_dtype": null,
|
82 |
+
"torchscript": false,
|
83 |
+
"typical_p": 1.0,
|
84 |
+
"use_bfloat16": false,
|
85 |
+
"use_cache": true,
|
86 |
+
"vocab_size": 250102
|
87 |
+
},
|
88 |
+
"ent_token": "<<ENT>>",
|
89 |
+
"eval_every": 10000,
|
90 |
+
"fine_tune": true,
|
91 |
+
"freeze_token_rep": false,
|
92 |
+
"fuse_layers": false,
|
93 |
+
"has_rnn": true,
|
94 |
+
"hidden_size": 768,
|
95 |
+
"label_smoothing": 0,
|
96 |
+
"labels_encoder": null,
|
97 |
+
"labels_encoder_config": null,
|
98 |
+
"log_dir": "models/",
|
99 |
+
"loss_alpha": 0.75,
|
100 |
+
"loss_gamma": 0,
|
101 |
+
"loss_reduction": "sum",
|
102 |
+
"lr_encoder": "1e-5",
|
103 |
+
"lr_others": "3e-5",
|
104 |
+
"max_grad_norm": 10.0,
|
105 |
+
"max_len": 1024,
|
106 |
+
"max_neg_type_ratio": 1,
|
107 |
+
"max_types": 30,
|
108 |
+
"max_width": 12,
|
109 |
+
"model_name": "google/mt5-base",
|
110 |
+
"model_type": "gliner",
|
111 |
+
"name": "span level gliner",
|
112 |
+
"num_post_fusion_layers": 1,
|
113 |
+
"num_steps": 80000,
|
114 |
+
"post_fusion_schema": "",
|
115 |
+
"prev_path": null,
|
116 |
+
"random_drop": true,
|
117 |
+
"root_dir": "gliner_logs",
|
118 |
+
"save_total_limit": 3,
|
119 |
+
"scheduler_type": "cosine",
|
120 |
+
"sep_token": "<<SEP>>",
|
121 |
+
"shuffle_types": true,
|
122 |
+
"size_sup": -1,
|
123 |
+
"span_mode": "markerV0",
|
124 |
+
"subtoken_pooling": "first",
|
125 |
+
"train_batch_size": 8,
|
126 |
+
"train_data": "data/multilingual_data.json",
|
127 |
+
"transformers_version": "4.43.4",
|
128 |
+
"val_data_dir": "none",
|
129 |
+
"vocab_size": 250102,
|
130 |
+
"warmup_ratio": 0.05,
|
131 |
+
"weight_decay_encoder": 0.1,
|
132 |
+
"weight_decay_other": 0.01,
|
133 |
+
"words_splitter_type": "universal"
|
134 |
+
}
|
openmed_vs_sota_grouped_bars.png
ADDED
![]() |
Git LFS Details
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:54c9159f647e50e3f969a6b4f27fe58b8bc72bc7f93d81d6419234bf1801f330
|
3 |
+
size 1207355594
|
special_tokens_map.json
ADDED
@@ -0,0 +1,23 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"eos_token": {
|
3 |
+
"content": "</s>",
|
4 |
+
"lstrip": false,
|
5 |
+
"normalized": false,
|
6 |
+
"rstrip": false,
|
7 |
+
"single_word": false
|
8 |
+
},
|
9 |
+
"pad_token": {
|
10 |
+
"content": "<pad>",
|
11 |
+
"lstrip": false,
|
12 |
+
"normalized": false,
|
13 |
+
"rstrip": false,
|
14 |
+
"single_word": false
|
15 |
+
},
|
16 |
+
"unk_token": {
|
17 |
+
"content": "<unk>",
|
18 |
+
"lstrip": false,
|
19 |
+
"normalized": false,
|
20 |
+
"rstrip": false,
|
21 |
+
"single_word": false
|
22 |
+
}
|
23 |
+
}
|
spiece.model
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ef78f86560d809067d12bac6c09f19a462cb3af3f54d2b8acbba26e1433125d6
|
3 |
+
size 4309802
|
test_results.json
ADDED
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"eval_loss": 744.2429809570312,
|
3 |
+
"seqeval_accuracy": 0.945583688803663,
|
4 |
+
"seqeval_f1": 0.8166919356003176,
|
5 |
+
"seqeval_precision": 0.8105474347950702,
|
6 |
+
"seqeval_recall": 0.8229303069983995
|
7 |
+
}
|
tokenizer.json
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c23b87e1609c72116a5aea222f983df99723cb2afa554d9d137f289840c3097b
|
3 |
+
size 16335205
|
tokenizer_config.json
ADDED
@@ -0,0 +1,855 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"added_tokens_decoder": {
|
3 |
+
"0": {
|
4 |
+
"content": "<pad>",
|
5 |
+
"lstrip": false,
|
6 |
+
"normalized": false,
|
7 |
+
"rstrip": false,
|
8 |
+
"single_word": false,
|
9 |
+
"special": true
|
10 |
+
},
|
11 |
+
"1": {
|
12 |
+
"content": "</s>",
|
13 |
+
"lstrip": false,
|
14 |
+
"normalized": false,
|
15 |
+
"rstrip": false,
|
16 |
+
"single_word": false,
|
17 |
+
"special": true
|
18 |
+
},
|
19 |
+
"2": {
|
20 |
+
"content": "<unk>",
|
21 |
+
"lstrip": false,
|
22 |
+
"normalized": false,
|
23 |
+
"rstrip": false,
|
24 |
+
"single_word": false,
|
25 |
+
"special": true
|
26 |
+
},
|
27 |
+
"250000": {
|
28 |
+
"content": "β<extra_id_99>",
|
29 |
+
"lstrip": false,
|
30 |
+
"normalized": false,
|
31 |
+
"rstrip": false,
|
32 |
+
"single_word": false,
|
33 |
+
"special": false
|
34 |
+
},
|
35 |
+
"250001": {
|
36 |
+
"content": "β<extra_id_98>",
|
37 |
+
"lstrip": false,
|
38 |
+
"normalized": false,
|
39 |
+
"rstrip": false,
|
40 |
+
"single_word": false,
|
41 |
+
"special": false
|
42 |
+
},
|
43 |
+
"250002": {
|
44 |
+
"content": "β<extra_id_97>",
|
45 |
+
"lstrip": false,
|
46 |
+
"normalized": false,
|
47 |
+
"rstrip": false,
|
48 |
+
"single_word": false,
|
49 |
+
"special": false
|
50 |
+
},
|
51 |
+
"250003": {
|
52 |
+
"content": "β<extra_id_96>",
|
53 |
+
"lstrip": false,
|
54 |
+
"normalized": false,
|
55 |
+
"rstrip": false,
|
56 |
+
"single_word": false,
|
57 |
+
"special": false
|
58 |
+
},
|
59 |
+
"250004": {
|
60 |
+
"content": "β<extra_id_95>",
|
61 |
+
"lstrip": false,
|
62 |
+
"normalized": false,
|
63 |
+
"rstrip": false,
|
64 |
+
"single_word": false,
|
65 |
+
"special": false
|
66 |
+
},
|
67 |
+
"250005": {
|
68 |
+
"content": "β<extra_id_94>",
|
69 |
+
"lstrip": false,
|
70 |
+
"normalized": false,
|
71 |
+
"rstrip": false,
|
72 |
+
"single_word": false,
|
73 |
+
"special": false
|
74 |
+
},
|
75 |
+
"250006": {
|
76 |
+
"content": "β<extra_id_93>",
|
77 |
+
"lstrip": false,
|
78 |
+
"normalized": false,
|
79 |
+
"rstrip": false,
|
80 |
+
"single_word": false,
|
81 |
+
"special": false
|
82 |
+
},
|
83 |
+
"250007": {
|
84 |
+
"content": "β<extra_id_92>",
|
85 |
+
"lstrip": false,
|
86 |
+
"normalized": false,
|
87 |
+
"rstrip": false,
|
88 |
+
"single_word": false,
|
89 |
+
"special": false
|
90 |
+
},
|
91 |
+
"250008": {
|
92 |
+
"content": "β<extra_id_91>",
|
93 |
+
"lstrip": false,
|
94 |
+
"normalized": false,
|
95 |
+
"rstrip": false,
|
96 |
+
"single_word": false,
|
97 |
+
"special": false
|
98 |
+
},
|
99 |
+
"250009": {
|
100 |
+
"content": "β<extra_id_90>",
|
101 |
+
"lstrip": false,
|
102 |
+
"normalized": false,
|
103 |
+
"rstrip": false,
|
104 |
+
"single_word": false,
|
105 |
+
"special": false
|
106 |
+
},
|
107 |
+
"250010": {
|
108 |
+
"content": "β<extra_id_89>",
|
109 |
+
"lstrip": false,
|
110 |
+
"normalized": false,
|
111 |
+
"rstrip": false,
|
112 |
+
"single_word": false,
|
113 |
+
"special": false
|
114 |
+
},
|
115 |
+
"250011": {
|
116 |
+
"content": "β<extra_id_88>",
|
117 |
+
"lstrip": false,
|
118 |
+
"normalized": false,
|
119 |
+
"rstrip": false,
|
120 |
+
"single_word": false,
|
121 |
+
"special": false
|
122 |
+
},
|
123 |
+
"250012": {
|
124 |
+
"content": "β<extra_id_87>",
|
125 |
+
"lstrip": false,
|
126 |
+
"normalized": false,
|
127 |
+
"rstrip": false,
|
128 |
+
"single_word": false,
|
129 |
+
"special": false
|
130 |
+
},
|
131 |
+
"250013": {
|
132 |
+
"content": "β<extra_id_86>",
|
133 |
+
"lstrip": false,
|
134 |
+
"normalized": false,
|
135 |
+
"rstrip": false,
|
136 |
+
"single_word": false,
|
137 |
+
"special": false
|
138 |
+
},
|
139 |
+
"250014": {
|
140 |
+
"content": "β<extra_id_85>",
|
141 |
+
"lstrip": false,
|
142 |
+
"normalized": false,
|
143 |
+
"rstrip": false,
|
144 |
+
"single_word": false,
|
145 |
+
"special": false
|
146 |
+
},
|
147 |
+
"250015": {
|
148 |
+
"content": "β<extra_id_84>",
|
149 |
+
"lstrip": false,
|
150 |
+
"normalized": false,
|
151 |
+
"rstrip": false,
|
152 |
+
"single_word": false,
|
153 |
+
"special": false
|
154 |
+
},
|
155 |
+
"250016": {
|
156 |
+
"content": "β<extra_id_83>",
|
157 |
+
"lstrip": false,
|
158 |
+
"normalized": false,
|
159 |
+
"rstrip": false,
|
160 |
+
"single_word": false,
|
161 |
+
"special": false
|
162 |
+
},
|
163 |
+
"250017": {
|
164 |
+
"content": "β<extra_id_82>",
|
165 |
+
"lstrip": false,
|
166 |
+
"normalized": false,
|
167 |
+
"rstrip": false,
|
168 |
+
"single_word": false,
|
169 |
+
"special": false
|
170 |
+
},
|
171 |
+
"250018": {
|
172 |
+
"content": "β<extra_id_81>",
|
173 |
+
"lstrip": false,
|
174 |
+
"normalized": false,
|
175 |
+
"rstrip": false,
|
176 |
+
"single_word": false,
|
177 |
+
"special": false
|
178 |
+
},
|
179 |
+
"250019": {
|
180 |
+
"content": "β<extra_id_80>",
|
181 |
+
"lstrip": false,
|
182 |
+
"normalized": false,
|
183 |
+
"rstrip": false,
|
184 |
+
"single_word": false,
|
185 |
+
"special": false
|
186 |
+
},
|
187 |
+
"250020": {
|
188 |
+
"content": "β<extra_id_79>",
|
189 |
+
"lstrip": false,
|
190 |
+
"normalized": false,
|
191 |
+
"rstrip": false,
|
192 |
+
"single_word": false,
|
193 |
+
"special": false
|
194 |
+
},
|
195 |
+
"250021": {
|
196 |
+
"content": "β<extra_id_78>",
|
197 |
+
"lstrip": false,
|
198 |
+
"normalized": false,
|
199 |
+
"rstrip": false,
|
200 |
+
"single_word": false,
|
201 |
+
"special": false
|
202 |
+
},
|
203 |
+
"250022": {
|
204 |
+
"content": "β<extra_id_77>",
|
205 |
+
"lstrip": false,
|
206 |
+
"normalized": false,
|
207 |
+
"rstrip": false,
|
208 |
+
"single_word": false,
|
209 |
+
"special": false
|
210 |
+
},
|
211 |
+
"250023": {
|
212 |
+
"content": "β<extra_id_76>",
|
213 |
+
"lstrip": false,
|
214 |
+
"normalized": false,
|
215 |
+
"rstrip": false,
|
216 |
+
"single_word": false,
|
217 |
+
"special": false
|
218 |
+
},
|
219 |
+
"250024": {
|
220 |
+
"content": "β<extra_id_75>",
|
221 |
+
"lstrip": false,
|
222 |
+
"normalized": false,
|
223 |
+
"rstrip": false,
|
224 |
+
"single_word": false,
|
225 |
+
"special": false
|
226 |
+
},
|
227 |
+
"250025": {
|
228 |
+
"content": "β<extra_id_74>",
|
229 |
+
"lstrip": false,
|
230 |
+
"normalized": false,
|
231 |
+
"rstrip": false,
|
232 |
+
"single_word": false,
|
233 |
+
"special": false
|
234 |
+
},
|
235 |
+
"250026": {
|
236 |
+
"content": "β<extra_id_73>",
|
237 |
+
"lstrip": false,
|
238 |
+
"normalized": false,
|
239 |
+
"rstrip": false,
|
240 |
+
"single_word": false,
|
241 |
+
"special": false
|
242 |
+
},
|
243 |
+
"250027": {
|
244 |
+
"content": "β<extra_id_72>",
|
245 |
+
"lstrip": false,
|
246 |
+
"normalized": false,
|
247 |
+
"rstrip": false,
|
248 |
+
"single_word": false,
|
249 |
+
"special": false
|
250 |
+
},
|
251 |
+
"250028": {
|
252 |
+
"content": "β<extra_id_71>",
|
253 |
+
"lstrip": false,
|
254 |
+
"normalized": false,
|
255 |
+
"rstrip": false,
|
256 |
+
"single_word": false,
|
257 |
+
"special": false
|
258 |
+
},
|
259 |
+
"250029": {
|
260 |
+
"content": "β<extra_id_70>",
|
261 |
+
"lstrip": false,
|
262 |
+
"normalized": false,
|
263 |
+
"rstrip": false,
|
264 |
+
"single_word": false,
|
265 |
+
"special": false
|
266 |
+
},
|
267 |
+
"250030": {
|
268 |
+
"content": "β<extra_id_69>",
|
269 |
+
"lstrip": false,
|
270 |
+
"normalized": false,
|
271 |
+
"rstrip": false,
|
272 |
+
"single_word": false,
|
273 |
+
"special": false
|
274 |
+
},
|
275 |
+
"250031": {
|
276 |
+
"content": "β<extra_id_68>",
|
277 |
+
"lstrip": false,
|
278 |
+
"normalized": false,
|
279 |
+
"rstrip": false,
|
280 |
+
"single_word": false,
|
281 |
+
"special": false
|
282 |
+
},
|
283 |
+
"250032": {
|
284 |
+
"content": "β<extra_id_67>",
|
285 |
+
"lstrip": false,
|
286 |
+
"normalized": false,
|
287 |
+
"rstrip": false,
|
288 |
+
"single_word": false,
|
289 |
+
"special": false
|
290 |
+
},
|
291 |
+
"250033": {
|
292 |
+
"content": "β<extra_id_66>",
|
293 |
+
"lstrip": false,
|
294 |
+
"normalized": false,
|
295 |
+
"rstrip": false,
|
296 |
+
"single_word": false,
|
297 |
+
"special": false
|
298 |
+
},
|
299 |
+
"250034": {
|
300 |
+
"content": "β<extra_id_65>",
|
301 |
+
"lstrip": false,
|
302 |
+
"normalized": false,
|
303 |
+
"rstrip": false,
|
304 |
+
"single_word": false,
|
305 |
+
"special": false
|
306 |
+
},
|
307 |
+
"250035": {
|
308 |
+
"content": "β<extra_id_64>",
|
309 |
+
"lstrip": false,
|
310 |
+
"normalized": false,
|
311 |
+
"rstrip": false,
|
312 |
+
"single_word": false,
|
313 |
+
"special": false
|
314 |
+
},
|
315 |
+
"250036": {
|
316 |
+
"content": "β<extra_id_63>",
|
317 |
+
"lstrip": false,
|
318 |
+
"normalized": false,
|
319 |
+
"rstrip": false,
|
320 |
+
"single_word": false,
|
321 |
+
"special": false
|
322 |
+
},
|
323 |
+
"250037": {
|
324 |
+
"content": "β<extra_id_62>",
|
325 |
+
"lstrip": false,
|
326 |
+
"normalized": false,
|
327 |
+
"rstrip": false,
|
328 |
+
"single_word": false,
|
329 |
+
"special": false
|
330 |
+
},
|
331 |
+
"250038": {
|
332 |
+
"content": "β<extra_id_61>",
|
333 |
+
"lstrip": false,
|
334 |
+
"normalized": false,
|
335 |
+
"rstrip": false,
|
336 |
+
"single_word": false,
|
337 |
+
"special": false
|
338 |
+
},
|
339 |
+
"250039": {
|
340 |
+
"content": "β<extra_id_60>",
|
341 |
+
"lstrip": false,
|
342 |
+
"normalized": false,
|
343 |
+
"rstrip": false,
|
344 |
+
"single_word": false,
|
345 |
+
"special": false
|
346 |
+
},
|
347 |
+
"250040": {
|
348 |
+
"content": "β<extra_id_59>",
|
349 |
+
"lstrip": false,
|
350 |
+
"normalized": false,
|
351 |
+
"rstrip": false,
|
352 |
+
"single_word": false,
|
353 |
+
"special": false
|
354 |
+
},
|
355 |
+
"250041": {
|
356 |
+
"content": "β<extra_id_58>",
|
357 |
+
"lstrip": false,
|
358 |
+
"normalized": false,
|
359 |
+
"rstrip": false,
|
360 |
+
"single_word": false,
|
361 |
+
"special": false
|
362 |
+
},
|
363 |
+
"250042": {
|
364 |
+
"content": "β<extra_id_57>",
|
365 |
+
"lstrip": false,
|
366 |
+
"normalized": false,
|
367 |
+
"rstrip": false,
|
368 |
+
"single_word": false,
|
369 |
+
"special": false
|
370 |
+
},
|
371 |
+
"250043": {
|
372 |
+
"content": "β<extra_id_56>",
|
373 |
+
"lstrip": false,
|
374 |
+
"normalized": false,
|
375 |
+
"rstrip": false,
|
376 |
+
"single_word": false,
|
377 |
+
"special": false
|
378 |
+
},
|
379 |
+
"250044": {
|
380 |
+
"content": "β<extra_id_55>",
|
381 |
+
"lstrip": false,
|
382 |
+
"normalized": false,
|
383 |
+
"rstrip": false,
|
384 |
+
"single_word": false,
|
385 |
+
"special": false
|
386 |
+
},
|
387 |
+
"250045": {
|
388 |
+
"content": "β<extra_id_54>",
|
389 |
+
"lstrip": false,
|
390 |
+
"normalized": false,
|
391 |
+
"rstrip": false,
|
392 |
+
"single_word": false,
|
393 |
+
"special": false
|
394 |
+
},
|
395 |
+
"250046": {
|
396 |
+
"content": "β<extra_id_53>",
|
397 |
+
"lstrip": false,
|
398 |
+
"normalized": false,
|
399 |
+
"rstrip": false,
|
400 |
+
"single_word": false,
|
401 |
+
"special": false
|
402 |
+
},
|
403 |
+
"250047": {
|
404 |
+
"content": "β<extra_id_52>",
|
405 |
+
"lstrip": false,
|
406 |
+
"normalized": false,
|
407 |
+
"rstrip": false,
|
408 |
+
"single_word": false,
|
409 |
+
"special": false
|
410 |
+
},
|
411 |
+
"250048": {
|
412 |
+
"content": "β<extra_id_51>",
|
413 |
+
"lstrip": false,
|
414 |
+
"normalized": false,
|
415 |
+
"rstrip": false,
|
416 |
+
"single_word": false,
|
417 |
+
"special": false
|
418 |
+
},
|
419 |
+
"250049": {
|
420 |
+
"content": "β<extra_id_50>",
|
421 |
+
"lstrip": false,
|
422 |
+
"normalized": false,
|
423 |
+
"rstrip": false,
|
424 |
+
"single_word": false,
|
425 |
+
"special": false
|
426 |
+
},
|
427 |
+
"250050": {
|
428 |
+
"content": "β<extra_id_49>",
|
429 |
+
"lstrip": false,
|
430 |
+
"normalized": false,
|
431 |
+
"rstrip": false,
|
432 |
+
"single_word": false,
|
433 |
+
"special": false
|
434 |
+
},
|
435 |
+
"250051": {
|
436 |
+
"content": "β<extra_id_48>",
|
437 |
+
"lstrip": false,
|
438 |
+
"normalized": false,
|
439 |
+
"rstrip": false,
|
440 |
+
"single_word": false,
|
441 |
+
"special": false
|
442 |
+
},
|
443 |
+
"250052": {
|
444 |
+
"content": "β<extra_id_47>",
|
445 |
+
"lstrip": false,
|
446 |
+
"normalized": false,
|
447 |
+
"rstrip": false,
|
448 |
+
"single_word": false,
|
449 |
+
"special": false
|
450 |
+
},
|
451 |
+
"250053": {
|
452 |
+
"content": "β<extra_id_46>",
|
453 |
+
"lstrip": false,
|
454 |
+
"normalized": false,
|
455 |
+
"rstrip": false,
|
456 |
+
"single_word": false,
|
457 |
+
"special": false
|
458 |
+
},
|
459 |
+
"250054": {
|
460 |
+
"content": "β<extra_id_45>",
|
461 |
+
"lstrip": false,
|
462 |
+
"normalized": false,
|
463 |
+
"rstrip": false,
|
464 |
+
"single_word": false,
|
465 |
+
"special": false
|
466 |
+
},
|
467 |
+
"250055": {
|
468 |
+
"content": "β<extra_id_44>",
|
469 |
+
"lstrip": false,
|
470 |
+
"normalized": false,
|
471 |
+
"rstrip": false,
|
472 |
+
"single_word": false,
|
473 |
+
"special": false
|
474 |
+
},
|
475 |
+
"250056": {
|
476 |
+
"content": "β<extra_id_43>",
|
477 |
+
"lstrip": false,
|
478 |
+
"normalized": false,
|
479 |
+
"rstrip": false,
|
480 |
+
"single_word": false,
|
481 |
+
"special": false
|
482 |
+
},
|
483 |
+
"250057": {
|
484 |
+
"content": "β<extra_id_42>",
|
485 |
+
"lstrip": false,
|
486 |
+
"normalized": false,
|
487 |
+
"rstrip": false,
|
488 |
+
"single_word": false,
|
489 |
+
"special": false
|
490 |
+
},
|
491 |
+
"250058": {
|
492 |
+
"content": "β<extra_id_41>",
|
493 |
+
"lstrip": false,
|
494 |
+
"normalized": false,
|
495 |
+
"rstrip": false,
|
496 |
+
"single_word": false,
|
497 |
+
"special": false
|
498 |
+
},
|
499 |
+
"250059": {
|
500 |
+
"content": "β<extra_id_40>",
|
501 |
+
"lstrip": false,
|
502 |
+
"normalized": false,
|
503 |
+
"rstrip": false,
|
504 |
+
"single_word": false,
|
505 |
+
"special": false
|
506 |
+
},
|
507 |
+
"250060": {
|
508 |
+
"content": "β<extra_id_39>",
|
509 |
+
"lstrip": false,
|
510 |
+
"normalized": false,
|
511 |
+
"rstrip": false,
|
512 |
+
"single_word": false,
|
513 |
+
"special": false
|
514 |
+
},
|
515 |
+
"250061": {
|
516 |
+
"content": "β<extra_id_38>",
|
517 |
+
"lstrip": false,
|
518 |
+
"normalized": false,
|
519 |
+
"rstrip": false,
|
520 |
+
"single_word": false,
|
521 |
+
"special": false
|
522 |
+
},
|
523 |
+
"250062": {
|
524 |
+
"content": "β<extra_id_37>",
|
525 |
+
"lstrip": false,
|
526 |
+
"normalized": false,
|
527 |
+
"rstrip": false,
|
528 |
+
"single_word": false,
|
529 |
+
"special": false
|
530 |
+
},
|
531 |
+
"250063": {
|
532 |
+
"content": "β<extra_id_36>",
|
533 |
+
"lstrip": false,
|
534 |
+
"normalized": false,
|
535 |
+
"rstrip": false,
|
536 |
+
"single_word": false,
|
537 |
+
"special": false
|
538 |
+
},
|
539 |
+
"250064": {
|
540 |
+
"content": "β<extra_id_35>",
|
541 |
+
"lstrip": false,
|
542 |
+
"normalized": false,
|
543 |
+
"rstrip": false,
|
544 |
+
"single_word": false,
|
545 |
+
"special": false
|
546 |
+
},
|
547 |
+
"250065": {
|
548 |
+
"content": "β<extra_id_34>",
|
549 |
+
"lstrip": false,
|
550 |
+
"normalized": false,
|
551 |
+
"rstrip": false,
|
552 |
+
"single_word": false,
|
553 |
+
"special": false
|
554 |
+
},
|
555 |
+
"250066": {
|
556 |
+
"content": "β<extra_id_33>",
|
557 |
+
"lstrip": false,
|
558 |
+
"normalized": false,
|
559 |
+
"rstrip": false,
|
560 |
+
"single_word": false,
|
561 |
+
"special": false
|
562 |
+
},
|
563 |
+
"250067": {
|
564 |
+
"content": "β<extra_id_32>",
|
565 |
+
"lstrip": false,
|
566 |
+
"normalized": false,
|
567 |
+
"rstrip": false,
|
568 |
+
"single_word": false,
|
569 |
+
"special": false
|
570 |
+
},
|
571 |
+
"250068": {
|
572 |
+
"content": "β<extra_id_31>",
|
573 |
+
"lstrip": false,
|
574 |
+
"normalized": false,
|
575 |
+
"rstrip": false,
|
576 |
+
"single_word": false,
|
577 |
+
"special": false
|
578 |
+
},
|
579 |
+
"250069": {
|
580 |
+
"content": "β<extra_id_30>",
|
581 |
+
"lstrip": false,
|
582 |
+
"normalized": false,
|
583 |
+
"rstrip": false,
|
584 |
+
"single_word": false,
|
585 |
+
"special": false
|
586 |
+
},
|
587 |
+
"250070": {
|
588 |
+
"content": "β<extra_id_29>",
|
589 |
+
"lstrip": false,
|
590 |
+
"normalized": false,
|
591 |
+
"rstrip": false,
|
592 |
+
"single_word": false,
|
593 |
+
"special": false
|
594 |
+
},
|
595 |
+
"250071": {
|
596 |
+
"content": "β<extra_id_28>",
|
597 |
+
"lstrip": false,
|
598 |
+
"normalized": false,
|
599 |
+
"rstrip": false,
|
600 |
+
"single_word": false,
|
601 |
+
"special": false
|
602 |
+
},
|
603 |
+
"250072": {
|
604 |
+
"content": "β<extra_id_27>",
|
605 |
+
"lstrip": false,
|
606 |
+
"normalized": false,
|
607 |
+
"rstrip": false,
|
608 |
+
"single_word": false,
|
609 |
+
"special": false
|
610 |
+
},
|
611 |
+
"250073": {
|
612 |
+
"content": "β<extra_id_26>",
|
613 |
+
"lstrip": false,
|
614 |
+
"normalized": false,
|
615 |
+
"rstrip": false,
|
616 |
+
"single_word": false,
|
617 |
+
"special": false
|
618 |
+
},
|
619 |
+
"250074": {
|
620 |
+
"content": "β<extra_id_25>",
|
621 |
+
"lstrip": false,
|
622 |
+
"normalized": false,
|
623 |
+
"rstrip": false,
|
624 |
+
"single_word": false,
|
625 |
+
"special": false
|
626 |
+
},
|
627 |
+
"250075": {
|
628 |
+
"content": "β<extra_id_24>",
|
629 |
+
"lstrip": false,
|
630 |
+
"normalized": false,
|
631 |
+
"rstrip": false,
|
632 |
+
"single_word": false,
|
633 |
+
"special": false
|
634 |
+
},
|
635 |
+
"250076": {
|
636 |
+
"content": "β<extra_id_23>",
|
637 |
+
"lstrip": false,
|
638 |
+
"normalized": false,
|
639 |
+
"rstrip": false,
|
640 |
+
"single_word": false,
|
641 |
+
"special": false
|
642 |
+
},
|
643 |
+
"250077": {
|
644 |
+
"content": "β<extra_id_22>",
|
645 |
+
"lstrip": false,
|
646 |
+
"normalized": false,
|
647 |
+
"rstrip": false,
|
648 |
+
"single_word": false,
|
649 |
+
"special": false
|
650 |
+
},
|
651 |
+
"250078": {
|
652 |
+
"content": "β<extra_id_21>",
|
653 |
+
"lstrip": false,
|
654 |
+
"normalized": false,
|
655 |
+
"rstrip": false,
|
656 |
+
"single_word": false,
|
657 |
+
"special": false
|
658 |
+
},
|
659 |
+
"250079": {
|
660 |
+
"content": "β<extra_id_20>",
|
661 |
+
"lstrip": false,
|
662 |
+
"normalized": false,
|
663 |
+
"rstrip": false,
|
664 |
+
"single_word": false,
|
665 |
+
"special": false
|
666 |
+
},
|
667 |
+
"250080": {
|
668 |
+
"content": "β<extra_id_19>",
|
669 |
+
"lstrip": false,
|
670 |
+
"normalized": false,
|
671 |
+
"rstrip": false,
|
672 |
+
"single_word": false,
|
673 |
+
"special": false
|
674 |
+
},
|
675 |
+
"250081": {
|
676 |
+
"content": "β<extra_id_18>",
|
677 |
+
"lstrip": false,
|
678 |
+
"normalized": false,
|
679 |
+
"rstrip": false,
|
680 |
+
"single_word": false,
|
681 |
+
"special": false
|
682 |
+
},
|
683 |
+
"250082": {
|
684 |
+
"content": "β<extra_id_17>",
|
685 |
+
"lstrip": false,
|
686 |
+
"normalized": false,
|
687 |
+
"rstrip": false,
|
688 |
+
"single_word": false,
|
689 |
+
"special": false
|
690 |
+
},
|
691 |
+
"250083": {
|
692 |
+
"content": "β<extra_id_16>",
|
693 |
+
"lstrip": false,
|
694 |
+
"normalized": false,
|
695 |
+
"rstrip": false,
|
696 |
+
"single_word": false,
|
697 |
+
"special": false
|
698 |
+
},
|
699 |
+
"250084": {
|
700 |
+
"content": "β<extra_id_15>",
|
701 |
+
"lstrip": false,
|
702 |
+
"normalized": false,
|
703 |
+
"rstrip": false,
|
704 |
+
"single_word": false,
|
705 |
+
"special": false
|
706 |
+
},
|
707 |
+
"250085": {
|
708 |
+
"content": "β<extra_id_14>",
|
709 |
+
"lstrip": false,
|
710 |
+
"normalized": false,
|
711 |
+
"rstrip": false,
|
712 |
+
"single_word": false,
|
713 |
+
"special": false
|
714 |
+
},
|
715 |
+
"250086": {
|
716 |
+
"content": "β<extra_id_13>",
|
717 |
+
"lstrip": false,
|
718 |
+
"normalized": false,
|
719 |
+
"rstrip": false,
|
720 |
+
"single_word": false,
|
721 |
+
"special": false
|
722 |
+
},
|
723 |
+
"250087": {
|
724 |
+
"content": "β<extra_id_12>",
|
725 |
+
"lstrip": false,
|
726 |
+
"normalized": false,
|
727 |
+
"rstrip": false,
|
728 |
+
"single_word": false,
|
729 |
+
"special": false
|
730 |
+
},
|
731 |
+
"250088": {
|
732 |
+
"content": "β<extra_id_11>",
|
733 |
+
"lstrip": false,
|
734 |
+
"normalized": false,
|
735 |
+
"rstrip": false,
|
736 |
+
"single_word": false,
|
737 |
+
"special": false
|
738 |
+
},
|
739 |
+
"250089": {
|
740 |
+
"content": "β<extra_id_10>",
|
741 |
+
"lstrip": false,
|
742 |
+
"normalized": false,
|
743 |
+
"rstrip": false,
|
744 |
+
"single_word": false,
|
745 |
+
"special": false
|
746 |
+
},
|
747 |
+
"250090": {
|
748 |
+
"content": "β<extra_id_9>",
|
749 |
+
"lstrip": false,
|
750 |
+
"normalized": false,
|
751 |
+
"rstrip": false,
|
752 |
+
"single_word": false,
|
753 |
+
"special": false
|
754 |
+
},
|
755 |
+
"250091": {
|
756 |
+
"content": "β<extra_id_8>",
|
757 |
+
"lstrip": false,
|
758 |
+
"normalized": false,
|
759 |
+
"rstrip": false,
|
760 |
+
"single_word": false,
|
761 |
+
"special": false
|
762 |
+
},
|
763 |
+
"250092": {
|
764 |
+
"content": "β<extra_id_7>",
|
765 |
+
"lstrip": false,
|
766 |
+
"normalized": false,
|
767 |
+
"rstrip": false,
|
768 |
+
"single_word": false,
|
769 |
+
"special": false
|
770 |
+
},
|
771 |
+
"250093": {
|
772 |
+
"content": "β<extra_id_6>",
|
773 |
+
"lstrip": false,
|
774 |
+
"normalized": false,
|
775 |
+
"rstrip": false,
|
776 |
+
"single_word": false,
|
777 |
+
"special": false
|
778 |
+
},
|
779 |
+
"250094": {
|
780 |
+
"content": "β<extra_id_5>",
|
781 |
+
"lstrip": false,
|
782 |
+
"normalized": false,
|
783 |
+
"rstrip": false,
|
784 |
+
"single_word": false,
|
785 |
+
"special": false
|
786 |
+
},
|
787 |
+
"250095": {
|
788 |
+
"content": "β<extra_id_4>",
|
789 |
+
"lstrip": false,
|
790 |
+
"normalized": false,
|
791 |
+
"rstrip": false,
|
792 |
+
"single_word": false,
|
793 |
+
"special": false
|
794 |
+
},
|
795 |
+
"250096": {
|
796 |
+
"content": "β<extra_id_3>",
|
797 |
+
"lstrip": false,
|
798 |
+
"normalized": false,
|
799 |
+
"rstrip": false,
|
800 |
+
"single_word": false,
|
801 |
+
"special": false
|
802 |
+
},
|
803 |
+
"250097": {
|
804 |
+
"content": "β<extra_id_2>",
|
805 |
+
"lstrip": false,
|
806 |
+
"normalized": false,
|
807 |
+
"rstrip": false,
|
808 |
+
"single_word": false,
|
809 |
+
"special": false
|
810 |
+
},
|
811 |
+
"250098": {
|
812 |
+
"content": "β<extra_id_1>",
|
813 |
+
"lstrip": false,
|
814 |
+
"normalized": false,
|
815 |
+
"rstrip": false,
|
816 |
+
"single_word": false,
|
817 |
+
"special": false
|
818 |
+
},
|
819 |
+
"250099": {
|
820 |
+
"content": "β<extra_id_0>",
|
821 |
+
"lstrip": false,
|
822 |
+
"normalized": false,
|
823 |
+
"rstrip": false,
|
824 |
+
"single_word": false,
|
825 |
+
"special": false
|
826 |
+
},
|
827 |
+
"250100": {
|
828 |
+
"content": "<<ENT>>",
|
829 |
+
"lstrip": false,
|
830 |
+
"normalized": false,
|
831 |
+
"rstrip": false,
|
832 |
+
"single_word": false,
|
833 |
+
"special": true
|
834 |
+
},
|
835 |
+
"250101": {
|
836 |
+
"content": "<<SEP>>",
|
837 |
+
"lstrip": false,
|
838 |
+
"normalized": false,
|
839 |
+
"rstrip": false,
|
840 |
+
"single_word": false,
|
841 |
+
"special": true
|
842 |
+
}
|
843 |
+
},
|
844 |
+
"additional_special_tokens": [],
|
845 |
+
"clean_up_tokenization_spaces": false,
|
846 |
+
"eos_token": "</s>",
|
847 |
+
"extra_ids": 0,
|
848 |
+
"extra_special_tokens": {},
|
849 |
+
"legacy": true,
|
850 |
+
"model_max_length": 1000000000000000019884624838656,
|
851 |
+
"pad_token": "<pad>",
|
852 |
+
"sp_model_kwargs": {},
|
853 |
+
"tokenizer_class": "T5Tokenizer",
|
854 |
+
"unk_token": "<unk>"
|
855 |
+
}
|