Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,27 @@
|
|
1 |
-
---
|
2 |
-
license: mit
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
language:
|
4 |
+
- ar
|
5 |
+
---
|
6 |
+
|
7 |
+
## Checkpoints
|
8 |
+
|
9 |
+
### Pre-Trained Models
|
10 |
+
|
11 |
+
Model | Pre-train Dataset | Model | Tokenizer |
|
12 |
+
| --- | --- | --- | --- |
|
13 |
+
| ArTST v3 base | Multilingual | [Hugging Face](https://huggingface.co/MBZUAI/ArTSTv3/blob/main/pretrain_checkpoint.pt) | [Hugging Face](https://huggingface.co/MBZUAI/ArTSTv3/blob/main/tokenizer_artstv3.model)
|
14 |
+
|
15 |
+
# Acknowledgements
|
16 |
+
|
17 |
+
ArTST is built on [SpeechT5](https://arxiv.org/abs/2110.07205) Architecture. If you use any of ArTST models, please cite
|
18 |
+
|
19 |
+
```
|
20 |
+
@inproceedings{toyin2023artst,
|
21 |
+
title={ArTST: Arabic Text and Speech Transformer},
|
22 |
+
author={Toyin, Hawau and Djanibekov, Amirbek and Kulkarni, Ajinkya and Aldarmaki, Hanan},
|
23 |
+
booktitle={Proceedings of ArabicNLP 2023},
|
24 |
+
pages={41--51},
|
25 |
+
year={2023}
|
26 |
+
}
|
27 |
+
```
|