austinbv commited on
Commit
87cabee
·
verified ·
1 Parent(s): d7339d3

Add files using upload-large-folder tool

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,204 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: mlx
3
+ language:
4
+ - ar
5
+ - de
6
+ - en
7
+ - es
8
+ - fr
9
+ - hi
10
+ - id
11
+ - it
12
+ - pt
13
+ - th
14
+ - tl
15
+ - vi
16
+ tags:
17
+ - facebook
18
+ - meta
19
+ - pytorch
20
+ - llama
21
+ - llama-4
22
+ - mlx
23
+ extra_gated_prompt: '**LLAMA 4 COMMUNITY LICENSE AGREEMENT**
24
+
25
+ Llama 4 Version Effective Date: April 5, 2025
26
+
27
+ "**Agreement**" means the terms and conditions for use, reproduction, distribution
28
+ and modification of the Llama Materials set forth herein.
29
+
30
+ "**Documentation**" means the specifications, manuals and documentation accompanying
31
+ Llama 4 distributed by Meta at [https://www.llama.com/docs/overview](https://llama.com/docs/overview).
32
+
33
+ "**Licensee**" or "**you**" means you, or your employer or any other person or entity
34
+ (if you are entering into this Agreement on such person or entity’s behalf), of
35
+ the age required under applicable laws, rules or regulations to provide legal consent
36
+ and that has legal authority to bind your employer or such other person or entity
37
+ if you are entering in this Agreement on their behalf.
38
+
39
+ "**Llama 4**" means the foundational large language models and software and algorithms,
40
+ including machine-learning model code, trained model weights, inference-enabling
41
+ code, training-enabling code, fine-tuning enabling code and other elements of the
42
+ foregoing distributed by Meta at [https://www.llama.com/llama-downloads](https://www.llama.com/llama-downloads).
43
+
44
+ "**Llama Materials**" means, collectively, Meta’s proprietary Llama 4 and Documentation
45
+ (and any portion thereof) made available under this Agreement.
46
+
47
+ "**Meta**" or "**we**" means Meta Platforms Ireland Limited (if you are located
48
+ in or, if you are an entity, your principal place of business is in the EEA or Switzerland)
49
+ and Meta Platforms, Inc. (if you are located outside of the EEA or Switzerland). 
50
+
51
+ By clicking "I Accept" below or by using or distributing any portion or element
52
+ of the Llama Materials, you agree to be bound by this Agreement.
53
+
54
+ 1\. **License Rights and Redistribution**.
55
+
56
+ a. Grant of Rights. You are granted a non-exclusive, worldwide, non-transferable
57
+ and royalty-free limited license under Meta’s intellectual property or other rights
58
+ owned by Meta embodied in the Llama Materials to use, reproduce, distribute, copy,
59
+ create derivative works of, and make modifications to the Llama Materials.  
60
+
61
+ b. Redistribution and Use.  
62
+
63
+ i. If you distribute or make available the Llama Materials (or any derivative works
64
+ thereof), or a product or service (including another AI model) that contains any
65
+ of them, you shall (A) provide a copy of this Agreement with any such Llama Materials;
66
+ and (B) prominently display "Built with Llama" on a related website, user interface,
67
+ blogpost, about page, or product documentation. If you use the Llama Materials or
68
+ any outputs or results of the Llama Materials to create, train, fine tune, or otherwise
69
+ improve an AI model, which is distributed or made available, you shall also include
70
+ "Llama" at the beginning of any such AI model name.
71
+
72
+ ii. If you receive Llama Materials, or any derivative works thereof, from a Licensee
73
+ as part of an integrated end user product, then Section 2 of this Agreement will
74
+ not apply to you. 
75
+
76
+ iii. You must retain in all copies of the Llama Materials that you distribute the
77
+ following attribution notice within a "Notice" text file distributed as a part of
78
+ such copies: "Llama 4 is licensed under the Llama 4 Community License, Copyright
79
+ © Meta Platforms, Inc. All Rights Reserved."
80
+
81
+ iv. Your use of the Llama Materials must comply with applicable laws and regulations
82
+ (including trade compliance laws and regulations) and adhere to the Acceptable Use
83
+ Policy for the Llama Materials (available at [https://www.llama.com/llama4/use-policy](https://www.llama.com/llama4/use-policy)),
84
+ which is hereby incorporated by reference into this Agreement.    2\. **Additional
85
+ Commercial Terms**. If, on the Llama 4 version release date, the monthly active
86
+ users of the products or services made available by or for Licensee, or Licensee’s
87
+ affiliates, is greater than 700 million monthly active users in the preceding calendar
88
+ month, you must request a license from Meta, which Meta may grant to you in its
89
+ sole discretion, and you are not authorized to exercise any of the rights under
90
+ this Agreement unless or until Meta otherwise expressly grants you such rights.
91
+
92
+ 3**. Disclaimer of Warranty**. UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA MATERIALS
93
+ AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES
94
+ OF ANY KIND, AND META DISCLAIMS ALL WARRANTIES OF ANY KIND, BOTH EXPRESS AND IMPLIED,
95
+ INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY,
96
+ OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE FOR DETERMINING
97
+ THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE LLAMA MATERIALS AND ASSUME ANY
98
+ RISKS ASSOCIATED WITH YOUR USE OF THE LLAMA MATERIALS AND ANY OUTPUT AND RESULTS.
99
+
100
+ 4\. **Limitation of Liability**. IN NO EVENT WILL META OR ITS AFFILIATES BE LIABLE
101
+ UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY,
102
+ OR OTHERWISE, ARISING OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT,
103
+ SPECIAL, CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF META
104
+ OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING.
105
+
106
+ 5\. **Intellectual Property**.
107
+
108
+ a. No trademark licenses are granted under this Agreement, and in connection with
109
+ the Llama Materials, neither Meta nor Licensee may use any name or mark owned by
110
+ or associated with the other or any of its affiliates, except as required for reasonable
111
+ and customary use in describing and redistributing the Llama Materials or as set
112
+ forth in this Section 5(a). Meta hereby grants you a license to use "Llama" (the
113
+ "Mark") solely as required to comply with the last sentence of Section 1.b.i. You
114
+ will comply with Meta’s brand guidelines (currently accessible at [https://about.meta.com/brand/resources/meta/company-brand/](https://about.meta.com/brand/resources/meta/company-brand/)[)](https://en.facebookbrand.com/).
115
+ All goodwill arising out of your use of the Mark will inure to the benefit of Meta.
116
+
117
+ b. Subject to Meta’s ownership of Llama Materials and derivatives made by or for
118
+ Meta, with respect to any derivative works and modifications of the Llama Materials
119
+ that are made by you, as between you and Meta, you are and will be the owner of
120
+ such derivative works and modifications.
121
+
122
+ c. If you institute litigation or other proceedings against Meta or any entity (including
123
+ a cross-claim or counterclaim in a lawsuit) alleging that the Llama Materials or
124
+ Llama 4 outputs or results, or any portion of any of the foregoing, constitutes
125
+ infringement of intellectual property or other rights owned or licensable by you,
126
+ then any licenses granted to you under this Agreement shall terminate as of the
127
+ date such litigation or claim is filed or instituted. You will indemnify and hold
128
+ harmless Meta from and against any claim by any third party arising out of or related
129
+ to your use or distribution of the Llama Materials.
130
+
131
+ 6\. **Term and Termination**. The term of this Agreement will commence upon your
132
+ acceptance of this Agreement or access to the Llama Materials and will continue
133
+ in full force and effect until terminated in accordance with the terms and conditions
134
+ herein. Meta may terminate this Agreement if you are in breach of any term or condition
135
+ of this Agreement. Upon termination of this Agreement, you shall delete and cease
136
+ use of the Llama Materials. Sections 3, 4 and 7 shall survive the termination of
137
+ this Agreement. 
138
+
139
+ 7\. **Governing Law and Jurisdiction**. This Agreement will be governed and construed
140
+ under the laws of the State of California without regard to choice of law principles,
141
+ and the UN Convention on Contracts for the International Sale of Goods does not
142
+ apply to this Agreement. The courts of California shall have exclusive jurisdiction
143
+ of any dispute arising out of this Agreement.'
144
+ extra_gated_fields:
145
+ First Name: text
146
+ Last Name: text
147
+ Date of birth: date_picker
148
+ Country: country
149
+ Affiliation: text
150
+ Job title:
151
+ type: select
152
+ options:
153
+ - Student
154
+ - Research Graduate
155
+ - AI researcher
156
+ - AI developer/engineer
157
+ - Reporter
158
+ - Other
159
+ geo: ip_location
160
+ ? By clicking Submit below I accept the terms of the license and acknowledge that
161
+ the information I provide will be collected stored processed and shared in accordance
162
+ with the Meta Privacy Policy
163
+ : checkbox
164
+ extra_gated_description: The information you provide will be collected, stored, processed
165
+ and shared in accordance with the [Meta Privacy Policy](https://www.facebook.com/privacy/policy/).
166
+ extra_gated_button_content: Submit
167
+ extra_gated_heading: Please be sure to provide your full legal name, date of birth,
168
+ and full organization name with all corporate identifiers. Avoid the use of acronyms
169
+ and special characters. Failure to follow these instructions may prevent you from
170
+ accessing this model and others on Hugging Face. You will not have the ability to
171
+ edit this form after submission, so please ensure all information is accurate.
172
+ license: other
173
+ license_name: llama4
174
+ base_model: meta-llama/Llama-4-Scout-17B-16E
175
+ pipeline_tag: text-generation
176
+ ---
177
+
178
+ # mlx-community/meta-llama-Llama-4-Scout-17B-16E-8bit
179
+
180
+ This model [mlx-community/meta-llama-Llama-4-Scout-17B-16E-8bit](https://huggingface.co/mlx-community/meta-llama-Llama-4-Scout-17B-16E-8bit) was
181
+ converted to MLX format from [meta-llama/Llama-4-Scout-17B-16E](https://huggingface.co/meta-llama/Llama-4-Scout-17B-16E)
182
+ using mlx-lm version **0.22.3**.
183
+
184
+ ## Use with mlx
185
+
186
+ ```bash
187
+ pip install mlx-lm
188
+ ```
189
+
190
+ ```python
191
+ from mlx_lm import load, generate
192
+
193
+ model, tokenizer = load("mlx-community/meta-llama-Llama-4-Scout-17B-16E-8bit")
194
+
195
+ prompt = "hello"
196
+
197
+ if tokenizer.chat_template is not None:
198
+ messages = [{"role": "user", "content": prompt}]
199
+ prompt = tokenizer.apply_chat_template(
200
+ messages, add_generation_prompt=True
201
+ )
202
+
203
+ response = generate(model, tokenizer, prompt=prompt, verbose=True)
204
+ ```
config.json ADDED
@@ -0,0 +1,85 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Llama4ForConditionalGeneration"
4
+ ],
5
+ "boi_token_index": 200080,
6
+ "eoi_token_index": 200081,
7
+ "image_token_index": 200092,
8
+ "model_type": "llama4",
9
+ "quantization": {
10
+ "group_size": 64,
11
+ "bits": 8
12
+ },
13
+ "quantization_config": {
14
+ "group_size": 64,
15
+ "bits": 8
16
+ },
17
+ "text_config": {
18
+ "attention_bias": false,
19
+ "attention_chunk_size": 8192,
20
+ "attention_dropout": 0.0,
21
+ "bos_token_id": 200000,
22
+ "eos_token_id": [
23
+ 200001,
24
+ 200007,
25
+ 200008
26
+ ],
27
+ "for_llm_compressor": false,
28
+ "head_dim": 128,
29
+ "hidden_act": "silu",
30
+ "hidden_size": 5120,
31
+ "initializer_range": 0.02,
32
+ "interleave_moe_layer_step": 1,
33
+ "intermediate_size": 8192,
34
+ "intermediate_size_mlp": 16384,
35
+ "max_position_embeddings": 262144,
36
+ "model_type": "llama4_text",
37
+ "no_rope_layers": [],
38
+ "num_attention_heads": 40,
39
+ "num_experts_per_tok": 1,
40
+ "num_hidden_layers": 48,
41
+ "num_key_value_heads": 8,
42
+ "num_local_experts": 16,
43
+ "output_router_logits": false,
44
+ "pad_token_id": 200018,
45
+ "rms_norm_eps": 1e-05,
46
+ "rope_scaling": {
47
+ "factor": 8.0,
48
+ "high_freq_factor": 4.0,
49
+ "low_freq_factor": 1.0,
50
+ "original_max_position_embeddings": 8192,
51
+ "rope_type": "llama3"
52
+ },
53
+ "rope_theta": 500000.0,
54
+ "router_aux_loss_coef": 0.001,
55
+ "router_jitter_noise": 0.0,
56
+ "torch_dtype": "bfloat16",
57
+ "use_cache": true,
58
+ "use_qk_norm": true,
59
+ "vocab_size": 202048
60
+ },
61
+ "transformers_version": "4.51.0.dev0",
62
+ "vision_config": {
63
+ "attention_dropout": 0.0,
64
+ "hidden_act": "gelu",
65
+ "hidden_size": 1408,
66
+ "image_size": 336,
67
+ "initializer_range": 0.02,
68
+ "intermediate_size": 5632,
69
+ "model_type": "llama4_vision_model",
70
+ "multi_modal_projector_bias": false,
71
+ "norm_eps": 1e-05,
72
+ "num_attention_heads": 16,
73
+ "num_channels": 3,
74
+ "num_hidden_layers": 34,
75
+ "patch_size": 14,
76
+ "pixel_shuffle_ratio": 0.5,
77
+ "projector_dropout": 0.0,
78
+ "projector_input_dim": 4096,
79
+ "projector_output_dim": 4096,
80
+ "rope_theta": 10000,
81
+ "vision_feature_layer": -1,
82
+ "vision_feature_select_strategy": "default",
83
+ "vision_output_dim": 4096
84
+ }
85
+ }
model-00001-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b7957a7eea5278001c054b9b10f21d66402634ccc1ff6aae2eea4068781c84d2
3
+ size 4931801590
model-00002-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0655fcf15b035e6d68fa8f344e1446004bd2c6e75f4ff51f07016fc2fe904216
3
+ size 5350583948
model-00003-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:436acee90c15e7a1ba662edb5598c3a527f3d575301543f9ab50dedaf5978e60
3
+ size 4922089698
model-00004-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:43b7230b0daa52bd0dcc41e12ac3fdd3c04be63d7e1f461c008fca723d3b5a98
3
+ size 5350583958
model-00005-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a91970ec13f8883a3c8aa2a7953d73387fcdd6a6a08e92a7866cc7d76d07b0d0
3
+ size 4721438501
model-00006-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f2fab2113283420863c96dcc7e18b02955742ddc6013be4b8bc8ca973c8d2bf5
3
+ size 5350584015
model-00007-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:39a3ab3909c1fe69670589d54ac58c0d011ed38be31e0707025728c5ec08332d
3
+ size 4721438580
model-00008-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2eed72a138c8655faaae6d95422aac2f662589a1c426d781069c9145b0da92ef
3
+ size 5350584057
model-00009-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:baf07304eb9559dded61e078153c8580c9a0f4e8f6c9866a43cfa97dca656aa8
3
+ size 4922089776
model-00010-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:94cfc18bcc0dc0224ab352b8814fcfddee148f27dedb81f0d344533ef49c75d9
3
+ size 5350584035
model-00011-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:41e0f8b4d973270ad032648a20a36d1df4c49f23f7908d8ebc074855c2201790
3
+ size 4721438530
model-00012-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:af57afeda9818c288df3aa4af5e5efc2289f9531ca74077094d2d38d435c384a
3
+ size 5350584059
model-00013-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7c34d2541aec5abbf34334bab927e89d1a6c545a3afb0fb78c12c4f5e5579b2d
3
+ size 4721438582
model-00014-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:578adf130c292580e4e8ecd2595f4dd924236cce1eac048fde736ae2a4ceceef
3
+ size 5350584061
model-00015-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:99bc02fa71808ad169c82489ebf803bb828d02e10f3120a8774b21561ffad48f
3
+ size 4922089744
model-00016-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:99fc943c650be698564db49dbb5c311852572634447abdc97e459819a35e9f6c
3
+ size 5350584025
model-00017-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f6d9011620134a1b1ee5df4233c78b4fd98a06fd285c74abd9b06e00531b8120
3
+ size 4721438558
model-00018-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1ad06a0e8ec65a2e804d1acd5ec957cd73eb9faaa261670c7506be76b2cba1b5
3
+ size 5350583983
model-00019-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:92bf7a089e609c1bd48b9efc7814e57767676970541a39d6381dfe5104f024b4
3
+ size 4721438554
model-00020-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3d1516e8f311d87ad205695e7cfce1a49b3272ff054b7882c0c07268084711d4
3
+ size 5350584051
model-00021-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2c63c5e8dba009ed85b2ef73279f6ef6f33e288a1d8c55643fbe92660373efc2
3
+ size 4922089766
model-00022-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:933636a0f53edbdbf0cad88d19b838ebb8a73582bba62f4cc4edb6bf602573e3
3
+ size 5350583971
model-00023-of-00023.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0b1b5fe8ea9e0a2ac14c861e8c567c8338006caca2e9c5fa230ade8ba1fed234
3
+ size 2700962360
model.safetensors.index.json ADDED
The diff for this file is too large to render. See raw diff
 
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|begin_of_text|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|eot|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|finetune_right_pad_id|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:172c9eb4beafc72601690da3ccfcede5c2e6806a8d5ec1fca33e22acea8023a4
3
+ size 27948578
tokenizer_config.json ADDED
The diff for this file is too large to render. See raw diff