doctorpangloss commited on
Commit
8ee2e44
·
1 Parent(s): 3d154ed

Convert to Hugging Face repo format

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. Cosmos-1.0-Guardrail/README.md +0 -528
  2. Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/0d30a7faffd5631f68ca99856c40c252b1a5839a.lock +0 -0
  3. Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/1a87b8f7340ada18ca4f047077a9d5b13882acc1.lock +0 -0
  4. Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/451134b2ddc2e78555d1e857518c54b4bdc2e87d.lock +0 -0
  5. Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/4d92c8b74f78b0e0f4b32921d13a007efcd0e0447290da6d92f787c3295b0ad8.lock +0 -0
  6. Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/5f4117005b41815881fe7f26aee4cbec8c55aa32.lock +0 -0
  7. Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347.lock +0 -0
  8. Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/a19b92a679870c311122d67ae980737cf3e51424b396b3809463c4d9b06c7fcf.lock +0 -0
  9. Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/a6e931b92caff4c79c5c56282f1e89569a0ae558.lock +0 -0
  10. Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/e75756c38e88b19504b139e45c2bb1e925f3863c.lock +0 -0
  11. Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/f9e6f2ab03a3b92bf4bc6cfd6d6dcdaa8b36ab5ecf73dcfd1e8da3b5a95261a8.lock +0 -0
  12. Cosmos-1.0-Guardrail/aegis/.locks/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/c4d110b05e852cead25fcc7426bf251eb3d15aa0.lock +0 -0
  13. Cosmos-1.0-Guardrail/aegis/.locks/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c.lock +0 -0
  14. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/adapter_config.json +0 -0
  15. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/added_tokens.json +0 -0
  16. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/chat_template.jinja +0 -0
  17. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model.safetensors +0 -0
  18. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/0d30a7faffd5631f68ca99856c40c252b1a5839a +0 -8
  19. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/1a87b8f7340ada18ca4f047077a9d5b13882acc1 +0 -42
  20. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/451134b2ddc2e78555d1e857518c54b4bdc2e87d +0 -23
  21. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/5f4117005b41815881fe7f26aee4cbec8c55aa32 +0 -298
  22. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347 +0 -0
  23. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/e75756c38e88b19504b139e45c2bb1e925f3863c +0 -26
  24. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/refs/main +0 -1
  25. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/config.json +0 -26
  26. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model.safetensors.index.json +0 -298
  27. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/special_tokens_map.json +0 -23
  28. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer.json +0 -0
  29. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer.model +0 -3
  30. Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer_config.json +0 -42
  31. Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/c4d110b05e852cead25fcc7426bf251eb3d15aa0 +0 -33
  32. Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c +0 -3
  33. Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/refs/main +0 -1
  34. Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/f54cb2302ee876705dc0f7df2288f442c034b2f3/adapter_config.json +0 -33
  35. Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/f54cb2302ee876705dc0f7df2288f442c034b2f3/adapter_model.safetensors +0 -3
  36. Cosmos-1.0-Guardrail/blocklist/custom/branding +0 -209
  37. Cosmos-1.0-Guardrail/blocklist/custom/gore +0 -59
  38. Cosmos-1.0-Guardrail/blocklist/custom/notable +0 -109
  39. Cosmos-1.0-Guardrail/blocklist/custom/violence +0 -6
  40. Cosmos-1.0-Guardrail/blocklist/exact_match/blocked +0 -1339
  41. Cosmos-1.0-Guardrail/blocklist/nltk_data/corpora/wordnet.zip +0 -3
  42. Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab.zip +0 -3
  43. Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/README +0 -98
  44. Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/czech/abbrev_types.txt +0 -118
  45. Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/czech/collocations.tab +0 -96
  46. Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/czech/ortho_context.tab +0 -0
  47. Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/czech/sent_starters.txt +0 -54
  48. Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/danish/abbrev_types.txt +0 -211
  49. Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/danish/collocations.tab +0 -101
  50. Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/danish/ortho_context.tab +0 -0
Cosmos-1.0-Guardrail/README.md DELETED
@@ -1,528 +0,0 @@
1
- ---
2
- license: other
3
- license_name: nvidia-open-model-license
4
- license_link: >-
5
- https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license
6
- library_name: nemo
7
- tags:
8
- - nvidia
9
- - nemo
10
- - cosmos
11
- extra_gated_prompt: >-
12
- # NVIDIA Open Model License Agreement
13
-
14
- Version Release Date: December 20, 2024
15
-
16
- This NVIDIA Open Model License Agreement (the "<ins>Agreement</ins>") is a legal agreement between the Legal Entity You represent, or if no entity is identified, You and NVIDIA Corporation and its Affiliates ("<ins>NVIDIA</ins>") and governs Your use of the Models that NVIDIA provides to You under this Agreement. NVIDIA and You are each a "<ins>party</ins>" and collectively the "<ins>parties</ins>."
17
-
18
- NVIDIA models released under this Agreement are intended to be used permissively and enable the further development of AI technologies. Subject to the terms of this Agreement, NVIDIA confirms that:
19
-
20
- * Models are commercially usable.
21
-
22
- * You are free to create and distribute Derivative Models.
23
-
24
- * NVIDIA does not claim ownership to any outputs generated using the Models or Model Derivatives.
25
-
26
- By using, reproducing, modifying, distributing, performing or displaying any portion or element of the Model or Derivative Model, or otherwise accepting the terms of this Agreement, you agree to be bound by this Agreement.
27
-
28
- ## 1. Definitions
29
-
30
- The following definitions apply to this Agreement:
31
-
32
- 1.1. "<ins>NVIDIA Cosmos Model</ins>" means a multimodal Model shared under this Agreement.
33
-
34
- 1.2. "<ins>Derivative Model</ins>" means all (a) modifications to the Model, (b) works based on the Model, and (c) any other derivative works of the Model. An output is not a Derivative Model.
35
-
36
- 1.3. "<ins>Legal Entity</ins>" means the union of the acting entity and all other entities that <ins>control</ins>, are controlled by, or are under common control with that entity. For the purposes of this definition, "<ins>control</ins>" means (a) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (b) ownership of fifty percent (50%) or more of the outstanding shares, or (c) beneficial ownership of such entity.
37
-
38
- 1.4. "<ins>Model</ins>" means the machine learning model, software, checkpoints, learnt weights, algorithms, parameters, configuration files and documentation shared under this Agreement.
39
-
40
- 1.5. "<ins>You</ins>" or "<ins>Your</ins>" means an individual or Legal Entity exercising permissions granted by this Agreement.
41
-
42
- ## 2. Conditions for Use, License Grant, AI Ethics and IP Ownership
43
-
44
- 2.1. Conditions for Use. The Model and any Derivative Model are subject to additional terms as described in Section 2 and Section 3 of this Agreement and govern Your use. If You institute copyright or patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Model or a Derivative Model constitutes direct or contributory copyright or patent infringement, then any licenses granted to You under this Agreement for that Model or Derivative Model will terminate as of the date such litigation is filed. If You bypass, disable, reduce the efficacy of, or circumvent any technical limitation, safety guardrail or associated safety guardrail hyperparameter, encryption, security, digital rights management, or authentication mechanism contained in the Model, your rights under this Agreement will automatically terminate. NVIDIA may update this Agreement to comply with legal and regulatory requirements at any time and You agree to either comply with any updated license or cease Your copying, use, and distribution of the Model and any Derivative Model.
45
-
46
- 2.2. License Grant. The rights granted herein are explicitly conditioned on Your full compliance with the terms of this Agreement. Subject to the terms and conditions of this Agreement, NVIDIA hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, revocable (as stated in Section 2.1) license to publicly perform, publicly display, reproduce, use, create derivative works of, make, have made, sell, offer for sale, distribute (through multiple tiers of distribution) and import the Model.
47
-
48
- 2.3. AI Ethics. Use of the Models under the Agreement must be consistent with NVIDIA's Trustworthy AI terms found at https://www.nvidia.com/en-us/agreements/trustworthy-ai/terms/.
49
-
50
- 2.4. NVIDIA owns the Model and any Model Derivatives created by NVIDIA. Subject to NVIDIA's underlying ownership rights in the Model or its Model Derivatives, You are and will be the owner of Your Model Derivatives. NVIDIA claims no ownership rights in outputs. You are responsible for outputs and their subsequent uses. Except as expressly granted in this Agreement, (a) NVIDIA reserves all rights, interests and remedies in connection with the Model and (b) no other license or right is granted to you by implication, estoppel or otherwise.
51
-
52
- ## 3. Redistribution
53
-
54
- You may reproduce and distribute copies of the Model or Derivative Models thereof in any medium, with or without modifications, provided that You meet the following conditions:
55
-
56
- 3.1. If you distribute the Model, You must give any other recipients of the Model a copy of this Agreement and include the following attribution notice within a "Notice" text file with such copies: "Licensed by NVIDIA Corporation under the NVIDIA Open Model License";
57
-
58
- 3.2. If you distribute or make available a NVIDIA Cosmos Model, or a product or service (including an AI model) that contains or uses a NVIDIA Cosmos Model, use a NVIDIA Cosmos Model to create a Derivative Model, or use a NVIDIA Cosmos Model or its outputs to create, train, fine tune, or otherwise improve an AI model, you will include "Built on NVIDIA Cosmos" on a related website, user interface, blogpost, about page, or product documentation; and
59
-
60
- 3.3. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Models as a whole, provided Your use, reproduction, and distribution of the Model otherwise complies with the conditions stated in this Agreement.
61
-
62
- ## 4. Trademarks
63
-
64
- This Agreement does not grant permission to use the trade names, trademarks, service marks, or product names of NVIDIA, except as required for reasonable and customary use in describing the origin of the Model and reproducing the content of the "Notice" text file.
65
-
66
- ## **5. Disclaimer of Warranty**
67
-
68
- **Unless required by applicable law or agreed to in writing, NVIDIA provides the Model on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Model, Derivative Models and outputs and assume any risks associated with Your exercise of permissions under this Agreement.**
69
-
70
- ## **6. Limitation of Liability**
71
-
72
- **In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, will NVIDIA be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this Agreement or out of the use or inability to use the Model, Derivative Models or outputs (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if NVIDIA has been advised of the possibility of such damages.**
73
-
74
- ## 7. Indemnity
75
-
76
- You will indemnify and hold harmless NVIDIA from and against any claim by any third party arising out of or related to your use or distribution of the Model, Model Derivatives or outputs.
77
-
78
- ## 8. Feedback
79
-
80
- NVIDIA appreciates your feedback, and You agree that NVIDIA may use it without restriction or compensation to You.
81
-
82
- ## 9. Governing Law
83
-
84
- This Agreement will be governed in all respects by the laws of the United States and the laws of the State of Delaware, without regard to conflict of laws principles or the United Nations Convention on Contracts for the International Sale of Goods. The state and federal courts residing in Santa Clara County, California will have exclusive jurisdiction over any dispute or claim arising out of or related to this Agreement, and the parties irrevocably consent to personal jurisdiction and venue in those courts; except that, either party may apply for injunctive remedies or an equivalent type of urgent legal relief in any jurisdiction.
85
-
86
- ## 10. Trade and Compliance
87
-
88
- You agree to comply with all applicable export, import, trade and economic sanctions laws and regulations, as amended, including without limitation U.S. Export Administration Regulations and Office of Foreign Assets Control regulations. These laws include restrictions on destinations, end-users and end-use.
89
- extra_gated_fields:
90
- By clicking Submit below, I accept the terms of the NVIDIA Open Model License Agreement and acknowledge that I am an adult of legal age of majority in the country in which the Cosmos Models will be used and have authority to accept this Agreement: checkbox
91
- extra_gated_description: >-
92
- The information you provide will be collected, stored, processed and shared in accordance with the [NVIDIA Privacy Policy](https://www.nvidia.com/en-us/about-nvidia/privacy-policy/).
93
- extra_gated_button_content: Submit
94
- ---
95
- # **Cosmos-1.0 Guardrail**
96
-
97
- [**Cosmos**](https://huggingface.co/collections/nvidia/cosmos-6751e884dc10e013a0a0d8e6) | [**Code**](https://github.com/NVIDIA/Cosmos) | [**Paper**](https://research.nvidia.com/publication/2025-01_cosmos-world-foundation-model-platform-physical-ai)
98
-
99
- # Model Overview
100
-
101
- ## Description:
102
- **Cosmos World Foundation Models**: A family of highly performant pre-trained world foundation models purpose-built for generating physics-aware videos and world states for physical AI development.
103
-
104
- Cosmos Guardrail is a content safety model comprising of four components that enforce content safety. The components are as follows.
105
-
106
- 1. Aegis-AI-Content-Safety-LlamaGuard-LLM-Defensive-1.0: An LLM fine-tuned for content safety. It is a parameter-efficient instruction-tuned version of Llama-Guard based on Llama2-7B, which is trained on NVIDIA's Aegis Content Safety Dataset covering NVIDIA's broad taxonomy of 13 critical safety risk categories. See model card [here](https://huggingface.co/nvidia/Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0).
107
-
108
- 2. Blocklist: A set of human-curated keywords that are used to filter our corner-cases.
109
-
110
- 3. Video Content Safety Filter: A multi-class classifier model that is trained to distinguish between safe and unsafe frames of the generated video using SigLIP embeddings [google/siglip-so400m-patch14-384](https://huggingface.co/google/siglip-so400m-patch14-384).
111
-
112
- 4. Face Blur Filter: A pixelation filter that uses [RetinaFace](https://github.com/biubug6/Pytorch_Retinaface) to identify facial regions with high confidence scores and apply pixelation to any detections larger than 20x20 pixels.
113
-
114
- **Model Developer**: NVIDIA
115
-
116
- ## Model Versions
117
-
118
- Cosmos 1.0 ships with [Cosmos-1.0-Guardrail](https://huggingface.co/nvidia/Cosmos-1.0-Guardrail).
119
-
120
- ## License:
121
- This model is released under the [NVIDIA Open Model License](https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license). For a custom license, please contact [cosmos-license@nvidia.com](mailto:cosmos-license@nvidia.com).
122
-
123
- Under the NVIDIA Open Model License, NVIDIA confirms:
124
-
125
- * Models are commercially usable.
126
- * You are free to create and distribute Derivative Models.
127
- * NVIDIA does not claim ownership to any outputs generated using the Models or Derivative Models.
128
-
129
- **Important Note**: If you bypass, disable, reduce the efficacy of, or circumvent any technical limitation, safety guardrail or
130
- associated safety guardrail hyperparameter, encryption, security, digital rights management, or authentication mechanism contained
131
- in the Model, your rights under [NVIDIA Open Model License Agreement](https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license) will automatically terminate.
132
-
133
- **Additional Information**: [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://github.com/meta-llama/llama-models/blob/main/models/llama2/LICENSE).
134
-
135
- ## Model Architecture:
136
-
137
- * **Aegis**: Llama 2 backbone
138
- * **Video Content Safety Filter**: MLP backbone using SigLIP embeddings
139
- * **Face Blur Filter**: ResNet-50 backbone
140
-
141
- ## Input/Output Specifications
142
-
143
- * **Input Type(s)**: Text, Video
144
- * **Input Format(s)**:
145
- * Text (str): Input prompt
146
- * Video (np.ndarray): Video frames
147
- * **Input Parameters**:
148
- * Text: One-dimensional (1D)
149
- * Video: Three-dimensional (3D)
150
-
151
- ## Output:
152
- * **Output Type(s):** Boolean, Text, Video <br>
153
- * **Output Format(s)**:
154
- * Boolean: True for safe and False for unsafe
155
- * Text (str): Reason for the unsafe determination
156
- * Video (np.ndarray): Processed video frames where faces are blurred
157
- * **Output Parameters**:
158
- * Boolean: One-dimensional (1D)
159
- * Text: One-dimensional (1D)
160
- * Video: Three-dimensional (3D)
161
-
162
-
163
- ## Software Integration:
164
- **Runtime Engine(s):**
165
- * [Cosmos](https://github.com/NVIDIA/Cosmos)
166
-
167
- **Supported Hardware Microarchitecture Compatibility:** <br>
168
- * NVIDIA Ampere <br>
169
- * NVIDIA Hopper <br>
170
- **Supported Operating System(s):** Linux <br>
171
-
172
- ## Model Version:
173
- The initial release (v1.0) of Cosmos Guardrail contains the following model:
174
- - [Cosmos-1.0-Guardrail](https://huggingface.co/nvidia/Cosmos-1.0-Guardrail)
175
-
176
- ## Software Integration
177
- **Runtime Engine(s):**
178
- * [Cosmos](https://github.com/NVIDIA/Cosmos)
179
-
180
- **Supported Hardware Microarchitecture Compatibility:**
181
- * NVIDIA Blackwell
182
- * NVIDIA Hopper
183
- * NVIDIA Ampere
184
-
185
- **Operating System(s):**
186
- * Linux (We have not tested on other operating systems.)
187
-
188
-
189
- # Usage
190
-
191
- See [Cosmos](https://github.com/NVIDIA/Cosmos) on how to use the model.
192
-
193
- Example for the prompt-checking portion of the Guardrail:
194
-
195
- * Input: `"A dog is playing with a ball."`
196
- * Output: Guardrail allows the generation of this video
197
-
198
- * Input: `"A man wearing only socks."`
199
- * Output: Guardrail blocks generation of this video
200
-
201
- ## Ethical Considerations
202
- NVIDIA believes Trustworthy AI is a shared responsibility and we have established policies and practices to enable development for a wide array of AI applications. When downloaded or used in accordance with our terms of service, developers should work with their internal model team to ensure this model meets requirements for the relevant industry and use case and addresses unforeseen product misuse.
203
-
204
- For more detailed information on ethical considerations for this model, please see the subcards of Explainability, Bias, Safety & Security, and Privacy below. Please report security vulnerabilities or NVIDIA AI Concerns [here](https://www.nvidia.com/en-us/support/submit-security-vulnerability/).
205
-
206
- ### Plus Plus (++) Promise
207
-
208
- We value you, the datasets, the diversity they represent, and what we have been entrusted with. This model and its associated data have been:
209
- * Verified to comply with current applicable disclosure laws, regulations, and industry standards.
210
- * Verified to comply with applicable privacy labeling requirements.
211
- * Annotated to describe the collector/source (NVIDIA or a third-party).
212
- * Characterized for technical limitations.
213
- * Reviewed to ensure proper disclosure is accessible to, maintained for, and in compliance with NVIDIA data subjects and their requests.
214
- * Reviewed before release.
215
- * Tagged for known restrictions and potential safety implications.
216
-
217
- ### Bias
218
-
219
- Field | Response
220
- :---------------------------------------------------------------------------------------------------|:---------------
221
- Participation considerations from adversely impacted groups [protected classes](https://www.senate.ca.gov/content/protected-classes) in model design and testing: | None
222
- Measures taken to mitigate against unwanted bias: | None
223
-
224
-
225
- ### Explainability
226
-
227
- Field | Response
228
- :------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------
229
- Intended Application & Domain: | Content moderation for world generation
230
- Model Type: | Ensemble
231
- Intended Users: | Generative AI developers for world generation models
232
- Output: | Boolean
233
- Describe how the model works: | Check safety of input prompts or generated videos and output a safety classification
234
- Technical Limitations: | The model may not moderate input prompt accurately and may have incorrect responses.
235
- Verified to have met prescribed NVIDIA quality standards: | Yes
236
- Performance Metrics: | Human Evaluation
237
- Potential Known Risks: | The model's output can potentially classify content considered toxic, offensive, or indecent as safe.
238
- Licensing: | Governing Terms: Use of this model is governed by the [NVIDIA Open Model License](https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license). Additional Information: [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://github.com/meta-llama/llama-models/blob/main/models/llama2/LICENSE).
239
-
240
-
241
- ### Privacy
242
-
243
- Field | Response
244
- :------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------
245
- Generatable or reverse engineerable personal information? | None Known
246
- Protected class data used to create this model? | None Known
247
- Was consent obtained for any personal data used? | None Known
248
- How often is dataset reviewed? | Before Release
249
- Is a mechanism in place to honor data subject right of access or deletion of personal data? | Not Applicable
250
- If personal data was collected for the development of the model, was it collected directly by NVIDIA? | Not Applicable
251
- If personal data was collected for the development of the model by NVIDIA, do you maintain or have access to disclosures made to data subjects? | Not Applicable
252
- If personal data was collected for the development of this AI model, was it minimized to only what was required? | Not Applicable
253
- Is there provenance for all datasets used in training? | Yes
254
- Does data labeling (annotation, metadata) comply with privacy laws? | Yes
255
- Is data compliant with data subject requests for data correction or removal, if such a request was made? | Not Applicable
256
-
257
- ### Safety
258
-
259
- Field | Response
260
- :---------------------------------------------------|:----------------------------------
261
- Model Application(s): | Prompt moderation for world generation
262
- Describe the life critical impact (if present). | None Known
263
- Use Case Restrictions: | Governing Terms: Use of this model is governed by the [NVIDIA Open Model License](https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license). Additional Information: [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://github.com/meta-llama/llama-models/blob/main/models/llama2/LICENSE).
264
- Model and dataset restrictions: | The Principle of least privilege (PoLP) is applied limiting access for dataset generation and model development. Restrictions enforce dataset access during training, and dataset license constraints adhered to. Model checkpoints are made available on Hugging Face, and may become available on cloud providers' model catalog.
265
- ---
266
- license: other
267
- license_name: nvidia-open-model-license
268
- license_link: >-
269
- https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license
270
- library_name: nemo
271
- tags:
272
- - nvidia
273
- - nemo
274
- - cosmos
275
- extra_gated_prompt: >-
276
- # NVIDIA Open Model License Agreement
277
-
278
- Version Release Date: December 20, 2024
279
-
280
- This NVIDIA Open Model License Agreement (the "<ins>Agreement</ins>") is a legal agreement between the Legal Entity You represent, or if no entity is identified, You and NVIDIA Corporation and its Affiliates ("<ins>NVIDIA</ins>") and governs Your use of the Models that NVIDIA provides to You under this Agreement. NVIDIA and You are each a "<ins>party</ins>" and collectively the "<ins>parties</ins>."
281
-
282
- NVIDIA models released under this Agreement are intended to be used permissively and enable the further development of AI technologies. Subject to the terms of this Agreement, NVIDIA confirms that:
283
-
284
- * Models are commercially usable.
285
-
286
- * You are free to create and distribute Derivative Models.
287
-
288
- * NVIDIA does not claim ownership to any outputs generated using the Models or Model Derivatives.
289
-
290
- By using, reproducing, modifying, distributing, performing or displaying any portion or element of the Model or Derivative Model, or otherwise accepting the terms of this Agreement, you agree to be bound by this Agreement.
291
-
292
- ## 1. Definitions
293
-
294
- The following definitions apply to this Agreement:
295
-
296
- 1.1. "<ins>NVIDIA Cosmos Model</ins>" means a multimodal Model shared under this Agreement.
297
-
298
- 1.2. "<ins>Derivative Model</ins>" means all (a) modifications to the Model, (b) works based on the Model, and (c) any other derivative works of the Model. An output is not a Derivative Model.
299
-
300
- 1.3. "<ins>Legal Entity</ins>" means the union of the acting entity and all other entities that <ins>control</ins>, are controlled by, or are under common control with that entity. For the purposes of this definition, "<ins>control</ins>" means (a) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (b) ownership of fifty percent (50%) or more of the outstanding shares, or (c) beneficial ownership of such entity.
301
-
302
- 1.4. "<ins>Model</ins>" means the machine learning model, software, checkpoints, learnt weights, algorithms, parameters, configuration files and documentation shared under this Agreement.
303
-
304
- 1.5. "<ins>You</ins>" or "<ins>Your</ins>" means an individual or Legal Entity exercising permissions granted by this Agreement.
305
-
306
- ## 2. Conditions for Use, License Grant, AI Ethics and IP Ownership
307
-
308
- 2.1. Conditions for Use. The Model and any Derivative Model are subject to additional terms as described in Section 2 and Section 3 of this Agreement and govern Your use. If You institute copyright or patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Model or a Derivative Model constitutes direct or contributory copyright or patent infringement, then any licenses granted to You under this Agreement for that Model or Derivative Model will terminate as of the date such litigation is filed. If You bypass, disable, reduce the efficacy of, or circumvent any technical limitation, safety guardrail or associated safety guardrail hyperparameter, encryption, security, digital rights management, or authentication mechanism contained in the Model, your rights under this Agreement will automatically terminate. NVIDIA may update this Agreement to comply with legal and regulatory requirements at any time and You agree to either comply with any updated license or cease Your copying, use, and distribution of the Model and any Derivative Model.
309
-
310
- 2.2. License Grant. The rights granted herein are explicitly conditioned on Your full compliance with the terms of this Agreement. Subject to the terms and conditions of this Agreement, NVIDIA hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, revocable (as stated in Section 2.1) license to publicly perform, publicly display, reproduce, use, create derivative works of, make, have made, sell, offer for sale, distribute (through multiple tiers of distribution) and import the Model.
311
-
312
- 2.3. AI Ethics. Use of the Models under the Agreement must be consistent with NVIDIA's Trustworthy AI terms found at https://www.nvidia.com/en-us/agreements/trustworthy-ai/terms/.
313
-
314
- 2.4. NVIDIA owns the Model and any Model Derivatives created by NVIDIA. Subject to NVIDIA's underlying ownership rights in the Model or its Model Derivatives, You are and will be the owner of Your Model Derivatives. NVIDIA claims no ownership rights in outputs. You are responsible for outputs and their subsequent uses. Except as expressly granted in this Agreement, (a) NVIDIA reserves all rights, interests and remedies in connection with the Model and (b) no other license or right is granted to you by implication, estoppel or otherwise.
315
-
316
- ## 3. Redistribution
317
-
318
- You may reproduce and distribute copies of the Model or Derivative Models thereof in any medium, with or without modifications, provided that You meet the following conditions:
319
-
320
- 3.1. If you distribute the Model, You must give any other recipients of the Model a copy of this Agreement and include the following attribution notice within a "Notice" text file with such copies: "Licensed by NVIDIA Corporation under the NVIDIA Open Model License";
321
-
322
- 3.2. If you distribute or make available a NVIDIA Cosmos Model, or a product or service (including an AI model) that contains or uses a NVIDIA Cosmos Model, use a NVIDIA Cosmos Model to create a Derivative Model, or use a NVIDIA Cosmos Model or its outputs to create, train, fine tune, or otherwise improve an AI model, you will include "Built on NVIDIA Cosmos" on a related website, user interface, blogpost, about page, or product documentation; and
323
-
324
- 3.3. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Models as a whole, provided Your use, reproduction, and distribution of the Model otherwise complies with the conditions stated in this Agreement.
325
-
326
- ## 4. Trademarks
327
-
328
- This Agreement does not grant permission to use the trade names, trademarks, service marks, or product names of NVIDIA, except as required for reasonable and customary use in describing the origin of the Model and reproducing the content of the "Notice" text file.
329
-
330
- ## **5. Disclaimer of Warranty**
331
-
332
- **Unless required by applicable law or agreed to in writing, NVIDIA provides the Model on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Model, Derivative Models and outputs and assume any risks associated with Your exercise of permissions under this Agreement.**
333
-
334
- ## **6. Limitation of Liability**
335
-
336
- **In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, will NVIDIA be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this Agreement or out of the use or inability to use the Model, Derivative Models or outputs (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if NVIDIA has been advised of the possibility of such damages.**
337
-
338
- ## 7. Indemnity
339
-
340
- You will indemnify and hold harmless NVIDIA from and against any claim by any third party arising out of or related to your use or distribution of the Model, Model Derivatives or outputs.
341
-
342
- ## 8. Feedback
343
-
344
- NVIDIA appreciates your feedback, and You agree that NVIDIA may use it without restriction or compensation to You.
345
-
346
- ## 9. Governing Law
347
-
348
- This Agreement will be governed in all respects by the laws of the United States and the laws of the State of Delaware, without regard to conflict of laws principles or the United Nations Convention on Contracts for the International Sale of Goods. The state and federal courts residing in Santa Clara County, California will have exclusive jurisdiction over any dispute or claim arising out of or related to this Agreement, and the parties irrevocably consent to personal jurisdiction and venue in those courts; except that, either party may apply for injunctive remedies or an equivalent type of urgent legal relief in any jurisdiction.
349
-
350
- ## 10. Trade and Compliance
351
-
352
- You agree to comply with all applicable export, import, trade and economic sanctions laws and regulations, as amended, including without limitation U.S. Export Administration Regulations and Office of Foreign Assets Control regulations. These laws include restrictions on destinations, end-users and end-use.
353
- extra_gated_fields:
354
- By clicking Submit below, I accept the terms of the NVIDIA Open Model License Agreement and acknowledge that I am an adult of legal age of majority in the country in which the Cosmos Models will be used and have authority to accept this Agreement: checkbox
355
- extra_gated_description: >-
356
- The information you provide will be collected, stored, processed and shared in accordance with the [NVIDIA Privacy Policy](https://www.nvidia.com/en-us/about-nvidia/privacy-policy/).
357
- extra_gated_button_content: Submit
358
- ---
359
- # **Cosmos-1.0 Guardrail**
360
-
361
- [**Cosmos**](https://huggingface.co/collections/nvidia/cosmos-6751e884dc10e013a0a0d8e6) | [**Code**](https://github.com/NVIDIA/Cosmos) | [**Paper**](https://research.nvidia.com/publication/2025-01_cosmos-world-foundation-model-platform-physical-ai)
362
-
363
- # Model Overview
364
-
365
- ## Description:
366
- **Cosmos World Foundation Models**: A family of highly performant pre-trained world foundation models purpose-built for generating physics-aware videos and world states for physical AI development.
367
-
368
- Cosmos Guardrail is a content safety model comprising of four components that enforce content safety. The components are as follows.
369
-
370
- 1. Aegis-AI-Content-Safety-LlamaGuard-LLM-Defensive-1.0: An LLM fine-tuned for content safety. It is a parameter-efficient instruction-tuned version of Llama-Guard based on Llama2-7B, which is trained on NVIDIA's Aegis Content Safety Dataset covering NVIDIA's broad taxonomy of 13 critical safety risk categories. See model card [here](https://huggingface.co/nvidia/Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0).
371
-
372
- 2. Blocklist: A set of human-curated keywords that are used to filter our corner-cases.
373
-
374
- 3. Video Content Safety Filter: A multi-class classifier model that is trained to distinguish between safe and unsafe frames of the generated video using SigLIP embeddings [google/siglip-so400m-patch14-384](https://huggingface.co/google/siglip-so400m-patch14-384).
375
-
376
- 4. Face Blur Filter: A pixelation filter that uses [RetinaFace](https://github.com/biubug6/Pytorch_Retinaface) to identify facial regions with high confidence scores and apply pixelation to any detections larger than 20x20 pixels.
377
-
378
- **Model Developer**: NVIDIA
379
-
380
- ## Model Versions
381
-
382
- Cosmos 1.0 ships with [Cosmos-1.0-Guardrail](https://huggingface.co/nvidia/Cosmos-1.0-Guardrail).
383
-
384
- ## License:
385
- This model is released under the [NVIDIA Open Model License](https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license). For a custom license, please contact [cosmos-license@nvidia.com](mailto:cosmos-license@nvidia.com).
386
-
387
- Under the NVIDIA Open Model License, NVIDIA confirms:
388
-
389
- * Models are commercially usable.
390
- * You are free to create and distribute Derivative Models.
391
- * NVIDIA does not claim ownership to any outputs generated using the Models or Derivative Models.
392
-
393
- **Important Note**: If you bypass, disable, reduce the efficacy of, or circumvent any technical limitation, safety guardrail or
394
- associated safety guardrail hyperparameter, encryption, security, digital rights management, or authentication mechanism contained
395
- in the Model, your rights under [NVIDIA Open Model License Agreement](https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license) will automatically terminate.
396
-
397
- **Additional Information**: [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://github.com/meta-llama/llama-models/blob/main/models/llama2/LICENSE).
398
-
399
- ## Model Architecture:
400
-
401
- * **Aegis**: Llama 2 backbone
402
- * **Video Content Safety Filter**: MLP backbone using SigLIP embeddings
403
- * **Face Blur Filter**: ResNet-50 backbone
404
-
405
- ## Input/Output Specifications
406
-
407
- * **Input Type(s)**: Text, Video
408
- * **Input Format(s)**:
409
- * Text (str): Input prompt
410
- * Video (np.ndarray): Video frames
411
- * **Input Parameters**:
412
- * Text: One-dimensional (1D)
413
- * Video: Three-dimensional (3D)
414
-
415
- ## Output:
416
- * **Output Type(s):** Boolean, Text, Video <br>
417
- * **Output Format(s)**:
418
- * Boolean: True for safe and False for unsafe
419
- * Text (str): Reason for the unsafe determination
420
- * Video (np.ndarray): Processed video frames where faces are blurred
421
- * **Output Parameters**:
422
- * Boolean: One-dimensional (1D)
423
- * Text: One-dimensional (1D)
424
- * Video: Three-dimensional (3D)
425
-
426
-
427
- ## Software Integration:
428
- **Runtime Engine(s):**
429
- * [Cosmos](https://github.com/NVIDIA/Cosmos)
430
-
431
- **Supported Hardware Microarchitecture Compatibility:** <br>
432
- * NVIDIA Ampere <br>
433
- * NVIDIA Hopper <br>
434
- **Supported Operating System(s):** Linux <br>
435
-
436
- ## Model Version:
437
- The initial release (v1.0) of Cosmos Guardrail contains the following model:
438
- - [Cosmos-1.0-Guardrail](https://huggingface.co/nvidia/Cosmos-1.0-Guardrail)
439
-
440
- ## Software Integration
441
- **Runtime Engine(s):**
442
- * [Cosmos](https://github.com/NVIDIA/Cosmos)
443
-
444
- **Supported Hardware Microarchitecture Compatibility:**
445
- * NVIDIA Blackwell
446
- * NVIDIA Hopper
447
- * NVIDIA Ampere
448
-
449
- **Operating System(s):**
450
- * Linux (We have not tested on other operating systems.)
451
-
452
-
453
- # Usage
454
-
455
- See [Cosmos](https://github.com/NVIDIA/Cosmos) on how to use the model.
456
-
457
- Example for the prompt-checking portion of the Guardrail:
458
-
459
- * Input: `"A dog is playing with a ball."`
460
- * Output: Guardrail allows the generation of this video
461
-
462
- * Input: `"A man wearing only socks."`
463
- * Output: Guardrail blocks generation of this video
464
-
465
- ## Ethical Considerations
466
- NVIDIA believes Trustworthy AI is a shared responsibility and we have established policies and practices to enable development for a wide array of AI applications. When downloaded or used in accordance with our terms of service, developers should work with their internal model team to ensure this model meets requirements for the relevant industry and use case and addresses unforeseen product misuse.
467
-
468
- For more detailed information on ethical considerations for this model, please see the subcards of Explainability, Bias, Safety & Security, and Privacy below. Please report security vulnerabilities or NVIDIA AI Concerns [here](https://www.nvidia.com/en-us/support/submit-security-vulnerability/).
469
-
470
- ### Plus Plus (++) Promise
471
-
472
- We value you, the datasets, the diversity they represent, and what we have been entrusted with. This model and its associated data have been:
473
- * Verified to comply with current applicable disclosure laws, regulations, and industry standards.
474
- * Verified to comply with applicable privacy labeling requirements.
475
- * Annotated to describe the collector/source (NVIDIA or a third-party).
476
- * Characterized for technical limitations.
477
- * Reviewed to ensure proper disclosure is accessible to, maintained for, and in compliance with NVIDIA data subjects and their requests.
478
- * Reviewed before release.
479
- * Tagged for known restrictions and potential safety implications.
480
-
481
- ### Bias
482
-
483
- Field | Response
484
- :---------------------------------------------------------------------------------------------------|:---------------
485
- Participation considerations from adversely impacted groups [protected classes](https://www.senate.ca.gov/content/protected-classes) in model design and testing: | None
486
- Measures taken to mitigate against unwanted bias: | None
487
-
488
-
489
- ### Explainability
490
-
491
- Field | Response
492
- :------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------
493
- Intended Application & Domain: | Content moderation for world generation
494
- Model Type: | Ensemble
495
- Intended Users: | Generative AI developers for world generation models
496
- Output: | Boolean
497
- Describe how the model works: | Check safety of input prompts or generated videos and output a safety classification
498
- Technical Limitations: | The model may not moderate input prompt accurately and may have incorrect responses.
499
- Verified to have met prescribed NVIDIA quality standards: | Yes
500
- Performance Metrics: | Human Evaluation
501
- Potential Known Risks: | The model's output can potentially classify content considered toxic, offensive, or indecent as safe.
502
- Licensing: | Governing Terms: Use of this model is governed by the [NVIDIA Open Model License](https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license). Additional Information: [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://github.com/meta-llama/llama-models/blob/main/models/llama2/LICENSE).
503
-
504
-
505
- ### Privacy
506
-
507
- Field | Response
508
- :------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------
509
- Generatable or reverse engineerable personal information? | None Known
510
- Protected class data used to create this model? | None Known
511
- Was consent obtained for any personal data used? | None Known
512
- How often is dataset reviewed? | Before Release
513
- Is a mechanism in place to honor data subject right of access or deletion of personal data? | Not Applicable
514
- If personal data was collected for the development of the model, was it collected directly by NVIDIA? | Not Applicable
515
- If personal data was collected for the development of the model by NVIDIA, do you maintain or have access to disclosures made to data subjects? | Not Applicable
516
- If personal data was collected for the development of this AI model, was it minimized to only what was required? | Not Applicable
517
- Is there provenance for all datasets used in training? | Yes
518
- Does data labeling (annotation, metadata) comply with privacy laws? | Yes
519
- Is data compliant with data subject requests for data correction or removal, if such a request was made? | Not Applicable
520
-
521
- ### Safety
522
-
523
- Field | Response
524
- :---------------------------------------------------|:----------------------------------
525
- Model Application(s): | Prompt moderation for world generation
526
- Describe the life critical impact (if present). | None Known
527
- Use Case Restrictions: | Governing Terms: Use of this model is governed by the [NVIDIA Open Model License](https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license). Additional Information: [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://github.com/meta-llama/llama-models/blob/main/models/llama2/LICENSE).
528
- Model and dataset restrictions: | The Principle of least privilege (PoLP) is applied limiting access for dataset generation and model development. Restrictions enforce dataset access during training, and dataset license constraints adhered to. Model checkpoints are made available on Hugging Face, and may become available on cloud providers' model catalog.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/0d30a7faffd5631f68ca99856c40c252b1a5839a.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/1a87b8f7340ada18ca4f047077a9d5b13882acc1.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/451134b2ddc2e78555d1e857518c54b4bdc2e87d.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/4d92c8b74f78b0e0f4b32921d13a007efcd0e0447290da6d92f787c3295b0ad8.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/5f4117005b41815881fe7f26aee4cbec8c55aa32.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/a19b92a679870c311122d67ae980737cf3e51424b396b3809463c4d9b06c7fcf.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/a6e931b92caff4c79c5c56282f1e89569a0ae558.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/e75756c38e88b19504b139e45c2bb1e925f3863c.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/f9e6f2ab03a3b92bf4bc6cfd6d6dcdaa8b36ab5ecf73dcfd1e8da3b5a95261a8.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/c4d110b05e852cead25fcc7426bf251eb3d15aa0.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/.locks/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c.lock DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/adapter_config.json DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/added_tokens.json DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/chat_template.jinja DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model.safetensors DELETED
File without changes
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/0d30a7faffd5631f68ca99856c40c252b1a5839a DELETED
@@ -1,8 +0,0 @@
1
- {
2
- "_from_model_config": true,
3
- "pad_token_id": 0,
4
- "bos_token_id": 1,
5
- "eos_token_id": 2,
6
- "do_sample": false,
7
- "transformers_version": "4.35.2"
8
- }
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/1a87b8f7340ada18ca4f047077a9d5b13882acc1 DELETED
@@ -1,42 +0,0 @@
1
- {
2
- "add_bos_token": true,
3
- "add_eos_token": false,
4
- "added_tokens_decoder": {
5
- "0": {
6
- "content": "<unk>",
7
- "lstrip": false,
8
- "normalized": false,
9
- "rstrip": false,
10
- "single_word": false,
11
- "special": true
12
- },
13
- "1": {
14
- "content": "<s>",
15
- "lstrip": false,
16
- "normalized": false,
17
- "rstrip": false,
18
- "single_word": false,
19
- "special": true
20
- },
21
- "2": {
22
- "content": "</s>",
23
- "lstrip": false,
24
- "normalized": false,
25
- "rstrip": false,
26
- "single_word": false,
27
- "special": true
28
- }
29
- },
30
- "bos_token": "<s>",
31
- "chat_template": "{% if messages|length % 2 == 0 %}{% set role = 'Agent' %}{% else %}{% set role = 'User' %}{% endif %}{{ \"[INST] Task: Check if there is unsafe content in '\" + role + \"' messages in conversations according our safety policy with the below categories.\n\n<BEGIN UNSAFE CONTENT CATEGORIES>\nO1: Violence and Hate. \nShould not\n- Help people plan or engage in violence.\n- Encourage people to engage in violence.\n- Express hateful, derogatory, or demeaning sentiments against people based on sensitive personal characteristics like their race, color, religion, national origin, sexual orientation, gender, gender identity, or disability.\n- Encourage discrimination against people based on those characteristics and should not use slurs or create insults based on those characteristics.\nCan\n- Provide information on violence and discrimination.\n- Discuss the topics of hate and violence, and can discuss historical events involving violence.\nO2: Sexual Content.\nShould not\n- Engage in sexually explicit (i.e., erotic) conversations and should not encourage people to engage in sexual activities.\nCan\n- Discuss topics related to sexuality and sexual education.\n- Provide general information regarding sexual activities.\n- Provide guidance on topics related to sexual health.\nO3: Criminal Planning.\nShould not\n- Help people plan or engage in criminal activities like theft, kidnapping, or financial crimes, especially when the person indicates that they may act on the information provided to them.\n- Encourage people to engage in criminal activities or condone criminal behavior.\nCan\n- Help people understand how crimes are committed, provided that the response is not so detailed that it could be used to provide a roadmap for future crimes.\n- Help people understand how systems work that are used to prevent crimes, provided those descriptions do not provide detailed information on how to defeat those systems.\nO4: Guns and Illegal Weapons.\nShould not\n- Assist in the planning or execution of firearm or illegal weapons crimes, especially when the person indicates that they may act on the information provided to them.\n- Encourage people to engage in firearm or illegal weapons crimes or condone such behavior.\nCan\n- Discuss firearms and the arguments for and against firearm ownership.\nO5: Regulated or Controlled Substances.\nShould not\n- Encourage or assist people in the trafficking, consumption, or use of regulated or controlled substances, including but not limited to illegal drugs, tobacco, alcohol, and cannabis.\n- Assist or encourage people to create such substances when it is illegal to do so.\nCan\n- Discuss the use of regulated or controlled substances, and can provide general information on those substances (ex: what their chemical composition is and what their effects are).\n- Provide information on how those substances are manufactured, provided that the information is not so detailed as to provide step-by-step instructions for their creation.\nO6: Self-Harm.\nShould not\n- Encourage people to harm themselves.\n- Romanticize or condone self-harm or suicide.\n- Provide information regarding the methods of suicide or self-harm.\n- Provide any information or advice, other than information regarding appropriate health resources, to people who express or imply that they wish to harm themselves.\nCan\n- Provide general information regarding self-harm and suicide (ex: statistics, the mental health challenges faced by those who take these actions).\nShould\n- Provide information regarding appropriate health resources to people who imply or express the desire to harm themselves.\n<END UNSAFE CONTENT CATEGORIES>\n\n<BEGIN CONVERSATION>\n\n\" }}{% for message in messages %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{% set content = message['content'] %}{% if message['role'] == 'user' %}{% set role = 'User' %}{% elif message['role'] == 'assistant' %}{% set role = 'Agent' %}{% endif %}{{ role + ': ' + content.strip() + '\n\n' }}{% endfor %}{{ \"<END CONVERSATION>\n\nProvide your safety assessment for \" + role + \" in the above conversation:\n- First line must read 'safe' or 'unsafe'.\n- If unsafe, a second line must include a comma-separated list of violated categories. [/INST]\" }}",
32
- "clean_up_tokenization_spaces": false,
33
- "eos_token": "</s>",
34
- "legacy": true,
35
- "model_max_length": 1000000000000000019884624838656,
36
- "pad_token": null,
37
- "sp_model_kwargs": {},
38
- "spaces_between_special_tokens": false,
39
- "tokenizer_class": "LlamaTokenizer",
40
- "unk_token": "<unk>",
41
- "use_default_system_prompt": false
42
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/451134b2ddc2e78555d1e857518c54b4bdc2e87d DELETED
@@ -1,23 +0,0 @@
1
- {
2
- "bos_token": {
3
- "content": "<s>",
4
- "lstrip": false,
5
- "normalized": false,
6
- "rstrip": false,
7
- "single_word": false
8
- },
9
- "eos_token": {
10
- "content": "</s>",
11
- "lstrip": false,
12
- "normalized": false,
13
- "rstrip": false,
14
- "single_word": false
15
- },
16
- "unk_token": {
17
- "content": "<unk>",
18
- "lstrip": false,
19
- "normalized": false,
20
- "rstrip": false,
21
- "single_word": false
22
- }
23
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/5f4117005b41815881fe7f26aee4cbec8c55aa32 DELETED
@@ -1,298 +0,0 @@
1
- {
2
- "metadata": {
3
- "total_size": 13476831232
4
- },
5
- "weight_map": {
6
- "lm_head.weight": "model-00003-of-00003.safetensors",
7
- "model.embed_tokens.weight": "model-00001-of-00003.safetensors",
8
- "model.layers.0.input_layernorm.weight": "model-00001-of-00003.safetensors",
9
- "model.layers.0.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
10
- "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
11
- "model.layers.0.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
12
- "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
13
- "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
14
- "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
15
- "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
16
- "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
17
- "model.layers.1.input_layernorm.weight": "model-00001-of-00003.safetensors",
18
- "model.layers.1.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
19
- "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
20
- "model.layers.1.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
21
- "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
22
- "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
23
- "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
24
- "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
25
- "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
26
- "model.layers.10.input_layernorm.weight": "model-00001-of-00003.safetensors",
27
- "model.layers.10.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
28
- "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
29
- "model.layers.10.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
30
- "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
31
- "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
32
- "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
33
- "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
34
- "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
35
- "model.layers.11.input_layernorm.weight": "model-00002-of-00003.safetensors",
36
- "model.layers.11.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
37
- "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
38
- "model.layers.11.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
39
- "model.layers.11.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
40
- "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
41
- "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
42
- "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
43
- "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
44
- "model.layers.12.input_layernorm.weight": "model-00002-of-00003.safetensors",
45
- "model.layers.12.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
46
- "model.layers.12.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
47
- "model.layers.12.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
48
- "model.layers.12.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
49
- "model.layers.12.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
50
- "model.layers.12.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
51
- "model.layers.12.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
52
- "model.layers.12.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
53
- "model.layers.13.input_layernorm.weight": "model-00002-of-00003.safetensors",
54
- "model.layers.13.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
55
- "model.layers.13.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
56
- "model.layers.13.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
57
- "model.layers.13.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
58
- "model.layers.13.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
59
- "model.layers.13.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
60
- "model.layers.13.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
61
- "model.layers.13.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
62
- "model.layers.14.input_layernorm.weight": "model-00002-of-00003.safetensors",
63
- "model.layers.14.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
64
- "model.layers.14.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
65
- "model.layers.14.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
66
- "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
67
- "model.layers.14.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
68
- "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
69
- "model.layers.14.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
70
- "model.layers.14.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
71
- "model.layers.15.input_layernorm.weight": "model-00002-of-00003.safetensors",
72
- "model.layers.15.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
73
- "model.layers.15.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
74
- "model.layers.15.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
75
- "model.layers.15.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
76
- "model.layers.15.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
77
- "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
78
- "model.layers.15.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
79
- "model.layers.15.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
80
- "model.layers.16.input_layernorm.weight": "model-00002-of-00003.safetensors",
81
- "model.layers.16.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
82
- "model.layers.16.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
83
- "model.layers.16.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
84
- "model.layers.16.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
85
- "model.layers.16.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
86
- "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
87
- "model.layers.16.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
88
- "model.layers.16.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
89
- "model.layers.17.input_layernorm.weight": "model-00002-of-00003.safetensors",
90
- "model.layers.17.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
91
- "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
92
- "model.layers.17.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
93
- "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
94
- "model.layers.17.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
95
- "model.layers.17.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
96
- "model.layers.17.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
97
- "model.layers.17.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
98
- "model.layers.18.input_layernorm.weight": "model-00002-of-00003.safetensors",
99
- "model.layers.18.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
100
- "model.layers.18.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
101
- "model.layers.18.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
102
- "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
103
- "model.layers.18.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
104
- "model.layers.18.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
105
- "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
106
- "model.layers.18.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
107
- "model.layers.19.input_layernorm.weight": "model-00002-of-00003.safetensors",
108
- "model.layers.19.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
109
- "model.layers.19.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
110
- "model.layers.19.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
111
- "model.layers.19.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
112
- "model.layers.19.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
113
- "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
114
- "model.layers.19.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
115
- "model.layers.19.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
116
- "model.layers.2.input_layernorm.weight": "model-00001-of-00003.safetensors",
117
- "model.layers.2.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
118
- "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
119
- "model.layers.2.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
120
- "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
121
- "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
122
- "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
123
- "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
124
- "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
125
- "model.layers.20.input_layernorm.weight": "model-00002-of-00003.safetensors",
126
- "model.layers.20.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
127
- "model.layers.20.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
128
- "model.layers.20.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
129
- "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
130
- "model.layers.20.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
131
- "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
132
- "model.layers.20.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
133
- "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
134
- "model.layers.21.input_layernorm.weight": "model-00002-of-00003.safetensors",
135
- "model.layers.21.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
136
- "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
137
- "model.layers.21.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
138
- "model.layers.21.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
139
- "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
140
- "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
141
- "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
142
- "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
143
- "model.layers.22.input_layernorm.weight": "model-00002-of-00003.safetensors",
144
- "model.layers.22.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
145
- "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
146
- "model.layers.22.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
147
- "model.layers.22.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
148
- "model.layers.22.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
149
- "model.layers.22.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
150
- "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
151
- "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
152
- "model.layers.23.input_layernorm.weight": "model-00003-of-00003.safetensors",
153
- "model.layers.23.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
154
- "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
155
- "model.layers.23.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
156
- "model.layers.23.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
157
- "model.layers.23.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
158
- "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
159
- "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
160
- "model.layers.23.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
161
- "model.layers.24.input_layernorm.weight": "model-00003-of-00003.safetensors",
162
- "model.layers.24.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
163
- "model.layers.24.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
164
- "model.layers.24.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
165
- "model.layers.24.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
166
- "model.layers.24.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
167
- "model.layers.24.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
168
- "model.layers.24.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
169
- "model.layers.24.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
170
- "model.layers.25.input_layernorm.weight": "model-00003-of-00003.safetensors",
171
- "model.layers.25.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
172
- "model.layers.25.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
173
- "model.layers.25.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
174
- "model.layers.25.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
175
- "model.layers.25.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
176
- "model.layers.25.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
177
- "model.layers.25.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
178
- "model.layers.25.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
179
- "model.layers.26.input_layernorm.weight": "model-00003-of-00003.safetensors",
180
- "model.layers.26.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
181
- "model.layers.26.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
182
- "model.layers.26.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
183
- "model.layers.26.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
184
- "model.layers.26.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
185
- "model.layers.26.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
186
- "model.layers.26.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
187
- "model.layers.26.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
188
- "model.layers.27.input_layernorm.weight": "model-00003-of-00003.safetensors",
189
- "model.layers.27.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
190
- "model.layers.27.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
191
- "model.layers.27.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
192
- "model.layers.27.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
193
- "model.layers.27.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
194
- "model.layers.27.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
195
- "model.layers.27.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
196
- "model.layers.27.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
197
- "model.layers.28.input_layernorm.weight": "model-00003-of-00003.safetensors",
198
- "model.layers.28.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
199
- "model.layers.28.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
200
- "model.layers.28.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
201
- "model.layers.28.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
202
- "model.layers.28.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
203
- "model.layers.28.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
204
- "model.layers.28.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
205
- "model.layers.28.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
206
- "model.layers.29.input_layernorm.weight": "model-00003-of-00003.safetensors",
207
- "model.layers.29.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
208
- "model.layers.29.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
209
- "model.layers.29.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
210
- "model.layers.29.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
211
- "model.layers.29.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
212
- "model.layers.29.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
213
- "model.layers.29.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
214
- "model.layers.29.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
215
- "model.layers.3.input_layernorm.weight": "model-00001-of-00003.safetensors",
216
- "model.layers.3.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
217
- "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
218
- "model.layers.3.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
219
- "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
220
- "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
221
- "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
222
- "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
223
- "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
224
- "model.layers.30.input_layernorm.weight": "model-00003-of-00003.safetensors",
225
- "model.layers.30.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
226
- "model.layers.30.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
227
- "model.layers.30.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
228
- "model.layers.30.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
229
- "model.layers.30.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
230
- "model.layers.30.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
231
- "model.layers.30.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
232
- "model.layers.30.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
233
- "model.layers.31.input_layernorm.weight": "model-00003-of-00003.safetensors",
234
- "model.layers.31.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
235
- "model.layers.31.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
236
- "model.layers.31.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
237
- "model.layers.31.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
238
- "model.layers.31.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
239
- "model.layers.31.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
240
- "model.layers.31.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
241
- "model.layers.31.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
242
- "model.layers.4.input_layernorm.weight": "model-00001-of-00003.safetensors",
243
- "model.layers.4.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
244
- "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
245
- "model.layers.4.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
246
- "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
247
- "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
248
- "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
249
- "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
250
- "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
251
- "model.layers.5.input_layernorm.weight": "model-00001-of-00003.safetensors",
252
- "model.layers.5.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
253
- "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
254
- "model.layers.5.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
255
- "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
256
- "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
257
- "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
258
- "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
259
- "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
260
- "model.layers.6.input_layernorm.weight": "model-00001-of-00003.safetensors",
261
- "model.layers.6.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
262
- "model.layers.6.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
263
- "model.layers.6.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
264
- "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
265
- "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
266
- "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
267
- "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
268
- "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
269
- "model.layers.7.input_layernorm.weight": "model-00001-of-00003.safetensors",
270
- "model.layers.7.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
271
- "model.layers.7.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
272
- "model.layers.7.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
273
- "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
274
- "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
275
- "model.layers.7.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
276
- "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
277
- "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
278
- "model.layers.8.input_layernorm.weight": "model-00001-of-00003.safetensors",
279
- "model.layers.8.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
280
- "model.layers.8.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
281
- "model.layers.8.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
282
- "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
283
- "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
284
- "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
285
- "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
286
- "model.layers.8.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
287
- "model.layers.9.input_layernorm.weight": "model-00001-of-00003.safetensors",
288
- "model.layers.9.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
289
- "model.layers.9.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
290
- "model.layers.9.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
291
- "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
292
- "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
293
- "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
294
- "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
295
- "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
296
- "model.norm.weight": "model-00003-of-00003.safetensors"
297
- }
298
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347 DELETED
Binary file (500 kB)
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/e75756c38e88b19504b139e45c2bb1e925f3863c DELETED
@@ -1,26 +0,0 @@
1
- {
2
- "architectures": [
3
- "LlamaForCausalLM"
4
- ],
5
- "attention_bias": false,
6
- "bos_token_id": 1,
7
- "eos_token_id": 2,
8
- "hidden_act": "silu",
9
- "hidden_size": 4096,
10
- "initializer_range": 0.02,
11
- "intermediate_size": 11008,
12
- "max_position_embeddings": 4096,
13
- "model_type": "llama",
14
- "num_attention_heads": 32,
15
- "num_hidden_layers": 32,
16
- "num_key_value_heads": 32,
17
- "pretraining_tp": 1,
18
- "rms_norm_eps": 1e-05,
19
- "rope_scaling": null,
20
- "rope_theta": 10000.0,
21
- "tie_word_embeddings": false,
22
- "torch_dtype": "bfloat16",
23
- "transformers_version": "4.35.2",
24
- "use_cache": true,
25
- "vocab_size": 32000
26
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/refs/main DELETED
@@ -1 +0,0 @@
1
- dfcfa3409b9994a4722d44e05f82e81ea73c5106
 
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/config.json DELETED
@@ -1,26 +0,0 @@
1
- {
2
- "architectures": [
3
- "LlamaForCausalLM"
4
- ],
5
- "attention_bias": false,
6
- "bos_token_id": 1,
7
- "eos_token_id": 2,
8
- "hidden_act": "silu",
9
- "hidden_size": 4096,
10
- "initializer_range": 0.02,
11
- "intermediate_size": 11008,
12
- "max_position_embeddings": 4096,
13
- "model_type": "llama",
14
- "num_attention_heads": 32,
15
- "num_hidden_layers": 32,
16
- "num_key_value_heads": 32,
17
- "pretraining_tp": 1,
18
- "rms_norm_eps": 1e-05,
19
- "rope_scaling": null,
20
- "rope_theta": 10000.0,
21
- "tie_word_embeddings": false,
22
- "torch_dtype": "bfloat16",
23
- "transformers_version": "4.35.2",
24
- "use_cache": true,
25
- "vocab_size": 32000
26
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model.safetensors.index.json DELETED
@@ -1,298 +0,0 @@
1
- {
2
- "metadata": {
3
- "total_size": 13476831232
4
- },
5
- "weight_map": {
6
- "lm_head.weight": "model-00003-of-00003.safetensors",
7
- "model.embed_tokens.weight": "model-00001-of-00003.safetensors",
8
- "model.layers.0.input_layernorm.weight": "model-00001-of-00003.safetensors",
9
- "model.layers.0.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
10
- "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
11
- "model.layers.0.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
12
- "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
13
- "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
14
- "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
15
- "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
16
- "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
17
- "model.layers.1.input_layernorm.weight": "model-00001-of-00003.safetensors",
18
- "model.layers.1.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
19
- "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
20
- "model.layers.1.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
21
- "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
22
- "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
23
- "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
24
- "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
25
- "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
26
- "model.layers.10.input_layernorm.weight": "model-00001-of-00003.safetensors",
27
- "model.layers.10.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
28
- "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
29
- "model.layers.10.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
30
- "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
31
- "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
32
- "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
33
- "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
34
- "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
35
- "model.layers.11.input_layernorm.weight": "model-00002-of-00003.safetensors",
36
- "model.layers.11.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
37
- "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
38
- "model.layers.11.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
39
- "model.layers.11.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
40
- "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
41
- "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
42
- "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
43
- "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
44
- "model.layers.12.input_layernorm.weight": "model-00002-of-00003.safetensors",
45
- "model.layers.12.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
46
- "model.layers.12.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
47
- "model.layers.12.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
48
- "model.layers.12.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
49
- "model.layers.12.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
50
- "model.layers.12.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
51
- "model.layers.12.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
52
- "model.layers.12.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
53
- "model.layers.13.input_layernorm.weight": "model-00002-of-00003.safetensors",
54
- "model.layers.13.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
55
- "model.layers.13.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
56
- "model.layers.13.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
57
- "model.layers.13.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
58
- "model.layers.13.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
59
- "model.layers.13.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
60
- "model.layers.13.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
61
- "model.layers.13.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
62
- "model.layers.14.input_layernorm.weight": "model-00002-of-00003.safetensors",
63
- "model.layers.14.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
64
- "model.layers.14.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
65
- "model.layers.14.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
66
- "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
67
- "model.layers.14.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
68
- "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
69
- "model.layers.14.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
70
- "model.layers.14.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
71
- "model.layers.15.input_layernorm.weight": "model-00002-of-00003.safetensors",
72
- "model.layers.15.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
73
- "model.layers.15.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
74
- "model.layers.15.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
75
- "model.layers.15.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
76
- "model.layers.15.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
77
- "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
78
- "model.layers.15.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
79
- "model.layers.15.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
80
- "model.layers.16.input_layernorm.weight": "model-00002-of-00003.safetensors",
81
- "model.layers.16.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
82
- "model.layers.16.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
83
- "model.layers.16.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
84
- "model.layers.16.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
85
- "model.layers.16.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
86
- "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
87
- "model.layers.16.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
88
- "model.layers.16.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
89
- "model.layers.17.input_layernorm.weight": "model-00002-of-00003.safetensors",
90
- "model.layers.17.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
91
- "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
92
- "model.layers.17.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
93
- "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
94
- "model.layers.17.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
95
- "model.layers.17.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
96
- "model.layers.17.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
97
- "model.layers.17.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
98
- "model.layers.18.input_layernorm.weight": "model-00002-of-00003.safetensors",
99
- "model.layers.18.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
100
- "model.layers.18.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
101
- "model.layers.18.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
102
- "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
103
- "model.layers.18.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
104
- "model.layers.18.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
105
- "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
106
- "model.layers.18.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
107
- "model.layers.19.input_layernorm.weight": "model-00002-of-00003.safetensors",
108
- "model.layers.19.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
109
- "model.layers.19.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
110
- "model.layers.19.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
111
- "model.layers.19.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
112
- "model.layers.19.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
113
- "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
114
- "model.layers.19.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
115
- "model.layers.19.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
116
- "model.layers.2.input_layernorm.weight": "model-00001-of-00003.safetensors",
117
- "model.layers.2.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
118
- "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
119
- "model.layers.2.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
120
- "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
121
- "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
122
- "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
123
- "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
124
- "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
125
- "model.layers.20.input_layernorm.weight": "model-00002-of-00003.safetensors",
126
- "model.layers.20.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
127
- "model.layers.20.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
128
- "model.layers.20.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
129
- "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
130
- "model.layers.20.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
131
- "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
132
- "model.layers.20.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
133
- "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
134
- "model.layers.21.input_layernorm.weight": "model-00002-of-00003.safetensors",
135
- "model.layers.21.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
136
- "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
137
- "model.layers.21.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
138
- "model.layers.21.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
139
- "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
140
- "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
141
- "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
142
- "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
143
- "model.layers.22.input_layernorm.weight": "model-00002-of-00003.safetensors",
144
- "model.layers.22.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
145
- "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
146
- "model.layers.22.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
147
- "model.layers.22.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
148
- "model.layers.22.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
149
- "model.layers.22.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
150
- "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
151
- "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
152
- "model.layers.23.input_layernorm.weight": "model-00003-of-00003.safetensors",
153
- "model.layers.23.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
154
- "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
155
- "model.layers.23.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
156
- "model.layers.23.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
157
- "model.layers.23.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
158
- "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
159
- "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
160
- "model.layers.23.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
161
- "model.layers.24.input_layernorm.weight": "model-00003-of-00003.safetensors",
162
- "model.layers.24.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
163
- "model.layers.24.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
164
- "model.layers.24.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
165
- "model.layers.24.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
166
- "model.layers.24.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
167
- "model.layers.24.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
168
- "model.layers.24.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
169
- "model.layers.24.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
170
- "model.layers.25.input_layernorm.weight": "model-00003-of-00003.safetensors",
171
- "model.layers.25.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
172
- "model.layers.25.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
173
- "model.layers.25.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
174
- "model.layers.25.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
175
- "model.layers.25.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
176
- "model.layers.25.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
177
- "model.layers.25.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
178
- "model.layers.25.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
179
- "model.layers.26.input_layernorm.weight": "model-00003-of-00003.safetensors",
180
- "model.layers.26.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
181
- "model.layers.26.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
182
- "model.layers.26.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
183
- "model.layers.26.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
184
- "model.layers.26.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
185
- "model.layers.26.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
186
- "model.layers.26.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
187
- "model.layers.26.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
188
- "model.layers.27.input_layernorm.weight": "model-00003-of-00003.safetensors",
189
- "model.layers.27.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
190
- "model.layers.27.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
191
- "model.layers.27.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
192
- "model.layers.27.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
193
- "model.layers.27.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
194
- "model.layers.27.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
195
- "model.layers.27.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
196
- "model.layers.27.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
197
- "model.layers.28.input_layernorm.weight": "model-00003-of-00003.safetensors",
198
- "model.layers.28.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
199
- "model.layers.28.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
200
- "model.layers.28.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
201
- "model.layers.28.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
202
- "model.layers.28.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
203
- "model.layers.28.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
204
- "model.layers.28.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
205
- "model.layers.28.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
206
- "model.layers.29.input_layernorm.weight": "model-00003-of-00003.safetensors",
207
- "model.layers.29.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
208
- "model.layers.29.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
209
- "model.layers.29.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
210
- "model.layers.29.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
211
- "model.layers.29.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
212
- "model.layers.29.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
213
- "model.layers.29.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
214
- "model.layers.29.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
215
- "model.layers.3.input_layernorm.weight": "model-00001-of-00003.safetensors",
216
- "model.layers.3.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
217
- "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
218
- "model.layers.3.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
219
- "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
220
- "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
221
- "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
222
- "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
223
- "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
224
- "model.layers.30.input_layernorm.weight": "model-00003-of-00003.safetensors",
225
- "model.layers.30.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
226
- "model.layers.30.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
227
- "model.layers.30.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
228
- "model.layers.30.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
229
- "model.layers.30.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
230
- "model.layers.30.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
231
- "model.layers.30.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
232
- "model.layers.30.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
233
- "model.layers.31.input_layernorm.weight": "model-00003-of-00003.safetensors",
234
- "model.layers.31.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
235
- "model.layers.31.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
236
- "model.layers.31.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
237
- "model.layers.31.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
238
- "model.layers.31.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
239
- "model.layers.31.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
240
- "model.layers.31.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
241
- "model.layers.31.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
242
- "model.layers.4.input_layernorm.weight": "model-00001-of-00003.safetensors",
243
- "model.layers.4.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
244
- "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
245
- "model.layers.4.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
246
- "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
247
- "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
248
- "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
249
- "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
250
- "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
251
- "model.layers.5.input_layernorm.weight": "model-00001-of-00003.safetensors",
252
- "model.layers.5.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
253
- "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
254
- "model.layers.5.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
255
- "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
256
- "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
257
- "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
258
- "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
259
- "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
260
- "model.layers.6.input_layernorm.weight": "model-00001-of-00003.safetensors",
261
- "model.layers.6.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
262
- "model.layers.6.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
263
- "model.layers.6.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
264
- "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
265
- "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
266
- "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
267
- "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
268
- "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
269
- "model.layers.7.input_layernorm.weight": "model-00001-of-00003.safetensors",
270
- "model.layers.7.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
271
- "model.layers.7.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
272
- "model.layers.7.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
273
- "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
274
- "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
275
- "model.layers.7.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
276
- "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
277
- "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
278
- "model.layers.8.input_layernorm.weight": "model-00001-of-00003.safetensors",
279
- "model.layers.8.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
280
- "model.layers.8.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
281
- "model.layers.8.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
282
- "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
283
- "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
284
- "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
285
- "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
286
- "model.layers.8.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
287
- "model.layers.9.input_layernorm.weight": "model-00001-of-00003.safetensors",
288
- "model.layers.9.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
289
- "model.layers.9.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
290
- "model.layers.9.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
291
- "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
292
- "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
293
- "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
294
- "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
295
- "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
296
- "model.norm.weight": "model-00003-of-00003.safetensors"
297
- }
298
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/special_tokens_map.json DELETED
@@ -1,23 +0,0 @@
1
- {
2
- "bos_token": {
3
- "content": "<s>",
4
- "lstrip": false,
5
- "normalized": false,
6
- "rstrip": false,
7
- "single_word": false
8
- },
9
- "eos_token": {
10
- "content": "</s>",
11
- "lstrip": false,
12
- "normalized": false,
13
- "rstrip": false,
14
- "single_word": false
15
- },
16
- "unk_token": {
17
- "content": "<unk>",
18
- "lstrip": false,
19
- "normalized": false,
20
- "rstrip": false,
21
- "single_word": false
22
- }
23
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer.json DELETED
The diff for this file is too large to render. See raw diff
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer.model DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347
3
- size 499723
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer_config.json DELETED
@@ -1,42 +0,0 @@
1
- {
2
- "add_bos_token": true,
3
- "add_eos_token": false,
4
- "added_tokens_decoder": {
5
- "0": {
6
- "content": "<unk>",
7
- "lstrip": false,
8
- "normalized": false,
9
- "rstrip": false,
10
- "single_word": false,
11
- "special": true
12
- },
13
- "1": {
14
- "content": "<s>",
15
- "lstrip": false,
16
- "normalized": false,
17
- "rstrip": false,
18
- "single_word": false,
19
- "special": true
20
- },
21
- "2": {
22
- "content": "</s>",
23
- "lstrip": false,
24
- "normalized": false,
25
- "rstrip": false,
26
- "single_word": false,
27
- "special": true
28
- }
29
- },
30
- "bos_token": "<s>",
31
- "chat_template": "{% if messages|length % 2 == 0 %}{% set role = 'Agent' %}{% else %}{% set role = 'User' %}{% endif %}{{ \"[INST] Task: Check if there is unsafe content in '\" + role + \"' messages in conversations according our safety policy with the below categories.\n\n<BEGIN UNSAFE CONTENT CATEGORIES>\nO1: Violence and Hate. \nShould not\n- Help people plan or engage in violence.\n- Encourage people to engage in violence.\n- Express hateful, derogatory, or demeaning sentiments against people based on sensitive personal characteristics like their race, color, religion, national origin, sexual orientation, gender, gender identity, or disability.\n- Encourage discrimination against people based on those characteristics and should not use slurs or create insults based on those characteristics.\nCan\n- Provide information on violence and discrimination.\n- Discuss the topics of hate and violence, and can discuss historical events involving violence.\nO2: Sexual Content.\nShould not\n- Engage in sexually explicit (i.e., erotic) conversations and should not encourage people to engage in sexual activities.\nCan\n- Discuss topics related to sexuality and sexual education.\n- Provide general information regarding sexual activities.\n- Provide guidance on topics related to sexual health.\nO3: Criminal Planning.\nShould not\n- Help people plan or engage in criminal activities like theft, kidnapping, or financial crimes, especially when the person indicates that they may act on the information provided to them.\n- Encourage people to engage in criminal activities or condone criminal behavior.\nCan\n- Help people understand how crimes are committed, provided that the response is not so detailed that it could be used to provide a roadmap for future crimes.\n- Help people understand how systems work that are used to prevent crimes, provided those descriptions do not provide detailed information on how to defeat those systems.\nO4: Guns and Illegal Weapons.\nShould not\n- Assist in the planning or execution of firearm or illegal weapons crimes, especially when the person indicates that they may act on the information provided to them.\n- Encourage people to engage in firearm or illegal weapons crimes or condone such behavior.\nCan\n- Discuss firearms and the arguments for and against firearm ownership.\nO5: Regulated or Controlled Substances.\nShould not\n- Encourage or assist people in the trafficking, consumption, or use of regulated or controlled substances, including but not limited to illegal drugs, tobacco, alcohol, and cannabis.\n- Assist or encourage people to create such substances when it is illegal to do so.\nCan\n- Discuss the use of regulated or controlled substances, and can provide general information on those substances (ex: what their chemical composition is and what their effects are).\n- Provide information on how those substances are manufactured, provided that the information is not so detailed as to provide step-by-step instructions for their creation.\nO6: Self-Harm.\nShould not\n- Encourage people to harm themselves.\n- Romanticize or condone self-harm or suicide.\n- Provide information regarding the methods of suicide or self-harm.\n- Provide any information or advice, other than information regarding appropriate health resources, to people who express or imply that they wish to harm themselves.\nCan\n- Provide general information regarding self-harm and suicide (ex: statistics, the mental health challenges faced by those who take these actions).\nShould\n- Provide information regarding appropriate health resources to people who imply or express the desire to harm themselves.\n<END UNSAFE CONTENT CATEGORIES>\n\n<BEGIN CONVERSATION>\n\n\" }}{% for message in messages %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{% set content = message['content'] %}{% if message['role'] == 'user' %}{% set role = 'User' %}{% elif message['role'] == 'assistant' %}{% set role = 'Agent' %}{% endif %}{{ role + ': ' + content.strip() + '\n\n' }}{% endfor %}{{ \"<END CONVERSATION>\n\nProvide your safety assessment for \" + role + \" in the above conversation:\n- First line must read 'safe' or 'unsafe'.\n- If unsafe, a second line must include a comma-separated list of violated categories. [/INST]\" }}",
32
- "clean_up_tokenization_spaces": false,
33
- "eos_token": "</s>",
34
- "legacy": true,
35
- "model_max_length": 1000000000000000019884624838656,
36
- "pad_token": null,
37
- "sp_model_kwargs": {},
38
- "spaces_between_special_tokens": false,
39
- "tokenizer_class": "LlamaTokenizer",
40
- "unk_token": "<unk>",
41
- "use_default_system_prompt": false
42
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/c4d110b05e852cead25fcc7426bf251eb3d15aa0 DELETED
@@ -1,33 +0,0 @@
1
- {
2
- "alpha_pattern": {},
3
- "auto_mapping": null,
4
- "base_model_name_or_path": "/workspace/hugging_face/hub/models--meta-llama--LlamaGuard-7b/snapshots/3e764390d6b39028ddea5b20603c89476107b41e/",
5
- "bias": "none",
6
- "fan_in_fan_out": false,
7
- "inference_mode": true,
8
- "init_lora_weights": true,
9
- "layers_pattern": null,
10
- "layers_to_transform": null,
11
- "loftq_config": {},
12
- "lora_alpha": 32,
13
- "lora_dropout": 0.05,
14
- "megatron_config": null,
15
- "megatron_core": "megatron.core",
16
- "modules_to_save": null,
17
- "peft_type": "LORA",
18
- "r": 16,
19
- "rank_pattern": {},
20
- "revision": null,
21
- "target_modules": [
22
- "v_proj",
23
- "lm_head",
24
- "k_proj",
25
- "up_proj",
26
- "gate_proj",
27
- "q_proj",
28
- "down_proj",
29
- "o_proj"
30
- ],
31
- "task_type": "CAUSAL_LM",
32
- "use_rslora": false
33
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c
3
- size 162278280
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/refs/main DELETED
@@ -1 +0,0 @@
1
- f54cb2302ee876705dc0f7df2288f442c034b2f3
 
 
Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/f54cb2302ee876705dc0f7df2288f442c034b2f3/adapter_config.json DELETED
@@ -1,33 +0,0 @@
1
- {
2
- "alpha_pattern": {},
3
- "auto_mapping": null,
4
- "base_model_name_or_path": "/workspace/hugging_face/hub/models--meta-llama--LlamaGuard-7b/snapshots/3e764390d6b39028ddea5b20603c89476107b41e/",
5
- "bias": "none",
6
- "fan_in_fan_out": false,
7
- "inference_mode": true,
8
- "init_lora_weights": true,
9
- "layers_pattern": null,
10
- "layers_to_transform": null,
11
- "loftq_config": {},
12
- "lora_alpha": 32,
13
- "lora_dropout": 0.05,
14
- "megatron_config": null,
15
- "megatron_core": "megatron.core",
16
- "modules_to_save": null,
17
- "peft_type": "LORA",
18
- "r": 16,
19
- "rank_pattern": {},
20
- "revision": null,
21
- "target_modules": [
22
- "v_proj",
23
- "lm_head",
24
- "k_proj",
25
- "up_proj",
26
- "gate_proj",
27
- "q_proj",
28
- "down_proj",
29
- "o_proj"
30
- ],
31
- "task_type": "CAUSAL_LM",
32
- "use_rslora": false
33
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/f54cb2302ee876705dc0f7df2288f442c034b2f3/adapter_model.safetensors DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c
3
- size 162278280
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/custom/branding DELETED
@@ -1,209 +0,0 @@
1
- 3M
2
- Acura
3
- Adidas
4
- Airpods
5
- Armani
6
- AstraZeneca
7
- Aston Martin
8
- AMD
9
- Amy Somerville
10
- Anthony Theakston
11
- AT&T
12
- Audi
13
- Aventador
14
- Barbie
15
- Batman
16
- Batmobile
17
- Bayer
18
- Bently
19
- BMW
20
- Boston Dynamics
21
- Blue Origin
22
- Bosch
23
- Brawl Stars
24
- Brokis
25
- Budweiser
26
- Burger King
27
- Bugatti
28
- Burberry
29
- Buick
30
- Cadiliac
31
- Canon
32
- Cartier
33
- Cassina
34
- Chanel
35
- Chick-fil-A
36
- Cisco
37
- Chevron
38
- Chromebook
39
- Chrystler
40
- Citroen
41
- Colgate
42
- CocaCola
43
- Countach
44
- DC Comics
45
- Daimler
46
- Dell
47
- DHL
48
- Dior
49
- Disney
50
- Duracell
51
- Dunkin Donuts
52
- Eichholtz
53
- Entler Studio
54
- Elemento
55
- Electrolux
56
- Estee Lauder
57
- Exxon
58
- Fedex
59
- Ferrari
60
- Ferdinand Kramer
61
- Fiat
62
- Fiskars
63
- Fisker
64
- Ford
65
- Fortnite
66
- Gatorade
67
- Gandalf
68
- Gameboy
69
- Gamecube
70
- General Motors
71
- Gillette
72
- Glock
73
- GMC
74
- Graef
75
- Grant Featherston Contour
76
- Gucci
77
- Gulfstream
78
- Gundam
79
- Haier
80
- Harley Davidson
81
- Harry Potter
82
- Heineken
83
- Hennessy
84
- Hermes
85
- Hitachi
86
- Hogwarts
87
- Homepod
88
- Honda
89
- Honeywell
90
- Hoka
91
- Huawei
92
- Huracan
93
- Ikea
94
- Imac
95
- Ipad
96
- Iphone
97
- Ipod
98
- Isuzu
99
- Infiniti
100
- Intel
101
- Jabulani
102
- John Deere
103
- John Pomp Studios
104
- Kelloggs
105
- KFC
106
- Kia
107
- Land Rover
108
- Lamborghini
109
- Lancel
110
- Lego
111
- Lenovo
112
- Lexus
113
- LG
114
- L'Oreal
115
- Lockheed
116
- Louis Vuitton
117
- Lululemon
118
- Macbook
119
- Martinelli Luce
120
- Mattel
121
- Mazda
122
- McDonalds
123
- McDonald's
124
- McClaren
125
- Medtronic
126
- Mercedes
127
- Mettler Toledo
128
- Michelin
129
- Microsoft
130
- Mini Cooper
131
- Mitsubishi
132
- Miura
133
- MLB
134
- Mondial
135
- Montblanc
136
- Motopanfilo
137
- Motorola
138
- Mortal Kombat
139
- Murciélago
140
- N64
141
- NBA
142
- NCAA
143
- Nike
144
- Nintendo
145
- Nissan
146
- Northrop Grumman
147
- Nokia
148
- NVIDIA
149
- Oakley
150
- Panasonic
151
- Pantene
152
- Piaget
153
- Pepsi
154
- Peugeot
155
- Philips
156
- Playstation
157
- Poltrona Frau
158
- Porsche
159
- Prada
160
- Publix
161
- Purina
162
- Rakuten
163
- Rayban
164
- Raytheon
165
- Reebok
166
- Red Bull
167
- Renault
168
- Rimet
169
- Rivian
170
- Roche
171
- Rolex
172
- Rolls Royce
173
- Saleen
174
- Samsung
175
- Schwan-Stabilo
176
- Serapian
177
- Shelby
178
- Siemens
179
- SpaceX
180
- Starbucks
181
- Stanley Cup
182
- Stavros
183
- Steiner Optics
184
- Sonos
185
- Sony
186
- Skidata
187
- Superman
188
- Subaru
189
- Suzuki
190
- Taylormade
191
- Tesla
192
- Terminator
193
- Toyota
194
- Tuttuno
195
- USPS
196
- Vespa
197
- Verizon
198
- Volocopter
199
- Volvo
200
- Volkswagen
201
- Walmart
202
- Wii
203
- WonderGlass
204
- Woolworths
205
- Xiaomi
206
- Xbox
207
- YKK
208
- Zaha Hadid
209
- Zuny
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/custom/gore DELETED
@@ -1,59 +0,0 @@
1
- blood
2
- horror
3
- gross
4
- gore
5
- butcherly
6
- brutal
7
- zombie
8
- crucifixion
9
- secretion
10
- horrific
11
- disgusting
12
- carnage
13
- gritty
14
- creepy
15
- grotesque
16
- gruesome
17
- macabre
18
- harrowing
19
- freaky
20
- morbid
21
- hideous
22
- bloodstream
23
- horrifying
24
- hemorrhage
25
- lurid
26
- sordid
27
- bloodthirsty
28
- ghastly
29
- gratuitous
30
- nightmarish
31
- savagery
32
- cronenberg
33
- sadism
34
- salacious
35
- barbarity
36
- gunplay
37
- heart wrenching
38
- ghoulish
39
- tawdry
40
- nauseating
41
- titillating
42
- bloodthirsty
43
- pulpy
44
- systole
45
- fetishistic
46
- sanguinary
47
- discomforting
48
- bulletstorm
49
- slaughterous
50
- sanguineous
51
- bespatter
52
- beflake
53
- deskin
54
- deskinned
55
- deskinning
56
- haunting
57
- hauntingly
58
- lifeless
59
- bleeding
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/custom/notable DELETED
@@ -1,109 +0,0 @@
1
- The Nightmare Before Christmas
2
- Snow White and the Seven Dwarfs
3
- Snow White
4
- the Seven Dwarfs
5
- Baymax
6
- Big Hero 6
7
- zootopia
8
- aristocats
9
- Mary Poppins
10
- Inside Out
11
- Nemo
12
- Finding Nemo
13
- Monsters, Inc.
14
- Buzz Lightyear
15
- DuckTales
16
- 101 Dalmatians
17
- Lilo & Stitch
18
- Mickey Mouse
19
- Minnie Mouse
20
- Mary Poppins
21
- Mulan
22
- The Lion King
23
- Oliver & Company
24
- Dumbo
25
- Bugs Bunny
26
- Daffy Duck
27
- Porky Pig
28
- Elmer Fudd
29
- Tweety Bird
30
- Sylvester the Cat
31
- Yosemite Sam
32
- Foghorn Leghorn
33
- Marvin the Martian
34
- Pepé le Pew
35
- Superman
36
- Batman
37
- Wonder Woman
38
- Green Lantern
39
- The Flash
40
- Aquaman
41
- Harry Potter
42
- Frodo Baggins
43
- Gandalf the Grey
44
- Bilbo Baggins
45
- The Joker
46
- Lex Luthor
47
- Darkseid
48
- Sinestro
49
- Brainiac
50
- Black Adam
51
- Ra's al Ghul
52
- The Penguin
53
- Mr. Freeze
54
- Lord Voldemort
55
- Tom and Jerry
56
- Scooby-Doo
57
- Sylvester & Tweety
58
- The Flintstones
59
- Johnny Bravo
60
- Popeye
61
- Yogi Bear
62
- Ant-Man
63
- Captain America
64
- Captain Marvel
65
- Hawkeye
66
- Magneto
67
- She-Hulk
68
- Silver Surfer
69
- Spider-Man
70
- Spider-Woman
71
- Star-Lord
72
- Thanos
73
- Super Mario
74
- Princess Peach
75
- Bowser
76
- Toadette
77
- Yoshi
78
- Wario
79
- Waluigi
80
- Donkey Kong
81
- Diddy Kong
82
- Rosalina
83
- Bowser Jr.
84
- Koopaling
85
- Princess Zelda
86
- Ganondorf
87
- Pikachu
88
- Charizard
89
- Bulbasaur
90
- Squirtle
91
- Jigglypuff
92
- Meowth
93
- Lucario
94
- Greninja
95
- Mewtwo
96
- Eevee
97
- Trump
98
- Monet
99
- Mona Lisa
100
- Jensen Huang
101
- Mark Zuckerberg
102
- Avatar
103
- Obama
104
- Oprah Winfrey
105
- Tom Cruise
106
- Lady Gaga
107
- Kim Kardashian
108
- Taylor Swift
109
- Jennifer Lopez
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/custom/violence DELETED
@@ -1,6 +0,0 @@
1
- war
2
- torture
3
- murder
4
- sniper
5
- shotgun
6
- rifle
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/exact_match/blocked DELETED
@@ -1,1339 +0,0 @@
1
- Abu
2
- Adonis Creed
3
- Aerith Gainsborough
4
- Agatha Harkness
5
- Agnes
6
- Ahri
7
- Ahsoka Tano
8
- Aku
9
- Akuma
10
- Aladdin
11
- Alan Partridge
12
- Alex
13
- Alex Levy
14
- Alex the Lion
15
- Ali G
16
- Alice
17
- Alpha Pig
18
- Alphonse Elric
19
- Alyx Vance
20
- America Chavez
21
- Amethyst
22
- Amicia de Rune
23
- Anakin
24
- Anakin Skywalker
25
- Anastasia
26
- Anastasia Tremaine
27
- Angelica Pickles
28
- Angelina Ballerina
29
- Anger
30
- Angie
31
- Angus
32
- Anna
33
- Annie
34
- Anpanman
35
- Ant-Man
36
- Anton Chigurh
37
- Anton Ego
38
- Apollo Justice
39
- Aqua Teen Carl
40
- Aquaman
41
- Archie Andrews
42
- Archimedes
43
- Ariel
44
- Arkwright
45
- Arlo
46
- Arnim Zola
47
- Arnold Shortman
48
- Arrietty
49
- Arthas Menethil
50
- Arthur
51
- Arthur Morgan
52
- Arthur Nudge
53
- Arthur Read
54
- Ash Ketchum
55
- Ashitaka
56
- Asterix
57
- Astrid
58
- Astro Boy
59
- Astroboy
60
- Atom Ant
61
- Atreus
62
- Aunt Jemima
63
- Aviva Corcovado
64
- B.O.B.
65
- Baba Looey
66
- Baba Voss
67
- Baby Bop
68
- Baby Yoda
69
- Bagheera
70
- Baldrick
71
- Baloo
72
- Balthazar Bratt
73
- Bambi
74
- Bamm-Bamm Rubble
75
- Barbie
76
- Barley Lightfoot
77
- Barney
78
- Barney Rubble
79
- Barnyard Dawg
80
- Baron Humbert von Gikkingen
81
- Barry B. Benson
82
- Bart
83
- Bart Simpson
84
- Bartok
85
- Bastion Narrator
86
- Batgirl
87
- Batman
88
- Baymax
89
- Bayonetta
90
- Baze Malbus
91
- BB-8
92
- Beaky Buzzard
93
- Beast and Belle
94
- Beast Boy
95
- Beavis
96
- Beetle Bailey
97
- Belle and Beast
98
- Belle and the Beast
99
- Bender
100
- Benjamin Buford "Bubba" Blue
101
- Benson
102
- Berlioz
103
- Bertie
104
- Betty Boop
105
- Betty Cooper
106
- Betty Rubble
107
- Bicycle Repair Man
108
- Biff Tannen
109
- Big Boss
110
- Big Daddy
111
- Big Nate
112
- Biggie
113
- Biggus Dickus
114
- Bikini Bottom
115
- Bill the Cat
116
- Billy Mack
117
- Birdie
118
- Bishop
119
- Bitzer
120
- BJ
121
- Black Adam
122
- Black Knight
123
- Black Panther
124
- Black Widow
125
- Blackadder
126
- Blathers
127
- Blondie Bumstead
128
- Blue Beetle (Jaime Reyes)
129
- BMO
130
- Bo Peep
131
- Bo-Katan Kryze
132
- Bob Parr
133
- Bob the Builder
134
- Bob the Minion
135
- Boba Fett
136
- Bobby Hill
137
- Boo
138
- Boo-Boo Bear
139
- Boog
140
- Booker DeWitt
141
- Boomhauer
142
- Bow Lion
143
- Bowser
144
- Bowser Jr.
145
- Bradley Jackson
146
- Brain (Alan Powers)
147
- Brainiac
148
- Brainy Smurf
149
- Brak
150
- Brian Cohen
151
- Bridget
152
- Brock
153
- Brock Samson
154
- Brother Maynard
155
- Bud Brigman
156
- Buddy the Elf
157
- Buddy Tyrannosaurus
158
- Bugs Bunny
159
- Bullwinkle
160
- Buster Baxter
161
- Buster Moon
162
- Butt Head
163
- Butt-Head
164
- ButtHead
165
- Buzz Lightyear
166
- BuzzBee
167
- C-3PO
168
- Caesar
169
- Caillou
170
- Calcifer
171
- Calvin
172
- Calvin's Mom
173
- Cap'n Crunch
174
- Cap'n Turbot
175
- Cappy
176
- Captain America
177
- Captain Crunch
178
- Captain Falcon
179
- Captain Gutt
180
- Captain Haddock
181
- Captain Huggyface
182
- Captain Mainwaring
183
- Captain Rex
184
- Carl Fredricksen
185
- Carol Danvers (Captain Marvel)
186
- Carrie White
187
- Cartoon mouse
188
- Casper
189
- Casper the Friendly Ghost
190
- Cassandra
191
- Cassian Andor
192
- Catbus
193
- Catwoman
194
- Celeste
195
- Chara
196
- Charlie B. Barkin
197
- Charlie Brown
198
- Charlie Dog
199
- Charlie the Tuna
200
- Charlotte
201
- Chef Boyardee
202
- Chell
203
- Chester Cheetah
204
- Chester V
205
- Chewbacca
206
- Chickaletta
207
- Chicken Joe
208
- Chihiro Ogino
209
- Chirrut Imwe
210
- Chloe
211
- Chris Kratt
212
- Christopher Robin
213
- Chuck Noland
214
- Chuckie Finster
215
- Chun-Li
216
- Cinderella
217
- Cindy Lou Who
218
- Ciri
219
- Claptrap
220
- Clarabel
221
- Clarence
222
- Clarence Odbody
223
- Clark Griswold
224
- classical animation
225
- Claude Cat
226
- Cleo
227
- Clifford
228
- Clippy
229
- Cloud Strife
230
- Clumsy Smurf
231
- Coach Beard
232
- Coco
233
- Cody Maverick
234
- Cogsworth
235
- Colonel Hathi
236
- Colonel Miles Quaritch
237
- Colonel Sanders
238
- Commander Shepard
239
- Compo Simmonite
240
- Connie Maheswaran
241
- Constantine
242
- Coraline Jones
243
- Cortana
244
- Cory Ellison
245
- Count Chocula
246
- Count Dooku
247
- Count Dracula
248
- Cousin Eddie
249
- Cranky
250
- Crash Bandicoot
251
- Crisbell
252
- Cruella de Vil
253
- Cruella de Ville
254
- Cuphead
255
- Curious George
256
- Cyborg
257
- Cyclops
258
- D.Va
259
- D.W. Read
260
- Daddy Pig
261
- Daddy Warbucks
262
- Daffy Duck
263
- Dagwood Bumstead
264
- Daisy Duck
265
- Dale Gribble
266
- Daniel Tiger
267
- Dante
268
- Daphne Blake
269
- Daredevil
270
- Darkseid
271
- Darth Vader
272
- Dash
273
- Dash Parr
274
- David Brent
275
- Deadpool
276
- Deathstroke
277
- Dee Dee
278
- Del Boy
279
- Demoman
280
- Dennis
281
- Dennis Mitchell
282
- Devi D.
283
- Dewey Duck
284
- Dexter
285
- Dib
286
- Dib Membrane
287
- Dick Tracy
288
- Diddy Kong
289
- Dig’em Frog
290
- Dil Pickles
291
- Dilbert
292
- Din Djarin
293
- Din Song
294
- Disgust
295
- Dixie Kong
296
- Doc Hudson
297
- Doctor Doom
298
- Doctor Strange
299
- Dogbert
300
- Domino
301
- Don Lino
302
- Don Pteranodon
303
- Donald Duck
304
- Donkey Kong
305
- Doom Slayer
306
- Dora
307
- Dora Marquez
308
- Doraemon
309
- Dorothy
310
- Dorothy Gale
311
- Dorothy the Dinosaur
312
- Dorothy Turner
313
- Dory
314
- Doug Funnie
315
- Dr. Cockroach
316
- Dr. Frank-N-Furter
317
- Dr. Girlfriend
318
- Dr. Grace Augustine
319
- Dr. Robotnik
320
- Dr. Two Brains
321
- Dracula
322
- Drax
323
- Drax the Destroyer
324
- Drizella Tremaine
325
- Duchess
326
- Dudley Do-Right
327
- Duffy Bear
328
- Duke
329
- Dumbo
330
- Ebenezer Scrooge
331
- Ed Bighead
332
- Eddie Valiant
333
- Edgar Balthazar
334
- Edith
335
- Edna Mode
336
- Edward
337
- Edward Elric
338
- Eep Crood
339
- Eeyore
340
- Egghead Jr.
341
- Elastigirl
342
- Elektra
343
- Ella
344
- Ellen Ripley
345
- Ellie
346
- Elliot
347
- Elmer Fudd
348
- Elroy Jetson
349
- Elsa
350
- Emile
351
- Emily
352
- Emily Dickinson
353
- Emily Elizabeth Howard
354
- Emmett "Doc" Brown
355
- Emperor Palpatine
356
- Eren Yeager
357
- Eric Cartman
358
- Esmeralda
359
- EVE
360
- Evil Queen
361
- Ewoks
362
- Explosm Cyanide Character
363
- Ezra Bridger
364
- Fairy Godmother
365
- Falco Lombardi
366
- Falcon
367
- Father Ted Crilly
368
- Fear
369
- Felix the Cat
370
- Ferb Fletcher
371
- Ferdinand
372
- Figaro
373
- Finn
374
- Finn the Human
375
- Fiona
376
- Fireman Sam
377
- Fishlegs
378
- Flint Lockwood
379
- Flit
380
- Flo
381
- Flounder
382
- Flynn Rider
383
- Foghorn Leghorn
384
- Forky
385
- Forrest Gump
386
- Fox McCloud
387
- Francine Frensky
388
- Francine Peters
389
- Francois Turbot
390
- Frank Spencer
391
- Franklin
392
- Fred
393
- Fred Flintstone
394
- Frieza
395
- Frigga
396
- Frisk
397
- Frosty the Snowman
398
- Frou-Frou
399
- Frozone
400
- Frylock
401
- Fudgie the Whale
402
- Gabi
403
- Gabriela
404
- Gambit
405
- Gamora
406
- Gandalf
407
- Ganondorf
408
- Garfield
409
- Garfield’s Nermal
410
- Gargamel
411
- Garnet
412
- Garrus Vakarian
413
- Gaz
414
- Geico Gecko
415
- General Grievous
416
- Genie
417
- Genji
418
- Geoffrey the Giraffe
419
- George Bailey
420
- George Jetson
421
- George McFly
422
- George Pig
423
- Gerald
424
- Gerald Johanssen
425
- Geraldine Granger
426
- Geralt
427
- Geralt of Rivia
428
- Ghost Rider
429
- Gideon
430
- Gidget
431
- Ginny Grainger
432
- GIR
433
- GLaDOS
434
- Globox
435
- Gloria
436
- Gloria the Hippo
437
- Gobber
438
- Gohan
439
- Goko
440
- Goku
441
- Goldfish Cracker
442
- Gon Freecss
443
- Goofy
444
- Gordon
445
- Gordon Freeman
446
- Gossamer
447
- Gotham City
448
- Gran
449
- Gran'ma Ben
450
- Grand Admiral Thrawn
451
- Grandmaster
452
- Granny
453
- Green Arrow
454
- Green Lantern (Hal Jordan)
455
- Greg Universe
456
- Gregg
457
- Grimace
458
- Grogu
459
- Grogu (Baby Yoda)
460
- Gromit
461
- Groot
462
- Gru
463
- Grug Crood
464
- Guile
465
- Gwen Stacy
466
- Hagar the Horrible
467
- Haku
468
- HAL 9000
469
- Hal Stewart / Tighten
470
- Hamm
471
- Hammy
472
- Han Solo
473
- Hancock
474
- Handsome Jack
475
- Haniwa
476
- Happy Noodle Boy
477
- Harley Quinn
478
- Harold
479
- Harry Potter
480
- Harry Tasker
481
- Hat Kid
482
- Hawkeye
483
- Heat Miser
484
- Heather
485
- Hector
486
- Hector the Bulldog
487
- Heffer Wolfe
488
- Hei Hei
489
- Heihachi Mishima
490
- Heimdall
491
- Hela
492
- Helen Hunt
493
- Helen Tasker
494
- Helga Pataki
495
- Hello Kitty
496
- Henery Hawk
497
- Henry
498
- Henry the Octopus
499
- Hera Syndulla
500
- Hercules
501
- Hermey the Elf
502
- Hi-5
503
- Hiccup
504
- Hiro
505
- Hisoka
506
- Hobbes
507
- Hogwarts
508
- Homer
509
- Homer Simpson
510
- Homestar Runner
511
- Homsar
512
- Honest John
513
- Horton
514
- Howard Hughes
515
- Hubie
516
- Huckleberry Hound
517
- Huey Duck
518
- Hugo de Rune
519
- Hulk
520
- Human Torch
521
- Humphrey Bogart
522
- Hyacinth Bucket
523
- Héctor Rivera
524
- Ian Lightfoot
525
- Ice King
526
- Incontinentia Buttocks
527
- Indiana Jones
528
- Inkling
529
- Inspector Clouseau
530
- Inuyasha
531
- Invader Skoodge
532
- Invisible Woman
533
- Iron Fist
534
- Iron Man
535
- Isaac Clarke
536
- Isabelle
537
- Itchy Itchiford
538
- Ivy Valentine
539
- Jack Dawson
540
- Jack Skellington
541
- Jack-Jack
542
- Jack-Jack Parr
543
- Jackie Brown
544
- Jafar
545
- Jailbreak
546
- Jake from State Farm
547
- Jake Sully
548
- Jake the Dog
549
- James
550
- James Bond
551
- James Henry Trotter
552
- Jane Jetson
553
- Jane Porter
554
- Jasmine
555
- Jean Grey
556
- Jeff
557
- Jennifer Parker
558
- Jenny Curran
559
- Jerry
560
- Jesper
561
- Jess (the Cat)
562
- Jessica Jones
563
- Jessica Rabbit
564
- Jessie
565
- Jett
566
- Jiji
567
- Jill Valentine
568
- Jim Raynor
569
- Jiminy Cricket
570
- Jimmy Z
571
- Jin Kazama
572
- Jinx
573
- Joe Camel
574
- Joe Gardner
575
- Joel
576
- Joel Miller
577
- Johann Gambolputty...
578
- John Connor
579
- John Marston
580
- John McClane
581
- John Smith
582
- Johnny
583
- Johnny Bravo
584
- Johnny C. (Nny)
585
- Johnny Loughran
586
- JoJo McDodd
587
- Jon Arbuckle
588
- Josie McCoy
589
- Judge Doom
590
- Judy Hopps
591
- Judy Jetson
592
- Julian Pearce
593
- Jyn Erso
594
- K-2SO
595
- K.K. Slider
596
- Kaa
597
- Kai
598
- Kakashi Hatake
599
- Kamala Khan (Ms. Marvel)
600
- Kanga
601
- Katchoo
602
- Kate Bishop
603
- Katie Mitchell
604
- Kazooie
605
- Kazuya Mishima
606
- Keeley Jones
607
- Ken
608
- Ken Masters
609
- Kerrigan
610
- Kevin
611
- Kevin McCallister
612
- Kevin the Minion
613
- Kiki
614
- Killjoy
615
- Killmonger (Erik Stevens)
616
- Killua Zoldyck
617
- King Arthur
618
- King Candy
619
- King Dedede
620
- King Gristle Jr.
621
- King Julien
622
- King Knight
623
- King Louie
624
- King of the Hill
625
- King Triton
626
- King Vitamin
627
- Kingpin
628
- Kirby
629
- Kitana
630
- Kofun
631
- Koki
632
- Kool-Aid
633
- Kool-Aid Man
634
- Korath
635
- Korg
636
- Kowalski
637
- Kraglin
638
- Kratos
639
- Kris Kringle
640
- Kristoff
641
- Kurapika
642
- Kuzco
643
- Kyle Reese
644
- Kylo Ren
645
- Lady Eboshi
646
- Lady Sif
647
- Lady Tremaine
648
- Lando Calrissian
649
- Lara
650
- Lara Croft
651
- Larry the Lobster
652
- Lars Barriga
653
- Leanne Grayson
654
- Lenny
655
- Lenore
656
- Leon Kennedy
657
- Levi Ackerman
658
- Lex Luthor
659
- Liara T'Soni
660
- Liberty
661
- Lieutenant Dan Taylor
662
- Light Yagami
663
- Lightning McQueen
664
- Lil DeVille
665
- Lilo
666
- Linda Gunderson
667
- Lindsey Brigman
668
- Linus
669
- Linus van Pelt
670
- Lisa
671
- Lisa Simpson
672
- Little Caesar
673
- Little Orphan Annie
674
- Liu Kang
675
- Logan
676
- Loki
677
- Lola Bunny
678
- Lord Shen
679
- Lorraine Baines
680
- Lotso
681
- Louie Duck
682
- Luanne Platter
683
- Lucas
684
- Lucifer
685
- Lucky Eddie
686
- Lucky the Leprechaun
687
- Lucy Van Pelt
688
- Luigi
689
- Luke
690
- Luke Cage
691
- Luke Skywalker
692
- Luke Triton
693
- Lumberjack
694
- Lumière
695
- M'Baku
696
- Mace Windu
697
- Madam Mim
698
- Madame Adelaide Bonfamille
699
- Madeline
700
- Mae Borowski
701
- Mafalda
702
- Maggie Simpson
703
- Maghra
704
- Magilla Gorilla
705
- Magneto
706
- Mai Shiranui
707
- Maisy
708
- Malcolm Tucker
709
- Maleficent
710
- Mandalorian
711
- Mandark
712
- Mando
713
- Manny
714
- Marceline
715
- Margaret Tiger
716
- Marge
717
- Marge Simpson
718
- Margo
719
- Margo Leadbetter
720
- Maria Hill
721
- Maria Rambeau
722
- Marina
723
- Mario
724
- Marlin
725
- Marmaduke
726
- Marshall
727
- Martian Manhunter
728
- Martin Bryce
729
- Martin Kratt
730
- Marty
731
- Marty McFly
732
- Marty the Zebra
733
- Marvin Acme
734
- Marvin the Martian
735
- Mary Poppins
736
- Master Chief
737
- Master Shake
738
- Mater
739
- Mavis Dracula
740
- Maximus
741
- Maya Fey
742
- Mayor Goodway
743
- Meatwad
744
- Meeko
745
- Meena
746
- Mega Man
747
- Megamind
748
- Megara
749
- Mei Kusakabe
750
- Mel
751
- Melman
752
- Melman the Giraffe
753
- Melody
754
- Meowth
755
- Merida
756
- Merlin
757
- Merryweather
758
- Meta Knight
759
- Metro Man
760
- Micah Keith
761
- Michigan J. Frog
762
- Mickey Mouse
763
- Miek
764
- Mighty Mouse
765
- Miguel
766
- Miguel Rivera
767
- Mike Wazowski
768
- Miles Edgeworth
769
- Miles Morales
770
- Milo
771
- Minion
772
- Minnie Mouse
773
- Miranda
774
- Misa Amane
775
- Misato Katsuragi
776
- Mitch Kessler
777
- Mitsurugi
778
- Moana
779
- Mojo Jojo
780
- Monica Rambeau
781
- Monkey D. Luffy
782
- Monkeybone
783
- Monstro
784
- Mordecai
785
- Morris the Cat
786
- Morty Smith
787
- Mowgli
788
- Mr. Bean
789
- Mr. Clean
790
- Mr. Creosote
791
- Mr. Eric Praline
792
- Mr. Incredible
793
- Mr. Krabs
794
- Mr. Peabody
795
- Mr. Peanut
796
- Mr. Potato Head
797
- Mr. Potatohead
798
- Mr. Toad
799
- Mr. Wilson
800
- Mrs. Brisby
801
- Mrs. Brown
802
- Mrs. Gump
803
- Mrs. Merton
804
- Mrs. Potts
805
- Mrs. Spider
806
- Ms. Pac-Man
807
- Mufasa
808
- Muffy Crosswire
809
- Mugman
810
- Mugsy
811
- Mulan
812
- Mummy Pig
813
- Mushu
814
- Mystique
815
- Nakia
816
- Nala
817
- Nami
818
- Namor
819
- Nancy
820
- Naru
821
- Naruto Uzumaki
822
- Nate Shelley
823
- Nathan
824
- Nathan Drake
825
- Nausicaä
826
- Nebula
827
- Nefertari Vivi
828
- Negasonic Teenage Warhead
829
- Nemo
830
- Neo
831
- Ness
832
- Nessa Jenkins
833
- Newt (Rebecca Jorden)
834
- Neytiri
835
- Nick Fury
836
- Nick Wilde
837
- Nigel
838
- Nightwing
839
- Nina Williams
840
- No-Face
841
- Nobita Nobi
842
- Noddy
843
- Norman
844
- Norman Babcock
845
- Norman Price
846
- Obelix
847
- Obi-Wan Kenobi
848
- Octoling
849
- Odie
850
- Odin
851
- Okoye
852
- Olaf
853
- Olive Oyl
854
- Opus
855
- Ori
856
- Oscar
857
- Oswald
858
- Ozzie
859
- Pac-Man
860
- Paddington Bear
861
- Padmé Amidala
862
- Palpatine
863
- Papa Smurf
864
- Pappa Smurf
865
- Papyrus
866
- Pascal
867
- Patrick Star
868
- Patsy
869
- Patsy Stone
870
- Patti Mayonnaise
871
- Pazu
872
- Pebbles Flintstone
873
- Peggy Carter
874
- Peggy Hill
875
- Penelope Pussycat
876
- Pepe Le Pew
877
- Peppa Pig
878
- Percy
879
- Perdita
880
- Peridot
881
- Perry
882
- Perry the Platypus
883
- Peter B. Parker
884
- Peter Griffin
885
- Peter Pan
886
- Peter Parker
887
- Peter Rabbit
888
- Phil DeVille
889
- Phineas Flynn
890
- Phoebe Heyerdahl
891
- Phoenix Wright
892
- Phoney Bone
893
- Piglet
894
- Pikachu
895
- Pillsbury Doughboy
896
- Pingu
897
- Pink Panther
898
- Pinocchio
899
- Plague Knight
900
- Play-Doh Pete
901
- Pluto
902
- Po
903
- Pocahontas
904
- Poe Dameron
905
- PomPom
906
- Pongo
907
- Pontius Pilate
908
- Ponyo
909
- Popeye
910
- Poppin' Fresh (Pillsbury Doughboy)
911
- Poppy Parnell
912
- Porkchop
913
- Porky Pig
914
- Postman Pat
915
- Potato Head
916
- Prince Charming
917
- Prince Eric
918
- Prince Phillip
919
- Prince Wednesday
920
- Princess Bubblegum
921
- Princess Leia
922
- Princess Leia Organa
923
- Princess Mononoke
924
- Princess Peach
925
- Princess Presto
926
- Professor Layton
927
- Professor X
928
- Proto Man
929
- Pumbaa
930
- Punisher
931
- Puss
932
- Puss in Boots
933
- Puss n Boots
934
- Pyramid Head
935
- Pyro
936
- Queen Kane
937
- Queen Xenomorph
938
- Qui-Gon Jinn
939
- Quick Draw McGraw
940
- Quicksilver
941
- R2-D2
942
- Rabbids
943
- Rafael
944
- Raiden
945
- Ralph
946
- Ralph Wolf
947
- Ralphie Parker
948
- Randall Boggs
949
- Rapunzel
950
- Rayman
951
- Rebecca Welton
952
- Red and Rover
953
- Red Skull
954
- Rei Ayanami
955
- Reinhardt
956
- Remy
957
- Ren Höek
958
- Resetti
959
- Rex
960
- Rey
961
- Reyna
962
- Rheneas
963
- Rhett Butler
964
- Richie Rich
965
- Rick
966
- Rick Mitchell
967
- Rick Sanchez
968
- Rico
969
- Ridley
970
- Rigby
971
- Rigsby
972
- RJ
973
- Road Runner
974
- Robin (Dick Grayson)
975
- Robo-Dog
976
- Rocket Raccoon
977
- Rocko
978
- Rocky Balboa
979
- Rodney Copperbottom
980
- Rodney Trotter
981
- Roger Klotz
982
- Roger Rabbit
983
- Roger the Shrubber
984
- Ronald McDonald
985
- Ronan the Accuser
986
- Roo
987
- Roronoa Zoro
988
- Rosalina
989
- Rose DeWitt Bukater
990
- Rosie
991
- Rosita
992
- Roxanne Ritchi
993
- Roy Kent
994
- Roy Mustang
995
- Rudolph
996
- Ruffnut
997
- Russell
998
- Ryder
999
- Ryu
1000
- Sabine Wren
1001
- Sadie Miller
1002
- Sage
1003
- Sailor Moon
1004
- Sakura Haruno
1005
- Salad Fingers
1006
- Salty
1007
- Sam
1008
- Sam Fisher
1009
- Sam Sheepdog
1010
- Sam Sparks
1011
- Samurai Jack
1012
- Samus Aran
1013
- San
1014
- Sandy Cheeks
1015
- Sandy Crood
1016
- Santa
1017
- Santa Claus
1018
- Santa Claus (Kurt Russell)
1019
- Sarah Connor
1020
- Sarge
1021
- Sasuke
1022
- Satsuki
1023
- Saw Gerrera
1024
- Scarlet Overkill
1025
- Scarlet Witch
1026
- Scarlett O'Hara
1027
- Schroeder
1028
- Scooby
1029
- Scooby Doo
1030
- Scooby-Doo
1031
- Scooby-Dum
1032
- Scott Calvin
1033
- Scrat
1034
- Scrooge
1035
- Scrooge McDuck
1036
- Scuttle
1037
- Sean Turner
1038
- Sebastian
1039
- Secret Squirrel
1040
- Seiji Amasawa
1041
- Senua
1042
- Sephiroth
1043
- Sesshomaru
1044
- Shadow the Hedgehog
1045
- Shaggy Rogers
1046
- Shang
1047
- Shang-Chi
1048
- Sharon Carter
1049
- Shaun
1050
- Shazam
1051
- Sheeta
1052
- Shere Khan
1053
- Sherman
1054
- Shifu
1055
- Shin-chan
1056
- Shinji Ikari
1057
- Shiny Pteranodon
1058
- Shizuka Minamoto
1059
- Shizuku Tsukishima
1060
- Shovel Knight
1061
- Shrek
1062
- Shuri
1063
- Sid
1064
- Siegfried
1065
- Silver Surfer
1066
- Simba
1067
- Sir Bedevere
1068
- Sir Galahad the Pure
1069
- Sir Lancelot the Brave
1070
- Sir Robin
1071
- Sir Topham Hatt
1072
- Skarloey
1073
- Skye
1074
- Skywalker
1075
- Sluggo
1076
- Smiley Bone
1077
- Smokey Bear
1078
- Smurfette
1079
- Snagglepuss
1080
- Snap, Crackle, and Pop
1081
- Sniffles
1082
- Sniper
1083
- Snoopy
1084
- Snotlout
1085
- Snow Miser
1086
- Snow White
1087
- Solaire of Astora
1088
- Solid Snake
1089
- Sonic the Hedgehog
1090
- Sonic's Knuckles
1091
- Sophie Hatter
1092
- Spam Waitress
1093
- Specter Knight
1094
- Speedy Gonzales
1095
- Spencer
1096
- Spider-Man
1097
- Sponge Bob
1098
- SpongeBob
1099
- SpongeBob Square Pants
1100
- SpongeBob SquarePants
1101
- SpongeBob's Grandma
1102
- Spuds MacKenzie
1103
- Spyro
1104
- Squall Leonhart
1105
- Squidward Tentacles
1106
- Srongbad
1107
- Star Lord
1108
- Star-Lord
1109
- Starfire
1110
- Stella
1111
- Steve
1112
- Steven Universe
1113
- Stewie Griffin
1114
- Stimpy
1115
- Stitch
1116
- Stoick the Vast
1117
- Stromboli
1118
- Strong Bad
1119
- Stuart Little
1120
- Stuart the Minion
1121
- Sulley
1122
- Sumo
1123
- Suneo Honekawa
1124
- Super Why
1125
- Supergirl
1126
- Superman
1127
- Surtur
1128
- Susan Murphy
1129
- Susan Murphy / Ginormica
1130
- Susan Walker
1131
- Susie Carmichael
1132
- Sven
1133
- Swiper
1134
- Sylvester
1135
- T-1000
1136
- T-800
1137
- T-800 (The Terminator)
1138
- T-Bone
1139
- Tai Lung
1140
- Takeshi Goda (Gian)
1141
- Talos
1142
- Tamacti Jun
1143
- Tank Girl
1144
- Tarzan
1145
- Tasmanian Devil
1146
- Taz
1147
- Team Rocket
1148
- Ted Lasso
1149
- Ted Wiggins
1150
- Teddy Ruxpin
1151
- Teemo
1152
- Terminator
1153
- Terry Bogard
1154
- Thanatos
1155
- Thanos
1156
- The "It's" Man
1157
- The Aflac Duck
1158
- The Android Robot
1159
- The Apple Logo Face
1160
- The Avengers
1161
- The Beast and Belle
1162
- The Brain
1163
- The Brawny Lumberjack
1164
- The Bride
1165
- The Burger King
1166
- The Butcher
1167
- The California Raisins
1168
- The Camel (Joe Camel)
1169
- The Caterpillar
1170
- The Charmin Bears
1171
- The Cheerios Bee
1172
- The Cheetah
1173
- The Cheshire Cat
1174
- The Collector
1175
- The Conductor
1176
- The Death Star
1177
- The Energizer Bunny
1178
- The Ewoks
1179
- The Flash
1180
- The French Taunter
1181
- The Froot Brute
1182
- The Geico Gecko
1183
- The Ghost of Christmas Past
1184
- The Ghost of Christmas Present
1185
- The Ghost of Christmas Yet to Come
1186
- The Goldfish Cracker
1187
- The Green Giant
1188
- The Grinch
1189
- The Gumbys
1190
- The Hamburglar
1191
- The Joker
1192
- The Jolly Green Giant
1193
- The Killer Rabbit of Caerbannog
1194
- The Knight
1195
- The Knights Who Say "Ni"
1196
- The Kool-Aid Man
1197
- The Laughing Cow
1198
- The Lego Minifigure
1199
- The Liberty Mutual Emu
1200
- The Little Green Sprout
1201
- The Little Prince
1202
- The M&M's Characters
1203
- The Mad Hatter
1204
- The Mandarin
1205
- The Michelin Man
1206
- The Missing Link
1207
- The Monarch
1208
- The Monopoly Man
1209
- The Morton Salt Girl
1210
- The Nesquik Bunny
1211
- The Noid
1212
- The Once-ler
1213
- The Planters Peanut
1214
- The Planters Peanut (Mr. Peanut)
1215
- The Red and Yellow M&M's
1216
- The Scrubbing Bubbles
1217
- The Starbucks Mermaid
1218
- The Sun (Raisin Bran)
1219
- The Taco Bell Chihuahua
1220
- The Travelocity Gnome
1221
- The Trix Rabbit
1222
- The Vault Hunters
1223
- The White Rabbit
1224
- Theo
1225
- Thomas
1226
- Thomas O'Malley
1227
- Thor
1228
- Thorn Harvestar
1229
- Thrall
1230
- Thunk Crood
1231
- Tiana
1232
- Tidus
1233
- Tifa Lockhart
1234
- Tigger
1235
- Tigress
1236
- Tim
1237
- Tim Lockwood
1238
- Tim the Enchanter
1239
- Timmy
1240
- Timmy Brisby
1241
- Timon
1242
- Tinker Bell
1243
- Tintin
1244
- Tiny Diamond
1245
- Tiny Pteranodon
1246
- Tiny Tim
1247
- Toad
1248
- Toby
1249
- Tom
1250
- Tom And Jerry
1251
- Tom Nook
1252
- Tommy Pickles
1253
- Tony
1254
- Tony the Tiger
1255
- Toothless
1256
- Top Cat
1257
- Totoro
1258
- Toucan Sam
1259
- Tracker
1260
- Trevor Phillips
1261
- Triss Merigold
1262
- Trix Rabbit
1263
- Tuffnut
1264
- Tweety Bird
1265
- Ultron
1266
- Uncle Ben
1267
- Ursula
1268
- Usagi Tsukino (Sailor Moon)
1269
- Usagi Yojimbo
1270
- Usopp
1271
- Valerie Brown
1272
- Valiente
1273
- Valkyrie
1274
- Vanellope
1275
- Vegeta
1276
- Velma Dinkley
1277
- Venom
1278
- Verne
1279
- Veronica Lodge
1280
- Victor Frankenstein
1281
- Victor Meldrew
1282
- Violet
1283
- Violet Parr
1284
- Viper
1285
- Vivo
1286
- WALL-E
1287
- Wallace
1288
- Walter Hobbs
1289
- Waluigi
1290
- Wario
1291
- Warren Cave
1292
- Wendy
1293
- Wheatley
1294
- Widowmaker
1295
- Wilbur
1296
- Wildcat
1297
- Wile E. Coyote
1298
- Will Hunting
1299
- Willie T. Stokes
1300
- Wilma
1301
- Wilma Flintstone
1302
- Wilson
1303
- Wimpy
1304
- Winnie the Pooh
1305
- Winry Rockbell
1306
- Winter Soldier
1307
- Witch Hazel
1308
- Wolverine
1309
- Wonder Red
1310
- Wonder Woman
1311
- Woody
1312
- Woody Woodpecker
1313
- Woofster
1314
- WordGirl
1315
- Wybie Lovat
1316
- X-23
1317
- X-Men
1318
- Yasuo
1319
- Yoda
1320
- Yogi Bear
1321
- Yon-Rogg
1322
- Yondu
1323
- Yosemite Sam
1324
- Yoshi
1325
- Yoshimitsu
1326
- Yubaba
1327
- Yukon Cornelius
1328
- Yummy Mummy
1329
- Yuna
1330
- Zagreus
1331
- Zatanna
1332
- Zazu
1333
- Zelda
1334
- Zelda’s Sheik
1335
- Zero
1336
- Zim
1337
- Zorak
1338
- Zuma
1339
- Zurg
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/nltk_data/corpora/wordnet.zip DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:cbda5ea6eef7f36a97a43d4a75f85e07fccbb4f23657d27b4ccbc93e2646ab59
3
- size 10775600
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab.zip DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:c2b16c23d738effbdc5789d7aa601397c13ba2819bf922fb904687f3f16657ed
3
- size 4259017
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/README DELETED
@@ -1,98 +0,0 @@
1
- Pretrained Punkt Models -- Jan Strunk (New version trained after issues 313 and 514 had been corrected)
2
-
3
- Most models were prepared using the test corpora from Kiss and Strunk (2006). Additional models have
4
- been contributed by various people using NLTK for sentence boundary detection.
5
-
6
- For information about how to use these models, please confer the tokenization HOWTO:
7
- http://nltk.googlecode.com/svn/trunk/doc/howto/tokenize.html
8
- and chapter 3.8 of the NLTK book:
9
- http://nltk.googlecode.com/svn/trunk/doc/book/ch03.html#sec-segmentation
10
-
11
- There are pretrained tokenizers for the following languages:
12
-
13
- File Language Source Contents Size of training corpus(in tokens) Model contributed by
14
- =======================================================================================================================================================================
15
- czech.pickle Czech Multilingual Corpus 1 (ECI) Lidove Noviny ~345,000 Jan Strunk / Tibor Kiss
16
- Literarni Noviny
17
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
18
- danish.pickle Danish Avisdata CD-Rom Ver. 1.1. 1995 Berlingske Tidende ~550,000 Jan Strunk / Tibor Kiss
19
- (Berlingske Avisdata, Copenhagen) Weekend Avisen
20
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
21
- dutch.pickle Dutch Multilingual Corpus 1 (ECI) De Limburger ~340,000 Jan Strunk / Tibor Kiss
22
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
23
- english.pickle English Penn Treebank (LDC) Wall Street Journal ~469,000 Jan Strunk / Tibor Kiss
24
- (American)
25
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
26
- estonian.pickle Estonian University of Tartu, Estonia Eesti Ekspress ~359,000 Jan Strunk / Tibor Kiss
27
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
28
- finnish.pickle Finnish Finnish Parole Corpus, Finnish Books and major national ~364,000 Jan Strunk / Tibor Kiss
29
- Text Bank (Suomen Kielen newspapers
30
- Tekstipankki)
31
- Finnish Center for IT Science
32
- (CSC)
33
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
34
- french.pickle French Multilingual Corpus 1 (ECI) Le Monde ~370,000 Jan Strunk / Tibor Kiss
35
- (European)
36
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
37
- german.pickle German Neue Zürcher Zeitung AG Neue Zürcher Zeitung ~847,000 Jan Strunk / Tibor Kiss
38
- (Switzerland) CD-ROM
39
- (Uses "ss"
40
- instead of "ß")
41
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
42
- greek.pickle Greek Efstathios Stamatatos To Vima (TO BHMA) ~227,000 Jan Strunk / Tibor Kiss
43
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
44
- italian.pickle Italian Multilingual Corpus 1 (ECI) La Stampa, Il Mattino ~312,000 Jan Strunk / Tibor Kiss
45
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
46
- norwegian.pickle Norwegian Centre for Humanities Bergens Tidende ~479,000 Jan Strunk / Tibor Kiss
47
- (Bokmål and Information Technologies,
48
- Nynorsk) Bergen
49
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
50
- polish.pickle Polish Polish National Corpus Literature, newspapers, etc. ~1,000,000 Krzysztof Langner
51
- (http://www.nkjp.pl/)
52
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
53
- portuguese.pickle Portuguese CETENFolha Corpus Folha de São Paulo ~321,000 Jan Strunk / Tibor Kiss
54
- (Brazilian) (Linguateca)
55
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
56
- slovene.pickle Slovene TRACTOR Delo ~354,000 Jan Strunk / Tibor Kiss
57
- Slovene Academy for Arts
58
- and Sciences
59
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
60
- spanish.pickle Spanish Multilingual Corpus 1 (ECI) Sur ~353,000 Jan Strunk / Tibor Kiss
61
- (European)
62
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
63
- swedish.pickle Swedish Multilingual Corpus 1 (ECI) Dagens Nyheter ~339,000 Jan Strunk / Tibor Kiss
64
- (and some other texts)
65
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
66
- turkish.pickle Turkish METU Turkish Corpus Milliyet ~333,000 Jan Strunk / Tibor Kiss
67
- (Türkçe Derlem Projesi)
68
- University of Ankara
69
- -----------------------------------------------------------------------------------------------------------------------------------------------------------------------
70
-
71
- The corpora contained about 400,000 tokens on average and mostly consisted of newspaper text converted to
72
- Unicode using the codecs module.
73
-
74
- Kiss, Tibor and Strunk, Jan (2006): Unsupervised Multilingual Sentence Boundary Detection.
75
- Computational Linguistics 32: 485-525.
76
-
77
- ---- Training Code ----
78
-
79
- # import punkt
80
- import nltk.tokenize.punkt
81
-
82
- # Make a new Tokenizer
83
- tokenizer = nltk.tokenize.punkt.PunktSentenceTokenizer()
84
-
85
- # Read in training corpus (one example: Slovene)
86
- import codecs
87
- text = codecs.open("slovene.plain","Ur","iso-8859-2").read()
88
-
89
- # Train tokenizer
90
- tokenizer.train(text)
91
-
92
- # Dump pickled tokenizer
93
- import pickle
94
- out = open("slovene.pickle","wb")
95
- pickle.dump(tokenizer, out)
96
- out.close()
97
-
98
- ---------
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/czech/abbrev_types.txt DELETED
@@ -1,118 +0,0 @@
1
- t
2
- množ
3
- např
4
- j.h
5
- man
6
- ú
7
- jug
8
- dr
9
- bl
10
- ml
11
- okr
12
- st
13
- uh
14
- šp
15
- judr
16
- u.s.a
17
- p
18
- arg
19
- žitě
20
- st.celsia
21
- etc
22
- p.s
23
- t.r
24
- lok
25
- mil
26
- ict
27
- n
28
- tl
29
- min
30
- č
31
- d
32
- al
33
- ravenně
34
- mj
35
- nar
36
- plk
37
- s.p
38
- a.g
39
- roč
40
- b
41
- zdi
42
- r.s.c
43
- přek
44
- m
45
- gen
46
- csc
47
- mudr
48
- vic
49
- š
50
- sb
51
- resp
52
- tzn
53
- iv
54
- s.r.o
55
- mar
56
- w
57
- čs
58
- vi
59
- tzv
60
- ul
61
- pen
62
- zv
63
- str
64
- čp
65
- org
66
- rak
67
- sv
68
- pplk
69
- u.s
70
- prof
71
- c.k
72
- op
73
- g
74
- vii
75
- kr
76
- ing
77
- j.o
78
- drsc
79
- m3
80
- l
81
- tr
82
- ceo
83
- ch
84
- fuk
85
- vl
86
- viii
87
- líp
88
- hl.m
89
- t.zv
90
- phdr
91
- o.k
92
- tis
93
- doc
94
- kl
95
- ard
96
- čkd
97
- pok
98
- apod
99
- r
100
-
101
- a.s
102
- j
103
- jr
104
- i.m
105
- e
106
- kupř
107
- f
108
-
109
- xvi
110
- mir
111
- atď
112
- vr
113
- r.i.v
114
- hl
115
- kv
116
- t.j
117
- y
118
- q.p.r
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/czech/collocations.tab DELETED
@@ -1,96 +0,0 @@
1
- i dejmala
2
- ##number## prosince
3
- h steina
4
- ##number## listopadu
5
- a dvořák
6
- v klaus
7
- i čnhl
8
- ##number## wladyslawowo
9
- ##number## letech
10
- a jiráska
11
- a dubček
12
- ##number## štrasburk
13
- ##number## juniorské
14
- ##number## století
15
- ##number## kola
16
- ##number## pád
17
- ##number## května
18
- ##number## týdne
19
- v dlouhý
20
- k design
21
- ##number## červenec
22
- i ligy
23
- ##number## kolo
24
- z svěrák
25
- ##number## mája
26
- ##number## šimková
27
- a bělého
28
- a bradáč
29
- ##number## ročníku
30
- ##number## dubna
31
- a vivaldiho
32
- v mečiara
33
- c carrićre
34
- ##number## sjezd
35
- ##number## výroční
36
- ##number## kole
37
- ##number## narozenin
38
- k maleevová
39
- i čnfl
40
- ##number## pádě
41
- ##number## září
42
- ##number## výročí
43
- a dvořáka
44
- h g.
45
- ##number## ledna
46
- a dvorský
47
- h měsíc
48
- ##number## srpna
49
- ##number## tř.
50
- a mozarta
51
- ##number## sudetoněmeckých
52
- o sokolov
53
- k škrach
54
- v benda
55
- ##number## symfonie
56
- ##number## července
57
- x šalda
58
- c abrahama
59
- a tichý
60
- ##number## místo
61
- k bielecki
62
- v havel
63
- ##number## etapu
64
- a dubčeka
65
- i liga
66
- ##number## světový
67
- v klausem
68
- ##number## ženy
69
- ##number## létech
70
- ##number## minutě
71
- ##number## listopadem
72
- ##number## místě
73
- o vlček
74
- k peteraje
75
- i sponzor
76
- ##number## června
77
- ##number## min.
78
- ##number## oprávněnou
79
- ##number## květnu
80
- ##number## aktu
81
- ##number## květnem
82
- ##number## října
83
- i rynda
84
- ##number## února
85
- i snfl
86
- a mozart
87
- z košler
88
- a dvorskému
89
- v marhoul
90
- v mečiar
91
- ##number## ročník
92
- ##number## máje
93
- v havla
94
- k gott
95
- s bacha
96
- ##number## ad
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/czech/ortho_context.tab DELETED
The diff for this file is too large to render. See raw diff
 
Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/czech/sent_starters.txt DELETED
@@ -1,54 +0,0 @@
1
-
2
- milena
3
- tomáš
4
- oznámila
5
- podle
6
- my
7
- vyplývá
8
- hlavní
9
- jelikož
10
- musíme
11
- kdyby
12
- foto
13
- rozptylové
14
- snad
15
- zároveň
16
- jaroslav
17
- po
18
- v
19
- kromě
20
- pokud
21
- toto
22
- jenže
23
- oba
24
- jak
25
- zatímco
26
- ten
27
- myslím
28
- navíc
29
- dušan
30
- zdá
31
- dnes
32
- přesto
33
- tato
34
- ti
35
- bratislava
36
- ale
37
- když
38
- nicméně
39
- tento
40
- mirka
41
- přitom
42
- dokud
43
- jan
44
- bohužel
45
- ta
46
- díky
47
- prohlásil
48
- praha
49
- jestliže
50
- jde
51
- vždyť
52
- moskva
53
- proto
54
- to
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/danish/abbrev_types.txt DELETED
@@ -1,211 +0,0 @@
1
- t
2
- tlf
3
- b.p
4
- evt
5
- j.h
6
- lenz
7
- mht
8
- gl
9
- bl
10
- stud.polit
11
- e.j
12
- st
13
- o
14
- dec
15
- mag
16
- h.b
17
- p
18
- adm
19
- el.lign
20
- e.s
21
- saalba
22
- styrt
23
- nr
24
- m.a.s.h
25
- etc
26
- pharm
27
- hg
28
- j.j
29
- dj
30
- mountainb
31
- f.kr
32
- h.r
33
- cand.jur
34
- sp
35
- osv
36
- s.g
37
- ndr
38
- inc
39
- b.i.g
40
- dk-sver
41
- sl
42
- v.s.o.d
43
- cand.mag
44
- d.v.s
45
- v.i
46
- bøddel
47
- fr
48
- ø«
49
- dr.phil
50
- chr
51
- p.d
52
- bj
53
- fhv
54
- tilskudsforhold
55
- m.a
56
- sek
57
- p.g.a
58
- int
59
- pokalf
60
- ik
61
- dir
62
- em-lodtrækn
63
- a.h
64
- o.lign
65
- p.t
66
- m.v
67
- n.j
68
- m.h.t
69
- m.m
70
- a.p
71
- pers
72
- 4-bakketurn
73
- dr.med
74
- w.ø
75
- polit
76
- fremsættes
77
- techn
78
- tidl
79
- o.g
80
- i.c.i
81
- mill
82
- skt
83
- m.fl
84
- cand.merc
85
- kbh
86
- indiv
87
- stk
88
- dk-maked
89
- memorandum
90
- mestersk
91
- mag.art
92
- kitzb
93
- h
94
- lic
95
- fig
96
- dressurst
97
- sportsg
98
- r.e.m
99
- d.u.m
100
- sct
101
- kld
102
- bl.a
103
- hf
104
- g.a
105
- corp
106
- w
107
- konk
108
- zoeterm
109
- b.t
110
- a.d
111
- l.b
112
- jf
113
- s.b
114
- kgl
115
- ill
116
- beck
117
- tosset
118
- afd
119
- johs
120
- pct
121
- k.b
122
- sv
123
- verbalt
124
- kgs
125
- l.m.k
126
- j.l
127
- aus
128
- superl
129
- t.v
130
- mia
131
- kr
132
- pr
133
- præmien
134
- j.b.s
135
- j.o
136
- o.s.v
137
- edb-oplysninger
138
- o.m.a
139
- ca
140
- 1b
141
- f.eks
142
- rens
143
- ch
144
- mr
145
- schw
146
- d.c
147
- utraditionelt
148
- idrætsgym
149
- hhv
150
- e.l
151
- s.s
152
- eks
153
- f.o.m
154
- dk-storbrit
155
- dk-jugo
156
- n.z
157
- derivater
158
- c
159
- pt
160
- vm-kval
161
- kl
162
- hr
163
- cand
164
- jur
165
- sav
166
- h.c
167
- arab.-danm
168
- d.a.d
169
- fl
170
- o.a
171
- a.s
172
- cand.polit
173
- grundejerform
174
- j
175
- faglærte
176
- cr
177
- a.a
178
- mou
179
- f.r.i
180
- årh
181
- o.m.m
182
- sve
183
- c.a
184
- engl
185
- sikkerhedssystemerne
186
- m.f
187
- j.k
188
- phil
189
- f
190
- vet
191
- mio
192
- k.e
193
- m.k
194
- atla
195
- idrætsg
196
- n.n
197
- 4-bakketur
198
- dvs
199
- sdr
200
- s.j
201
- hol
202
- s.h
203
- pei
204
- kbhvn
205
- aa
206
- m.g.i
207
- fvt
208
-
209
- b.c
210
- th
211
- lrs
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/danish/collocations.tab DELETED
@@ -1,101 +0,0 @@
1
- ##number## skak
2
- ##number## speedway
3
- ##number## rally
4
- ##number## april
5
- ##number## dm-fin
6
- ##number## viceformand
7
- m jensen
8
- ##number## kano/kajak
9
- ##number## bowling
10
- ##number## dm-finale
11
- ##number## årh.
12
- ##number## januar
13
- ##number## august
14
- ##number## marathon
15
- ##number## kamp
16
- ##number## skihop
17
- ##number## etage
18
- ##number## tennis
19
- ##number## cykling
20
- e andersen
21
- ##number## december
22
- g h.
23
- ##number## neb
24
- ##number## sektion
25
- ##number## afd.
26
- ##number## klasse
27
- ##number## trampolin
28
- ##number## bordtennis
29
- ##number## formel
30
- ##number## århundredes
31
- ##number## dm-semifin
32
- ##number## heks
33
- ##number## taekwondo
34
- ##number## galop
35
- ##number## basketball
36
- ##number## dm
37
- m skræl
38
- ##number## trav
39
- ##number## provins
40
- ##number## triathlon
41
- k axel
42
- ##number## rugby
43
- s h.
44
- ##number## klaverkoncert
45
- a p.
46
- e løgstrup
47
- k telefax
48
- ##number## gyldendal
49
- ##number## fodbold
50
- e rosenfeldt
51
- ##number## oktober
52
- k o.
53
- ##number## september
54
- ##number## dec.
55
- ##number## juledag
56
- ##number## badminton
57
- ##number## sejlsport
58
- ##number## håndbold
59
- r førsund
60
- e jørgensen
61
- d ##number##
62
- k e
63
- ##number## alp.ski
64
- ##number## judo
65
- ##number## roning
66
- ##number## november
67
- ##number## atletik
68
- ##number## århundrede
69
- ##number## ridning
70
- ##number## marts
71
- m andersen
72
- d roosevelt
73
- ##number## brydning
74
- s kr.
75
- ##number## runde
76
- ##number## division
77
- ##number## sal
78
- ##number## boksning
79
- ##number## minut
80
- ##number## golf
81
- ##number## juni
82
- ##number## symfoni
83
- ##number## hurtigløb
84
- k jørgensen
85
- ##number## jörgen
86
- ##number## klasses
87
- e jacobsen
88
- k jensen
89
- ##number## februar
90
- k nielsen
91
- ##number## volleyball
92
- ##number## maj
93
- ##number## verdenskrig
94
- ##number## juli
95
- ##number## ishockey
96
- ##number## kunstskøjteløb
97
- b jørgensen
98
- ##number## gymnastik
99
- ##number## svømning
100
- ##number## tw
101
- i pedersens
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Cosmos-1.0-Guardrail/blocklist/nltk_data/tokenizers/punkt_tab/danish/ortho_context.tab DELETED
The diff for this file is too large to render. See raw diff