Dolphin-Mistral-24B-Venice-Edition-Q3_K_M.gguf - GGUF Internal File Dump
- Endian: LITTLE endian
Key Value Metadata Store
There are 46 key-value pairs in this file
POS | TYPE | Count | Key | Value |
---|---|---|---|---|
1 | UINT32 | 1 | GGUF.version | 3 |
2 | UINT64 | 1 | GGUF.tensor_count | 363 |
3 | UINT64 | 1 | GGUF.kv_count | 43 |
4 | STRING | 1 | general.architecture | llama |
5 | STRING | 1 | general.type | model |
6 | STRING | 1 | general.name | Dolphin Mistral 24B Venice Edition |
7 | STRING | 1 | general.finetune | Venice-Edition |
8 | STRING | 1 | general.basename | Dolphin-Mistral |
9 | STRING | 1 | general.size_label | 24B |
10 | STRING | 1 | general.license | apache-2.0 |
11 | UINT32 | 1 | general.base_model.count | 1 |
12 | STRING | 1 | general.base_model.0.name | Mistral Small 24B Instruct 2501 |
13 | STRING | 1 | general.base_model.0.version | 2501 |
14 | STRING | 1 | general.base_model.0.organization | Mistralai |
15 | STRING | 1 | general.base_model.0.repo_url | https://huggingface.co/mistral ...istral-Small-24B-Instruct-2501 |
16 | UINT32 | 1 | llama.block_count | 40 |
17 | UINT32 | 1 | llama.context_length | 32768 |
18 | UINT32 | 1 | llama.embedding_length | 5120 |
19 | UINT32 | 1 | llama.feed_forward_length | 32768 |
20 | UINT32 | 1 | llama.attention.head_count | 32 |
21 | UINT32 | 1 | llama.attention.head_count_kv | 8 |
22 | FLOAT32 | 1 | llama.rope.freq_base | 100000000.0 |
23 | FLOAT32 | 1 | llama.attention.layer_norm_rms_epsilon | 1e-05 |
24 | UINT32 | 1 | llama.attention.key_length | 128 |
25 | UINT32 | 1 | llama.attention.value_length | 128 |
26 | UINT32 | 1 | llama.vocab_size | 131072 |
27 | UINT32 | 1 | llama.rope.dimension_count | 128 |
28 | STRING | 1 | tokenizer.ggml.model | gpt2 |
29 | STRING | 1 | tokenizer.ggml.pre | tekken |
30 | [STRING] | 131072 | tokenizer.ggml.tokens | [ <unk> , <s> , </s> , [INST] , [/INST] , ... ] |
31 | [INT32] | 131072 | tokenizer.ggml.token_type | [ 3, 3, 3, 3, 3, 3, 3, ... ] |
32 | [STRING] | 269443 | tokenizer.ggml.merges | [ Ġ Ġ , Ġ t , e r , i n , Ġ ĠĠĠ , ... ] |
33 | UINT32 | 1 | tokenizer.ggml.bos_token_id | 1 |
34 | UINT32 | 1 | tokenizer.ggml.eos_token_id | 2 |
35 | UINT32 | 1 | tokenizer.ggml.unknown_token_id | 0 |
36 | UINT32 | 1 | tokenizer.ggml.padding_token_id | 11 |
37 | BOOL | 1 | tokenizer.ggml.add_bos_token | True |
38 | BOOL | 1 | tokenizer.ggml.add_eos_token | False |
39 | STRING | 1 | tokenizer.chat_template | {%- set today = strftime_now(" ... {%- endif %}{%- endfor %} |
40 | BOOL | 1 | tokenizer.ggml.add_space_prefix | False |
41 | UINT32 | 1 | general.quantization_version | 2 |
42 | UINT32 | 1 | general.file_type | 12 |
43 | STRING | 1 | quantize.imatrix.file | ./imatrix/imatrix-Dolphin-Mist ...l-24B-Venice-Edition-small.dat |
44 | STRING | 1 | quantize.imatrix.dataset | ../../datasets/imatrix/combined_eur_small.txt |
45 | INT32 | 1 | quantize.imatrix.entries_count | 281 |
46 | INT32 | 1 | quantize.imatrix.chunks_count | 3192 |
Tensors Overview ~24B Elements
Total number of elements in all tensors: 23572403200 Elements
- Dolphin-Mistral-24B-Venice-Edition-Q3_K_M.gguf - GGUF Internal File Dump
- Key Value Metadata Store
- Tensors Overview ~24B Elements
- Tensor Data Offset
- Base Tensor Group : ~1B Elements
- Block 0 Tensor Group : ~556M Elements
- Block 1 Tensor Group : ~556M Elements
- Block 2 Tensor Group : ~556M Elements
- Block 3 Tensor Group : ~556M Elements
- Block 4 Tensor Group : ~556M Elements
- Block 5 Tensor Group : ~556M Elements
- Block 6 Tensor Group : ~556M Elements
- Block 7 Tensor Group : ~556M Elements
- Block 8 Tensor Group : ~556M Elements
- Block 9 Tensor Group : ~556M Elements
- Block 10 Tensor Group : ~556M Elements
- Block 11 Tensor Group : ~556M Elements
- Block 12 Tensor Group : ~556M Elements
- Block 13 Tensor Group : ~556M Elements
- Block 14 Tensor Group : ~556M Elements
- Block 15 Tensor Group : ~556M Elements
- Block 16 Tensor Group : ~556M Elements
- Block 17 Tensor Group : ~556M Elements
- Block 18 Tensor Group : ~556M Elements
- Block 19 Tensor Group : ~556M Elements
- Block 20 Tensor Group : ~556M Elements
- Block 21 Tensor Group : ~556M Elements
- Block 22 Tensor Group : ~556M Elements
- Block 23 Tensor Group : ~556M Elements
- Block 24 Tensor Group : ~556M Elements
- Block 25 Tensor Group : ~556M Elements
- Block 26 Tensor Group : ~556M Elements
- Block 27 Tensor Group : ~556M Elements
- Block 28 Tensor Group : ~556M Elements
- Block 29 Tensor Group : ~556M Elements
- Block 30 Tensor Group : ~556M Elements
- Block 31 Tensor Group : ~556M Elements
- Block 32 Tensor Group : ~556M Elements
- Block 33 Tensor Group : ~556M Elements
- Block 34 Tensor Group : ~556M Elements
- Block 35 Tensor Group : ~556M Elements
- Block 36 Tensor Group : ~556M Elements
- Block 37 Tensor Group : ~556M Elements
- Block 38 Tensor Group : ~556M Elements
- Block 39 Tensor Group : ~556M Elements
Tensor Data Offset
This table contains the offset and data segment relative to start of file
T_ID | Tensor Layer Name | Data Offset (B) | Data Size (B) |
---|---|---|---|
0 | output.weight | 0x784920 | 0x11300000 |
1 | output_norm.weight | 0x11a84920 | 0x5000 |
2 | token_embd.weight | 0x11a89920 | 0x11300000 |
3 | blk.0.attn_k.weight | 0x22d89920 | 0x1a4000 |
4 | blk.0.attn_norm.weight | 0x22f2d920 | 0x5000 |
5 | blk.0.attn_output.weight | 0x22f32920 | 0xb40000 |
6 | blk.0.attn_q.weight | 0x23a72920 | 0x690000 |
7 | blk.0.attn_v.weight | 0x24102920 | 0x226000 |
8 | blk.0.ffn_down.weight | 0x24328920 | 0x6e00000 |
9 | blk.0.ffn_gate.weight | 0x2b128920 | 0x3480000 |
10 | blk.0.ffn_norm.weight | 0x2e5a8920 | 0x5000 |
11 | blk.0.ffn_up.weight | 0x2e5ad920 | 0x3480000 |
12 | blk.1.attn_k.weight | 0x31a2d920 | 0x1a4000 |
13 | blk.1.attn_norm.weight | 0x31bd1920 | 0x5000 |
14 | blk.1.attn_output.weight | 0x31bd6920 | 0xb40000 |
15 | blk.1.attn_q.weight | 0x32716920 | 0x690000 |
16 | blk.1.attn_v.weight | 0x32da6920 | 0x226000 |
17 | blk.1.ffn_down.weight | 0x32fcc920 | 0x6e00000 |
18 | blk.1.ffn_gate.weight | 0x39dcc920 | 0x3480000 |
19 | blk.1.ffn_norm.weight | 0x3d24c920 | 0x5000 |
20 | blk.1.ffn_up.weight | 0x3d251920 | 0x3480000 |
21 | blk.2.attn_k.weight | 0x406d1920 | 0x1a4000 |
22 | blk.2.attn_norm.weight | 0x40875920 | 0x5000 |
23 | blk.2.attn_output.weight | 0x4087a920 | 0xb40000 |
24 | blk.2.attn_q.weight | 0x413ba920 | 0x690000 |
25 | blk.2.attn_v.weight | 0x41a4a920 | 0x226000 |
26 | blk.2.ffn_down.weight | 0x41c70920 | 0x5a00000 |
27 | blk.2.ffn_gate.weight | 0x47670920 | 0x3480000 |
28 | blk.2.ffn_norm.weight | 0x4aaf0920 | 0x5000 |
29 | blk.2.ffn_up.weight | 0x4aaf5920 | 0x3480000 |
30 | blk.3.attn_k.weight | 0x4df75920 | 0x1a4000 |
31 | blk.3.attn_norm.weight | 0x4e119920 | 0x5000 |
32 | blk.3.attn_output.weight | 0x4e11e920 | 0xb40000 |
33 | blk.3.attn_q.weight | 0x4ec5e920 | 0x690000 |
34 | blk.3.attn_v.weight | 0x4f2ee920 | 0x226000 |
35 | blk.3.ffn_down.weight | 0x4f514920 | 0x5a00000 |
36 | blk.3.ffn_gate.weight | 0x54f14920 | 0x3480000 |
37 | blk.3.ffn_norm.weight | 0x58394920 | 0x5000 |
38 | blk.3.ffn_up.weight | 0x58399920 | 0x3480000 |
39 | blk.4.attn_k.weight | 0x5b819920 | 0x1a4000 |
40 | blk.4.attn_norm.weight | 0x5b9bd920 | 0x5000 |
41 | blk.4.attn_output.weight | 0x5b9c2920 | 0xb40000 |
42 | blk.4.attn_q.weight | 0x5c502920 | 0x690000 |
43 | blk.4.attn_v.weight | 0x5cb92920 | 0x226000 |
44 | blk.4.ffn_down.weight | 0x5cdb8920 | 0x5a00000 |
45 | blk.4.ffn_gate.weight | 0x627b8920 | 0x3480000 |
46 | blk.4.ffn_norm.weight | 0x65c38920 | 0x5000 |
47 | blk.4.ffn_up.weight | 0x65c3d920 | 0x3480000 |
48 | blk.5.attn_k.weight | 0x690bd920 | 0x1a4000 |
49 | blk.5.attn_norm.weight | 0x69261920 | 0x5000 |
50 | blk.5.attn_output.weight | 0x69266920 | 0xb40000 |
51 | blk.5.attn_q.weight | 0x69da6920 | 0x690000 |
52 | blk.5.attn_v.weight | 0x6a436920 | 0x226000 |
53 | blk.5.ffn_down.weight | 0x6a65c920 | 0x5a00000 |
54 | blk.5.ffn_gate.weight | 0x7005c920 | 0x3480000 |
55 | blk.5.ffn_norm.weight | 0x734dc920 | 0x5000 |
56 | blk.5.ffn_up.weight | 0x734e1920 | 0x3480000 |
57 | blk.6.attn_k.weight | 0x76961920 | 0x1a4000 |
58 | blk.6.attn_norm.weight | 0x76b05920 | 0x5000 |
59 | blk.6.attn_output.weight | 0x76b0a920 | 0xb40000 |
60 | blk.6.attn_q.weight | 0x7764a920 | 0x690000 |
61 | blk.6.attn_v.weight | 0x77cda920 | 0x226000 |
62 | blk.6.ffn_down.weight | 0x77f00920 | 0x5a00000 |
63 | blk.6.ffn_gate.weight | 0x7d900920 | 0x3480000 |
64 | blk.6.ffn_norm.weight | 0x80d80920 | 0x5000 |
65 | blk.6.ffn_up.weight | 0x80d85920 | 0x3480000 |
66 | blk.7.attn_k.weight | 0x84205920 | 0x1a4000 |
67 | blk.7.attn_norm.weight | 0x843a9920 | 0x5000 |
68 | blk.7.attn_output.weight | 0x843ae920 | 0xb40000 |
69 | blk.7.attn_q.weight | 0x84eee920 | 0x690000 |
70 | blk.7.attn_v.weight | 0x8557e920 | 0x226000 |
71 | blk.7.ffn_down.weight | 0x857a4920 | 0x5a00000 |
72 | blk.7.ffn_gate.weight | 0x8b1a4920 | 0x3480000 |
73 | blk.7.ffn_norm.weight | 0x8e624920 | 0x5000 |
74 | blk.7.ffn_up.weight | 0x8e629920 | 0x3480000 |
75 | blk.8.attn_k.weight | 0x91aa9920 | 0x1a4000 |
76 | blk.8.attn_norm.weight | 0x91c4d920 | 0x5000 |
77 | blk.8.attn_output.weight | 0x91c52920 | 0xb40000 |
78 | blk.8.attn_q.weight | 0x92792920 | 0x690000 |
79 | blk.8.attn_v.weight | 0x92e22920 | 0x226000 |
80 | blk.8.ffn_down.weight | 0x93048920 | 0x5a00000 |
81 | blk.8.ffn_gate.weight | 0x98a48920 | 0x3480000 |
82 | blk.8.ffn_norm.weight | 0x9bec8920 | 0x5000 |
83 | blk.8.ffn_up.weight | 0x9becd920 | 0x3480000 |
84 | blk.9.attn_k.weight | 0x9f34d920 | 0x1a4000 |
85 | blk.9.attn_norm.weight | 0x9f4f1920 | 0x5000 |
86 | blk.9.attn_output.weight | 0x9f4f6920 | 0xb40000 |
87 | blk.9.attn_q.weight | 0xa0036920 | 0x690000 |
88 | blk.9.attn_v.weight | 0xa06c6920 | 0x226000 |
89 | blk.9.ffn_down.weight | 0xa08ec920 | 0x5a00000 |
90 | blk.9.ffn_gate.weight | 0xa62ec920 | 0x3480000 |
91 | blk.9.ffn_norm.weight | 0xa976c920 | 0x5000 |
92 | blk.9.ffn_up.weight | 0xa9771920 | 0x3480000 |
93 | blk.10.attn_k.weight | 0xacbf1920 | 0x1a4000 |
94 | blk.10.attn_norm.weight | 0xacd95920 | 0x5000 |
95 | blk.10.attn_output.weight | 0xacd9a920 | 0xb40000 |
96 | blk.10.attn_q.weight | 0xad8da920 | 0x690000 |
97 | blk.10.attn_v.weight | 0xadf6a920 | 0x226000 |
98 | blk.10.ffn_down.weight | 0xae190920 | 0x5a00000 |
99 | blk.10.ffn_gate.weight | 0xb3b90920 | 0x3480000 |
100 | blk.10.ffn_norm.weight | 0xb7010920 | 0x5000 |
101 | blk.10.ffn_up.weight | 0xb7015920 | 0x3480000 |
102 | blk.11.attn_k.weight | 0xba495920 | 0x1a4000 |
103 | blk.11.attn_norm.weight | 0xba639920 | 0x5000 |
104 | blk.11.attn_output.weight | 0xba63e920 | 0xb40000 |
105 | blk.11.attn_q.weight | 0xbb17e920 | 0x690000 |
106 | blk.11.attn_v.weight | 0xbb80e920 | 0x226000 |
107 | blk.11.ffn_down.weight | 0xbba34920 | 0x5a00000 |
108 | blk.11.ffn_gate.weight | 0xc1434920 | 0x3480000 |
109 | blk.11.ffn_norm.weight | 0xc48b4920 | 0x5000 |
110 | blk.11.ffn_up.weight | 0xc48b9920 | 0x3480000 |
111 | blk.12.attn_k.weight | 0xc7d39920 | 0x1a4000 |
112 | blk.12.attn_norm.weight | 0xc7edd920 | 0x5000 |
113 | blk.12.attn_output.weight | 0xc7ee2920 | 0xb40000 |
114 | blk.12.attn_q.weight | 0xc8a22920 | 0x690000 |
115 | blk.12.attn_v.weight | 0xc90b2920 | 0x226000 |
116 | blk.12.ffn_down.weight | 0xc92d8920 | 0x5a00000 |
117 | blk.12.ffn_gate.weight | 0xcecd8920 | 0x3480000 |
118 | blk.12.ffn_norm.weight | 0xd2158920 | 0x5000 |
119 | blk.12.ffn_up.weight | 0xd215d920 | 0x3480000 |
120 | blk.13.attn_k.weight | 0xd55dd920 | 0x1a4000 |
121 | blk.13.attn_norm.weight | 0xd5781920 | 0x5000 |
122 | blk.13.attn_output.weight | 0xd5786920 | 0xb40000 |
123 | blk.13.attn_q.weight | 0xd62c6920 | 0x690000 |
124 | blk.13.attn_v.weight | 0xd6956920 | 0x226000 |
125 | blk.13.ffn_down.weight | 0xd6b7c920 | 0x5a00000 |
126 | blk.13.ffn_gate.weight | 0xdc57c920 | 0x3480000 |
127 | blk.13.ffn_norm.weight | 0xdf9fc920 | 0x5000 |
128 | blk.13.ffn_up.weight | 0xdfa01920 | 0x3480000 |
129 | blk.14.attn_k.weight | 0xe2e81920 | 0x1a4000 |
130 | blk.14.attn_norm.weight | 0xe3025920 | 0x5000 |
131 | blk.14.attn_output.weight | 0xe302a920 | 0xb40000 |
132 | blk.14.attn_q.weight | 0xe3b6a920 | 0x690000 |
133 | blk.14.attn_v.weight | 0xe41fa920 | 0x226000 |
134 | blk.14.ffn_down.weight | 0xe4420920 | 0x5a00000 |
135 | blk.14.ffn_gate.weight | 0xe9e20920 | 0x3480000 |
136 | blk.14.ffn_norm.weight | 0xed2a0920 | 0x5000 |
137 | blk.14.ffn_up.weight | 0xed2a5920 | 0x3480000 |
138 | blk.15.attn_k.weight | 0xf0725920 | 0x1a4000 |
139 | blk.15.attn_norm.weight | 0xf08c9920 | 0x5000 |
140 | blk.15.attn_output.weight | 0xf08ce920 | 0xb40000 |
141 | blk.15.attn_q.weight | 0xf140e920 | 0x690000 |
142 | blk.15.attn_v.weight | 0xf1a9e920 | 0x226000 |
143 | blk.15.ffn_down.weight | 0xf1cc4920 | 0x5a00000 |
144 | blk.15.ffn_gate.weight | 0xf76c4920 | 0x3480000 |
145 | blk.15.ffn_norm.weight | 0xfab44920 | 0x5000 |
146 | blk.15.ffn_up.weight | 0xfab49920 | 0x3480000 |
147 | blk.16.attn_k.weight | 0xfdfc9920 | 0x1a4000 |
148 | blk.16.attn_norm.weight | 0xfe16d920 | 0x5000 |
149 | blk.16.attn_output.weight | 0xfe172920 | 0xb40000 |
150 | blk.16.attn_q.weight | 0xfecb2920 | 0x690000 |
151 | blk.16.attn_v.weight | 0xff342920 | 0x226000 |
152 | blk.16.ffn_down.weight | 0xff568920 | 0x5a00000 |
153 | blk.16.ffn_gate.weight | 0x104f68920 | 0x3480000 |
154 | blk.16.ffn_norm.weight | 0x1083e8920 | 0x5000 |
155 | blk.16.ffn_up.weight | 0x1083ed920 | 0x3480000 |
156 | blk.17.attn_k.weight | 0x10b86d920 | 0x226000 |
157 | blk.17.attn_norm.weight | 0x10ba93920 | 0x5000 |
158 | blk.17.attn_output.weight | 0x10ba98920 | 0xb40000 |
159 | blk.17.attn_q.weight | 0x10c5d8920 | 0x898000 |
160 | blk.17.attn_v.weight | 0x10ce70920 | 0x2d0000 |
161 | blk.17.ffn_down.weight | 0x10d140920 | 0x5a00000 |
162 | blk.17.ffn_gate.weight | 0x112b40920 | 0x3480000 |
163 | blk.17.ffn_norm.weight | 0x115fc0920 | 0x5000 |
164 | blk.17.ffn_up.weight | 0x115fc5920 | 0x3480000 |
165 | blk.18.attn_k.weight | 0x119445920 | 0x226000 |
166 | blk.18.attn_norm.weight | 0x11966b920 | 0x5000 |
167 | blk.18.attn_output.weight | 0x119670920 | 0xb40000 |
168 | blk.18.attn_q.weight | 0x11a1b0920 | 0x898000 |
169 | blk.18.attn_v.weight | 0x11aa48920 | 0x2d0000 |
170 | blk.18.ffn_down.weight | 0x11ad18920 | 0x5a00000 |
171 | blk.18.ffn_gate.weight | 0x120718920 | 0x3480000 |
172 | blk.18.ffn_norm.weight | 0x123b98920 | 0x5000 |
173 | blk.18.ffn_up.weight | 0x123b9d920 | 0x3480000 |
174 | blk.19.attn_k.weight | 0x12701d920 | 0x1a4000 |
175 | blk.19.attn_norm.weight | 0x1271c1920 | 0x5000 |
176 | blk.19.attn_output.weight | 0x1271c6920 | 0xb40000 |
177 | blk.19.attn_q.weight | 0x127d06920 | 0x690000 |
178 | blk.19.attn_v.weight | 0x128396920 | 0x226000 |
179 | blk.19.ffn_down.weight | 0x1285bc920 | 0x5a00000 |
180 | blk.19.ffn_gate.weight | 0x12dfbc920 | 0x3480000 |
181 | blk.19.ffn_norm.weight | 0x13143c920 | 0x5000 |
182 | blk.19.ffn_up.weight | 0x131441920 | 0x3480000 |
183 | blk.20.attn_k.weight | 0x1348c1920 | 0x226000 |
184 | blk.20.attn_norm.weight | 0x134ae7920 | 0x5000 |
185 | blk.20.attn_output.weight | 0x134aec920 | 0xb40000 |
186 | blk.20.attn_q.weight | 0x13562c920 | 0x898000 |
187 | blk.20.attn_v.weight | 0x135ec4920 | 0x2d0000 |
188 | blk.20.ffn_down.weight | 0x136194920 | 0x5a00000 |
189 | blk.20.ffn_gate.weight | 0x13bb94920 | 0x44c0000 |
190 | blk.20.ffn_norm.weight | 0x140054920 | 0x5000 |
191 | blk.20.ffn_up.weight | 0x140059920 | 0x44c0000 |
192 | blk.21.attn_k.weight | 0x144519920 | 0x1a4000 |
193 | blk.21.attn_norm.weight | 0x1446bd920 | 0x5000 |
194 | blk.21.attn_output.weight | 0x1446c2920 | 0xb40000 |
195 | blk.21.attn_q.weight | 0x145202920 | 0x690000 |
196 | blk.21.attn_v.weight | 0x145892920 | 0x226000 |
197 | blk.21.ffn_down.weight | 0x145ab8920 | 0x5a00000 |
198 | blk.21.ffn_gate.weight | 0x14b4b8920 | 0x44c0000 |
199 | blk.21.ffn_norm.weight | 0x14f978920 | 0x5000 |
200 | blk.21.ffn_up.weight | 0x14f97d920 | 0x44c0000 |
201 | blk.22.attn_k.weight | 0x153e3d920 | 0x226000 |
202 | blk.22.attn_norm.weight | 0x154063920 | 0x5000 |
203 | blk.22.attn_output.weight | 0x154068920 | 0xb40000 |
204 | blk.22.attn_q.weight | 0x154ba8920 | 0x898000 |
205 | blk.22.attn_v.weight | 0x155440920 | 0x2d0000 |
206 | blk.22.ffn_down.weight | 0x155710920 | 0x5a00000 |
207 | blk.22.ffn_gate.weight | 0x15b110920 | 0x44c0000 |
208 | blk.22.ffn_norm.weight | 0x15f5d0920 | 0x5000 |
209 | blk.22.ffn_up.weight | 0x15f5d5920 | 0x44c0000 |
210 | blk.23.attn_k.weight | 0x163a95920 | 0x226000 |
211 | blk.23.attn_norm.weight | 0x163cbb920 | 0x5000 |
212 | blk.23.attn_output.weight | 0x163cc0920 | 0xb40000 |
213 | blk.23.attn_q.weight | 0x164800920 | 0x898000 |
214 | blk.23.attn_v.weight | 0x165098920 | 0x2d0000 |
215 | blk.23.ffn_down.weight | 0x165368920 | 0x5a00000 |
216 | blk.23.ffn_gate.weight | 0x16ad68920 | 0x44c0000 |
217 | blk.23.ffn_norm.weight | 0x16f228920 | 0x5000 |
218 | blk.23.ffn_up.weight | 0x16f22d920 | 0x44c0000 |
219 | blk.24.attn_k.weight | 0x1736ed920 | 0x226000 |
220 | blk.24.attn_norm.weight | 0x173913920 | 0x5000 |
221 | blk.24.attn_output.weight | 0x173918920 | 0xb40000 |
222 | blk.24.attn_q.weight | 0x174458920 | 0x898000 |
223 | blk.24.attn_v.weight | 0x174cf0920 | 0x2d0000 |
224 | blk.24.ffn_down.weight | 0x174fc0920 | 0x5a00000 |
225 | blk.24.ffn_gate.weight | 0x17a9c0920 | 0x44c0000 |
226 | blk.24.ffn_norm.weight | 0x17ee80920 | 0x5000 |
227 | blk.24.ffn_up.weight | 0x17ee85920 | 0x44c0000 |
228 | blk.25.attn_k.weight | 0x183345920 | 0x226000 |
229 | blk.25.attn_norm.weight | 0x18356b920 | 0x5000 |
230 | blk.25.attn_output.weight | 0x183570920 | 0xb40000 |
231 | blk.25.attn_q.weight | 0x1840b0920 | 0x898000 |
232 | blk.25.attn_v.weight | 0x184948920 | 0x2d0000 |
233 | blk.25.ffn_down.weight | 0x184c18920 | 0x5a00000 |
234 | blk.25.ffn_gate.weight | 0x18a618920 | 0x44c0000 |
235 | blk.25.ffn_norm.weight | 0x18ead8920 | 0x5000 |
236 | blk.25.ffn_up.weight | 0x18eadd920 | 0x44c0000 |
237 | blk.26.attn_k.weight | 0x192f9d920 | 0x226000 |
238 | blk.26.attn_norm.weight | 0x1931c3920 | 0x5000 |
239 | blk.26.attn_output.weight | 0x1931c8920 | 0xb40000 |
240 | blk.26.attn_q.weight | 0x193d08920 | 0x898000 |
241 | blk.26.attn_v.weight | 0x1945a0920 | 0x2d0000 |
242 | blk.26.ffn_down.weight | 0x194870920 | 0x5a00000 |
243 | blk.26.ffn_gate.weight | 0x19a270920 | 0x44c0000 |
244 | blk.26.ffn_norm.weight | 0x19e730920 | 0x5000 |
245 | blk.26.ffn_up.weight | 0x19e735920 | 0x44c0000 |
246 | blk.27.attn_k.weight | 0x1a2bf5920 | 0x1a4000 |
247 | blk.27.attn_norm.weight | 0x1a2d99920 | 0x5000 |
248 | blk.27.attn_output.weight | 0x1a2d9e920 | 0xb40000 |
249 | blk.27.attn_q.weight | 0x1a38de920 | 0x690000 |
250 | blk.27.attn_v.weight | 0x1a3f6e920 | 0x226000 |
251 | blk.27.ffn_down.weight | 0x1a4194920 | 0x5a00000 |
252 | blk.27.ffn_gate.weight | 0x1a9b94920 | 0x44c0000 |
253 | blk.27.ffn_norm.weight | 0x1ae054920 | 0x5000 |
254 | blk.27.ffn_up.weight | 0x1ae059920 | 0x44c0000 |
255 | blk.28.attn_k.weight | 0x1b2519920 | 0x226000 |
256 | blk.28.attn_norm.weight | 0x1b273f920 | 0x5000 |
257 | blk.28.attn_output.weight | 0x1b2744920 | 0xb40000 |
258 | blk.28.attn_q.weight | 0x1b3284920 | 0x898000 |
259 | blk.28.attn_v.weight | 0x1b3b1c920 | 0x2d0000 |
260 | blk.28.ffn_down.weight | 0x1b3dec920 | 0x5a00000 |
261 | blk.28.ffn_gate.weight | 0x1b97ec920 | 0x44c0000 |
262 | blk.28.ffn_norm.weight | 0x1bdcac920 | 0x5000 |
263 | blk.28.ffn_up.weight | 0x1bdcb1920 | 0x44c0000 |
264 | blk.29.attn_k.weight | 0x1c2171920 | 0x226000 |
265 | blk.29.attn_norm.weight | 0x1c2397920 | 0x5000 |
266 | blk.29.attn_output.weight | 0x1c239c920 | 0xb40000 |
267 | blk.29.attn_q.weight | 0x1c2edc920 | 0x898000 |
268 | blk.29.attn_v.weight | 0x1c3774920 | 0x2d0000 |
269 | blk.29.ffn_down.weight | 0x1c3a44920 | 0x5a00000 |
270 | blk.29.ffn_gate.weight | 0x1c9444920 | 0x44c0000 |
271 | blk.29.ffn_norm.weight | 0x1cd904920 | 0x5000 |
272 | blk.29.ffn_up.weight | 0x1cd909920 | 0x44c0000 |
273 | blk.30.attn_k.weight | 0x1d1dc9920 | 0x226000 |
274 | blk.30.attn_norm.weight | 0x1d1fef920 | 0x5000 |
275 | blk.30.attn_output.weight | 0x1d1ff4920 | 0xb40000 |
276 | blk.30.attn_q.weight | 0x1d2b34920 | 0x898000 |
277 | blk.30.attn_v.weight | 0x1d33cc920 | 0x2d0000 |
278 | blk.30.ffn_down.weight | 0x1d369c920 | 0x5a00000 |
279 | blk.30.ffn_gate.weight | 0x1d909c920 | 0x44c0000 |
280 | blk.30.ffn_norm.weight | 0x1dd55c920 | 0x5000 |
281 | blk.30.ffn_up.weight | 0x1dd561920 | 0x44c0000 |
282 | blk.31.attn_k.weight | 0x1e1a21920 | 0x226000 |
283 | blk.31.attn_norm.weight | 0x1e1c47920 | 0x5000 |
284 | blk.31.attn_output.weight | 0x1e1c4c920 | 0xb40000 |
285 | blk.31.attn_q.weight | 0x1e278c920 | 0x898000 |
286 | blk.31.attn_v.weight | 0x1e3024920 | 0x2d0000 |
287 | blk.31.ffn_down.weight | 0x1e32f4920 | 0x5a00000 |
288 | blk.31.ffn_gate.weight | 0x1e8cf4920 | 0x44c0000 |
289 | blk.31.ffn_norm.weight | 0x1ed1b4920 | 0x5000 |
290 | blk.31.ffn_up.weight | 0x1ed1b9920 | 0x44c0000 |
291 | blk.32.attn_k.weight | 0x1f1679920 | 0x226000 |
292 | blk.32.attn_norm.weight | 0x1f189f920 | 0x5000 |
293 | blk.32.attn_output.weight | 0x1f18a4920 | 0xb40000 |
294 | blk.32.attn_q.weight | 0x1f23e4920 | 0x898000 |
295 | blk.32.attn_v.weight | 0x1f2c7c920 | 0x2d0000 |
296 | blk.32.ffn_down.weight | 0x1f2f4c920 | 0x5a00000 |
297 | blk.32.ffn_gate.weight | 0x1f894c920 | 0x44c0000 |
298 | blk.32.ffn_norm.weight | 0x1fce0c920 | 0x5000 |
299 | blk.32.ffn_up.weight | 0x1fce11920 | 0x44c0000 |
300 | blk.33.attn_k.weight | 0x2012d1920 | 0x226000 |
301 | blk.33.attn_norm.weight | 0x2014f7920 | 0x5000 |
302 | blk.33.attn_output.weight | 0x2014fc920 | 0xb40000 |
303 | blk.33.attn_q.weight | 0x20203c920 | 0x898000 |
304 | blk.33.attn_v.weight | 0x2028d4920 | 0x2d0000 |
305 | blk.33.ffn_down.weight | 0x202ba4920 | 0x5a00000 |
306 | blk.33.ffn_gate.weight | 0x2085a4920 | 0x44c0000 |
307 | blk.33.ffn_norm.weight | 0x20ca64920 | 0x5000 |
308 | blk.33.ffn_up.weight | 0x20ca69920 | 0x44c0000 |
309 | blk.34.attn_k.weight | 0x210f29920 | 0x226000 |
310 | blk.34.attn_norm.weight | 0x21114f920 | 0x5000 |
311 | blk.34.attn_output.weight | 0x211154920 | 0xb40000 |
312 | blk.34.attn_q.weight | 0x211c94920 | 0x898000 |
313 | blk.34.attn_v.weight | 0x21252c920 | 0x2d0000 |
314 | blk.34.ffn_down.weight | 0x2127fc920 | 0x5a00000 |
315 | blk.34.ffn_gate.weight | 0x2181fc920 | 0x44c0000 |
316 | blk.34.ffn_norm.weight | 0x21c6bc920 | 0x5000 |
317 | blk.34.ffn_up.weight | 0x21c6c1920 | 0x44c0000 |
318 | blk.35.attn_k.weight | 0x220b81920 | 0x226000 |
319 | blk.35.attn_norm.weight | 0x220da7920 | 0x5000 |
320 | blk.35.attn_output.weight | 0x220dac920 | 0xb40000 |
321 | blk.35.attn_q.weight | 0x2218ec920 | 0x898000 |
322 | blk.35.attn_v.weight | 0x222184920 | 0x2d0000 |
323 | blk.35.ffn_down.weight | 0x222454920 | 0x5a00000 |
324 | blk.35.ffn_gate.weight | 0x227e54920 | 0x44c0000 |
325 | blk.35.ffn_norm.weight | 0x22c314920 | 0x5000 |
326 | blk.35.ffn_up.weight | 0x22c319920 | 0x44c0000 |
327 | blk.36.attn_k.weight | 0x2307d9920 | 0x226000 |
328 | blk.36.attn_norm.weight | 0x2309ff920 | 0x5000 |
329 | blk.36.attn_output.weight | 0x230a04920 | 0xb40000 |
330 | blk.36.attn_q.weight | 0x231544920 | 0x898000 |
331 | blk.36.attn_v.weight | 0x231ddc920 | 0x2d0000 |
332 | blk.36.ffn_down.weight | 0x2320ac920 | 0x5a00000 |
333 | blk.36.ffn_gate.weight | 0x237aac920 | 0x44c0000 |
334 | blk.36.ffn_norm.weight | 0x23bf6c920 | 0x5000 |
335 | blk.36.ffn_up.weight | 0x23bf71920 | 0x44c0000 |
336 | blk.37.attn_k.weight | 0x240431920 | 0x226000 |
337 | blk.37.attn_norm.weight | 0x240657920 | 0x5000 |
338 | blk.37.attn_output.weight | 0x24065c920 | 0xb40000 |
339 | blk.37.attn_q.weight | 0x24119c920 | 0x898000 |
340 | blk.37.attn_v.weight | 0x241a34920 | 0x2d0000 |
341 | blk.37.ffn_down.weight | 0x241d04920 | 0x5a00000 |
342 | blk.37.ffn_gate.weight | 0x247704920 | 0x44c0000 |
343 | blk.37.ffn_norm.weight | 0x24bbc4920 | 0x5000 |
344 | blk.37.ffn_up.weight | 0x24bbc9920 | 0x44c0000 |
345 | blk.38.attn_k.weight | 0x250089920 | 0x226000 |
346 | blk.38.attn_norm.weight | 0x2502af920 | 0x5000 |
347 | blk.38.attn_output.weight | 0x2502b4920 | 0xb40000 |
348 | blk.38.attn_q.weight | 0x250df4920 | 0x898000 |
349 | blk.38.attn_v.weight | 0x25168c920 | 0x2d0000 |
350 | blk.38.ffn_down.weight | 0x25195c920 | 0x5a00000 |
351 | blk.38.ffn_gate.weight | 0x25735c920 | 0x44c0000 |
352 | blk.38.ffn_norm.weight | 0x25b81c920 | 0x5000 |
353 | blk.38.ffn_up.weight | 0x25b821920 | 0x44c0000 |
354 | blk.39.attn_k.weight | 0x25fce1920 | 0x226000 |
355 | blk.39.attn_norm.weight | 0x25ff07920 | 0x5000 |
356 | blk.39.attn_output.weight | 0x25ff0c920 | 0xb40000 |
357 | blk.39.attn_q.weight | 0x260a4c920 | 0x898000 |
358 | blk.39.attn_v.weight | 0x2612e4920 | 0x2d0000 |
359 | blk.39.ffn_down.weight | 0x2615b4920 | 0x5a00000 |
360 | blk.39.ffn_gate.weight | 0x266fb4920 | 0x44c0000 |
361 | blk.39.ffn_norm.weight | 0x26b474920 | 0x5000 |
362 | blk.39.ffn_up.weight | 0x26b479920 | 0x44c0000 |
Base Tensor Group : ~1B Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
0 | output.weight | Output (W) | (~671M) 671088640 | 5120 x 131072 x 1 x 1 | Q3_K |
1 | output_norm.weight | Output Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
2 | token_embd.weight | Token Embedding (W) | (~671M) 671088640 | 5120 x 131072 x 1 x 1 | Q3_K |
- Total elements in base: ( ~1B) 1342182400
- Percentage of total elements: 5.69%
Block 0 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
3 | blk.0.attn_k.weight | Block 0 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
4 | blk.0.attn_norm.weight | Block 0 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
5 | blk.0.attn_output.weight | Block 0 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
6 | blk.0.attn_q.weight | Block 0 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
7 | blk.0.attn_v.weight | Block 0 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
8 | blk.0.ffn_down.weight | Block 0 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q5_K |
9 | blk.0.ffn_gate.weight | Block 0 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
10 | blk.0.ffn_norm.weight | Block 0 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
11 | blk.0.ffn_up.weight | Block 0 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.0: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 1 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
12 | blk.1.attn_k.weight | Block 1 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
13 | blk.1.attn_norm.weight | Block 1 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
14 | blk.1.attn_output.weight | Block 1 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
15 | blk.1.attn_q.weight | Block 1 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
16 | blk.1.attn_v.weight | Block 1 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
17 | blk.1.ffn_down.weight | Block 1 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q5_K |
18 | blk.1.ffn_gate.weight | Block 1 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
19 | blk.1.ffn_norm.weight | Block 1 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
20 | blk.1.ffn_up.weight | Block 1 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.1: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 2 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
21 | blk.2.attn_k.weight | Block 2 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
22 | blk.2.attn_norm.weight | Block 2 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
23 | blk.2.attn_output.weight | Block 2 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
24 | blk.2.attn_q.weight | Block 2 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
25 | blk.2.attn_v.weight | Block 2 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
26 | blk.2.ffn_down.weight | Block 2 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
27 | blk.2.ffn_gate.weight | Block 2 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
28 | blk.2.ffn_norm.weight | Block 2 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
29 | blk.2.ffn_up.weight | Block 2 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.2: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 3 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
30 | blk.3.attn_k.weight | Block 3 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
31 | blk.3.attn_norm.weight | Block 3 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
32 | blk.3.attn_output.weight | Block 3 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
33 | blk.3.attn_q.weight | Block 3 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
34 | blk.3.attn_v.weight | Block 3 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
35 | blk.3.ffn_down.weight | Block 3 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
36 | blk.3.ffn_gate.weight | Block 3 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
37 | blk.3.ffn_norm.weight | Block 3 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
38 | blk.3.ffn_up.weight | Block 3 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.3: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 4 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
39 | blk.4.attn_k.weight | Block 4 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
40 | blk.4.attn_norm.weight | Block 4 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
41 | blk.4.attn_output.weight | Block 4 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
42 | blk.4.attn_q.weight | Block 4 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
43 | blk.4.attn_v.weight | Block 4 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
44 | blk.4.ffn_down.weight | Block 4 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
45 | blk.4.ffn_gate.weight | Block 4 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
46 | blk.4.ffn_norm.weight | Block 4 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
47 | blk.4.ffn_up.weight | Block 4 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.4: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 5 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
48 | blk.5.attn_k.weight | Block 5 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
49 | blk.5.attn_norm.weight | Block 5 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
50 | blk.5.attn_output.weight | Block 5 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
51 | blk.5.attn_q.weight | Block 5 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
52 | blk.5.attn_v.weight | Block 5 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
53 | blk.5.ffn_down.weight | Block 5 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
54 | blk.5.ffn_gate.weight | Block 5 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
55 | blk.5.ffn_norm.weight | Block 5 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
56 | blk.5.ffn_up.weight | Block 5 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.5: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 6 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
57 | blk.6.attn_k.weight | Block 6 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
58 | blk.6.attn_norm.weight | Block 6 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
59 | blk.6.attn_output.weight | Block 6 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
60 | blk.6.attn_q.weight | Block 6 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
61 | blk.6.attn_v.weight | Block 6 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
62 | blk.6.ffn_down.weight | Block 6 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
63 | blk.6.ffn_gate.weight | Block 6 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
64 | blk.6.ffn_norm.weight | Block 6 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
65 | blk.6.ffn_up.weight | Block 6 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.6: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 7 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
66 | blk.7.attn_k.weight | Block 7 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
67 | blk.7.attn_norm.weight | Block 7 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
68 | blk.7.attn_output.weight | Block 7 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
69 | blk.7.attn_q.weight | Block 7 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
70 | blk.7.attn_v.weight | Block 7 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
71 | blk.7.ffn_down.weight | Block 7 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
72 | blk.7.ffn_gate.weight | Block 7 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
73 | blk.7.ffn_norm.weight | Block 7 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
74 | blk.7.ffn_up.weight | Block 7 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.7: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 8 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
75 | blk.8.attn_k.weight | Block 8 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
76 | blk.8.attn_norm.weight | Block 8 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
77 | blk.8.attn_output.weight | Block 8 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
78 | blk.8.attn_q.weight | Block 8 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
79 | blk.8.attn_v.weight | Block 8 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
80 | blk.8.ffn_down.weight | Block 8 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
81 | blk.8.ffn_gate.weight | Block 8 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
82 | blk.8.ffn_norm.weight | Block 8 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
83 | blk.8.ffn_up.weight | Block 8 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.8: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 9 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
84 | blk.9.attn_k.weight | Block 9 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
85 | blk.9.attn_norm.weight | Block 9 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
86 | blk.9.attn_output.weight | Block 9 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
87 | blk.9.attn_q.weight | Block 9 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
88 | blk.9.attn_v.weight | Block 9 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
89 | blk.9.ffn_down.weight | Block 9 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
90 | blk.9.ffn_gate.weight | Block 9 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
91 | blk.9.ffn_norm.weight | Block 9 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
92 | blk.9.ffn_up.weight | Block 9 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.9: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 10 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
93 | blk.10.attn_k.weight | Block 10 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
94 | blk.10.attn_norm.weight | Block 10 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
95 | blk.10.attn_output.weight | Block 10 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
96 | blk.10.attn_q.weight | Block 10 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
97 | blk.10.attn_v.weight | Block 10 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
98 | blk.10.ffn_down.weight | Block 10 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
99 | blk.10.ffn_gate.weight | Block 10 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
100 | blk.10.ffn_norm.weight | Block 10 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
101 | blk.10.ffn_up.weight | Block 10 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.10: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 11 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
102 | blk.11.attn_k.weight | Block 11 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
103 | blk.11.attn_norm.weight | Block 11 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
104 | blk.11.attn_output.weight | Block 11 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
105 | blk.11.attn_q.weight | Block 11 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
106 | blk.11.attn_v.weight | Block 11 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
107 | blk.11.ffn_down.weight | Block 11 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
108 | blk.11.ffn_gate.weight | Block 11 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
109 | blk.11.ffn_norm.weight | Block 11 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
110 | blk.11.ffn_up.weight | Block 11 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.11: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 12 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
111 | blk.12.attn_k.weight | Block 12 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
112 | blk.12.attn_norm.weight | Block 12 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
113 | blk.12.attn_output.weight | Block 12 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
114 | blk.12.attn_q.weight | Block 12 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
115 | blk.12.attn_v.weight | Block 12 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
116 | blk.12.ffn_down.weight | Block 12 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
117 | blk.12.ffn_gate.weight | Block 12 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
118 | blk.12.ffn_norm.weight | Block 12 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
119 | blk.12.ffn_up.weight | Block 12 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.12: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 13 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
120 | blk.13.attn_k.weight | Block 13 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
121 | blk.13.attn_norm.weight | Block 13 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
122 | blk.13.attn_output.weight | Block 13 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
123 | blk.13.attn_q.weight | Block 13 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
124 | blk.13.attn_v.weight | Block 13 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
125 | blk.13.ffn_down.weight | Block 13 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
126 | blk.13.ffn_gate.weight | Block 13 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
127 | blk.13.ffn_norm.weight | Block 13 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
128 | blk.13.ffn_up.weight | Block 13 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.13: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 14 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
129 | blk.14.attn_k.weight | Block 14 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
130 | blk.14.attn_norm.weight | Block 14 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
131 | blk.14.attn_output.weight | Block 14 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
132 | blk.14.attn_q.weight | Block 14 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
133 | blk.14.attn_v.weight | Block 14 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
134 | blk.14.ffn_down.weight | Block 14 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
135 | blk.14.ffn_gate.weight | Block 14 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
136 | blk.14.ffn_norm.weight | Block 14 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
137 | blk.14.ffn_up.weight | Block 14 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.14: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 15 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
138 | blk.15.attn_k.weight | Block 15 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
139 | blk.15.attn_norm.weight | Block 15 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
140 | blk.15.attn_output.weight | Block 15 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
141 | blk.15.attn_q.weight | Block 15 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
142 | blk.15.attn_v.weight | Block 15 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
143 | blk.15.ffn_down.weight | Block 15 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
144 | blk.15.ffn_gate.weight | Block 15 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
145 | blk.15.ffn_norm.weight | Block 15 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
146 | blk.15.ffn_up.weight | Block 15 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.15: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 16 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
147 | blk.16.attn_k.weight | Block 16 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
148 | blk.16.attn_norm.weight | Block 16 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
149 | blk.16.attn_output.weight | Block 16 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
150 | blk.16.attn_q.weight | Block 16 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
151 | blk.16.attn_v.weight | Block 16 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
152 | blk.16.ffn_down.weight | Block 16 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
153 | blk.16.ffn_gate.weight | Block 16 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
154 | blk.16.ffn_norm.weight | Block 16 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
155 | blk.16.ffn_up.weight | Block 16 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.16: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 17 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
156 | blk.17.attn_k.weight | Block 17 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
157 | blk.17.attn_norm.weight | Block 17 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
158 | blk.17.attn_output.weight | Block 17 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
159 | blk.17.attn_q.weight | Block 17 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
160 | blk.17.attn_v.weight | Block 17 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
161 | blk.17.ffn_down.weight | Block 17 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
162 | blk.17.ffn_gate.weight | Block 17 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
163 | blk.17.ffn_norm.weight | Block 17 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
164 | blk.17.ffn_up.weight | Block 17 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.17: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 18 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
165 | blk.18.attn_k.weight | Block 18 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
166 | blk.18.attn_norm.weight | Block 18 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
167 | blk.18.attn_output.weight | Block 18 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
168 | blk.18.attn_q.weight | Block 18 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
169 | blk.18.attn_v.weight | Block 18 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
170 | blk.18.ffn_down.weight | Block 18 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
171 | blk.18.ffn_gate.weight | Block 18 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
172 | blk.18.ffn_norm.weight | Block 18 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
173 | blk.18.ffn_up.weight | Block 18 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.18: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 19 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
174 | blk.19.attn_k.weight | Block 19 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
175 | blk.19.attn_norm.weight | Block 19 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
176 | blk.19.attn_output.weight | Block 19 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
177 | blk.19.attn_q.weight | Block 19 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
178 | blk.19.attn_v.weight | Block 19 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
179 | blk.19.ffn_down.weight | Block 19 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
180 | blk.19.ffn_gate.weight | Block 19 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
181 | blk.19.ffn_norm.weight | Block 19 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
182 | blk.19.ffn_up.weight | Block 19 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q2_K |
- Total elements in blk.19: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 20 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
183 | blk.20.attn_k.weight | Block 20 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
184 | blk.20.attn_norm.weight | Block 20 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
185 | blk.20.attn_output.weight | Block 20 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
186 | blk.20.attn_q.weight | Block 20 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
187 | blk.20.attn_v.weight | Block 20 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
188 | blk.20.ffn_down.weight | Block 20 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
189 | blk.20.ffn_gate.weight | Block 20 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
190 | blk.20.ffn_norm.weight | Block 20 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
191 | blk.20.ffn_up.weight | Block 20 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.20: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 21 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
192 | blk.21.attn_k.weight | Block 21 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
193 | blk.21.attn_norm.weight | Block 21 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
194 | blk.21.attn_output.weight | Block 21 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
195 | blk.21.attn_q.weight | Block 21 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
196 | blk.21.attn_v.weight | Block 21 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
197 | blk.21.ffn_down.weight | Block 21 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
198 | blk.21.ffn_gate.weight | Block 21 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
199 | blk.21.ffn_norm.weight | Block 21 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
200 | blk.21.ffn_up.weight | Block 21 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.21: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 22 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
201 | blk.22.attn_k.weight | Block 22 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
202 | blk.22.attn_norm.weight | Block 22 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
203 | blk.22.attn_output.weight | Block 22 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
204 | blk.22.attn_q.weight | Block 22 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
205 | blk.22.attn_v.weight | Block 22 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
206 | blk.22.ffn_down.weight | Block 22 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
207 | blk.22.ffn_gate.weight | Block 22 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
208 | blk.22.ffn_norm.weight | Block 22 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
209 | blk.22.ffn_up.weight | Block 22 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.22: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 23 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
210 | blk.23.attn_k.weight | Block 23 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
211 | blk.23.attn_norm.weight | Block 23 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
212 | blk.23.attn_output.weight | Block 23 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
213 | blk.23.attn_q.weight | Block 23 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
214 | blk.23.attn_v.weight | Block 23 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
215 | blk.23.ffn_down.weight | Block 23 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
216 | blk.23.ffn_gate.weight | Block 23 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
217 | blk.23.ffn_norm.weight | Block 23 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
218 | blk.23.ffn_up.weight | Block 23 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.23: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 24 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
219 | blk.24.attn_k.weight | Block 24 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
220 | blk.24.attn_norm.weight | Block 24 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
221 | blk.24.attn_output.weight | Block 24 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
222 | blk.24.attn_q.weight | Block 24 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
223 | blk.24.attn_v.weight | Block 24 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
224 | blk.24.ffn_down.weight | Block 24 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
225 | blk.24.ffn_gate.weight | Block 24 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
226 | blk.24.ffn_norm.weight | Block 24 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
227 | blk.24.ffn_up.weight | Block 24 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.24: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 25 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
228 | blk.25.attn_k.weight | Block 25 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
229 | blk.25.attn_norm.weight | Block 25 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
230 | blk.25.attn_output.weight | Block 25 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
231 | blk.25.attn_q.weight | Block 25 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
232 | blk.25.attn_v.weight | Block 25 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
233 | blk.25.ffn_down.weight | Block 25 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
234 | blk.25.ffn_gate.weight | Block 25 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
235 | blk.25.ffn_norm.weight | Block 25 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
236 | blk.25.ffn_up.weight | Block 25 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.25: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 26 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
237 | blk.26.attn_k.weight | Block 26 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
238 | blk.26.attn_norm.weight | Block 26 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
239 | blk.26.attn_output.weight | Block 26 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
240 | blk.26.attn_q.weight | Block 26 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
241 | blk.26.attn_v.weight | Block 26 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
242 | blk.26.ffn_down.weight | Block 26 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
243 | blk.26.ffn_gate.weight | Block 26 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
244 | blk.26.ffn_norm.weight | Block 26 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
245 | blk.26.ffn_up.weight | Block 26 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.26: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 27 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
246 | blk.27.attn_k.weight | Block 27 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q2_K |
247 | blk.27.attn_norm.weight | Block 27 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
248 | blk.27.attn_output.weight | Block 27 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
249 | blk.27.attn_q.weight | Block 27 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q2_K |
250 | blk.27.attn_v.weight | Block 27 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
251 | blk.27.ffn_down.weight | Block 27 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
252 | blk.27.ffn_gate.weight | Block 27 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
253 | blk.27.ffn_norm.weight | Block 27 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
254 | blk.27.ffn_up.weight | Block 27 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.27: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 28 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
255 | blk.28.attn_k.weight | Block 28 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
256 | blk.28.attn_norm.weight | Block 28 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
257 | blk.28.attn_output.weight | Block 28 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
258 | blk.28.attn_q.weight | Block 28 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
259 | blk.28.attn_v.weight | Block 28 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
260 | blk.28.ffn_down.weight | Block 28 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
261 | blk.28.ffn_gate.weight | Block 28 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
262 | blk.28.ffn_norm.weight | Block 28 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
263 | blk.28.ffn_up.weight | Block 28 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.28: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 29 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
264 | blk.29.attn_k.weight | Block 29 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
265 | blk.29.attn_norm.weight | Block 29 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
266 | blk.29.attn_output.weight | Block 29 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
267 | blk.29.attn_q.weight | Block 29 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
268 | blk.29.attn_v.weight | Block 29 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
269 | blk.29.ffn_down.weight | Block 29 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
270 | blk.29.ffn_gate.weight | Block 29 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
271 | blk.29.ffn_norm.weight | Block 29 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
272 | blk.29.ffn_up.weight | Block 29 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.29: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 30 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
273 | blk.30.attn_k.weight | Block 30 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
274 | blk.30.attn_norm.weight | Block 30 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
275 | blk.30.attn_output.weight | Block 30 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
276 | blk.30.attn_q.weight | Block 30 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
277 | blk.30.attn_v.weight | Block 30 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
278 | blk.30.ffn_down.weight | Block 30 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
279 | blk.30.ffn_gate.weight | Block 30 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
280 | blk.30.ffn_norm.weight | Block 30 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
281 | blk.30.ffn_up.weight | Block 30 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.30: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 31 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
282 | blk.31.attn_k.weight | Block 31 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
283 | blk.31.attn_norm.weight | Block 31 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
284 | blk.31.attn_output.weight | Block 31 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
285 | blk.31.attn_q.weight | Block 31 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
286 | blk.31.attn_v.weight | Block 31 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
287 | blk.31.ffn_down.weight | Block 31 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
288 | blk.31.ffn_gate.weight | Block 31 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
289 | blk.31.ffn_norm.weight | Block 31 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
290 | blk.31.ffn_up.weight | Block 31 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.31: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 32 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
291 | blk.32.attn_k.weight | Block 32 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
292 | blk.32.attn_norm.weight | Block 32 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
293 | blk.32.attn_output.weight | Block 32 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
294 | blk.32.attn_q.weight | Block 32 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
295 | blk.32.attn_v.weight | Block 32 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
296 | blk.32.ffn_down.weight | Block 32 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
297 | blk.32.ffn_gate.weight | Block 32 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
298 | blk.32.ffn_norm.weight | Block 32 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
299 | blk.32.ffn_up.weight | Block 32 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.32: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 33 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
300 | blk.33.attn_k.weight | Block 33 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
301 | blk.33.attn_norm.weight | Block 33 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
302 | blk.33.attn_output.weight | Block 33 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
303 | blk.33.attn_q.weight | Block 33 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
304 | blk.33.attn_v.weight | Block 33 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
305 | blk.33.ffn_down.weight | Block 33 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
306 | blk.33.ffn_gate.weight | Block 33 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
307 | blk.33.ffn_norm.weight | Block 33 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
308 | blk.33.ffn_up.weight | Block 33 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.33: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 34 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
309 | blk.34.attn_k.weight | Block 34 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
310 | blk.34.attn_norm.weight | Block 34 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
311 | blk.34.attn_output.weight | Block 34 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
312 | blk.34.attn_q.weight | Block 34 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
313 | blk.34.attn_v.weight | Block 34 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
314 | blk.34.ffn_down.weight | Block 34 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
315 | blk.34.ffn_gate.weight | Block 34 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
316 | blk.34.ffn_norm.weight | Block 34 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
317 | blk.34.ffn_up.weight | Block 34 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.34: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 35 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
318 | blk.35.attn_k.weight | Block 35 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
319 | blk.35.attn_norm.weight | Block 35 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
320 | blk.35.attn_output.weight | Block 35 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
321 | blk.35.attn_q.weight | Block 35 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
322 | blk.35.attn_v.weight | Block 35 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
323 | blk.35.ffn_down.weight | Block 35 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
324 | blk.35.ffn_gate.weight | Block 35 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
325 | blk.35.ffn_norm.weight | Block 35 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
326 | blk.35.ffn_up.weight | Block 35 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.35: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 36 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
327 | blk.36.attn_k.weight | Block 36 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
328 | blk.36.attn_norm.weight | Block 36 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
329 | blk.36.attn_output.weight | Block 36 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
330 | blk.36.attn_q.weight | Block 36 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
331 | blk.36.attn_v.weight | Block 36 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
332 | blk.36.ffn_down.weight | Block 36 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
333 | blk.36.ffn_gate.weight | Block 36 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
334 | blk.36.ffn_norm.weight | Block 36 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
335 | blk.36.ffn_up.weight | Block 36 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.36: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 37 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
336 | blk.37.attn_k.weight | Block 37 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
337 | blk.37.attn_norm.weight | Block 37 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
338 | blk.37.attn_output.weight | Block 37 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
339 | blk.37.attn_q.weight | Block 37 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
340 | blk.37.attn_v.weight | Block 37 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
341 | blk.37.ffn_down.weight | Block 37 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
342 | blk.37.ffn_gate.weight | Block 37 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
343 | blk.37.ffn_norm.weight | Block 37 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
344 | blk.37.ffn_up.weight | Block 37 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.37: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 38 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
345 | blk.38.attn_k.weight | Block 38 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
346 | blk.38.attn_norm.weight | Block 38 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
347 | blk.38.attn_output.weight | Block 38 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
348 | blk.38.attn_q.weight | Block 38 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
349 | blk.38.attn_v.weight | Block 38 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
350 | blk.38.ffn_down.weight | Block 38 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
351 | blk.38.ffn_gate.weight | Block 38 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
352 | blk.38.ffn_norm.weight | Block 38 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
353 | blk.38.ffn_up.weight | Block 38 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.38: (~556M) 555755520
- Percentage of total elements: 2.36%
Block 39 Tensor Group : ~556M Elements
T_ID | Tensor Layer Name | Human Friendly Tensor Layer Name | Elements | Shape | Type |
---|---|---|---|---|---|
354 | blk.39.attn_k.weight | Block 39 Attention Key (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q3_K |
355 | blk.39.attn_norm.weight | Block 39 Attention Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
356 | blk.39.attn_output.weight | Block 39 Attention Output (W) | ( ~21M) 20971520 | 4096 x 5120 x 1 x 1 | Q4_K |
357 | blk.39.attn_q.weight | Block 39 Attention Query (W) | ( ~21M) 20971520 | 5120 x 4096 x 1 x 1 | Q3_K |
358 | blk.39.attn_v.weight | Block 39 Attention Value (W) | ( ~5M) 5242880 | 5120 x 1024 x 1 x 1 | Q4_K |
359 | blk.39.ffn_down.weight | Block 39 Feed-Forward Network "Down" (W) | (~168M) 167772160 | 32768 x 5120 x 1 x 1 | Q4_K |
360 | blk.39.ffn_gate.weight | Block 39 Feed-Forward Network "Gate" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
361 | blk.39.ffn_norm.weight | Block 39 Feed-Forward Network Normalization (W) | ( ~5K) 5120 | 5120 x 1 x 1 x 1 | F32 |
362 | blk.39.ffn_up.weight | Block 39 Feed-Forward Network "Up" (W) | (~168M) 167772160 | 5120 x 32768 x 1 x 1 | Q3_K |
- Total elements in blk.39: (~556M) 555755520
- Percentage of total elements: 2.36%