Update README.md
Browse files
README.md
CHANGED
@@ -40,8 +40,8 @@ Many of the mergekit MoEs I have found, frequently combine several experts that
|
|
40 |
| Model | Description |
|
41 |
| ----- | ----------- |
|
42 |
| [Velvet Eclipse 2x12B](https://huggingface.co/SuperbEmphasis/Viloet-Eclipse-2x12B-v0.2-MINI) | A slimmer model with the ERP and RP experts.|
|
43 |
-
| [Velvet Eclipse 2x12B Reasoning](https://huggingface.co/SuperbEmphasis/Viloet-Eclipse-2x12B-v0.2-MINI-Reasoning) | A slimmer model with the ERP and the Reasoning Experts|
|
44 |
-
|
45 |
|
46 |
Want to sacrifice speed, for more intelligence? Just increase the active experts.
|
47 |
In llamacpp:
|
|
|
40 |
| Model | Description |
|
41 |
| ----- | ----------- |
|
42 |
| [Velvet Eclipse 2x12B](https://huggingface.co/SuperbEmphasis/Viloet-Eclipse-2x12B-v0.2-MINI) | A slimmer model with the ERP and RP experts.|
|
43 |
+
| [Velvet Eclipse 2x12B Reasoning](https://huggingface.co/SuperbEmphasis/Viloet-Eclipse-2x12B-v0.2-MINI-Reasoning) | A slimmer model with the ERP and the Reasoning Experts |
|
44 |
+
| [Velvet Eclipse 4x12B Reasoning](https://huggingface.co/SuperbEmphasis/Velvet-Eclipse-4x12B-v0.2) | Full 4x12B Parameter Velvet Eclipse |
|
45 |
|
46 |
Want to sacrifice speed, for more intelligence? Just increase the active experts.
|
47 |
In llamacpp:
|