Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
wolfram
/
miquliz-120b-v2.0-GGUF
like
28
Transformers
GGUF
5 languages
mergekit
Merge
conversational
arxiv:
2203.05482
Model card
Files
Files and versions
xet
Community
4
Train
Deploy
Use this model
98b4b75
miquliz-120b-v2.0-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
12 commits
wolfram
Upload folder using huggingface_hub (
#3
)
98b4b75
verified
over 1 year ago
.gitattributes
2.36 kB
Upload folder using huggingface_hub (#3)
over 1 year ago
README.md
30.4 kB
Update README.md
over 1 year ago
miquliz-120b-v2.0.IQ1_S.gguf
Safe
25.2 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago
miquliz-120b-v2.0.IQ2_XS.gguf
Safe
35.4 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago
miquliz-120b-v2.0.IQ2_XXS.gguf
Safe
31.8 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago
miquliz-120b-v2.0.IQ3_XXS.gguf
Safe
46.2 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago
miquliz-120b-v2.0.IQ3_XXS.old.gguf
Safe
49 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago
miquliz-120b-v2.0.IQ4_XS.gguf-split-a
Safe
50 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago
miquliz-120b-v2.0.IQ4_XS.gguf-split-b
Safe
14.2 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago
miquliz-120b-v2.0.Q2_K.gguf
Safe
44.2 GB
xet
Upload folder using huggingface_hub
over 1 year ago
miquliz-120b-v2.0.Q4_K_M.gguf-split-a
Safe
50 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago
miquliz-120b-v2.0.Q4_K_M.gguf-split-b
Safe
22.1 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago
miquliz-120b-v2.0.Q5_K_M.gguf-split-a
Safe
50 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago
miquliz-120b-v2.0.Q5_K_M.gguf-split-b
Safe
35 GB
xet
Upload folder using huggingface_hub (#3)
over 1 year ago