model file
- download model
huggingface-cli download \
c00cjz00/phi-4-14b-it-R1-m22k_lora \
unsloth.Q2_K.gguf \
--local-dir /work/c00cjz00/github/hpc_ollama/home \
--local-dir-use-symlinks False
- phi4.modelfile
# Modelfile generated by "ollama show"
# To build a new Modelfile based on this, replace FROM with:
# FROM phi4:latest
FROM ./unsloth.Q2_K.gguf
TEMPLATE """{{- range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1 -}}
<|im_start|>{{ .Role }}<|im_sep|>
{{ .Content }}{{ if not $last }}<|im_end|>
{{ end }}
{{- if and (ne .Role "assistant") $last }}<|im_end|>
<|im_start|>assistant<|im_sep|>
{{ end }}
{{- end }}"""
PARAMETER stop <|im_start|>
PARAMETER stop <|im_end|>
PARAMETER stop <|im_sep|>
- ollama
ollama create myphi4-on-q2 -f ./phi4.modelfile
- Downloads last month
- 18
Hardware compatibility
Log In
to view the estimation
2-bit
4-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support