Spaces:
Running
on
A10G
Running
on
A10G
microsoft/Phi-4-multimodal-instruct Not working
#183
by
jivaniyash
- opened
Hello,
I am getting an error while converting. Is anyone getting such errors while converting?
Error converting to fp16: INFO:hf-to-gguf:Loading model: Phi-4-multimodal-instruct
WARNING:hf-to-gguf:Failed to load model config from downloads/tmpsb3bygox/Phi-4-multimodal-instruct: The repository downloads/tmpsb3bygox/Phi-4-multimodal-instruct contains custom code which must be executed to correctly load the model. You can inspect the repository content at /home/user/app/downloads/tmpsb3bygox/Phi-4-multimodal-instruct .
You can inspect the repository content at https://hf.co/downloads/tmpsb3bygox/Phi-4-multimodal-instruct.
Please pass the argument `trust_remote_code=True` to allow custom code to be run.
WARNING:hf-to-gguf:Trying to load config.json instead
INFO:hf-to-gguf:Model architecture: Phi4MMForCausalLM
ERROR:hf-to-gguf:Model Phi4MMForCausalLM is not supported