Upgrade transformers in requirements
I try to convert HuggingFaceTB/SmolLM3-3B
, and also tried Qwen/Qwen3-4B
. I get the error (which I have seen before locally and fixed by upgrading transformers to 4.54.0):
ValueError: The checkpoint you are trying to load has model type [smollm3 | ....] but Transformers does not recognize this architecture.
I would recommend upgrading transformers to a more recent version unless there is a compatibility reason not to.
Hi,
@david-thrower
!
Thanks for bringing it up.
Could you suggest this in transformers.js repository?
Currently, we keep the dependencies in sync with the convert.py
requirements from transformers.js.
I see,
Thanks for the clarification. Sorry to step out of my lane here. I'm not as deeply familiar with the HF JS package as I should be honestly.
The same ValueError I saw when I tried to convert HuggingFaceTB/SmolLM3-3B using this app "unrecognized model architecture", this appeared identical to what I ran into earlier using an older version of the Python transformers package, transformers.AutoModelForCausalLM.from_pretrained("HuggingFaceTB/SmolLM3-3B")
method. In my case, upgrading the Python transformers to 4.54.0 resolved the issue, and I was able to use the model perfectly.
Looking at the requirements.txt in this project (https://huggingface.co/spaces/onnx-community/convert-to-onnx/blob/main/requirements.txt), I noticed it pins transformers[torch]==4.49.0
(line 3), which is an earlier version that probably lacks the model architecture definitions added in later versions like 4.54.0. It was my knee - jerk reaction to presume this may be a simple upgrade on this line.
I guess the dependencies in transformers.js are coupled with convert.py requirements, so there may be a more structural reason we're on 4.49.0...