hunyuan-1.8b 转换onnx 模型时 BitShift算子出现这个 tensor(int32)无效 是layers.31网络层出现什么问题吗?
Traceback (most recent call last):
File "/opt/conda/bin/optimum-cli", line 7, in
sys.exit(main())
File "/opt/conda/lib/python3.9/site-packages/optimum/commands/optimum_cli.py", line 208, in main
service.run()
File "/opt/conda/lib/python3.9/site-packages/optimum/commands/export/onnx.py", line 276, in run
main_export(
File "/opt/conda/lib/python3.9/site-packages/optimum/exporters/onnx/main.py", line 418, in main_export
onnx_export_from_model(
File "/opt/conda/lib/python3.9/site-packages/optimum/exporters/onnx/convert.py", line 1186, in onnx_export_from_model
_, onnx_outputs = export_models(
File "/opt/conda/lib/python3.9/site-packages/optimum/exporters/onnx/convert.py", line 770, in export_models
export(
File "/opt/conda/lib/python3.9/site-packages/optimum/exporters/onnx/convert.py", line 903, in export
config.fix_dynamic_axes(output, device=device, input_shapes=input_shapes, dtype=dtype)
File "/opt/conda/lib/python3.9/site-packages/optimum/exporters/onnx/base.py", line 235, in fix_dynamic_axes
session = InferenceSession(model_path.as_posix(), providers=providers, sess_options=session_options)
File "/opt/conda/lib/python3.9/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/opt/conda/lib/python3.9/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 480, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidGraph: [ONNXRuntimeError] : 10 : INVALID_GRAPH : Load model from hunyuan_tflite_float32/model.onnx failed:This is an invalid model. Type Error: Type 'tensor(int32)' of input parameter (/model/layers.31/mlp/up_proj/Expand_output_0) of operator (BitShift) in node (/model/layers.31/mlp/up_proj/BitShift) is invalid.