o Sid?d&d@dAgdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdEdQdRdSdOdTdUdVddWd dXdYdZddKd[d\d[ddMd]d^d_ddUdId`dadbidcddddedf dgdhZdiS)jz vit-bigG-14 flan-t5-base adapter_id003namezDualShuntAdapter-Gt5zgoogle/flan-t5-basei)model hidden_sizeclipzopenai/clip-vit-large-patch14ir bottleneckiheadstau_initg? max_guidanceg$@ proj_layers layer_normTdropout use_dropoutZuse_proj_stackassert_input_dimsroutingZcross_attentionF)typeZenable_causal_mask bidirectionalversionzv0.3.2z9AbstractPhil/t5-flan-base-vit-bigG-14-dual-stream-adapterzDualStreamAdapter-Gz config.json) z2t5-flan-vit-bigG-14-dual_shunt_caption.safetensorsz8t5-flan-vit-bigG-14-dual_shunt_no_caption_e1.safetensorsz8t5-flan-vit-bigG-14-dual_shunt_no_caption_e2.safetensorsz8t5-flan-vit-bigG-14-dual_shunt_no_caption_e3.safetensorsz4t5-flan-vit-bigG-14-dual_shunt_summarize.safetensorsz5dual_shunt_omega_no_caption_e1_step_10000.safetensorsz;dual_shunt_omega_no_caption_noised_e1_step_1000.safetensorsz;dual_shunt_omega_no_caption_noised_e1_step_4000.safetensorsz model_type captionz caption: )early_stoppinglength_penalty max_length num_beamsprefixfloat32z4.51.3i}) n_positionsnum_decoder_layers num_heads num_layersZ output_past pad_token_idrelative_attention_max_distancerelative_attention_num_bucketstask_specific_params torch_dtypetransformers_version use_cache vocab_size)r"r#r$ tokenizer file_namer)rr(Z#t5_small_human_attentive_try2_pass3N)T5_SHUNT_REPOSZ BERT_CONFIGSZ T5_CONFIGSrYrY/home/user/app/configs.pys`         &        F