o @gfA @sddlmZddlZddlmZddlmZmZmZddl Z ddl m Z ddl m Z ddl mZddlmZd d lmZd d lmZd d lmZmZmZmZmZmZmZd d lmZmZm Z ej!eej"eej#eej$eej%eej&eiZ'dddZ(dddZ)Gdddee j j*Z+dS)) annotationsN)contextmanager)AnyOptionalUnion)remove_hook_from_submodules)nn)PushToHubMixin)COMPATIBLE_TUNER_TYPES) PeftConfig) PeftModel) AdaLoraModelIA3Model LoHaModel LoKrModel LoraModel MixedModelOFTModel)PeftType _set_adapter_set_trainablemodel nn.ModulereturnNonecCs~t|dds|St|dds7t|dds9t|dds;t|dr%|d St|dr=d d }||d Sd Sd Sd Sd S) zD Prepares the model for gradient checkpointing if necessary is_gradient_checkpointingTis_loaded_in_8bitFis_loaded_in_4bit is_quantizedenable_input_require_gradsget_input_embeddingscSs|ddS)NT)requires_grad_)moduleinputoutputr&B/group/40034/yaoweili/code/ImageConductor_demo/peft/mixed_model.pymake_inputs_require_gradEszK_prepare_model_for_gradient_checkpointing..make_inputs_require_gradN)getattrhasattrr r!register_forward_hook)rr(r&r&r')_prepare_model_for_gradient_checkpointing4s        r, peft_configr cCs&|jtvrtd|jjdtdS)NzThe provided `peft_type` 'zE' is not compatible with the `PeftMixedModel`. Compatible types are: ) peft_typer ValueErrorvalue)r-r&r&r'_check_config_compatibleKs  r1cs eZdZdZdNdOfd d ZedPddZedQddZedRddZddZ ddZ dSfdd Z dTdd Z dTd!d"Z ed#d$ZdUd%d&ZdVd'd(ZdWd*d+ZdWd,d-ZdTd.d/ZdTd0d1Zd2d3Zd4d5ZedXd7d8ZdYd:d;ZdZd=d>Z ? @d[d\dFdGZe  ? @d]d^dLdMZZS)_PeftMixedModela PeftMixedModel for loading mixing different types of adapters for inference. This class does not support loading/saving, and it shouldn't usually be initialized directly. Instead, use `get_peft_model` with the argument `mixed=True`. Read the [Mixed adapter types](https://huggingface.co/docs/peft/en/developer_guides/mixed_models) guide to learn more about using different adapter types. Example: ```py >>> from peft import get_peft_model >>> base_model = ... # load the base model, e.g. from transformers >>> peft_model = PeftMixedModel.from_pretrained(base_model, path_to_adapter1, "adapter1").eval() >>> peft_model.load_adapter(path_to_adapter2, "adapter2") >>> peft_model.set_adapter(["adapter1", "adapter2"]) # activate both adapters >>> peft_model(data) # forward pass using both adapters ``` Args: model (`torch.nn.Module`): The model to be tuned. config (`PeftConfig`): The config of the model to be tuned. The adapter type must be compatible. adapter_name (`str`, `optional`, defaults to `"default"`): The name of the first adapter. defaultrrr-r adapter_namestrrrcstt|t|d|_t|||i||_|||t|dddi|_ t |jdrd|jj _ dSdSdS)Nconfig model_typecustompretraining_tpr ) super__init__r1r,modules_to_saver base_modelset_modules_to_saver)r6r*r9)selfrr-r4 __class__r&r'r;vs  zPeftMixedModel.__init__dict[str, PeftConfig]cC|jjSN)r=r-r?r&r&r'r-zPeftMixedModel.peft_configcCrCrD)r=active_adapterrEr&r&r'rGrFzPeftMixedModel.active_adapter list[str]cCrCrD)r=active_adaptersrEr&r&r'rIrFzPeftMixedModel.active_adapterscCsld}d}|D])\}}|}|dkrt|dr|j}|jjdkr&|d}||7}|jr1||7}q||fS)zg Returns the number of trainable parameters and number of all parameters in the model. rds_numel Params4bit)named_parametersnumelr*rJrA__name__ requires_grad)r?trainable_params all_param_param num_paramsr&r&r'get_nb_trainable_parameterss z*PeftMixedModel.get_nb_trainable_parameterscCs8|\}}td|dd|ddd||ddS)a Prints the number of trainable parameters in the model. Note: print_trainable_parameters() uses get_nb_trainable_parameters() which is different from num_parameters(only_trainable=True) from huggingface/transformers. get_nb_trainable_parameters() returns (trainable parameters, all parameters) of the Peft Model which includes modified backbone transformer model. For techniques like LoRA, the backbone transformer model is modified in place with LoRA modules. However, for prompt tuning, the backbone transformer model is unmodified. num_parameters(only_trainable=True) returns number of trainable parameters of the backbone transformer model which can be different. ztrainable params: z,dz || all params: z || trainable%: dz.4fN)rVprint)r?rQrRr&r&r'print_trainable_parameterss   z)PeftMixedModel.print_trainable_parametersnamecs.zt|WStyt|j|YSw)z1Forward missing attributes to the wrapped module.)r: __getattr__AttributeErrorr)r=)r?rZr@r&r'r[s  zPeftMixedModel.__getattr__argsrkwargscOs|j|i|S)z, Forward pass of the model. )r=r?r]r^r&r&r'forwardszPeftMixedModel.forwardcO|jj|i|S)z" Generate output. )r=generater_r&r&r'rbszPeftMixedModel.generateccs0z|jdVW|jdS|jw)z. Disables the adapter module. N)r=Zdisable_adapter_layersZenable_adapter_layersrEr&r&r'disable_adapters  zPeftMixedModel.disable_adaptercCsXt|z||j|<|j||Wnty#||jvr"|j|=w|||dSrD)r1r-r=Zinject_adapter Exceptionr>)r?r4r-r&r&r' add_adapters   zPeftMixedModel.add_adaptercCsHt|dd}dur dS|jdurt||_n|j|t||dS)Nr<)r)r<setupdater)r?r-r4r<r&r&r'r>s    z"PeftMixedModel.set_modules_to_saveUnion[str, list[str]]cCsft|tr|g}t|t|j}|r&tdt|dt|j|j|t ||dS)a Sets the active adapter(s) for the model. Note that the order in which the adapters are applied during the forward pass may not be the same as the order in which they are passed to this function. Instead, the order during the forward pass is determined by the order in which the adapters were loaded into the model. The active adapters only determine which adapters are active during the forward pass, but not the order in which they are applied. Additionally, this function will set the specified adapters to trainable (i.e., requires_grad=True). If this is not desired, use the following code. ```py >>> for name, param in model_peft.named_parameters(): ... if ...: # some check on name (ex. if 'lora' in name) ... param.requires_grad = False ``` Args: adapter_name (`str` or `List[str]`): The name of the adapter(s) to be activated. Adapter(s) not found, available adapters: N) isinstancer5rfr-keysr/sortedr= set_adapterrr?r4 mismatchedr&r&r'rns  zPeftMixedModel.set_adaptercCs\t|tr|g}t|t|j}|r&tdt|dt|j|j|dS)Nrirj) rkr5rfr-rlr/rmr=delete_adapterror&r&r'rqs zPeftMixedModel.delete_adaptercOra)a This method merges the adapter layers into the base model. This is needed if someone wants to use the base model as a standalone model. Args: progressbar (`bool`): whether to show a progressbar indicating the unload and merge process safe_merge (`bool`): whether to activate the safe merging check to check if there is any potential Nan in the adapter weights adapter_names (`List[str]`, *optional*): The list of adapter names that should be merged. If None, all active adapters will be merged. Defaults to `None`. )r=merge_and_unloadr_r&r&r'rr"szPeftMixedModel.merge_and_unloadcOra)z Gets back the base model by removing all the adapter modules without merging. This gives back the original base model. )r=unloadr_r&r&r'rs3szPeftMixedModel.unloadcCtd|jjd)Nz&get_layer_status is not supported for . TypeErrorrArOrEr&r&r'get_layer_status:zPeftMixedModel.get_layer_statuscCrt)Nz&get_model_status is not supported for rurvrEr&r&r'get_model_status=ryzPeftMixedModel.get_model_statusdict[str, Any]cCs t|SrD)r _split_kwargs)clsr^r&r&r'r|@s zPeftMixedModel._split_kwargsmodel_idcOs,tj|||g|Ri|}||j|SrD)r load_adapterrnrI)r?r~r4r]r^r%r&r&r'rDs zPeftMixedModel.load_adapter output_dircCrt)Nz)Model card creation is not supported for  (yet).NotImplementedErrorrArO)r?rr&r&r'create_or_update_model_cardJryz*PeftMixedModel.create_or_update_model_cardFNsave_directorysafe_serializationboolselected_adaptersOptional[list[str]]cKrt)NzSaving is not supported for rr)r?rrrr^r&r&r'save_pretrainedMszPeftMixedModel.save_pretrainedstr | os.PathLike is_trainabler6Optional[PeftConfig]c Ksddlm}|dur.|tj||dd|dd|dd|dddj|fi|}nt|tr8| |_ntd |j |j t vrNtd |j d t |d ddurjt t|jd dhdkrjt||jrs|rstd| |_||||}|j||fd|i||S)a Instantiate a PEFT mixed model from a pretrained model and loaded PEFT weights. Note that the passed `model` may be modified inplace. Args: model (`nn.Module`): The model to be adapted. model_id (`str` or `os.PathLike`): The name of the PEFT configuration to use. Can be either: - A string, the `model id` of a PEFT configuration hosted inside a model repo on the Hugging Face Hub. - A path to a directory containing a PEFT configuration file saved using the `save_pretrained` method (`./my_peft_config_directory/`). adapter_name (`str`, *optional*, defaults to `"default"`): The name of the adapter to be loaded. This is useful for loading multiple adapters. is_trainable (`bool`, *optional*, defaults to `False`): Whether the adapter should be trainable or not. If `False`, the adapter will be frozen and use for inference config ([`~peft.PeftConfig`], *optional*): The configuration object to use instead of an automatically loaded configuration. This configuration object is mutually exclusive with `model_id` and `kwargs`. This is useful when configuration is already loaded before calling `from_pretrained`. kwargs: (`optional`): Additional keyword arguments passed along to the specific PEFT configuration class. r )PEFT_TYPE_TO_CONFIG_MAPPINGN subfolderrevision cache_diruse_auth_token)rrrrz+The input config must be a PeftConfig, got zAdapter of type z# is not supported for mixed models. hf_device_mapcpudiskrzRCannot set a prompt learning adapter to trainable when loading pretrained adapter.r)mappingrr _get_peft_typegetfrom_pretrainedrkinference_moder/rAr.PEFT_TYPE_TO_MODEL_MAPPINGr)lenrfrvalues intersectionris_prompt_learningr)r}rr~r4rr6r^rr&r&r'rVs@ %        zPeftMixedModel.from_pretrained)r3)rrr-r r4r5rr)rrB)rr5)rrH)rZr5)r]rr^r)r4r5r-r )r-r r4r5rr)r4rhrr)r^r{)r~r5r4r5r]rr^r)rr5)FN)rr5rrrrr^r)r3FN) rrr~rr4r5rrr6rr^r)rO __module__ __qualname____doc__r;propertyr-rGrIrVrYr[r`rbrrcrer>rnrqrrrsrxrz classmethodr|rrrr __classcell__r&r&r@r'r2SsH"       "      r2)rrrr)r-r rr), __future__ros contextlibrtypingrrrtorchZaccelerate.hooksrrtransformers.utilsr Zpeft.tuners.mixedr r6r peft_modelr tunersrrrrrrrutilsrrrLORALOHALOKRADALORAIA3OFTrr,r1Moduler2r&r&r&r's.        $