U ˜€dã@sŒddlZddlZddlZddlmmZe e¡Z dd„Z dd„Z dd„Z ej dkr\e ZnejZd d „Zeje eeje ed œZd d „ZdS)éNcCs|t |¡S)N)ÚtorchÚsigmoid©Úx©rú//home/yxchng/Downloads/elia/bert/activations.pyÚswish srcCs |ddt |t d¡¡S)aš Original Implementation of the gelu activation function in Google Bert repo when initially created. For information: OpenAI GPT's gelu is slightly different (and gives slightly different results): 0.5 * x * (1 + torch.tanh(math.sqrt(2 / math.pi) * (x + 0.044715 * torch.pow(x, 3)))) This is now written in C in torch.nn.functional Also see https://arxiv.org/abs/1606.08415 çà?çð?ç@)rÚerfÚmathÚsqrtrrrrÚ _gelu_pythonsrc Cs6d|dt t dtj¡|dt |d¡¡S)zŸ Implementation of the gelu activation function currently in Google Bert repo (identical to OpenAI GPT). Also see https://arxiv.org/abs/1606.08415 r r r ç÷Hmâä¦?g@)rÚtanhr rÚpiÚpowrrrrÚgelu_newsrz1.4.0cCs*d|dt |ddd||¡S)Nr r g€ÑÓ3Eˆé?r)rrrrrrÚ gelu_fast&sr)ÚrelurÚgelurrrcCs,|tkrt|Std |tt ¡ƒ¡ƒ‚dS)Nz*function {} not found in ACT2FN mapping {})ÚACT2FNÚKeyErrorÚformatÚlistÚkeys)Zactivation_stringrrrÚget_activation4sr)Úloggingr rÚtorch.nn.functionalÚnnÚ functionalÚFÚ getLoggerÚ__name__ÚloggerrrrÚ __version__rrrrrrrrrrÚs&   ú