B 3gdã@sŒddlZddlZddlZddlmmZe e¡Z dd„Z dd„Z dd„Z ej dkr\e ZnejZd d „Zeje eeje ed œZd d „ZdS)éNcCs|t |¡S)N)ÚtorchÚsigmoid)Úx©rú+/cluster/home2/cyx/elia/bert/activations.pyÚswish srcCs |ddt |t d¡¡S)aš Original Implementation of the gelu activation function in Google Bert repo when initially created. For information: OpenAI GPT's gelu is slightly different (and gives slightly different results): 0.5 * x * (1 + torch.tanh(math.sqrt(2 / math.pi) * (x + 0.044715 * torch.pow(x, 3)))) This is now written in C in torch.nn.functional Also see https://arxiv.org/abs/1606.08415 gà?gð?g@)rÚerfÚmathÚsqrt)rrrrÚ _gelu_pythonsr c Cs6d|dt t dtj¡|dt |d¡¡S)zŸ Implementation of the gelu activation function currently in Google Bert repo (identical to OpenAI GPT). Also see https://arxiv.org/abs/1606.08415 gà?gð?g@g÷Hmâä¦?g@)rÚtanhr r ÚpiÚpow)rrrrÚgelu_newsrz1.4.0cCs*d|dt |ddd||¡S)Ngà?gð?g€ÑÓ3Eˆé?g÷Hmâä¦?)rr )rrrrÚ gelu_fast&sr)ÚrelurÚgelur rrcCs,|tkrt|Std |tt ¡ƒ¡ƒ‚dS)Nz*function {} not found in ACT2FN mapping {})ÚACT2FNÚKeyErrorÚformatÚlistÚkeys)Zactivation_stringrrrÚget_activation4sr)Úloggingr rÚtorch.nn.functionalÚnnÚ functionalÚFÚ getLoggerÚ__name__Úloggerrr rÚ __version__rrrr rrrrrrÚs$