0gL ndZddlmZddlZddlmZddlmcmZdZ dZ Gddej Z dS)z9Some utilities for backbones, in particular for windowing)TupleNc x|j\}}}}|||zz |z}|||zz |z}|dks|dkrtj|ddd|d|f}||z||z} }||||z|| |z||}|ddddddd|||} | || ffS)aT Partition into non-overlapping windows with padding if needed. Args: x (tensor): input tokens with [B, H, W, C]. window_size (int): window size. Returns: windows: windows after partition with [B * num_windows, window_size, window_size, C]. (Hp, Wp): padded height and width before partition r)shapeFpadviewpermute contiguous) x window_sizeBHWCpad_hpad_wHpWpwindowss AC:\codes\sam2\segment-anything-2\sam2\modeling\backbones\utils.pywindow_partitionrsJAq!Q 1{? *k 9E 1{? *k 9E qyyEAII E!aAua/ 0 0 YE B q" #[" 2C[RSTTA !Q1a##..0055b+{TUVV  RH ct|\}}|\}}|jd||z|z|zz}||||z||z||d} | dddddd|||d} ||ks||kr&| ddd|d|ddf} | S) a Window unpartition into original sequences and removing padding. Args: x (tensor): input tokens with [B * num_windows, window_size, window_size, C]. window_size (int): window size. pad_hw (Tuple): padded height and width (Hp, Wp). hw (Tuple): original height and width (H, W) before padding. Returns: x: unpartitioned sequences with [B, H, W, C]. rr rrrrr N)r rrr) rrpad_hwhwrrrrrrs rwindow_unpartitionr")sFB DAq aR"W 3{BCA  2 bK/k2  A !Q1a##..0055aRDDA Avva aaa!RaRlO & & ( ( Hrc eZdZdZ ddeedfd eedfd eedfd ed ef fd ZdejdejfdZ xZ S) PatchEmbedz# Image to Patch Embedding. r&rrrrr kernel_size.stridepaddingin_chans embed_dimcttj||||||_dS)ab Args: kernel_size (Tuple): kernel size of the projection layer. stride (Tuple): stride of the projection layer. padding (Tuple): padding size of the projection layer. in_chans (int): Number of input image channels. embed_dim (int): embed_dim (int): Patch embedding dimension. )r*r+r,N)super__init__nnConv2dproj)selfr*r+r,r-r. __class__s rr1zPatchEmbed.__init__FsC I i[QX    rrreturnc`||}|dddd}|S)Nrrrr)r4r)r5rs rforwardzPatchEmbed.forward[s- IIaLL IIaAq ! !r)r%r'r(rr)) __name__ __module__ __qualname____doc__rintr1torchTensorr9 __classcell__)r6s@rr$r$As (."(#)   38_ c3h sCx           *%,rr$) r=typingrr?torch.nnr2torch.nn.functional functionalr rr"Moduler$rrrHs@? 2   0r