U ۦb @sDddlZddlZddlmZddlmZddZddZ ddZ dS) N) rearrangecCsVtjt|dddddd}t||jddd}t||d|d}|S) Nzb c h w -> b c (h w)T)dimkeepdimr)rgư>)thmedianr unsqueezeabsmean)inputstargetsZ median_valuescaleZnllr1/home/zsyue/code/python/GradDiff/models/losses.pynll_l1 s rcsd||||fD]}t|tjr|q*qdk s:tdfdd||fD\}}dd||t||||dt| S)z Compute the KL divergence between two gaussians. Shapes are automatically broadcasted, so batches can be compared to scalars, among other use cases. Nz&at least one argument must be a Tensorcs,g|]$}t|tjr|nt|qSr) isinstancerTensortensorto).0xrrr "sznormal_kl..g?g)rrrAssertionErrorexp)mean1logvar1mean2logvar2objrrr normal_kls(    r#c Cs|j|jkr|jks nt||}t| }||d}t|}||d}t|}t|jdd} td|jdd} ||} t|dk| t|dk| t| jdd} | j|jkst| S)a{ Compute the log-likelihood of a Gaussian distribution discretizing to a given image. :param x: the target images. It is assumed that this was uint8 values, rescaled to the range [-1, 1]. :param means: the Gaussian mean Tensor. :param log_scales: the Gaussian log stddev Tensor. :return: a tensor like x of log probabilities (in nats). gp?g-q=)ming?g+g+?)shaperrrZapprox_standard_normal_cdflogclampwhere) rmeans log_scalesZ centered_xZinv_stdvZplus_inZcdf_plusZmin_inZcdf_minZ log_cdf_plusZlog_one_minus_cdf_minZ cdf_delta log_probsrrr#discretized_gaussian_log_likelihood/s"    r,) torchrnumpynptorch.nnnneinopsrrr#r,rrrrs