Skip to content

🎨"Denoising Diffusion Probabilistic Models" paper implementation.

License

Notifications You must be signed in to change notification settings

YemenOpenSource/DDPM

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

46 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DDPM 🎨

image

🎨"Denoising Diffusion Probabilistic Models" paper implementation.


Overview

Denoising Diffusion Probabilistic Models (DDPM) are a class of generative models that learn a diffusion process to generate samples. The model iteratively applies a diffusion process to noise, gradually transforming it into samples from the target distribution. This approach has shown promising results in generating high-quality images and has garnered attention in the field of generative modeling.


Gaussing Ditribution:

$$q(\mathbf{x}_t \vert \mathbf{x}_0) = \mathcal{N}(\mathbf{x}_t; \sqrt{\bar{\alpha}_t} \mathbf{x}_0, (1 - \bar{\alpha}_t)\mathbf{I})$$
def add_noise(self, 
                 original_samples: torch.FloatTensor, 
                 timestep: torch.IntTensor):

        alphas_cumlative_product = self.alphas_cumlative_product.to(device = original_samples.device, dtype = original_samples.dtype)
        timestep = timestep.to(original_samples.device)
        alphas_cumlative_product_squaroot = alphas_cumlative_product[timestep] ** 0.5 
        alphas_cumlative_product_squaroot = alphas_cumlative_product_squaroot.flatten()
        while len(alphas_cumlative_product_squaroot.shape) < len(original_samples.shape):
            alphas_cumlative_product_squaroot = alphas_cumlative_product_squaroot.unsqueeze(-1)
        
        alphas_cumlative_product_squaroot_mins_one = (1 - alphas_cumlative_product[timestep]) ** 0.5 
        alphas_cumlative_product_squaroot_mins_one = alphas_cumlative_product_squaroot_mins_one.flatten()
        while len(alphas_cumlative_product_squaroot_mins_one.shape) < len(original_samples.shape):
            alphas_cumlative_product_squaroot_mins_one = alphas_cumlative_product_squaroot_mins_one.unsqueeze(-1)
        
        noise = torch.randn(original_samples.shape, generator=self.generator, device=original_samples.device, dtype=original_samples.dtype)
        noisy_samples = alphas_cumlative_product_squaroot * original_samples + alphas_cumlative_product_squaroot_mins_one * noise 
        return noisy_samples
$$q(\mathbf{x}_t \vert \mathbf{x}_{t-1}) = \mathcal{N}(\mathbf{x}_t; \sqrt{1 - \beta_t} \mathbf{x}_{t-1}, \beta_t\mathbf{I}) \quad q(\mathbf{x}_{1:T} \vert \mathbf{x}_0) = \prod^T_{t=1} q(\mathbf{x}_t \vert \mathbf{x}_{t-1})$$
class GaussingDitribution:
    def __init__(self, paramenters: torch.Tensor) -> None:
        self.mean, log_variance = torch.chunk(paramenters, 2, dim = 1)
        self.log_variance = torch.clamp(log_variance, -30.0, 20.0)
        self.std = torch.exp(0.5 * self.log_variance)
    
    def sample(self):
        return self.mean + self.std * torch.rand_like(self.std)

Citation

@misc{ho2020denoising,
    title   = {Denoising Diffusion Probabilistic Models},
    author  = {Jonathan Ho and Ajay Jain and Pieter Abbeel},
    year    = {2020},
    eprint  = {2006.11239},
    archivePrefix = {arXiv},
    primaryClass = {cs.LG}
}

References

original_paper: "Denoising Diffusion Probabilistic Models" by Jonathan Ho, Ajay Jain, and Pieter Abbeel.


About

🎨"Denoising Diffusion Probabilistic Models" paper implementation.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 78.2%
  • Python 21.8%