Skip to content

Hi, This is a repository about Deep Generative Modeling(More attention to probabilistic time series forecasting with Normalizing Flows)

Notifications You must be signed in to change notification settings

hanlaoshi/Deep-Generative-Modeling

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 

Repository files navigation

Deep-Generative-Modeling

This is a repository about Deep Generative Modeling(More attention to probabilistic time series forecasting with Normalizing Flows).

Updating everyday

Contact me: [email protected]

2023

  • Deep Generative Wasserstein Gradient Flows. [Paper] [Code]

      In this paper, These authors present Deep Generative Wasserstein Gradient Flows (DGGF),  which constructs a WGF between two distributions by minimizing the entropy-regularized f-divergence. 
    
  • Invertible normalizing flow neural networks by JKO scheme. [Paper] [Code]

      The current paper develops a neural ODE flow network inspired by the Jordan-Kinderleherer-Otto (JKO) scheme, which allows an efficient block-wise training procedure
    

🔥Book for generative modeling

2022

  • Deep Generative Modeling (Contained code) [Book]

🔥Paper for generative modeling

2022

  • Poisson Flow Generative Models. [Paper] [Code] [Datasets 1] [Datasets 2]

      This work propose a new "Poisson flow" generative model (PFGM) that maps a uniform distribution on a high-dimensional hemisphere into any data distribution.
    
  • Fair Normalizing Flows. [Paper]

      In this work, we present Fair Normalizing Flows (FNF), a new approach offering more rigorous fairness guarantees for learned representations.
    
  • Autoregressive Quantile Flows for Predictive Uncertainty Estimation. [Paper]

      These authors propose autoregressive quantile flows, a flexible class of normalizing flow models trained using a novel objective based on proper scoring rules.
    
  • Probabilistic Forecasting through Reformer Conditioned Normalizing Flows. [Thesis]

      This thesis introduces a new model for forecasting, the Reformer Masked Autoregressive Model (RMAF),  based on the Transformer Masked Autoregressive Flow (TMAF), where we replace the Transformer part of the model with a Reformer. 
    
  • E(n) Equivariant Normalizing Flows. [Paper] [Code]

      This paper introduces equivariant graph neural networks into the normalizing flow framework which combine to give invertible equivariant functions. Demonstrates their flow beats prior equivariant models and allows sampling of molecular configurations with positions, atom types and charges.
    
  • BayesFlow: Learning complex stochastic models with invertible neural networks. [Paper] [Code]

      This paper propose a novel method for globally amortized Bayesian inference based on invertible neural networks which we call BayesFlow. The method uses simulation to learn a global estimator for the probabilistic mapping from observed data to underlying model parameters.
    
  • PFVAE: A Planar Flow-Based Variational Auto-Encoder Prediction Model for Time Series Data. [Paper]

      This paper proposes a novel planar flow-based variational auto-encoder prediction model (PFVAE), which uses the long- and short-term memory network (LSTM) as the auto-encoder and designs the variational auto-encoder (VAE) as a time series data predictor to overcome the noise effects.
    
  • Multi-scale Attention Flow for Probabilistic Time Series Forecasting. [Paper]

      These authors proposed a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF), where ones integrate multi-scale attention and relative position information and the multivariate data distribution is represented by the conditioned normalizing flow.
    
  • Autoregressive Quantile Flows for Predictive Uncertainty Estimation. [Paper]

      These authors propose autoregressive quantile flows, a flexible class of normalizing flow models trained using a novel objective based on proper scoring rules. 
    
  • Embedded-model flows: Combining the inductive biases of model-free deep learning and explicit probabilistic modeling. [paper]

      This paper propose embedded-model flows (EMF), which alternate general-purpose transformations with structured layers that embed domain-specific inductive biases.
    
  • Efficient CDF Approximations for Normalizing Flows. [Paper--with under double-blind review] [Code]

      In this paper, these authors build upon the diffeomorphic properties of normalizing flows and leverage the divergence theorem to estimate the CDF over a closed region in target space in terms of the flux across its boundary, as induced by the normalizing flow.
    
  • Short-Term Density Forecasting of Low-Voltage Load using Bernstein-Polynomial Normalizing Flows. [Paper] [Code]

      These authors propose an approach for flexible conditional density forecasting of short-term load based on Bernstein polynomial normalizing flows, where a neural network controls the parameters of the flow. 
    
  • EXIT: Extrapolation and Interpolation-based Neural Controlled Differential Equations for Time-series Classification and Forecasting. [Paper]

      This paper propose to i) generate another latent continuous path using an encoder-decoder architecture, which corresponds to the interpolation process of NCDEs, i.e., our neural network-based interpolation vs. the existing explicit interpolation, and ii) exploit the generative characteristic of the decoder, i.e., extrapolation beyond the time domain of original data if needed.
    
  • Normalizing Flows with Multi-Scale Autoregressive Priors. [Paper] [Code]

      In this work, we improve the representational power of flow-based models by introducing channel-wise dependencies in their latent space through multi-scale autoregressive priors (mAR). 
    

2021

  • Probabilistic Forecast of Time Series with Transformers and Normalizing Flows. [Thesis]

      This thesis aims to understand normalizing flows and do multivariate probabilistic forecasting using normalizing flows conditioned on autoregressive models like GRUs and Transformers.
    
  • CInC Flow: Characterizable Invertible 3x3 Convolution. [Paper] [Code]

      This paper sought to improve the emerging convolutions as they were expensive. So they investigated the conditions for when 3x3 convolutions are invertible under which conditions (e.g. padding) and saw successful speedups. Furthermore, they developed a more expressive, invertible Quad coupling layer. 
    
  • Orthogonalizing Convolutional Layers with the Cayley Transform. [Paper] [Code]

      The authors parameterized the multichannel convolution to be orthogonal via the Cayley transform (skew-symmetric convolutions in the Fourier domain). This enables the inverse to be computed efficiently . 
    
  • Automatic variational inference with cascading flows. [Paper]

      This paper combine the flexibility of normalizing flows and the prior-embedding property of ASVI in a new family of variational programs, which named cascading flows.
    
  • Deep Generative Modelling: A Comparative Review of VAEs, GANs, Normalizing Flows, Energy-Based and Autoregressive Models. [Paper] [Datasets Used]

       This compendium covers energy-based models, variational autoencoders, generative adversarial networks, autoregressive models, normalizing flows, in addition to numerous hybrid approaches
    
  • Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows. [Paper] [Code]

       This paper model the multivariate temporal dynamics of time series via an autoregressive deep learning model, where the data distribution is represented by a conditioned normalizing flow.
    
  • Autoregressive Denoising Diffusion Models for Multivariate Probabilistic Time Series Forecasting. [Paper] [Code]

      In this work, These authors propose TimeGrad, an autoregressive model for multivariate probabilistic time series forecasting which samples from the data distribution at each time step by estimating its gradient. 
    
  • RNN with Particle Flow for Probabilistic Spatio-temporal Forecasting. [Paper]

       In this work, we consider the time-series data as a random realization from a nonlinear state-space model and target Bayesian inference of the hidden states for probabilistic forecasting. 
    
  • Masked Autoencoder for Distribution Estimation on Small Structured Data Sets. [Paper]

      In this article, we propose two autoencoders for estimating the density of a small set of observations, where the data have a known Markov random field (MRF) structure.
    

2020

  • Invertible DenseNets. [Paper]

      We introduce Invertible Dense Networks (i-DenseNets), a more parameter efficient alternative to Residual Flows. The method relies on an analysis of the Lipschitz continuity of the concatenation in DenseNets, where we enforce the invertibility of the network by satisfying the Lipschitz constraint.
    
  • VideoFlow: A Conditional Flow-Based Model for Stochastic Video Generation. [Paper] [Code]

       This work is the first to propose multi-frame video prediction with normalizing flows, which allows for direct optimization of the data likelihood, and produces high-quality stochastic predictions.
    
  • Normalizing Kalman Filters for Multivariate Time Series Analysis. [Paper]

      This paper present a novel approach reconciling classical state space models with deep learning methods. 
    
  • Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows. [Paper] [Code]

      In this work, we propose a novel type of normalizing flow driven by a differential deformation of the Wiener process.
    
  • Training Normalizing Flows with the Information Bottleneck for Competitive Generative Classification. [Paper] [Code]

      In this work, firstly, we develop the theory and methodology of IB-INNs, a class of conditional normalizing flows where INNs are trained using the IB objective: Introducing a small amount of {\em controlled} information loss allows for an asymptotically exact formulation of the IB, while keeping the INN's generative capabilities intact.
    

2019

  • Block Neural Autoregressive Flow. [Paper]

       This paper propose block neural autoregressive flow (B-NAF), a much more compact universal approximator of density functions, where we model a bijection directly using a single feed-forward network.
    
  • Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design. [Paper] [Code] [Dataset Used 1] [Dataset Used 2]

      In this paper, we investigate and improve upon three limiting design choices employed by flow-based models in prior work: the use of uniform noise for dequantization, the use of inexpressive affine flows, and the use of purely convolutional conditioning networks in coupling layers. 
    
  • Sum-of-Squares Polynomial Flow. [Paper]

      Based on triangular maps, this paper propose a general framework for high-dimensional density estimation, by specifying one-dimensional transformations (equivalently conditional densities) and appropriate conditioner networks. 
    

2018

  • Neural Processes. [Paper] [Code]

      This paper introduce a class of neural latent variable models which we call Neural Processes (NPs), combining the best of both worlds. Like Gaussian process, NPs define distributions over functions, are capable of rapid adaptation to new observations, and can estimate the uncertainty in their predictions. 
    
  • Sylvester Normalizing Flows for Variational Inference. [Paper] [Code]

      We introduce Sylvester normalizing flows, which can be seen as a generalization of planar flows. Sylvester normalizing flows remove the well-known single-unit bottleneck from planar flows, making a single transformation much more flexible. 
    

2017

  • Masked Autoregressive Flow for Density Estimation. [Paper] [Code] [Dataset Used 1] [Dataset Used 2] [Dataset Used 3] [Dataset Used 4]

     These authors describe an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data.
    
  • Conditional Recurrent Flow: Conditional Generation of Longitudinal Samples with Applications to Neuroimaging. [Paper] [Dataset Used]

     These authors seek to develop a conditional generative model for longitudinal data generation by designing an invertable neural network. Inspired by recurrent nature of longitudinal data, and propose a novel neural network that incorporates recurrent subnetwork and context gating to include smooth transition in a sequence of generated data.
    

About

Hi, This is a repository about Deep Generative Modeling(More attention to probabilistic time series forecasting with Normalizing Flows)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published