Skip to content

computational-imaging/time-multiplexed-neural-holography

Repository files navigation

Time-multiplexed Neural Holography: A Flexible Framework for Holographic Near-eye Displays with Fast Heavily-quantized Spatial Light Modulators
SIGGRAPH 2022

PyTorch implementation of
Time-multiplexed Neural Holography: A Flexible Framework for Holographic Near-eye Displays with Fast Heavily-quantized Spatial Light Modulators
Suyeon Choi*, Manu Gopakumar*, Yifan Peng, Jonghyun Kim, Matthew O'Toole, Gordon Wetzstein
*denotes equal contribution
in SIGGRAPH 2022

Get started

Our code uses PyTorch Lightning and PyTorch >=1.10.0.

You can set up a conda environment with all dependencies like so:

conda env create -f env.yml
conda activate tmnh

High-Level structure

The code is organized as follows:

./

  • main.py generates phase patterns from LF/RGBD/RGB data using SGD.

  • holo2lf.py contains the Light-field ↔ Hologram conversion implementations.

  • algorithms.py contains the gradient-descent based algorithm for LF/RGBD/RGB supervision

  • params.py contains our default parameter settings. ❗(Replace values here with those in your setup.)

  • quantization.py contains modules for quantizations (projected gradient, sigmoid, Gumbel-Softmax).

  • image_loader.py contains data loader modules.

  • utils.py has some other utilities.

./props/ contain the wave propagation operators (in simulation and physics).

./hw/ contains modules for hardware control and homography calibration

  • ti.py contains data given by Texas Instruments.
  • ti_encodings.py contains phase encoding and decoding functionalities for the TI SLM.

Run

To run, download the sample images from here and place the contents in the data/ folder.

Dataset generation / Model training

Please see the supplement and Neural 3D Holography repo for more details on dataset generation and model training.

# Train TMNH models
for c in 0 1 2
do
  python train.py -c=configs_model_training.txt --channel=$c --data_path=${dataset_path}
done

Run SGD with various target distributions (RGB images, focal stacks, and light fields)

for c in 0 1 2
do
  # 2D rgb images
  python main.py -c=configs_2d.txt --channel=$c
  # 3D focal stacks
  python main.py -c=configs_3d.txt --channel=$c
  # 4D light fields
  python main.py -c=configs_4d.txt --channel=$c
done

Run SGD with advanced quantizations

q=gumbel-softmax; # try none, nn, nn_sigmoid as well.
python main.py -c=configs_2d.txt --channel=$c --quan_method=$q

Citation

If you find our work useful in your research, please cite:

@inproceedings{choi2022time,
               author = {Choi, Suyeon
                         and Gopakumar, Manu
                         and Peng, Yifan
                         and Kim, Jonghyun
                         and O'Toole, Matthew
                         and Wetzstein, Gordon},
               title={Time-multiplexed neural holography: a flexible framework for holographic near-eye displays with fast heavily-quantized spatial light modulators},
               booktitle={ACM SIGGRAPH 2022 Conference Proceedings},
               pages={1--9},
               year={2022}
}

Acknowledgmenets

Thanks to Brian Chao for the help with code updates and Cindy Nguyen for helpful discussions. This project was in part supported by a Kwanjeong Scholarship, a Stanford SGF, Intel, NSF (award 1839974), a PECASE by the ARO (W911NF-19-1-0120), and Sony.

Contact

If you have any questions, please feel free to email the authors.

Releases

No releases published

Packages

No packages published

Languages