This repository contains the code and data used in the study "Learning with Passive Optical Nonlinear Mapping." We constructed a passive optical nonlinear mapping setup with a multiple-scattering cavity to demonstrate its learning from hihgly nonlinear optical features. The repository provides the code used for training and analyzing the data produced by the setup.
data/
: This folder contains the partial data collected through the passive optical nonlinear mapping setup, due to the limited storage Github allows.code/
: This folder contains the code for training and analyzing the data produced by the setup.
-
Data: The partial data reconstructed through the passive optical nonlinear mapping setup can be found in the
data/
folder. The data includes FashionMNIST reconstruction results. -
Code: The
code/
folder contains the code used for training and analyzing the data produced by the setup. It includes code for preprocessing the data, training models, and evaluating performance. -
Training and Analysis: To get started with training and analysis, navigate to the
code/
folder and run the Python scripts provided. Detailed instructions on how to run the code and reproduce the results are given within the scripts.
If you use any of the datasets or code in this repository for your research, please consider citing our work:
Xia, F., Kim, K., Eliezer, Y., Shaughnessy, L., Gigan, S., & Cao, H. (2023). Deep Learning with Passive Optical Nonlinear Mapping. arXiv preprint arXiv:2307.08558.
@article{xia2023deep,
title={Deep Learning with Passive Optical Nonlinear Mapping},
author={Xia, Fei and Kim, Kyungduk and Eliezer, Yaniv and Shaughnessy, Liam and Gigan, Sylvain and Cao, Hui},
journal={arXiv preprint arXiv:2307.08558},
year={2023}
}
This code in this repository is released under the following license:
A copy of this license is provided in this repository as License.txt.
Fei Xia, Kyungduk Kim, Yaniv Eliezer, Liam Shaughnessy
For any questions, issues, or concerns, please feel free to open an issue on this repository or contact Fei Xia ([email protected]).