Skip to content

A predictive processing model of episodic memory and time perception

License

Notifications You must be signed in to change notification settings

zfountas/prospective-retrospective-model

Repository files navigation

A deep predictive processing model of episodic memory and time perception

This source code release accompanies the manuscript:

Z. Fountas, A. Sylaidi, K. Nikiforou, A. Seth, M. Shanahan, W. Roseboom, "A predictive processing model of episodic memory and time perception" Neural Computation 34.7 (2022): 1501-1544,

which can also be found in a pre-print version.


Requirements

  • Programming language: Python 3
  • Libraries: tensorflow >= 1.12.0, matplotlib, scipy, sklearn pip PrettyTable networkx
  • Video dataset: To be uploaded soon

Instructions

Install Python dependencies

Assuming you have a working version of Python 3, open a terminal and type:

sudo pip install tensorflow-gpu==1.15 sklearn matplotlib scipy sklearn PrettyTable networkx
Run the model using any video or webcam

Assuming you have a video named video.mp4, open a terminal and type:

python run.py -i video.mp4

To run the model using your webcam, open a terminal and simply type:

python run.py
Reproduce paper figures

To reproduce the results in the computational experiments presented in the paper, please follow this sequence:

  1. To run the model for a short episode in each video (frames 1 to XXX) to train its semantic memory.
python step1.py

This will create the file semantic_memory.pkl.

  1. Then, to generate the timeseries of surprises for each trial. The output files go to the folder /trials_surpr.
python step2.py python timestorm_dataset
  1. Then we need to run a large number of trials in order to get an estimation of the number of average children stored in episodic memory per layer. To do this, type
python step3_save_av_children.py

This will create a file called all_av_children.pkl. 4. To produce a figure similar to Fig.1 of the manuscript purely based on the accumulators, type

python step4_accumulators_fig_different_att_only_used_trials.py

This uses different attention levels for prospective and retrospective and different effort for high and low cognitive load. 5. Finally, to produce the final figures with model reports based on linear regression, type

python step5_linear.py

This uses the file produced in step 4 and sklearn's linearSVR.

License

This project is licensed under the GLUv3 License - see the LICENSE file for details.

About

A predictive processing model of episodic memory and time perception

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages